Act now and download your Microsoft 70-646 test today! Do not waste time for the worthless Microsoft 70-646 tutorials. Download Renovate Microsoft PRO: Windows Server 2008, Server Administrator exam with real questions and answers and begin to learn Microsoft 70-646 with a classic professional.
Q101. - (Topic 18)
You are designing a Windows Server 2008 R2 deployment strategy for the Minneapolis campus servers. Which deployment strategy should you recommend?
A. install from media.
B. Use a discover image in WDS.
C. Deploy a VHD image.
D. Deploy a WIM image.
Answer: D
Explanation:
Requirements - Bitlocker is needed on all disks in Minneapolis and installations must be done remotely
VHD Image
-according to the official MS courseware book 6433A - a VHD can not contain more than one partition. so if true that rules VHD Images out because you need bitlocker and bitlocker requires 2 partitions. so if this is true then answer C is wrong.also http://technet.microsoft.com/en-us/library/dd363560.aspx
A supported .vhd image. The only supported operating systems are Windows Server 2008 R2, Windows 7 Enterprise, and Windows 7 Ultimate. Fixed, dynamic, and differencing .vhd images are supported. However, note that a supported image cannot contain the following: More than one operating system.
More than one partition.
Applications or data (instead of an operating system).
A 64-bit operating system that is partitioned with a GUID partition table (GPT).
So again further evidence that C is not the right answer as Bit locker needs 2 partitions.
I'm leaning toward Answer B because
WDS Images
WDS uses two different types of images: install images and boot images. Install images are the operating system images that will be deployed to computers running Windows Server 2008 R2, Windows Server 2008, Windows 7, or Windows Vista. A default installation image named Install.wim is located in the \Sources directory of the installation DVD. If you are using WDS to deploy Windows 7 to computers with different processor architectures, it will be necessary to add separate installation images for each architecture to the WDS server. Architecture-specific images can be found on the architecture-specific installation media; for example, the Itanium image is located on the Itanium installation media, and the x64 default installation image is located on the x64 installation media. Although it is possible to create custom images, it is necessary to have only one image per processor architecture. For example, deploying Windows Server 2008 R2 Enterprise edition x64 to a computer with two x64 processors and to a computer with eight x64 processors in SMP configuration only requires access to the default x64 installation image. Boot images are used to start a client computer prior to the installation of the operating system image. When a computer starts off a boot image over the network, a menu is presented that displays the possible images that can be deployed to the computer from the WDS server. The Windows Server 2008 R2 Boot.wim file allows for advanced deployment options, and this file should be used instead of the Boot.wim file that is available from other sources.
In addition to the basic boot image, there are two separate types of additional boot images that can be configured for use with WDS. The capture image is a boot image that starts the WDS capture utility. This utility is used with a reference computer, prepared with the Sysprep utility, as a method of capturing the reference computer’s image for deployment with WDS. The second type of additional boot image is the discover image. Discover images are used to deploy images to computers that are not PXE-enabled or on networks that don’t allow PXE. These images are written to CD, DVD, or USB media and the computer is started off the media rather than off the PXE network card, which is the traditional method of using WDS. I'm gonna make a huge assumption that the Minneapolis servers are on a different subnet, which makes sense because they are all different campuses for a college. but if there is a DHCP Server or IP Helper is enabled then that wont be a problem. So B may not be the answer
Media Install
It specifically says they use WDS for deployment. WDS is all about using images so would that not rule out media install? You can do media installs that are unattended but it requires sending a DVD and corresponding USB key with an answer file to the site and it being inserted into the server. But GDI uses PXE enabled network cards so that would imply media is not used as images would be stored centrally.
Q102. - (Topic 1)
Your network consists of a single Active Directory domain. All domain controllers run Windows Server 2008 R2. All servers run Windows Server 2008 R2. All client computers run Windows 7.
You need to generate a monthly report on the status of software updates for the client computers.
Your solution must meet the following requirements:
. Display all of the operating system updates that installed successfully
. Display all of the Microsoft Application updates that installed successfully
....
Display all of the operating system updates that failed to install
Display all of the Microsoft Application updates that failed to install
Minimize administrative effort
Minimize costs
What should you do?
A. Install Microsoft System Center Essentials (Essentials) 2007. Deploy management agents on all client computers.
B. Install Microsoft System Center Configuration Manager (SysMgr) 2007. Deploy management agents on all client computers.
C. Install Windows Server Update Services (WSUS) 3.0 SP2. Configure Windows Update by using a Group Policy object (GPO).
D. Deploy Microsoft Baseline Security Analyzer (MBSA) 2.1 on the client computers. Run MBSA on each client computer, and save the report to a shared folder on the network.
Answer: C
Explanation:
http://technet.microsoft.com/en-us/library/dd939886%28WS.10%29.aspx What’s new in this release?
.
Integration with Windows Server. 2008 R2
.
Support for the BranchCache. feature in Windows Server 2008 R2
.
Support for Windows. 7 client computers New features
.
Automatic approval rules include the ability to specify the approval deadline date and time for all computers or for specific computer groups.
.
Improved handling of language selection for downstream servers includes a new warning dialog that appears when you decide to download updates only for specified languages.
.
New Update and Computer Status reports let you filter updates that are approved for installation. You can run these reports from the WSUS administration console or use the application programming interface (API) to incorporate this functionality into your own reports.
Windows Update Agent improvements
.
Client computer scan time is faster than previous versions.
.
Computers that are managed by WSUS servers can now run “scoped” scans against those servers, instead of performing a full scan. This results in faster scans for applications that use Microsoft Update APIs such as Windows Defender.
.
User experience improvements help users organize updates and provide greater clarity on update value and behavior.
.
Imaged computers are more clearly displayed in the WSUS administration console. For more information, see article 903262 in the Microsoft Knowledge Base.
.
Prevents APIs that are called by non-local system callers in a non-interactive session from failing.
.
Prevents error code 0x80070057 when you try to install 80 or more updates at the same time from the Windows Update Web page or from the Microsoft Update Web page.
.
Improves scan times for Windows Update
.
Improves the speed at which signature updates are delivered
.
Enables support for Windows Installer reinstallation functionality
.
Improves error messaging
Q103. - (Topic 1)
...
Your network consists of a single Active Directory domain. All servers run Windows Server 2008 R2. All client computers run Windows 7. Some users have laptop computers and work remotely from home.
You need to plan a data provisioning infrastructure to secure sensitive files. Your plan must meet the following requirements:
Files must be stored in an encrypted format.
Files must be accessible by remote users over the Internet.
Files must be encrypted while they are transmitted over the Internet.
What should you include in your plan?
A. Deploy one Microsoft SharePoint Foundation 2010 site. Require users to access the SharePoint site by using a Secure Socket Transmission Protocol (SSTP) connection.
B. Deploy two Microsoft SharePoint Foundation 2010 sites. Configure one site for internal users. Configure the other site for remote users. Publish the SharePoint sites by using HTTPS.
C. Configure a Network Policy and Access Services (NPAS) server to act as a VPN server. Require remote users to access the files by using an IPsec connection to the VPN server.
D. Store all sensitive files in folders that are encrypted by using Encrypting File System (EFS). Require remote users to access the files by using Secure Socket Transmission Protocol (SSTP).
Answer: D
Explanation:
MCITP Self-Paced Training Kit Exam 70-646 Windows Server Administration: Encrypting File System Encrypting File System (EFS) is another method through which you can ensure the integrity of data. Unlike BitLocker, which encrypts all data on a volume using a single encryption key that is tied to the computer, EFS allows for the encryption of individual files and folders using a public encryption key tied to a specific user account. The encrypted file can only be decrypted using a private encryption key that is accessible only to the user. It is also possible to encrypt documents to other user’s public EFS certificates. A document encrypted to another user’s public EFS certificate can only be decrypted by that user’s private certificate.
Security Groups cannot hold encryption certificates, so the number of users that can access an encrypted document is always limited to the individual EFS certificates that have been assigned to the document. Only a user that originally encrypts the file or a user whose certificate is already assigned to the file can add another user’s certificate to that file. With EFS there is no chance that an encrypted file on a departmental shared folder might be accessed by someone who should not have access because of incorrectly configured NTFS or Shared Folder permissions. As many administrators know, teaching regular staff to configure NTFS permissions can be challenging. The situation gets even more complicated when you take into account Shared Folder permissions. Teaching staff to use EFS to limit access to documents is significantly simpler than explaining NTFS ACLs.
If you are considering deployment of EFS throughout your organization, you should remember that the default configuration of EFS uses self-signed certificates. These are certificates generated by the user’s computer rather than a Certificate Authority and can cause problems with sharing documents because they are not necessarily accessible from other computers where the user has not encrypted documents. A more robust solution is to modify the default EFS Certificate Template that is provided with a Windows Server 2008 Enterprise Certificate Authority to enable autoenrollment. EFS certificates automatically issued by an Enterprise CA can be stored in Active Directory and applied to files that need to be shared between multiple users.
Another EFS deployment option involves smart cards. In organizations where users authenticate using smart cards, their private EFS certificates can be stored on a smart card and their public certificates stored within Active Directory. You can learn more about configuring templates for autoenrollment in Chapter 10, “Certificate Services and Storage Area Networks.” MORE INFO More on EFS For more information on Encrypting File System in Windows Server 2008, consult the following TechNet article: http://technet2.microsoft.com/windowsserver2008/en/library/f843023b-bedd-40dd9e5b-f1619eebf7821033.mspx?mfr=true.
Quick Check
1.
From a normal user’s perspective, in terms of encryption functionality, how does EFS differ from BitLocker?
2.
What type of auditing policy should you implement to track access to sensitive files? Quick Check Answers
1.
BitLocker works on entire volumes and is transparent to the user. EFS works on individual files and folders and be configured by the user.
2.
Auditing Object Access.
Windows Server 2008 VPN Protocols
Windows Server 2008 supports three different VPN protocols: Tunneling Protocol (PPTP), Layer Two Tunneling Protocol over IPsec (L2TP/IPsec), and Secure Socket Tunneling Protocol (SSTP). The factors that will influence the protocol you choose to deploy in your own network environment include client operating system, certificate infrastructure, and how your organization’s firewall is deployed.
Windows XP remote access clients, because these clients cannot use SSTP
SSTP Secure Socket Tunneling Protocol (SSTP) is a VPN technology that makes its debut with Windows Server 2008. SSTP VPN tunnels allow traffic to pass across firewalls that block traditional PPTP or L2TP/IPsec VPN traffic. SSTP works by encapsulating Point-to-Point Protocol (PPP) traffic over the Secure Sockets Layer (SSL) channel of the Secure Hypertext Transfer Protocol (HTTPS) protocol. Expressed more directly, SSTP piggybacks PPP over HTTPS. This means that SSTP traffic passes across TCP port 443, which is almost certain to be open on any firewall between the Internet and a public-facing Web server on an organization’s screened subnet. When planning for the deployment of SSTP, you need to take into account the following considerations:
SSTP is only supported with Windows Server 2008 and Windows Vista with Service Pack
1. SSTP requires that the client trust the CA that issues the VPN server’s SSL certificate. The SSL certificate must be installed on the server that will function as the VPN server
prior to the installation of Routing and Remote Access; otherwise, SSTP will not be available.
The SSL certificate subject name and the host name that external clients use to connect to the VPN server must match, and the client Windows Vista SP1 computer must trust the issuing CA.
SSTP does not support tunneling through Web proxies that require authentication.
SSTP does not support site-to-site tunnels. (PPTP and L2TP do.)
MORE INFO More on SSTP
To learn more about SSTP, see the following SSTP deployment walkthrough document at http://download.microsoft.com/download/b/1/0/b106fc39-936c-4857-a6ea-3fb9d1f37063/ Deploying%20SSTP %20Remote%20Access%20Step%20by%20Step%20Guide.doc.
Q104. - (Topic 14)
You need to recommend an administrative solution for the local support technicians in the satellite offices. The solution must meet the company's security requirements.
What should you include in the recommendation?
A. Active Directory delegation
B. Administrator Role Separation
C. managed service accounts
D. Restricted Groups
Answer: B
Explanation:
http://technet.microsoft.com/en-us/library/cc753170%28WS.10%29.aspx This topic explains how you can use Administrator Role Separation (ARS) on a read-only domain controller (RODC) to delegate RODC administration to a user who is not a member of the Domain Admins group. One problem encountered by administrators of domain controllers in perimeter networks is that domain controllers typically have to be set up and administered by domain administrators. Administrative operations, such as applying software updates, performing an offline defragmentation, or backing up the system, cannot be delegated. With the introduction of RODCs, domain administrators can delegate both the installation and the administration of RODCs to any domain user, without granting them any additional rights in the domain. The ability to perform this delegation is called ARS.
Q105. - (Topic 4)
You need to recommend a backup solution for the VMs that meets the museum's technical requirements. What should you include in the recommendation?
A. On each VM, perform a full server backup by using Windows Server Backup.
B. On each physical node, perform a full server backup by using Windows Server Backup.
C. Deploy Microsoft System Center Data Protection Manager 2010 and create a new protection group.
D. Deploy Microsoft System Center Virtual Machine Manager (VMM) 2008 R2 and schedule checkpoints
Answer: C
Explanation:
http://technet.microsoft.com/en-us/library/ff399260.aspx What is Data Protection Manager? Microsoft System Center Data Protection Manager (DPM) 2010 is a member of the Microsoft System Center family of management products, designed to help IT professionals manage their Windows environment. DPM provides Windows backup and recovery—delivering seamless data protection for Microsoft application and file servers by using integrated disk and tape media. DPM performs replication, synchronization, and recovery point creation to provide reliable protection and rapid recovery of data for both system administrators and endusers.
What is a custom volume?
You can assign a custom volume to a protection group member, in place of the DPM storage pool. A custom volume is a volume that is not in the DPM storage pool and is specified to store the replica and recovery points for a protection group member. Any volume that is attached to the DPM server can be selected as a custom volume, except the volume that contains the system and program files. To use custom volumes for a protection group member, two custom volumes must be available: one volume to store the replica and one volume to store the recovery points
Q106. - (Topic 1)
Your network is configured as shown in the following diagram.
You deploy an enterprise certification authority (CA) on the internal network. You also deploy a Microsoft Online Responder on the internal network. You need to recommend a secure method for Internet users to verify the validity of individual certificates.
The solution must minimize network bandwidth.
What should you recommend?
A. Deploy a subordinate CA on the perimeter network.
B. Install a standalone CA and the Network Device Enrollment Service (NDES) on a server on the perimeter network.
C. Install a Network Policy Server (NPS) on a server on the perimeter network. Redirect authentication requests to a server on the internal network.
D. Install Microsoft Internet Information Services (IIS) on a server on the perimeter network.
Configure IIS to redirect requests to the Online Responder on the internal network.
Answer: D
Explanation:
http://www.ipsure.com/blog/2010/installation-and-configuration-of-active-directory-certificate-services-onwindows-server-2008-r2-1/ http://msdn.microsoft.com/en-us/library/cc732956.aspx
Q107. - (Topic 1)
Your network contains a Windows Server 2008 R2 server that functions as a file server. All
users have laptop computers that run Windows 7.
The network is not connected to the Internet.
Users save files to a shared folder on the server.
You need to design a data provisioning solution that meets the following requirements:
. Users who are not connected to the corporate network must be able to access the files and the folders in the corporate network. . Unauthorized users must not have access to the cached files and folders.
What should you do?
A. Implement a certification authority (CA). Configure IPsec domain isolation.
B. Implement a certification authority (CA). Configure Encrypting File System (EFS) for the drive that hosts the files.
C. Implement Microsoft SharePoint Foundation 2010. Enable Secure Socket Layer (SSL) encryption.
D. Configure caching on the shared folder. Configure offline files to use encryption.
Answer: D
Explanation:
MCITP Self-Paced Training Kit Exam 70-646 Windows Server Administration: Lesson 2: Provisioning Data Lesson 1 in this chapter introduced the Share And Storage Management tool, which gives you access to the Provision Storage Wizard and the Provision A Shared Folder Wizard. These tools allow you to configure storage on the volumes accessed by your server and to set up shares. When you add the Distributed File System (DFS) role service to the File Services server role you can create a DFS Namespace and go on to configure DFSR. Provisioning data ensures that user files are available and remain available even if a server fails or a WAN link goes down. Provisioning data also ensures that users canwork on important files when they are not connected to the corporate network.
In a well-designed data provisioning scheme, users should not need to know the network path to their files, or from which server they are downloading them. Even large files should typically download quickly—files should not be downloaded or saved across a WAN link when they are available from a local server. You need to configure indexing so that users can find information quickly and easily. Offline files need to be synchronized quickly and efficiently, and whenever possible without user intervention. A user should always be working with the most up-to-date information (except when a shadow copy is specified) and fast and efficient replication should ensure that where several copies of a file exist on a network they contain the same information and latency is minimized.
You have several tools that you use to configure shares and offline files, configure storage, audit file access, prevent inappropriate access, prevent users from using excessive disk resource, and implement disaster recovery. However, the main tool for provisioning storage and implementing a shared folder structure is DFS Management, specifically DFS Namespaces. The main tool for implementing shared folder replication in a Windows Server 2008 network is DFS Replication.
Q108. - (Topic 1)
.....
You are designing a server infrastructure to support a new stateful Application.
The server infrastructure must meet the following requirements:
Use two servers, each with two NIC cards and 32 GB of RAM.
Provide access to the Application in the event of the failure of a single server.
Provide the ability to scale up the Application.
Minimize the attack surface of each server.
Minimize server disk space requirements.
You need to design a server infrastructure that meets the requirements.
What should you recommend? (More than one answer choice may achieve the goal. Select the BEST answer.)
A. Perform a Server Core installation of Windows Server 2008 R2 Standard Edition. Configure both servers in a failover cluster.
B. Perform a Server Core installation of Windows Server 2008 R2. Configure both servers in a Windows Network Load Balancing array.
C. Install Windows Server 2008 R2 on both servers. Use DNS Round Robin to balance the load between the servers.
D. Install Windows Server 2008 R2 on both servers. Configure both servers in a Windows Network Load Balancing array.
Answer: A
Explanation:
Failover clusters are designed for applications that have long-running in-memory state, or that have large, frequently updated data states. These are called stateful applications, and they include database applications and messaging applications. Typical uses for failover clusters include file servers, print servers, database servers, and messaging servers.
Not B (stateful application in this scenario): Network Load Balancing is intended for applications that do not have long-running in-memory state. These are called stateless applications. A stateless application treats each client request as an independent operation, and therefore it can load-balance each request independently. Stateless applications often have read-only data or data that changes infrequently. Front-end Web servers, virtual private networks (VPNs), File Transfer Protocol (FTP) servers, and firewall and proxy servers typically use Network Load Balancing. Network Load Balancing clusters can also support other TCP- or UDP-based services and applications.
Note:
*
Windows Server 2008 provides two clustering technologies: failover clusters and Network Load Balancing (NLB). Failover clusters primarily provide high availability; Network Load Balancing provides scalability and at the same time helps increase availability of Web-based services.
*
Server Core provides you with a minimal installation of Windows Server 2008 that supports installing only certain server roles. Server Core includes Network Load Balancing and Failover Clustering.
Reference: Failover Cluster Overview
Q109. - (Topic 7)
You need to recommend changes to the infrastructure to ensure that DFS meets the company's security requirements. What should you include in the recommendation?
A. Upgrade DFS2 to Windows Server 2008 R2.
B. Implement accessbased enumeration (ABE).
C. Implement Authentication Mechanism Assurance.
D. Configure the DFS namespace to use Windows Server 2008 mode.
Answer: A
Explanation:
Users must only be able to modify the financial forecast reports on DFSl. DFS2 must contain a read-only copy of the financial forecast reports. Both servers are part of the same replication group and it is in Windows 2000 server mode http://blogs.technet.com/b/filecab/archive/2009/04/01/configuring-a-read-only-replicated-folder.aspx Please read the following notes carefully before deploying the read-only replicated folders feature. a) Feature applicability: The read-only replicated folders feature is available only on replication member servers which are running Windows Server 2008 R2. In other words, it is not possible to configure a replicated folder to be read-only on a member server running either Windows Server 2003 R2 or Windows Server 2008. b) Backwards compatibility: Only the server hosting read-only replicated folders needs to be running Windows Server 2008 R2. The member server that hosts a read-only replicated folder can replicate with partners that are on Windows Server 2003 or Windows Server 2008. However, to configure and administer a replication group that has a read-only replicated folder, you need to use the DFS Management MMC snap-in on Windows Server 2008 R2. c) Administration of read-only replicated folders: In order to configure a replicated folder as read-only replicated folder, you need to use the DFS Management MMC snap-in on Windows Server 2008 R2. Older versions of the snap-in (available on Windows Server 2003 R2 or Windows Server 2008) cannot configure or manage a read-only replicated folder. In other words, these snap-ins will not display the option to mark a replicated folder 'read-only'. d) Schema updates: If you have an older version of the schema (pre-Windows Server 2008), you will need to update your Active Directory schema to include the DFS Replication schema extensions for Windows Server 2008. http://blogs.technet.com/b/filecab/archive/2009/01/21/read-only-replicated-folders-on-windows-server-2008-r2.aspx
Why deploy read-only replicated folders? Consider the following scenario. Contoso Corporation has a replication infrastructure similar to that depicted in the diagram below. Reports are published to the datacenter server and these need to be distributed to Contoso’s branch offices. DFS Replication is configured to replicate a folder containing these published reports between the datacenter server and branch office servers.
The DFS Replication service is a multi-master file replication engine – meaning that changes can be made to replicated data on any of the servers taking part in replication. The service then ensures that these changes are replicated out to all other members in that replication group and that conflicts are resolved using ‘last-writerwins’ semantics.
Now, a Contoso employee working in a branch office accidentally deletes the ‘Specs’ sub-folder from the replicated folder stored on that branch office’s file server. This accidental deletion is replicated by the DFS Replication service, first to the datacenter server and then via that server to the other branch offices. Soon, the ‘Specs’ folder gets deleted on all of the servers participating in replication. Contoso’s file server administrator now needs to restore the folder from a previously taken backup and ensure that the restored contents of the folder once again replicate to all branch office file servers.
Administrators need to monitor their replication infrastructure very closely in order to prevent such situations from arising or to recover lost data if needed. Strict ACLs are a way of preventing these accidental modifications from happening, but managing ACLs across many branch office servers and for large amounts of replicated data quickly degenerates into an administrative nightmare. In case of accidental deletions, administrators need to scramble to recover data from backups (often up-to-date backups are unavailable) and in the meantime, end-users face outages leading to loss of productivity.
This situation can be prevented by configuring read-only replicated folders on branch office file servers. A readonly replicated folder ensures that no local modifications can take place and the replica is kept in sync with a read-write enabled copy by the DFS Replication service. Therefore, read-only replicated folders enable easyto-deploy and low-administrative-overhead data publication solutions especially for branch office scenarios.
How does all this work? For a read-only replicated folder, the DFS Replication service intercepts and inspects every file system operation. This is done by virtue of a file system filter driver that layers above every replicated folder that is configured to be read-only. Volumes that do not host read-only replicated folders or volumes hosting only readwrite replicated folders are ignored by the filter driver.
Only modifications initiated by the service itself are allowed – these modifications are typically caused by the service installing updates from its replication partners. This ensures that the read-only replicated folder is maintained in sync with a read-write enabled replicated folder on another replication partner (presumably located at the datacenter server). All other modification attempts are blocked – this ensures that the contents of the read-only replicated folder cannot be modified locally. As shown in the below figure, end-users are unable to modify the contents of the replicated folder on servers where it has been configured to be read-only. The behavior is similar to that of a read-only SMB share – contents can be read and attributes can be queried for all files, however, modifications are not possible.
Q110. - (Topic 5)
You need to recommend changes to the DFS infrastructure that meet the company's technical requirements. What should you recommend implementing in each branch office? (Each correct answer presents part of the solution. Choose two.)
A. a DFS namespace server
B. a DFS replica
C. a standalone DFS namespace
D. BranchCache in Distributed Cache mode
E. BranchCache in Hosted Cache mode
Answer: A,B
Explanation:
When deploying domain-based namespaces, you can add additional namespace servers to host a namespace. This has several advantages: If one namespace server hosting the namespace goes down, the namespace will still be available to users who need to access shared resources on your network. Adding another namespace thus increases the availability of your namespace. If you have a namespace that must be available to users all across your organization but your Active Directory network has more than one site, then each site should have a namespace server hosting your namespace. That way, when users in a site need to contact a namespace server for referrals, they can do so locally instead of sending traffic requests to other sites. This improves performance and reduces unnecessary WAN traffic. Note that adding additional namespace servers is only supported for domain-based namespaces, not standalone namespaces. http://technet.microsoft.com/en-us/library/cc732863%28v=ws.10%29.aspx DFS Namespaces enables you to group shared folders located on different servers by transparently connecting them to one or more namespaces. A namespace is a virtual view of shared folders in an organization. When you create a namespace, you select which shared folders to add to the namespace, design the hierarchy in which those folders appear, and determine the names that the shared folders show in the namespace. When a user views the namespace, the folders appear to reside on a single, high-capacity hard disk. Users can navigate the namespace without needing to know the server names or shared folders hosting the data. The path to a namespace is similar to a Universal Naming Convention (UNC) path of a shared folder, such as \\Server1\Public\Software\Tools. If you are familiar with UNC paths, you know that in this example the shared folder, Public, and its subfolders, Software and Tools, are all hosted on Server1. Now, assume you want to give users a single place to locate data, but you want to host data on different servers for availability and performance purposes. To do this, you can deploy a namespace similar to the one shown in the following figure. The elements of this namespace are described after the figure.
Namespace server. A namespace server hosts a namespace. The namespace server can be a member server or a domain controller. Namespace root. The root is the starting point of the namespace. In the previous figure, the name of the root is Public, and the namespace path is \\Contoso\Public. This type of namespace is known as a domain-based namespace, because it begins with a domain name (for example, Contoso) and its metadata is stored in AD DS. Although a single namespace server is shown in the previous figure, a domain-based namespace can be hosted on multiple namespace servers. Folder. Folders help build the namespace hierarchy. Folders can optionally have folder targets. When users browse a folder with targets in the namespace, the client computer receives a referral that directs the client computer to one of the folder targets. Folder targets. A folder target is a UNC path of a shared folder or another namespace that is associated with a folder in a namespace. In the previous figure, the folder named Tools has two folder targets, one in London and one in New York, and the folder named Training Guides has a single folder target in New York. A user who browses to \\Contoso\Public\Software\Tools is transparently redirected to the shared folder \\LDN-SVR-01\Tools or \\NYC-SVR-01\Tools, depending on which site the user is in.