getcertified4sure.com

Examples of 70-646 cbt nuggets free download




It is impossible to pass Microsoft 70-646 exam without any help in the short term. Come to Exambible soon and find the most advanced, correct and guaranteed Microsoft 70-646 practice questions. You will get a surprising result by our Renew PRO: Windows Server 2008, Server Administrator practice guides.

2021 Nov ms 70-646:

Q61. - (Topic 1) 

Your network consists of a single Active Directory domain. All servers run Windows Server 2008 R2. A server named Server1 has the Remote Desktop Services server role installed. You notice that several users consume more than 30 percent of the CPU resources throughout the day. You need to prevent users from consuming more than 15 percent of the CPU resources. Administrators must not be limited by the amount of CPU resources that they can consume. 

What should you do? 

A. Implement Windows System Resource Manager (WSRM), and configure user policies. 

B. Implement Windows System Resource Manager (WSRM), and configure session policies. 

C. Configure Performance Monitor, and create a userdefined Data Collector Set. 

D. Configure Performance Monitor, and create an Event Trace Session Data Collector Set. 

Answer:

Explanation: 

You can use tools such as the Windows System Resource Manager and Performance Monitor to determine memory and processor usage of Terminal Services clients. Once you understand how the Terminal Server’s resources are used, you can determine the necessary hardware resources and make a good estimate as to the Terminal Server’s overall client capacity. Terminal Server capacity directly influences your deployment plans: A server that has a capacity of 100 clients is not going to perform well when more than 250 clients attempt to connect. Monitoring tools are covered in more detail in “Monitoring Terminal Services” later in this lesson. 

Windows System Resource Manager 

Windows System Resource Manager (WSRM) is a feature that you can install on a Windows Server 2008 computer that controls how resources are allocated. The WSRM console, shown in Figure 5-9, allows an administrator to apply WSRM policies. WSRM includes four default policies and also allows administrators to create their own. The two policies that will most interest you as someone responsible for planning and deploying Terminal Services infrastructure are Equal_Per_User and Equal_Per_Session. The Equal_Per_User WSRM policy ensures that each user is allocated resources equally, even when one user has more sessions connected to the Terminal Server than other users. Apply this policy when you allow users to have multiple sessions to the Terminal Server—it stops any one user from monopolizing hardware resources by opening multiple sessions. The Equal_Per_Session policy ensures that each session is allocated resources equally. If applied on a Terminal Server where users are allowed to connect with multiple sessions, this policy can allow those users to gain access to a disproportionate amount of system resources in comparison to users with single sessions. 


Q62. - (Topic 6) 

You need to recommend a security solution for the documents in the finance department. The solution must meet the company's security requirements. 

What should you include in the recommendation? 

A. accessbased enumeration (ABE) and Encrypted File System (EFS) 

B. accessbased enumeration (ABE) and Windows BitLocker Drive Encryption (BitLocker) 

C. Active Directory Rights Management Services (AD RMS) 

D. File Server Resource Manager (FSRM) file screens 

Answer:

Explanation: 

http://technet.microsoft.com/en-us/library/dd996658%28WS.10%29.aspx Rights policy templates are used to control the rights that a user or group has on a particular piece of rightsprotected content. Active Directory Rights Management Services (AD RMS) stores rights policy templates in the configuration database. Optionally, it may maintain a copy of all rights policy templates in a shared folder that you specify. 


Q63. - (Topic 1) 

Your network contains an Active Directory forest named contoso.com. 

You plan to deploy a new child domain named branch.contoso.com. The child domain will contain two domain controllers. Both domain controllers will have the DNS Server server role installed. All users and computers in the branch office will be members of the branch.contoso.com domain. 

You need to plan the DNS infrastructure for the child domain to meet the following requirements: 

. Ensure resources in the root domain are accessible by fully qualified domain names. . Ensure resources in the child domain are accessible by fully qualified domain names. . Provide name resolution services in the event that a single server fails for a prolonged period of time. . Automatically recognize when new DNS servers are added to or removed from the contoso.com domain. 

What should you include in your plan? 

A. On both domain controllers, add a conditional forwarder for contoso.com and create a standard primary zone for branch.contoso.com. 

B. On both domain controllers, modify the root hints to include the domain controllers for contoso.com. On one domain controller, create an Active Directory-integrated zone for branch.contoso.com. 

C. On one domain controller create an Active Directory-integrated zone for branch.contoso.com and create an Active Directory-integrated stub zone for contoso.com. 

D. On one domain controller, create a standard primary zone for contoso.com. On the other domain controller, create a standard secondary zone for contoso.com. 

Answer:

Explanation: 

http://technet.microsoft.com/en-us/library/cc772101.aspx http://technet.microsoft.com/en-us/library/cc771898.aspx 

Understanding DNS Zone Replication in Active Directory Domain Services 

Applies To: Windows Server 2008, Windows Server 2008 R2 

You can store Domain Name System (DNS) zones in the domain or application directory partitions of Active Directory Domain Services (AD DS). A partition is a data structure in AD DS that distinguishes data for different replication purposes. For more information, see Understanding Active Directory Domain Services Integration. The following table describes the available zone replication scopes for AD DS-integrated DNS zone data. 

When you decide which replication scope to choose, consider that the broader the replication scope, the greater the network traffic caused by replication. For example, if you decide to have AD DS-integrated DNS zone data replicated to all DNS servers in the forest, this will produce greater network traffic than replicating the DNS zone data to all DNS servers in a single AD DS domain in that forest. 

AD DS-integrated DNS zone data that is stored in an application directory partition is not replicated to the global catalog for the forest The domain controller that contains the global catalog can also host application directory partitions, but it will not replicate this data to its global catalog. 

AD DS-integrated DNS zone data that is stored in a domain partition is replicated to all 

domain controllers in its AD DS domain, and a portion of this data is stored in the global 

catalog. This setting is used to support Windows 2000. 

If an application directory partition's replication scope replicates across AD DS sites, 

replication will occur with the same intersite replication schedule as is used for domain 

partition data. 

By default, the Net Logon service registers domain controller locator (Locator) DNS 

resource records for the application directory partitions that are hosted on a domain 

controller in the same manner as it registers domain controller locator (Locator) DNS 

resource records for the domain partition that is hosted on a domain controller. 

Primary zone 

When a zone that this DNS server hosts is a primary zone, the DNS server is the primary 

source for information about this zone, and it stores the master copy of zone data in a local 

file or in AD DS. When the zone is stored in a file, by default the primary zone file is named 

rone_name.dns and it is located in the %windir%\System32\Dns folder on the server. 

Secondary zone 

When a zone that this DNS server hosts is a secondary zone, this DNS server is a 

secondary source for information about this zone. The zone at this server must be obtained 

from another remote DNS server computer that also hosts the zone. This DNS server must 

have network access to the remote DNS server that supplies this server with updated 

information about the zone. Because a secondary zone is merely a copy of a primary zone 

that is hosted on another server, it cannot be stored in AD DS. 

Stub zone 

When a zone that this DNS server hosts is a stub zone, this DNS server is a source only 

for information about the authoritative name servers for this zone. The zone at this server 

must be obtained from another DNS server that hosts the zone. This DNS server must 

have network access to the remote DNS server to copy the authoritative name server 

information about the zone. 

You can use stub zones to: 

.

 Keep delegated zone information current. By updating a stub zone for one of its child zones regularly, the DNS server that hosts both the parent zone and the stub zone will maintain a current list of authoritative DNS servers for the child zone. 

.

 Improve name resolution. Stub zones enable a DNS server to perform recursion using the stub zone's list of name servers, without having to query the Internet or an internal root server for the DNS namespace. 

.

 Simplify DNS administration. By using stub zones throughout your DNS infrastructure, you can distribute a list of the authoritative DNS servers for a zone without using secondary 

zones. However, stub zones do not serve the same purpose as secondary zones, and they are not an alternative for enhancing redundancy and load sharing. 

There are two lists of DNS servers involved in the loading and maintenance of a stub zone: 

.

 The list of master servers from which the DNS server loads and updates a stub zone. A master server may be a primary or secondary DNS server for the zone. In both cases, it will have a complete list of the DNS servers for the zone. 

.

 The list of the authoritative DNS servers for a zone. This list is contained in the stub zone using name server (NS) resource records. When a DNS server loads a stub zone, such as widgets.tailspintoys.com, it quenes the master servers, which can be in different locations, for the necessary resource records of the authoritative servers for the zone widgets.tailspintoys.com. The list of master servers may contain a single server or multiple servers, and it can be changed anytime. 


Q64. - (Topic 1) 

Your company has Windows Server 2008 R2 file servers. 

You need to recommend a data recovery strategy that meets the following requirements: 

•Backups must have a minimal impact on performance. 

•All data volumes on the file server must be backed up daily. 

•If a disk fails, the recovery strategy must allow individual files to be restored. 

•Users must be able to retrieve previous versions of files without the intervention of an 

administrator. What should you recommend? 

A. Deploy File Server Resource Manger (FSRM). Use Windows Server Backup to perform a daily backup to an external disk. 

B. Deploy Windows Automated Installation Kit (Windows AIK). Enable shadow copies for the volumes that contain shared user data. Store the shadow copies on a separate physical disk. 

C. Use Windows Server Backup to perform a daily backup to an external disk. Enable shadow copies for the volumes that contain shared user data. Store the shadow copies on a separate physical disk. 

D. Use Windows Server Backup to perform a daily backup to a remote network share. Enable shadow copies for the volumes that contain shared user data. Store the shadow copies in the default location. 

Answer:

Explanation: 

Shadow Copies of Shared Folders 

Implementing Shadow Copies of Shared Folders will reduce an administrator’s restoration workload dramatically because it almost entirely eliminates the need for administrator intervention in the recovery of deleted, modified, or corrupted user files. Shadow Copies of Shared Folders work by taking snapshots of files stored in shared folders as they exist at a particular point in time. This point in time is dictated by a schedule and the default schedule for Shadow Copies of Shared Folders is to be taken at 7:00 A.M. and 12:00 P.M. every weekday. Multiple schedules can be applied to a volume and the default schedule is actually two schedules applied at the same time. 

To enable Shadow Copies of Shared Folders, open Computer Management from the Administrative Tools menu, right-click the Shared Folders node, click All Tasks and then click Configure Shadow Copies. This will bring up the Shadow Copies dialog box, shown in Figure 12-1. This dialog box allows you to enable and disable Shadow Copies on a per-volume basis. It allows you to edit the Shadow Copy of Shared Folder settings for a particular volume. It also allows you to create a shadow copy of a particular volume manually. 

Figure 12-1Enabling Shadow Copies Enabling Shadow Copies on a volume will automatically generate an initial shadow copy for that volume. Clicking Settings launches the dialog box shown in Figure 12-2. From this dialog box, you can configure the storage area, the maximum size of the copy store, and the schedule of when copies are taken. Clicking Schedules allows you to configure how often shadow copies are generated. On volumes hosting file shares that contain files that are updated frequently, you would use a frequent shadow copy schedule. On a volume hosting file shares where files are updated less frequently, you should configure a less frequent shadow copy schedule. 

Figure 12-2Shadow Copy settings When a volume regularly experiences intense read and write operations, such as a commonly used file share, you can mitigate the performance impact of Shadow Copies of Shared Folders by storing the shadow copy data on a separate volume. If a volume has less space available than the set limit, the service will remove the oldestshadow copies that it has stored as a way of freeing up space. Finally, no matter how much free space is available, a maximum of 64 shadow copies can be stored on any one volume. When you consider how scheduling might be configured for a volume, you will realize how this directly influences the length of shadow copy data retention. Where space is available, a schedule where shadow copies are taken once every Monday, Wednesday, and Friday allows shadow copies from 21 weeks previously to be retrieved. The default schedule allows for the retrieval of up to 6 weeks of previousshadow copies. 

When planning the deployment of Shadow Copies of Shared Folders, it is important to remember that you configure settings on a per-volume basis. This means that the storage area, maximum size, and schedules for different volumes can be completely separate. If you plan shares in such a way that each volume hosts a single share, you can optimize the shadow copy settings for that share based on how the data is used, rather than trying to compromise in finding an effective schedule for very different shared folder usage patterns. 

Quick Check 1.On what basis (server, volume, share, disk, or folder) are Shadow Copies of Shared Folders enabled? 2.What happens to shadow copy data when the volume that hosts it begins to run out of space? Quick Check Answers 1.Shadow Copies of Shared Folders are enabled on a per-volume basis. 2.The oldest shadow copy data is automatically deleted when volumes begin to run out of space. 


Q65. - (Topic 12) 

You need to recommend a strategy to ensure that the administration of AD LDS is encrypted. 

What should you include in the recommendation? 

A. a server authentication certificate 

B. client authentication certificates 

C. Digest authentication 

D. Windows Integrated authentication 

Answer:

Explanation: 

http://technet.microsoft.com/en-us/library/cc725767%28WS.10%29.aspx The Lightweight Directory Access Protocol (LDAP) is used to read from and write to Active Directory Lightweight Directory Services (AD LDS). By default, LDAP traffic is not transmitted securely. You can make LDAP traffic confidential and secure by using Secure Sockets Layer (SSL) / Transport Layer Security (TLS) technology. To enable SSL-based encrypted connections to AD LDS, you must request and obtain a server authentication certificate from a trusted certification authority (CA) in your organization or from a trusted third-party CA. For more information about installing and using a CA, see Certificate Services (http://go.microsoft.com/fwlink/?LinkID=48952). 


Avant-garde windows server 2008 administrator exam 70-646 lab manual:

Q66. - (Topic 7) 

You need to recommend a solution to ensure that all of the client computers that run Windows 7 meet the company's security requirements. 

What should you include in the recommendation? 

A. Encrypted File System (EFS) 

B. the App1ocker Group Policy settings 

C. the IPSec enforcement method 

D. Windows BitLocker Drive Encryption (BitLocker) 

Answer:

Explanation: 

BitLocker Drive Encryption is a full disk encryption feature. It is designed to protect data by providing encryption for entire volumes. By default it uses the AES encryption algorithm in CBC mode with a 128 bit key, combined with the Elephant diffuser for additional disk encryption-specific security not provided by AES. The latest version of BitLocker, included in Windows 7 and Windows Server 2008 R2, adds the ability to encrypt removable drives. These can be read, but not written, by Windows XP using Microsoft BitLocker To Go Reader program 


Q67. - (Topic 5) 

You need to recommend changes to the DFS infrastructure that meet the company's security requirements. What should you recommend? 

A. Modify the NTFS permissions and the share permissions of the DFS targets. 

B. Modify the referrals settings of the DFS namespace and the NTFS permissions of the DFS targets. 

C. Migrate the namespace to Windows Server 2008 mode and modify the referrals settings. 

D. Migrate the namespace to Windows Server 2008 mode and enable accessbased enumeration (ABE). 

Answer:

Explanation: 

ABE is enabled by default and lets you hide files and folders from users who do not have access to them. 


Q68. - (Topic 14) 

You need to recommend a solution for deploying and managing App2. 

What should you recommend? 

A. Publish App2 as a RemoteApp program. 

B. Deploy App2 by using a Group Policy logon script. 

C. Assign App2 by using Group Policy software distribution. 

D. Publish App2 by using Group Policy software distribution. 

Answer:

Explanation: 

http://support.microsoft.com/kb/816102 

This step-by-step article describes how to use Group Policy to automatically distribute 

programs to client computers or users. You can use Group Policy to distribute computer 

programs by using the following methods: 

Assigning Software 

You can assign a program distribution to users or computers. If you assign the program to 

a user, it is installed when the user logs on to the computer. When the user first runs the 

program, the installation is finalized. If you assign the program to a computer, it is installed 

when the computer starts, and it is available to all users who log on to the computer. When 

a user first runs the program, the installation is finalized. 

Publishing Software 

You can publish a program distribution to users. When the user logs on to the computer, 

the published program is displayed in the Add or Remove Programs dialog box, and it can 

be installed from there. 


Q69. - (Topic 1) 

A company has file servers that run a Server Core installation of Windows Server 2008. 

You are designing the migration of the file servers to Windows Server 2008 R2. After the migration, you will install the Remote Desktop Services server role on the file servers. 

You need to ensure that shared resources on the file servers are available after the migration, and minimize administrative effort. 

What should you recommend? (More than one answer choice may achieve the goal. Select the BEST answer.) 

A. Move the shared resources off of the existing file servers. Perform a clean installation of Windows Server 2008 R2 on the file servers. Move the shared resources back onto the file servers. 

B. Upgrade the existing file servers to a Server Core installation of Windows Server 2008 R2, and then upgrade the file servers to a full installation of Windows Server 2008 R2. 

C. Deploy new file servers with Windows Server 2008 R2 installed. Migrate the shared resources to the new file servers. 

D. Deploy new file servers with a Server Core installation of Windows Server 2008 R2. Migrate the shared resources to the new file servers. 

Answer:

Explanation: 

The key here is minimize effort & Remote Desktop Services. 

Server Core wouldn't allow remote desktop services as it has no GUI so that would rule out 

answer A you also cant upgrade from Core to Full see 

http://www.windowsitpro.com/article/tips/can-i-upgrade-fromserver-core-2008-to-the-full-windows-server-2008- or http://serverfault.com/questions/92523/upgrade-fromwindows-

2008-server-core-to-full-windows-2008-server upgrade considerations for Server Core 

installations of Windows Server 2008 

You can use the Server Core installation option only by performing a clean installation. 

You cannot upgrade from earlier versions of Windows to Server Core installations of 

Windows Server 2008. 

You cannot upgrade from non-Server Core installations of Windows Server 2008 to Server 

Core installations of Windows Server 2008. 

You cannot convert Server Core installations of Windows Server 2008 to non-Server Core 

installations of Windows Server 2008. 

You can upgrade Server Core installations of Windows Server 2008 only to Windows 

Server Core R2 when it is released. 

Answer C is possible but again you're asked to minimize effort so D would be 1 step less 

thus reducing your effort and possible down time. 


Q70. - (Topic 3) 

You need to recommend a solution for users in the branch office to access files in the main office. What should you include in the recommendation? 

A. a BranchCache server that operates in Distributed Cache mode 

B. a BranchCache server that operates in Hosted Cache mode 

C. a domainbased Distributed File System (DFS) namespace and DFS Replication 

D. a standalone Distributed File System (DFS) namespace and DFS Replication 

Answer:

Explanation: 

http://technet.microsoft.com/en-us/library/dd755969%28WS.10%29.aspx requirement = Minimize the amount of time it takes for users in the branch offices to access files on the file servers in the main office BranchCache. is a feature in Windows. 7 and Windows Server. 2008 R2 that can reduce wide area network (WAN) utilization and enhance network application responsiveness when users access content in a central office from branch office locations. When you enable BranchCache, a copy of the content that is retrieved from the Web server or file server is cached within the branch office. If another client in the branch requests the same content, the client can download it directly from the local branch network without needing to retrieve the content by using the Wide Area Network (WAN). This whitepaper provides an overview of BranchCache, explains the different modes in which BranchCache operates, and describes how BranchCache is configured. The paper also explains how BranchCache works with Web servers and file servers and the steps BranchCache takes to determine that the content is up-to-date. 

Hosted Cache mode The Hosted Cache is a central repository of data downloaded from BranchCache enabled servers into the branch office by BranchCache enabled clients. The configuration of Hosted Cache mode is described later in this document. Hosted Cache mode does not require a dedicated server. The BranchCache feature can be enabled on a server that is running Windows Server 2008 R2, which is located in a branch that is also running other workloads. In addition, BranchCache can be set up as a virtual workload and run on a server with other workloads, such as File and Print. Figure 2 illustrates Hosted Cache mode and provides a simplified illustration of the document caching and retrieval process.