Proper study guides for Refresh Microsoft Developing Microsoft SQL Server 2012 Databases certified begins with Microsoft 70-464 preparation products which designed to deliver the Validated 70-464 questions by making you pass the 70-464 test at your first time. Try the free 70-464 demo right now.
Q61. Topic 7)
You need to implement a new version of usp_AddMobileLocation. Develop the solution by selecting and arranging the required code blocks in the correct order. You may not need all of the code blocks.
Answer:
Q62. You need to recommend a solution that meets the concurrency problems.
What should you include in the recommendation?
A. Modify the stored procedures to use the SERIALIZABLE isolation level.
B. Modify the order in which usp_AcceptCandidate accesses the Applications table and the Candidates table.
C. Modify the order in which usp_UpdateCandidate accesses the Applications table and the Candidates table.
D. Modify the stored procedures to use the REPEATABLE READ isolation level.
Answer: C
Topic 6, Coho Winery
Overview
You are a database developer for a company named Coho Winery. Coho Winery has an office in London.
Coho Winery has an application that is used to process purchase orders from customers and retailers in 10 different countries.
The application uses a web front end to process orders from the Internet. The web front end adds orders to a database named Sales. The Sales database is managed by a server named Server1.
An empty copy of the Sales database is created on a server named Server2 in the London office. The database will store sales data for customers in Europe.
A new version of the application is being developed. In the new version, orders will be placed either by using the existing web front end or by loading an XML file.
Once a week, you receive two files that contain the purchase orders and the order details of orders from offshore facilities.
You run the usp_ImportOders stored procedure and the usp_ImportOrderDetails stored procedure to copy the offshore facility orders to the Sales database.
The Sales database contains a table named Orders that has more than 20 million rows.
Database Definitions Database and Tables
The following scripts are used to create the database and its tables:
Stored Procedures
The following are the definitions of the stored procedures used in the database:
Indexes
The following indexes are part of the Sales database:
Data Import
The XML files will contain the list of items in each order. Each retailer will have its own XML schema and will be able to use different types of encoding. Each XML schema will use a default namespace. The default namespaces are not guaranteed to be unique.
For testing purposes, you receive an XSD file from a customer.
For testing purposes, you also create an XML schema collection named ValidateOrder. ValidateOrder contains schemas for all of the retailers.
The new version of the application must validate the XML file, parse the data, and store the parsed data along with the original XML file in the database. The original XML file must be stored without losing any data.
Reported Issues
Performance Issues
You notice the following for the usp_GetOrdersAndItems stored procedure:
The stored procedure takes a long time to complete.
Less than two percent of the rows in the Orders table are retrieved by
usp_GetOrdersAndItems.
A full table scan runs when the stored procedure executes.
The amount of disk space used and the amount of time required to insert data are
very high.
You notice that the usp_GetOrdersByProduct stored procedure uses a table scan when the stored procedure is executed.
Page Split Issues
Updates to the Orders table cause excessive page splits on the IX_Orders_ShipDate index.
Requirements Site Requirements
Users located in North America must be able to view sales data for customers in North America and Europe in a single report. The solution must minimize the amount of traffic over the WAN link between the offices.
.. ..
Bulk Insert Requirements
The usp_ImportOrderDetails stored procedure takes more than 10 minutes to complete. The stored procedure runs daily. If the stored procedure fails, you must ensure that the stored procedure restarts from the last successful set of rows.
Index Monitoring Requirements
The usage of indexes in the Sales database must be monitored continuously. Monitored data must be maintained if a server restarts. The monitoring solution must minimize the usage of memory resources and processing resources.
Q63. You need to implement a solution that meets the site requirements.
What should you implement?
A. A non-indexed view on Server1
B. A non-indexed view on Server2
C. A distributed view on Server1
D. A distributed view on Server2
Answer: C
Q64. Topic 8)
You have a SQL Server 2012 database named DB1. DB1 contains four filegroups named FG1, FG2, FG3, and FG4. You execute the following code:
Two million rows are added to dbo.Sales.
You need to move the data from the first partition to a new table named SalesHistory and, starting on December 31, 2012, repartition dbo.Sales to support new sales data for three months.
Which code segment should you execute?
To answer, move the appropriate code segments from the list of code segments to the answer area and arrange them in the correct order.
Answer:
Q65. You plan to create a stored procedure that inserts data from an XML file to the OrderDetails table. The following is the signature of the stored procedure:
The following is the XSD file used to create the ValidateOrder schema collection:
You develop a code segment that retrieves the number of items and loops through each item. Each time the loop runs, a variable named @itemNumber is incremented.
You need to develop a code segment that retrieves the product ID of each item number in the loop.
Which code segment should you develop?
A. SET @productID = @items.value'/Root/Product/productID', int)
B. SET @productID = @items.value'/Root/Product['+ @itemNumber+ ']/@productID', int)
C. SET @productID = @items.value'/Root/Product['+ @itemNumber+ ']/productID', int)
D. SET @productID = @items.value'/Root/Product/@productID', int)
Answer: B
Topic 7, Fourth Coffee
Background
Corporate Information
Fourth Coffee is global restaurant chain. There are more than 5,000 locations worldwide.
Physical Locations
Currently a server at each location hosts a SQL Server 2012 instance. Each instance contains a database called StoreTransactions that stores all transactions from point of sale and uploads summary batches nightly.
Each server belongs to the COFFECORP domain. Local computer accounts access the StoreTransactions database at each store using sysadmin and datareaderwriter roles.
Planned changes
Fourth Coffee has three major initiatives:
The IT department must consolidate the point of sales database infrastructure.
The marketing department plans to launch a mobile application for micropayments.
The finance department wants to deploy an internal tool that will help detect fraud.
Initially, the mobile application will allow customers to make micropayments to buy coffee and other items on the company web site. These micropayments may be sent as gifts to other users and redeemed within an hour of ownership transfer. Later versions will generate profiles based on customer activity that will push texts and ads generated by an analytics application.
When the consolidation is finished and the mobile application is in production, the micropayments and point of sale transactions will use the same database.
...
Existing Environment
Existing Application Environment
Some stores have been using several pilot versions of the micropayment application. Each version currently is in a database that is independent from the point of sales systems. Some versions have been used in field tests at local stores, and others are hosted at corporate servers. All pilot versions were developed by using SQL Server 2012.
Existing Support Infrastructure
The proposed database for consolidating micropayments and transactions is called CoffeeTransactions. The database is hosted on a SQL Server 2014 Enterprise Edition instance and has the following file structures:
Business Requirements
General Application Solution Requirements
The database infrastructure must support a phased global rollout of the micropayment application and consolidation.
The consolidated micropayment and point of sales database will be into a CoffeeTransactions database. The infrastructure also will include a new CoffeeAnalytics database for reporting on content from CoffeeTransactions.
Mobile applications will interact most frequently with the micropayment database for the
following activities: . Retrieving the current status of a micropayment; . Modifying the status of the current micropayment; and . Canceling the micropayment.
The mobile application will need to meet the following requirements: . Communicate with web services that assign a new user to a micropayment by using a stored procedure named usp_AssignUser. . Update the location of the user by using a stored procedure named usp_AddMobileLocation.
The fraud detection service will need to meet the following requirements: . Query the current open micropayments for users who own multiple micropayments by using a stored procedure named usp.LookupConcurrentUsers. . Persist the current user locations by using a stored procedure named usp_MobileLocationSnapshot. . Look at the status of micropayments and mark micropayments for internal investigations. . Move micropayments to dbo.POSException table by using a stored procedure named ups_DetectSuspiciousActivity. . Detect micropayments that are flagged with a StatusId value that is greater than 3 and that occurred within the last minute.
The CoffeeAnalytics database will combine imports of the POSTransaction and MobileLocation tables to create a UserActivity table for reports on the trends in activity. Queries against the UserActivity table will include aggregated calculations on all columns that are not used in filters or groupings.
Micropayments need to be updated and queried for only a week after their creation by the mobile application or fraud detection services.
Performance
The most critical performance requirement is keeping the response time for any queries of
the POSTransaction table predictable and fast.
Web service queries will take a higher priority in performance tuning decisions over the
fraud detection agent queries.
Scalability
Queries of the user of a micropayment cannot return while the micropayment is being updated, but can show different users during different stages of the transaction.
The fraud detection service frequently will run queries over the micropayments that occur over different time periods that range between 30 seconds and ten minutes.
The POSTransaction table must have its structure optimized for hundreds of thousands of active micropayments that are updated frequently.
All changes to the POSTransaction table will require testing in order to confirm the expected throughput that will support the first year's performance requirements.
Updates of a user's location can tolerate some data loss.
Initial testing has determined that the POSTransaction and POSException tables will be migrated to an in-memory optimized table.
Availability
In order to minimize disruption at local stores during consolidation, nightly processes will restore the databases to a staging server at corporate headquarters.
Technical Requirements
Security
The sensitive nature of financial transactions in the store databases requires certification of the COFFECORP\Auditors group at corporate that will perform audits of the data. Members of the COFFECORP\Auditors group cannot have sysadmin or datawriter access to the database. Compliance requires that the data stewards have access to any restored StoreTransactions database without changing any security settings at a database level.
Nightly batch processes are run by the services account in the COFFECORP\StoreAgent group and need to be able to restore and verify the schema of the store databases match.
No Windows group should have more access to store databases than is necessary.
Maintainability
You need to anticipate when POSTransaction table will need index maintenance.
When the daily maintenance finishes, micropayments that are one week old must be available for queries in UserActivity table but will be queried most frequently within their first week and will require support for in-memory queries for data within first week.
The maintenance of the UserActivity table must allow frequent maintenance on the day's most recent activities with minimal impact on the use of disk space and the resources available to queries. The processes that add data to the UserActivity table must be able to update data from any time period, even while maintenance is running.
The index maintenance strategy for the UserActivity table must provide the optimal structure for both maintainability and query performance.
All micropayments queries must include the most permissive isolation level available for the maximum throughput.
In the event of unexpected results, all stored procedures must provide error messages in text message to the calling web service.
Any modifications to stored procedures will require the minimal amount of schema changes necessary to increase the performance.
Performance
Stress testing of the mobile application on the proposed CoffeeTransactions database uncovered performance bottlenecks. The sys.dm_os_wait_stats Dynamic Management View (DMV) shows high wait_time values for WRTTELOG and PAGEIOLATCHJJP wait types when updating the MobileLocation table.
Updates to the MobileLocation table must have minimal impact on physical resources.
Supporting Infrastructure
The stored procedure usp_LookupConcurrentUsers has the current implementation:
The current stored procedure for persisting a user location is defined in the following code:
The current stored procedure for managing micropayments needing investigation is defined in the following code:
The current table, before implementing any performance enhancements, is defined as follows:
Q66. Topic 8)
You are planning two stored procedures named SProc1 and SProc2.
You identify the following requirements:
. SProc1 must return a table.
. SProc2 must return a scalar value. You need to identify which option must be implemented for each stored procedure to return the desired data.
Which options should you identify?
To answer, drag the appropriate option to the correct requirement in the answer area. (Answer choices may be used once, more than once, or not at all.)
Answer:
Q67. While testing usp.GetFutureSessions, you discover that IX_Sessions is accessed by a scan rather than a seek.
You need to minimize the amount of time it takes to execute usp_GetFutureSessions.
What should you do? (Each correct answer presents part of the solution. Choose all that apply.)
A. Option A
B. Option B
C. Option C
D. Option D
E. Option E
F. Option F
Answer: B,E
Explanation: Future delivery dates.
Q68. Which data type should you use for ProductType?
A. varchar(11)
B. nvarchar(11)
C. char(11)
D. bigint
Answer: C
Topic 5, Litware, Inc
Overview
General Overview
You are a database developer for a company named Litware, Inc. Litware has a main office in Miami.
Litware has a job posting web application named WebApp1. WebApp1 uses a database named DB1. DB1 is hosted on a server named Server1. The database design of DB1 is shown in the exhibit. (Click the Exhibit button.)
WebApp1 allows a user to log on as a job poster or a job seeker. Candidates can search for job openings based on keywords, apply to an opening, view their application, and load their resume in Microsoft Word format. Companies can add a job opening, view the list of candidates who applied to an opening, and mark an application as denied.
Users and Roles
DB1 has five database users named Company, CompanyWeb, Candidate, CandidateWeb, and Administrator.
DB1 has three user-defined database roles. The roles are configured as shown in the following table.
Keyword Search
The keyword searches for the job openings are performed by using the following stored procedure named usp_GetOpenings:
Opening Update
Updates to the Openings table are performed by using the following stored procedure named usp_UpdateOpening:
Problems and Reported Issues
Concurrency Problems
You discover that deadlocks frequently occur.
You identify that a stored procedure named usp_AcceptCandidate and a stored procedure named usp_UpdateCandidate generate deadlocks. The following is the code for usp_AcceptCandidate:
Salary Query Issues
Users report that when they perform a search for job openings without specifying a minimum salary, only job openings that specify a minimum salary are displayed.
Log File Growth Issues
The current log file for DB1 grows constantly. The log file fails to shrink even when the daily SQL Server Agent Shrink Database task runs.
Performance Issues
You discover that a stored procedure named usp_ExportOpenings takes a long time to run and executes a table scan when it runs.
You also discover that the usp_GetOpenings stored procedure takes a long time to run and that the non-clustered index on the Description column is not being used.
Page Split Issues
On DB1, many page splits per second spike every few minutes.
Requirements
Security and Application Requirements
Litware identifies the following security and application requirements: . Only the Administrator, Company, and CompanyWeb database users must be able to execute the usp_UpdateOpening stored procedure. . Changes made to the database must not affect WebApp1.
Locking Requirements
Litware identifies the following locking requirements: . The usp_GetOpenings stored procedure must not be blocked by the usp_UpdateOpening stored procedure. . If a row is locked in the Openings table, usp_GetOpenings must retrieve the latest
version of the row, even if the row was not committed yet.
Integration Requirements
Litware exports its job openings to an external company as XML data. The XML data uses the following format:
A stored procedure named usp_ExportOpenings will be used to generate the XML data. The following is the code for usp_ExportOpenings:
The stored procedure will be executed by a SQL Server Integration Services (SSIS) package named Package1.
The XML data will be written to a secured folder named Folder1. Only a dedicated Active Directory account named Account1 is assigned the permissions to read from or write to Folder1.
Refactoring Requirements
Litware identifies the following refactoring requirements: . New code must be written by reusing the following query:
. The results from the query must be able to be joined to other queries.
Upload Requirements
Litware requires users to upload their job experience in a Word file by using WebApp1. WebApp1 will send the Word file to DB1 as a stream of bytes. DB1 will then convert the Word file to text before the contents of the Word file is saved to the Candidates table.
A database developer creates an assembly named Conversions that contains the following:
. A class named Convert in the SqlConversions namespace
. A method named ConvertToText in the Convert class that converts Word files to
text
The ConvertToText method accepts a stream of bytes and returns text. The method is used in the following stored procedure:
Job Application Requirements
A candidate can only apply to each job opening once.
Data Recovery Requirements
All changes to the database are performed by using stored procedures. WebApp1 generates a unique transaction ID for every stored procedure call that the application makes to the database.
If a server fails, you must be able to restore data to a specified transaction.
Q69. Topic 8)
You have a database that contains three tables. The tables are configured as shown in the following table.
You have the following query:
The execution plan for the query is shown in the exhibit. (Click the Exhibit button.)
You need to create one index to minimize the amount of time it takes to execute the query.
What should you do?
To answer, drag the appropriate columns to the correct locations in the answer area.
(Answer choices may be used once, more than once, or not at all.)
Answer:
Q70. You have a Microsoft SQL Azure database that contains a table named Employees.
You create a non-clustered index named EmployeeName on the name column.
You write the following query to retrieve all of the employees that have a name that starts
with the letters JOH:
You discover that the query performs a table scan.
You need to ensure that the query uses EmployeeName.
What should you do?
A. Recreate EmployeeName as a unique index
B. Recreate EmployeeName as a clustered index
C. Replace LEFT(name,3) = 'JOH' by using name like 'JOH%'
D. Replace LEFT(name,3) = 'JOH' by using substring(name, 1, 3) = 'JOH'
Answer: C