getcertified4sure.com

Top Microsoft 70-470 preparation labs Choices




Exam Code: 70-470 (Practice Exam Latest Test Questions VCE PDF)
Exam Name: Recertification for MCSE: Business Intelligence
Certification Provider: Microsoft
Free Today! Guaranteed Training- Pass 70-470 Exam.

2021 Dec 70-470 free practice questions

Q111. - (Topic 1) 

You need to identify the reports that produce the errors that Marc is receiving. 

What should you do? 

A. Write a query by using the Subscriptions table in the report server database. 

B. Use the Windows Event Viewer to search the Application log for errors. 

C. Write a query by using the ExecutionLog3 view in the report server database. 

D. Search the ReportServerService_<timestamp>.log file for errors. 

Answer:


Q112. - (Topic 5) 

You need to implement the aggregation designs for the cube. 

What should you do? 

A. Use the Usage-Based Optimization Wizard. 

B. Use the Aggregation Design Wizard. 

C. Partition the cube by month. 

D. Implement cache warming in SSAS via an SSIS package. 

Answer:


Q113. - (Topic 1) 

You need to create the sales territory and product measures. 

Which aggregate function should you use for both measures? 

A. COUNT (DISTINCT column_name) 

B. Distinct Count 

C. Distinct 

D. Count 

Answer:


Q114. - (Topic 6) 

You need to design the dimCustomers table. 

Which design approach should you use? 

A. Reference dimension 

B. Type 2 slowly changing dimension 

C. Junk dimension 

D. Conformed dimension 

E. Type 1 slowly changing dimension 

Answer: B Topic 7, Contoso, Ltd Case B 

General Background You are the business intelligence (BI) solutions architect for Contoso, Ltd, an online retailer. 

You produce solutions by using SQL Server 2012 Business Intelligence edition and Microsoft SharePoint Server 2010 Service Pack 1 (SP1) Enterprise edition. 

A SharePoint farm has been installed and configured for intranet access only. An Internet-facing web server hosts the company's public e-commerce website. Anonymous access is not configured on the Internet-facing web server. 

Data Warehouse The data warehouse is deployed on a 5QL Server 2012 relational database instance. The data warehouse is structured as shown in the following diagram. 

The following Transact-SQL (T-SQL) script is used to create the FactSales and FactPopulation tables: 

The FactPopulation table is loaded each year with data from a Windows Azure Marketplace commercial dataset. The table contains a snapshot of the population values for all countries of the world for each year. The world population for the last year loaded exceeds 

6.8 billion people. 

ETL Process SQL Server Integration Services (SSIS) is used to load data into the data warehouse. All SSIS projects are developed by using the project deployment model. 

A package named StageFactSales loads data into a data warehouse staging table. The package sources its data from numerous CSV files exported from a mainframe system. The CSV file names begin with the letters GLSD followed by a unique numeric identifier that never exceeds six digits. The data content of each CSV file is identically formatted. 

A package named LoadFactFreightCosts sources data from a Windows Azure SQL Database database that has data integrity problems. The package may retrieve duplicate rows from the database. 

The package variables of all packages have the RaiseChangedEvent property set to true. A package-level event handler for the OnVariableValueChanged event consists of an Execute SQL task that logs the System::VariableName and System::VariableValue variables. 

Data Models SQL Server Analysis Services (SSAS) is used to host the Corporate BI multidimensional database. The Corporate BI database contains a single data source view named Data Warehouse. The Data Warehouse data source view consists of all data warehouse tables. All data source view tables have been converted to named queries. 

The Corporate BI database contains a single cube named Sales Analysis and three database dimensions: Date, Customer and Product. The dimension usage for the Sales Analysis cube is as shown in the following image. 

The Customer dimension contains a single multi-level hierarchy named Geography. The structure of the Geography hierarchy is shown in the following image. 

The Sales Analysis cube's calculation script defines one calculated measure named Sales Per Capita. The calculated measure expression divides the Revenue measure by the Population measure and multiplies the result by 1,000. This calculation represents revenue per 1,000 people. 

The Sales Analysis cube produces correct Sales Per Capita results for each country of the world; however, the Grand Total for all countries is incorrect, as shown in the following image (rows 2-239 have been hidden). 

A role named Analysts grants Read permission for the Sales Analysis cube to all sales and marketing analysts in the company. 

SQL Server Reporting Services (SSRS) is configured in SharePoint integrated mode. All reports are based on shared data sources. 

Corporate logo images used in reports were originally configured as data-bound images sourced from a SQL Server relational database table. The image data has been exported to JPG files. The image files are hosted on the Internet-facing web server. All reports have been modified to reference the corporate logo images by using the fully qualified URLs of the image files. A red X currently appears in place of the corporate logo in reports. 

Users configure data alerts on certain reports. Users can view a report named Sales Profitability on demand; however, notification email messages are no longer being sent when Sales Profitability report data satisfies alert definition rules. The alert schedule settings for the Sales Profitability report are configured as shown in the following image. 

Business Requirements Data Models 

Users must be able to: . Provide context to measures and filter measures by using all related data warehouse dimensions. . Analyze measures by order date or ship date. 

Additionally, users must be able to add a measure named Sales to the report canvas by clicking only once in the Power View field list. The Sales measure must allow users to analyze the sum of the values in the Revenue column of the FactSales data warehouse table. Users must be able to change the aggregation function of the Sales measure. 

Analysis and Reporting A sales manager has requested the following query results from the Sales Analysis cube for the 2012 fiscal year: 

. Australian postal codes and sales in descending order of sales. . Australian states and the ratio of sales achieved by the 10 highest customer sales made for each city in that state. 

Technical Requirements ETL Processes If an SSIS package variable value changes, the package must log the variable name and the new variable value to a custom log table. 

The StageFactSales package must load the contents of all files that match the file name pattern. The source file name must also be stored in a column of the data warehouse staging table. In the design of the LoadFactSales package, if a lookup of the dimension surrogate key value for the product code fails, the row details must be emailed to the data steward and written as an error message to the SSIS catalog log by using the public API. 

You must configure the LoadFactFreightCosts package to remove duplicate rows, by using the least development effort. 

Data Models Users of the Sales Analysis cube frequently filter on the current month's data. You must ensure that queries to the Sales Analysis cube default to the current month in the Order Date dimension for all users. 

You must develop and deploy a tabular project for the exclusive use as a Power View reporting data source. The model must be based on the data warehouse. Model table names must exclude the Dim or Fact prefixes. All measures in the model must format values to display zero decimal places. 

Analysis and Reporting Reports must be developed that combine the SSIS catalog log messages with the package variable value changes. 


Q115. - (Topic 4) 

You need to implement the Customer Sales and Manufacturing data models. 

What should you do? (Each correct answer presents a partial solution. Choose all that apply.) 

A. Use the Database Synchronization Wizard to upgrade the database to tabular mode. 

B. Use SQL Server Integration Services (SSIS) to copy the database design to the SSAS instance, and specify tabular mode as the destination. 

C. Use SQL Server Data Tools (SSDT) to redevelop and deploy the projects. 

D. Use the current SSAS instance. 

E. Install a new instance of SSAS in tabular mode. 

Answer: C,E 

Explanation: 

C: Tabular models are authored in SQL Server Data Tools (SSDT) using new tabular model project templates. You can import data from multiple sources, and then enrich the model by adding relationships, calculated columns, measures, KPIs, and hierarchies. Models can then be deployed to an instance of Analysis Services where client reporting applications can connect to them. Deployed models can be managed in SQL Server Management Studio just like multidimensional models. They can also be partitioned for optimized processing and secured to the row-level by using role based security. 

E: If you are installing Analysis Services to use the new tabular modeling features, you must install Analysis Services in a server mode that supports that type of model. The server mode is Tabular, and it is configured during installation. After you install the server in this mode, you can use it host solutions that you build in tabular model designer. A tabular mode server is required if you want tabular model data access over the network. 

*

 From scenario: / Deploy a data model to allow the ad-hoc analysis of data. The data model must be cached and source data from an OData feed. / All SSAS databases other than the Research database must be converted to tabular BI Semantic Models (BISMs) as part of the upgrade to SSAS 2012. The Research team must have access to the Research database for modeling throughout the upgrade. To facilitate this, you detach the Research database and attach it to SSAS01. 

*

 The Business Intelligence Semantic Model (BISM) is a single unified BI platform which has both multi-dimensional as well as tabular data modeling capabilities to offer best of both worlds and choice for the developer. 

Reference: Install Analysis Services in Tabular Mode 

Reference: Tabular Modeling (SSAS Tabular) 


Update 70-470 study guide:

Q116. - (Topic 4) 

You need to implement the security requirement for the sales representatives. 

Which MDX expression should you use? 

A. Option A 

B. Option B 

C. Option C 

D. Option D 

Answer:

Topic 5, Data Architect 

General Background 

You are a Data Architect for a company that uses SQL Server 2012 Enterprise edition. 

You have been tasked with designing a data warehouse that uses the company's financial 

database as the data source. From the data warehouse, you will develop a cube to simplify 

the creation of accurate financial reports and related data analysis. 

Background You will utilize the following three servers: . ServerA runs SQL Server Database Engine. ServerA is a production server and also hosts the financial database. 

. ServerB runs SQL Server Database Engine, SQL Server Analysis Services (SSAS) in multidimensional mode, SQL Server Integration Services (SSIS), and SQL Server Reporting Services (SSRS). 

. ServerC runs SSAS in multidimensional mode. . The financial database is used by a third-party application and the table structures cannot be modified. 

The relevant tables in the financial database are shown in the exhibit. (Click the Exhibit button.) 

The SalesTransactions table is 500 GB and is anticipated to grow to 2 TB. The table is partitioned by month. It contains only the last five years of financial data. The CouponUsed, OnSale, and Closeout columns contain only the values Yes or No. Each of the other tables is less than 10 MB and has only one partition. 

The SecurityFilter table specifies the sites to which each user has access. 

Business Requirements The extract, transform, load (ETL) process that updates the data warehouse must run daily between 8:00 P.M. and 5:00 A.M. so that it doesn't impact the performance of ServerA during business hours. The cube data must be available by 8:00 A.M. 

The cube must meet the following business requirements: 

.. 

Ensure that reports display the most current information available. Allow fast access to support ad-hoc reports and data analysis. 

Business Analysts will access the data warehouse tables directly, and will access the cube by using SSRS, Microsoft Excel, and Microsoft SharePoint Server 2010 PerformancePoint Services. These tools will access only the cube and not the data warehouse. 

Technical Requirements SSIS solutions must be deployed by using the project deployment model. You must develop the data warehouse and store the cube on ServerB. When the number of concurrent SSAS users on ServerB reaches a specific number, you must scale out SSAS to ServerC and meet following requirements: 

.. 

Maintain copies of the cube on ServerB and ServerC. Ensure that the cube is always available on both servers. 

Guaranteed success with TestInsides practice guides 

.

Minimize query response time. 

The cube must meet the following technical requirements: 

.. .. 

The cube must be processed by using an SSIS package. 

The cube must contain the prior day's data up to 8:00 P.M. but does not need to 

contain same-day data. 

The cube must include aggregation designs when it is initially deployed. 

A product dimension must be added to the cube. It will contain a hierarchy 

comprised of product name and product color. 

Because of the large size of the SalesTransactions table, the cube must store only aggregations—the data warehouse must store the detailed data. Both the data warehouse and the cube must minimize disk space usage. 

As the cube size increases, you must plan to scale out to additional servers to minimize processing time. 

The data warehouse must use a star schema design. The table design must be as denormalized as possible. The history of changes to the Customer table must be tracked in the data warehouse. The cube must use the data warehouse as its only data source. 

Security settings on the data warehouse and the cube must ensure that queries against the SalesTransactions table return only records from the sites to which the current user has access. 

The ETL process must consist of multiple SSIS packages developed in a single project by using the least amount of effort. The SSIS packages must use a database connection string that is set at execution time to connect to the financial database. All data in the data warehouse must be loaded by the SSIS packages. 

You must create a Package Activity report that meets the following requirements: 

.. 

Track SSIS package execution data (including package name, status, start time, 

end time, duration, and rows processed). 

Use the least amount of development effort. 


Q117. - (Topic 8) 

You execute the SalesbyCategory report and receive the following error message: "Members, tuples, or sets must use the same hierarchies in the function." 

You need to ensure that the query executes successfully. 

Which two actions should you perform? Each correct answer presents part of the solution. 

A. Move the Product clause from line 08 to line 10. 

B. Move the Date and Product clauses on line 11 to axis 0. 

C. Move the Date clause from line 10 to line 08. 

D. Move the Measures clause on line 02 to axis 1. 

Answer:


Q118. DRAG DROP - (Topic 9) 

You have a business intelligence (BI) infrastructure that contains three servers. The servers are configured as shown in the following table. 

You need to recommend a health monitoring solution for the BI infrastructure. 

The solution must meet the following requirements: 

Monitor the status of the Usage Data Collection feature. 

Monitor the number of end-users accessing the solution. 

Monitor the amount of cache used when the users query data. 

Which health monitoring solution should you recommend using on each server? To answer, drag the appropriate monitoring solutions to the correct servers. Each monitoring solution may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. 

... 

Answer: 


Q119. - (Topic 9) 

You are designing an extract, transform, load (ETL) process for loading data from a SQL 

Server database into a large fact table in a data warehouse each day with the prior day's sales data. 

The ETL process for the fact table must meet the following requirements: 

..... 

Load new data in the shortest possible time. 

Remove data that is more than 36 months old. 

Ensure that data loads correctly. 

Minimize record locking. 

Minimize impact on the transaction log. 

You need to design an ETL process that meets the requirements. 

What should you do? (More than one answer choice may achieve the goal. Select the BEST answer.) 

A. Partition the destination fact table by date. Insert new data directly into the fact table and delete old data directly from the fact table. 

B. Partition the destination fact table by date. Use partition switching and staging tables both to remove old data and to load new data. 

C. Partition the destination fact table by customer. Use partition switching both to remove old data and to load new data into each partition. 

D. Partition the destination fact table by date. Use partition switching and a staging table to remove old data. Insert new data directly into the fact table. 

Answer:


Q120. - (Topic 8) 

You need to deploy a solution for the planned self-service reports that will be used by the sales department managers. 

What is the best solution you should deploy? More than one answer choice may achieve the goal. Select the BEST answer. 

A. A filter 

B. A KPI 

C. A calculated column 

D. A measure 

Answer: