Power BI Questions no highlight.pdf
Document Details
Uploaded by LuxuriantNash
Related
- PCSII Depression/Anxiety/Strong Emotions 2024 Document
- A Concise History of the World: A New World of Connections (1500-1800)
- Human Bio Test PDF
- University of Santo Tomas Pre-Laboratory Discussion of LA No. 1 PDF
- Vertebrate Pest Management PDF
- Lg 5 International Environmental Laws, Treaties, Protocols, and Conventions
Full Transcript
1 Prepare the Data 1. HOTSPOT You plan to create the Power BI model shown in the exhibit. (Click the Exhibit tab.) The data has the following refresh requirements: Customer must be refreshed daily. Date must be refreshe...
1 Prepare the Data 1. HOTSPOT You plan to create the Power BI model shown in the exhibit. (Click the Exhibit tab.) The data has the following refresh requirements: Customer must be refreshed daily. Date must be refreshed once every three years. Sales must be refreshed in near real time. SalesAggregate must be refreshed once per week. You need to select the storage modes for the tables. The solution must meet the following requirements: Minimize the load times of visuals. Ensure that the data is loaded to the model based on the refresh requirements. Which storage mode should you select for each table? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. 2 Explanation: Box 1: Dual Customer should use the dual storage mode. Dual: Tables with this setting can act as either cached or not cached, depending on the context of the query that's submitted to the Power BI dataset. In some cases, you fulfill queries from cached data. In other cases, you fulfill queries by executing an on-demand query to the data source. Note: You set the Storage mode property to one of these three values: Import, DirectQuery, and Dual. Box 2: Dual You can set the dimension tables (Customer, Geography, and Date) to Dual to reduce the number of limited relationships in the dataset, and improve performance. Box 3: DirectQuery Sales should use the DirectQuery storage mode. DirectQuery: Tables with this setting aren't cached. Queries that you submit to the Power BI dataset—for example, DAX queries—and that return data from DirectQuery tables can be fulfilled only by executing on- 3 demand queries to the data source. Queries that you submit to the data source use the query language for that data source, for example, SQL. Box 4: Import Import: Imported tables with this setting are cached. Queries submitted to the Power BI dataset that return data from Import tables can be fulfilled only from cached data. Reference: https://docs.microsoft.com/en-us/power-bi/transform-model/desktop-storage-mode 2. You have a project management app that is fully hosted in Microsoft Teams. The app was developed by using Microsoft Power Apps. You need to create a Power Bl report that connects to the project management app. Which connector should you select? A. Microsoft Teams Personal Analytics B. SQL Server database C. Dataverse D. Dataflows Explanation: Data sources in Power BI Desktop. The Power Platform category provides the following data connections: Power BI datasets Power BI dataflows Common Data Service (Legacy) Dataverse Dataflows Other data sources include Microsoft Teams Personal Analytics (Beta). Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-data-sources 3. For the sales department at your company, you publish a Power Bl report that imports data from a Microsoft Excel file located in a Microsoft SharePoint folder. The semantic model contains several measures. You need to create a Power Bl report from the existing data that will use many of the measures created in the initial Power Bl semantic model. The solution must minimize development effort. Which type of data source should you use? A. Power BI dataset B. a SharePoint folder C. Power BI dataflows D. an Excel workbook 4 Explanation: Connect to a SharePoint folder from Power Query Desktop To connect to a SharePoint folder: 1. From Get Data, select SharePoint folder. 2. Paste the SharePoint site URL you copied in Determine the site URL to the Site URL text box in the SharePoint folder dialog box. In this example, the site URL is https://contoso.sharepoint.com/marketing/data. If the site URL you enter is invalid, a warning icon. warning icon will appear next to the URL text box. Select OK to continue. 3. If this is the first time you've visited this site address, select the appropriate authentication method. Enter your credentials and choose which level to apply these settings to. Then select Connect. 4. When you select the SharePoint folder you want to use, the file information about all of the files in that SharePoint folder are displayed. In addition, file information about any files in any subfolders is also displayed. 5. Etc. 5 Reference: https://docs.microsoft.com/en-us/power-query/connectors/sharepointfolder 4. You import two Microsoft Excel tables named Customer and Address into Power Query. Customer contains the following columns: Customer ID Customer Name Phone Email Address Address ID Address contains the following columns: Address ID Address Line 1 Address Line 2 City State/Region Country Postal Code Each Customer ID represents a unique customer in the Customer table. Each Address ID represents a unique address in the Address table. You need to create a query that has one row per customer. Each row must contain City, State/Region, and Country for each customer. What should you do? A. Merge the Customer and Address tables. B. Group the Customer and Address tables by the Address ID column. C. Transpose the Customer and Address tables. D. Append the Customer and Address tables. Explanation: A merge queries operation joins two existing tables together based on matching values from one or multiple columns. You can choose to use different types of joins, depending on the output you want. Reference: https://docs.microsoft.com/en-us/power-query/merge-queries-overview 6 5. HOTSPOT You have two Azure SQL databases that contain the same tables and columns. For each database, you create a query that retrieves data from a table named Customer. You need to combine the Customer tables into a single table. The solution must minimize the size of the data model and support scheduled refresh in powerbi.com. What should you do? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Explanation: Box 1: Append Queries as New When you have additional rows of data that you'd like to add to an existing query, you append the query. There are two append options: * Append queries as new displays the Append dialog box to create a new query by appending multiple tables. * Append queries displays the Append dialog box to add additional tables to the current query. Incorrect: When you have one or more columns that you'd like to add to another query, you merge the queries. Box 2: Disable loading the query to the data model By default, all queries from Query Editor will be loaded into the memory of Power BI Model. You can disable the load for some queries, especially queries that used as intermediate transformation to produce the final query for the model. 7 Disabling Load doesn’t mean the query won’t be refreshed, it only means the query won’t be loaded into the memory. When you click on Refresh model in Power BI, or when a scheduled refresh happens even queries marked as Disable Load will be refreshed, but their data will be used as intermediate source for other queries instead of loading directly into the model. This is a very basic performance tuning tip, but very important when your Power BI model grows bigger and bigger. Reference: https://docs.microsoft.com/en-us/power-query/append-queries https://radacad.com/performance-tip-for-power-bi-enable-load-sucks-memory-up 6. DRAG DROP In Power Query Editor, you have three queries named ProductCategory, ProductSubCategory, and Product. Every Product has a ProductSubCategory. Not every ProductsubCategory has a parent ProductCategory. You need to merge the three queries into a single query. The solution must ensure the following: No data should be removed from the original queries. Provide the best performance in Power Query. How should you merge the tables? To answer, drag the appropriate merge types to the correct queries. Each merge type may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point. Explanation: Box 1: Inner 8 Every Product has a ProductSubCategory. A standard join is needed. One of the join kinds available in the Merge dialog box in Power Query is an inner join, which brings in only matching rows from both the left and right tables. Box 2: Left outer Not every ProductsubCategory has a parent ProductCategory. One of the join kinds available in the Merge dialog box in Power Query is a left outer join, which keeps all the rows from the left table and brings in any matching rows from the right table. Reference: https://docs.microsoft.com/en-us/power-query/merge-queries-inner https://docs.microsoft.com/en-us/power-query/merge-queries-left-outer 7. You are building a Power BI report that uses data from an Azure SQL database named erp1. You import the following tables. You need to perform the following analyses: Orders sold over time that include a measure of the total order value Orders by attributes of products sold The solution must minimize update times when interacting with visuals in the report. What should you do first? A. From Power Query, merge the Order Line Items query and the Products query. B. Create a calculated column that adds a list of product categories to the Orders table by using a DAX function. C. Calculate the count of orders per product by using a DAX function. D. From Power Query, merge the Orders query and the Order Line Items query. Explanation: A merge queries operation joins two existing tables together based on matching values from one or multiple columns. Join the Orders and the Order Line Items tables. Reference: https://docs.microsoft.com/en-us/power-query/merge-queries-overview 9 8. You have a Microsoft SharePoint Online site that contains several document libraries. One of the document libraries contains manufacturing reports saved as Microsoft Excel files. All the manufacturing reports have the same data structure. You need to use Power BI Desktop to load only the manufacturing reports to a table for analysis. What should you do? A. Get data from a SharePoint folder and enter the site URL. Select Transform, then filter by the folder path to the manufacturing reports library. B. Get data from a SharePoint list and enter the site URL. Select Combine & Transform, then filter by the folder path to the manufacturing reports library. C. Get data from a SharePoint folder, enter the site URL, and then select Combine & Load. D. Get data from a SharePoint list, enter the site URL, and then select Combine & Load. Explanation: Get Data from SharePoint folder + select Combine & Load to load the data from all of the files in the SharePoint folder directly into your app. Note: Connect to a SharePoint folder from Power Query Desktop To connect to a SharePoint folder: 1. From Get Data, select SharePoint folder. 2. Paste the SharePoint site URL you copied in Determine the site URL to the Site URL text box in the SharePoint folder dialog box. In this example, the site URL is https://contoso.sharepoint.com/marketing/data. If the site URL you enter is invalid, a warning icon. warning icon will appear next to the URL text box. SharePoint folder selection. 3. Select OK to continue. 4. If this is the first time you've visited this site address, select the appropriate authentication method. Enter your credentials and choose which level to apply these settings to. Then select Connect. 5. When you select the SharePoint folder you want to use, the file information about all of the files in that SharePoint folder are displayed. In addition, file information about any files in any subfolders is also displayed. 6. Select Combine & Transform Data to combine the data in the files of the selected SharePoint folder and load the data into the Power Query Editor for editing. Or select Combine & Load to load the data from all of the files in the SharePoint folder directly into your app. 10 Reference: https://docs.microsoft.com/en-us/power-query/connectors/sharepointfolder 9. DRAG DROP You have a Microsoft Excel workbook that contains two sheets named Sheet1 and Sheet2. Sheet1 contains the following table named Table1. Sheet2 contains the following table named Table2. You need to use Power Query Editor to combine the products from Table1 and Table2 into the following table that has one column containing no duplicate values. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 11 10. You have a CSV file that contains user complaints. The file contains a column named Logged. Logged contains the date and time each complaint occurred. The data in Logged is in the following format: 2018-12-31 at 08:59. You need to be able to analyze the complaints by the logged date and use a built-in date hierarchy. What should you do? A. Apply a transformation to extract the last 11 characters of the Logged column and set the data type of the new column to Date. B. Change the data type of the Logged column to Date. C. Split the Logged column by using at as the delimiter. D. Apply a transformation to extract the first 11 characters of the Logged column. Explanation: For this question these are the correct answers: * Create a column by example that starts with 2018-12-31 and set the data type of the new column to Date. We need a date column. * Split the Logged column by using at as the delimiter. We need a date column. We extract both the date and the time by splitting the column. For this question these are the incorrect answers: *Add a conditional column that outputs 2018 if the Logged column starts with 2018 and set the data type of the new column to Whole Number. * Apply the Parse function from the Data transformations options to the Logged column. * Apply a transformation to extract the first 11 characters of the Logged column. Extracts the date, but we would need set the column type to Date as well. 12 * Apply a transformation to extract the last 11 characters of the Logged column and set the data type of the new column to Date. * Change the data type of the Logged column to Date. * Create a column by example that starts with 2018-12-31. Note: What Is a Date Hierarchy? The Date hierarchy is a way of structuring your Date table for comfortable drilling down or drilling up the data based on different grouped time slices. For example, you can create the Date hierarchy to put months under years and weeks under months. This allows you to view such fields as sales for the whole year. Plus, you can then drill down to the month when the sale happened in a particular year, and drill down further to a particular week of that month. Examples of Date Hierarchies We can create different types of (custom) Date hierarchy including the following: Year-Month Hierarchy. Year-Quarter-Month Hierarchy. Year-Month-Week Hierarchy. Note 2: A CSV is a comma-separated values file, which allows data to be saved in a tabular format. CSVs look like a garden-variety spreadsheet but with a. csv extension. CSV files can be used with most any spreadsheet program, such as Microsoft Excel or Google Spreadsheets. CSV files have no data types, but when you import CSV files into PowerBI you can add data types. Reference: https://codingsight.com/create-and-use-custom-date-hierarchies-in-power-bi https://www.bigcommerce.com/ecommerce-answers/what-csv-file-and-what-does-it-mean-my-ecommerce- business/ 13 11. You have a Microsoft Excel file in a Microsoft OneDrive folder. The file must be imported to a Power Bl dataset. You need to ensure that the dataset can be refreshed in PowerBi.com. Which two connectors can you use to connect to the file? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. Available Choices (select all choices that are correct) A. Excel Workbook B. Text/CSV C. Folder D. SharePoint folder b Explanation: A: Connect to an Excel workbook from Power Query Desktop To make the connection from Power Query Desktop: 1. Select the Excel option in the connector selection. 2. Browse for and select the Excel workbook you want to load. Then select Open. 3. Etc. C: Folder connector capabilities supported Folder path Combine Combine and load Combine and transform Connect to a folder from Power Query Online To connect to a folder from Power Query Online: 1. Select the Folder option in the connector selection. 2. Enter the path to the folder you want to load. Reference: https://docs.microsoft.com/en-us/power-query/connectors/excel https://docs.microsoft.com/en-us/power-query/connectors/folder 14 12. HOTSPOT You are profiling data by using Power Query Editor. You have a table named Reports that contains a column named State. The distribution and quality data metrics for the data in State is shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. 15 Explanation: Box 1: 69 69 distinct/different values. Note: Column Distribution allows you to get a sense for the overall distribution of values within a column in your data previews, including the count of distinct values (total number of different values found in a given column) and unique values (total number of values that only appear once in a given column). Box 2: 4 Reference: https://systemmanagement.ro/2018/10/16/power-bi-data-profiling-distinct-vs-unique/ 13. HOTSPOT You have two CSV files named Products and Categories. The Products file contains the following columns: ProductID ProductName SupplierID CategoryID The Categories file contains the following columns: CategoryID CategoryName CategoryDescription From Power BI Desktop, you import the files into Power Query Editor. You need to create a Power BI dataset that will contain a single table named Product. The Product will include the following columns: ProductID 16 ProductName SupplierID CategoryID CategoryName CategoryDescription How should you combine the queries, and what should you do on the Categories query? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Explanation: Box 1: Merge There are two primary ways of combining queries: merging and appending. * When you have one or more columns that you’d like to add to another query, you merge the queries. * When you have additional rows of data that you’d like to add to an existing query, you append the query. Box 2: Disable the query load Managing loading of queries In many situations, it makes sense to break down your data transformations in multiple queries. One popular example is merging where you merge two queries into one to essentially do a join. In this type of situations, some queries are not relevant to load into Desktop as they are intermediate steps, while they are still required for your data transformations to work correctly. For these queries, you can make sure they are not loaded in Desktop by un-checking 'Enable load' in the context menu of the query in Desktop or in the Properties screen: Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-shape-and-combine-data https://docs.microsoft.com/en-us/power-bi/connect-data/refresh-include-in-report-refresh 17 14. You have an Azure SQL database that contains sales transactions. The database is updated frequently. You need to generate reports from the data to detect fraudulent transactions. The data must be visible within five minutes of an update. How should you configure the data connection? A. Add a SQL statement. B. Set the Command timeout in minutes setting. C. Set Data Connectivity mode to Import. D. Set Data Connectivity mode to DirectQuery. Explanation: DirectQuery: No data is imported or copied into Power BI Desktop. For relational sources, the selected tables and columns appear in the Fields list. For multi-dimensional sources like SAP Business Warehouse, the dimensions and measures of the selected cube appear in the Fields list. As you create or interact with a visualization, Power BI Desktop queries the underlying data source, so you’re always viewing current data. Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-use-directquery 15. DRAG DROP You have a folder that contains 100 CSV files. You need to make the file metadata available as a single dataset by using Power BI. The solution must NOT store the data of the CSV files. Which three actions should you perform in sequence. To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 18 Explanation: Step 1: From Power BI Desktop, Select Get Data, and then Select Folder. Open Power BI Desktop and then select Get Data\More… and choose Folder from the All options on the left. Enter the folder path, select OK, and then select Transform data to see the folder's files in Power Query Editor. Step 2: From Power Query Editor, expand the Attributes column. Step 3: From Power Query Editor, combine the Content column. 19 Combine files behavior To combine binary files in Power Query Editor, select Content (the first column label) and select Home > Combine Files. Or you can just select the Combine Files icon next to Content. Reference: https://docs.microsoft.com/en-us/power-bi/transform-model/desktop-combine-binaries 16. A business intelligence (BI) developer creates a dataflow in Power BI that uses DirectQuery to access tables from an on-premises Microsoft SQL server. The Enhanced Dataflows Compute Engine is turned on for the dataflow. You need to use the dataflow in a report. The solution must meet the following requirements: Minimize online processing operations. Minimize calculation times and render times for visuals. Include data from the current year, up to and including the previous day. What should you do? A. Create a dataflows connection that has DirectQuery mode selected. B. Create a dataflows connection that has DirectQuery mode selected and configure a gateway connection for the dataset. C. Create a dataflows connection that has Import mode selected and schedule a daily refresh. D. Create a dataflows connection that has Import mode selected and create a Microsoft Power Automate solution to refresh the data hourly. 20 Explanation: A daily update is adequate. When you set up a refresh schedule, Power BI connects directly to the data sources using connection information and credentials in the dataset to query for updated data, then loads the updated data into the dataset. Any visualizations in reports and dashboards based on that dataset in the Power BI service are also updated. Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/refresh-desktop-file-local-drive 17. DRAG DROP You publish a dataset that contains data from an on-premises Microsoft SQL Server database. The dataset must be refreshed daily. You need to ensure that the Power BI service can connect to the database and refresh the dataset. Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 21 Explanation: Step 1: Configure an on-premises data gateway. The Refresh of data source from Power BI Service requires to set up a on premise gateway in your machine/ server. This will help establish connection between your Power BI reports and SQL Server dataset in Power BI Service. Step 2: Add a data source Configure data source and dataset settings in Power BI for SQL Server connectivity through a data gateway. Add a gateway data source. Step 3: Add the dataset owner to the data source. After you add a data source to a gateway, you give users and email-enabled security groups access to the specific data source (not the entire gateway). The access list for the data source controls only who is allowed to publish reports that include data from the data source. Report owners can create dashboards and apps, and then share those items with other users. Add users to a data source 1. From the page header in the Power BI service, select Settings > Manage gateways. 2. Select the data source where you want to add users. 3. Select Manage Users from the top ribbon 22 4. Enter the users and mail-enabled security groups from your organization who will access the selected data source and assign the user role. Select Share, and the added member's name is added to the list of people who can publish reports that use this data source. Step 4: Configure a scheduled refresh. Configure a refresh schedule to ensure your Power BI dataset has recent data. After you create the data source, it's available to use with either DirectQuery connections or through scheduled refresh. Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/service-gateway-sql-tutorial https://learn.microsoft.com/en-us/power-bi/connect-data/service-gateway-data- sources?source=recommendations https://docs.microsoft.com/en-us/power-bi/connect-data/service-gateway-sql-tutorial 18. You attempt to connect Power BI Desktop to a Cassandra database. From the Get Data connector list, you discover that there is no specific connector for the Cassandra database. You need to select an alternate data connector that will connect to the database. Which type of connector should you choose? A. Microsoft SQL Server database B. ODBC C. OLE DB D. OData Explanation: With built-in support for ODBC on Microsoft Windows, CData ODBC Drivers provide self-service integration with self-service analytics tools such as Microsoft Power BI. The CData ODBC Driver for Cassandra links your Power BI reports to operational Cassandra data. Reference: https://www.cdata.com/kb/tech/cassandra-odbc-power-bi.rst 23 19. DRAG DROP You receive annual sales data that must be included in Power BI reports. From Power Query Editor, you connect to the Microsoft Excel source shown in the following exhibit. You need to create a report that meets the following requirements: Visualizes the Sales value over a period of years and months Adds a slicer for the month Adds a slicer for the year Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 24 Explanation: Step 1: Select the Month and MonthNumber columns. Step 2: Select Unpivot other columns. Unpivot other columns This command unpivots unselected columns. Use this command in a query when not all columns are known. New columns added during a refresh operation are also unpivoted. 1. To open a query, locate one previously loaded from the Power Query Editor, select a cell in the data, and then select Query > Edit. For more information see Create, load, or edit a query in Excel. 2. Select the columns you don't want to unpivot. To select more than one column contiguously or discontiguously, press Shift+Click or CTRL+Click on each subsequent column. 3. Select Transform > Unpivot Other Columns. Step 3: Rename the Attribute column as Year the the Value column as Sales. You might want to unpivot data, sometimes called flattening the data, to put it in a matrix format so that all similar values are in one column. This is necessary, for example, to create a chart or a report. Reference: https://support.microsoft.com/en-us/office/unpivot-columns-power-query-0f7bad4b-9ea1-49c1-9d95- f588221c7098 25 20. HOTSPOT You are using Power BI Desktop to connect to an Azure SQL database. The connection is configured as shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct solution is worth one point. 26 Explanation: Box 1: 10 minutes Command timeout in minutes - If your connection lasts longer than 10 minutes (the default timeout), you can enter another value in minutes to keep the connection open longer. This option is only available in Power Query Desktop. Box 2: only tables that contain data Navigate using full hierarchy - If checked, the navigator displays the complete hierarchy of tables in the database you're connecting to. If cleared, the navigator displays only the tables whose columns and rows contain data. Reference: https://docs.microsoft.com/en-us/power-query/connectors/azuresqldatabase 21. HOTSPOT You have the Azure SQL databases shown in the following table. You plan to build a single PBIX file to meet the following requirements: Data must be consumed from the database that corresponds to each stage of the development lifecycle. Power BI deployment pipelines must NOT be used. The solution must minimize administrative effort. What should you do? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. 27 22. You are creating a query to be used as a Country dimension in a star schema. A snapshot of the source data is shown in the following table. You need to create the dimension. The dimension must contain a list of unique countries. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. A. Delete the Country column. B. Remove duplicates from the table. C. Remove duplicates from the City column. D. Delete the City column. E. Remove duplicates from the Country column. Reference: https://learn.microsoft.com/en-us/power-bi/guidance/star-schema 23. DRAG DROP You use Power Query Editor to preview the data shown in the following exhibit. 28 You need to clean and transform the query so that all the rows of data are maintained, and error values in the discount column are replaced with a discount of 0.05. The solution must minimize administrative effort. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Explanation: Step 1: Select the discount column Step 2: Select Replace errors to replace each error value with 0.05 Make the replacement before the data type. Note: Replace errors If instead of removing rows with errors, you want to replace the errors with a fixed value, you can do so as well. To replace rows that have errors, first select the column that contains errors. On the Transform tab, in the Any column group, select Replace values. From the drop-down menu, select Replace errors. Step 3: For the discount column, change Data Type to Decimal number Reference: https://learn.microsoft.com/en-us/power-query/dealing-with-errors 24. HOTSPOT You attempt to use Power Query Editor to create a custom column and receive the error message shown in the following exhibit. 29 Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Explanation: Box 1: mismatched data types One text, the other number. Box 2: A1 Combination operator (&) For example: "A" & "BC" -> Text concatenation: "ABC" Reference: https://learn.microsoft.com/en-us/powerquery-m/operators 25. From Power Query Editor, you attempt to execute a query and receive the following error message. Datasource. Error: Could not find file. What are two possible causes of the error? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. You do not have permissions to the file. B. An incorrect privacy level was used for the data source. C. The file is locked. D. The referenced file was moved to a new location. 30 26. You have data in a Microsoft Excel worksheet as shown in the following table. You need to use Power Query to clean and transform the dataset. The solution must meet the following requirements: If the discount column returns an error, a discount of 0.05 must be used. All the rows of data must be maintained. Administrative effort must be minimized. What should you do in Power Query Editor? A. Select Replace Errors. B. Edit the query in the Query Errors group. C. Select Remove Errors. D. Select Keep Errors. Explanation: Replace errors If instead of removing rows with errors, you want to replace the errors with a fixed value, you can do so as well. To replace rows that have errors, first select the column that contains errors. On the Transform tab, in the Any column group, select Replace values. From the drop-down menu, select Replace errors. 31 Incorrect: Not C: Remove errors To remove rows with errors in Power Query, first select the column that contains errors. On the Home tab, in the Reduce rows group, select Remove rows. From the drop-down menu, select Remove errors. Reference: https://learn.microsoft.com/en-us/power-query/dealing-with-errors 27. You have a CSV file that contains user complaints. The file contains a column named Logged. Logged contains the date and time each complaint occurred. The data in Logged is in the following format: 2018-12-31 at 08:59. You need to be able to analyze the complaints by the logged date and use a built-in date hierarchy. What should you do? A. Apply the Parse function from the Data transformations options to the Logged column. B. Change the data type of the Logged column to Date. C. Split the Logged column by using at as the delimiter. D. Create a column by example that starts with 2018-12-31. Explanation: For this question these are the correct answers: * Create a column by example that starts with 2018-12-31 and set the data type of the new column to Date. We need a date column. * Split the Logged column by using at as the delimiter. We need a date column. We extract both the date and the time by splitting the column. For this question these are the incorrect answers: *Add a conditional column that outputs 2018 if the Logged column starts with 2018 and set the data type of the new column to Whole Number. * Apply the Parse function from the Data transformations options to the Logged column. * Apply a transformation to extract the first 11 characters of the Logged column. Extracts the date, but we would need set the column type to Date as well. * Apply a transformation to extract the last 11 characters of the Logged column and set the data type of the new column to Date. * Change the data type of the Logged column to Date. * Create a column by example that starts with 2018-12-31. Note: What Is a Date Hierarchy? The Date hierarchy is a way of structuring your Date table for comfortable drilling down or drilling up the data based on different grouped time slices. 32 For example, you can create the Date hierarchy to put months under years and weeks under months. This allows you to view such fields as sales for the whole year. Plus, you can then drill down to the month when the sale happened in a particular year, and drill down further to a particular week of that month. Examples of Date Hierarchies We can create different types of (custom) Date hierarchy including the following: Year-Month Hierarchy. Year-Quarter-Month Hierarchy. Year-Month-Week Hierarchy. Note 2: A CSV is a comma-separated values file, which allows data to be saved in a tabular format. CSVs look like a garden-variety spreadsheet but with a. csv extension. CSV files can be used with most any spreadsheet program, such as Microsoft Excel or Google Spreadsheets. CSV files have no data types, but when you import CSV files into PowerBI you can add data types. Reference: https://codingsight.com/create-and-use-custom-date-hierarchies-in-power-bi https://www.bigcommerce.com/ecommerce-answers/what-csv-file-and-what-does-it-mean-my-ecommerce- business/ 28. DRAG DROP You have two Microsoft Excel workbooks in a Microsoft OneDrive folder. Each workbook contains a table named Sales. The tables have the same data structure in both workbooks. You plan to use Power BI to combine both Sales tables into a single table and create visuals based on the data in the table. The solution must ensure that you can publish a separate report and dataset. Which storage mode should you use for the report file and the dataset file? To answer, drag the appropriate modes to the correct files. Each mode may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point. Explanation: 33 Box 1: Import Report file: Import or upload Excel data There are two ways to explore Excel data in Power BI: upload and import. When you upload your workbook, it appears in Power BI just like it would in Excel Online. But you also have some great features to help you pin elements from your worksheets to your dashboards. When you import your data, Power BI imports any supported data in tables and any data model into a new Power BI dataset. Upload to Power BI You can use the Upload button to upload files to the Power BI service. In the workspace where you want to add the file, select Upload at the top of the page. In the drop-down list, select: OneDrive for Business to connect to files that are stored in OneDrive for Business. SharePoint to connect to files on any SharePoint site that you have access to. Browse to upload files from your computer. Box 2: Direct Query Dataset file: Note: In Microsoft Power BI Desktop, you can specify the storage mode of a table. The storage mode lets you control whether or not Power BI Desktop caches table data in-memory for reports. Setting the storage mode provides many advantages. You can set the storage mode for each table individually in your model. This action enables a single dataset, You set the Storage mode property to one of these three values: Import: Imported tables with this setting are cached. Queries submitted to the Power BI dataset that return data from Import tables can be fulfilled only from cached data. DirectQuery: Tables with this setting aren't cached. Queries that you submit to the Power BI dataset—for example, DAX queries—and that return data from DirectQuery tables can be fulfilled only by executing on- demand queries to the data source. Queries that you submit to the data source use the query language for that data source, for example, SQL. Dual: Tables with this setting can act as either cached or not cached, depending on the context of the query that's submitted to the Power BI dataset. In some cases, you fulfill queries from cached data. In other cases, you fulfill queries by executing an on-demand query to the data source. Changing the Storage mode of a table to Import is an irreversible operation. After this property is set, it can't later be changed to either DirectQuery or Dual. Reference: https://learn.microsoft.com/en-us/power-bi/connect-data/service-excel-workbook-files#import-or-upload-excel- data https://learn.microsoft.com/en-us/power-bi/transform-model/desktop-storage-mode 34 29. You use Power Query to import two tables named Order Header and Order Details from an Azure SQL database. The Order Header table relates to the Order Details table by using a column named Order ID in each table. You need to combine the tables into a single query that contains the unique columns of each table. What should you select in Power Query Editor? A. Merge queries B. Combine files C. Append queries Explanation: A merge queries operation joins two existing tables together based on matching values from one or multiple columns. You can choose to use different types of joins, depending on the output you want. Incorrect: * Append queries (Power Query) An append operation creates a new query that contains all rows from a first query followed by all rows from a second query. The append operation requires at least two queries. These queries can also be based on different external data sources. Reference: https://learn.microsoft.com/en-us/power-query/merge-queries-overview https://support.microsoft.com/en-au/office/append-queries-power-query-e42ca582-4f62-4a43-b37f- 99e2b2a4813a 30. HOTSPOT You have a folder that contains 50 JSON files. You need to use Power BI Desktop to make the metadata of the files available as a single dataset. The solution must NOT store the data of the JSON files. 35 Which type of data source should you use, and which transformation should you perform? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Explanation: Box 1: Folder Data source type Data sources in Power BI Desktop File data sources The File category provides the following data connections: Excel Workbook Text/CSV XML JSON Folder PDF Parquet SharePoint folder Box 2: Delete the Content column Transformation Delete the Content Reference: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-data-sources 31. You have a PBIX file that imports data from a Microsoft Excel data source stored in a file share on a local network. You are notified that the Excel data source was moved to a new location. You need to update the PBIX file to use the new location. What are three ways to achieve the goal? Each correct answer presents a complete solution. A. From the Datasets settings of the Power BI service, configure the data source credentials. B. From the Data source settings in Power BI Desktop, configure the file path. 36 C. From Current File in Power BI Desktop, configure the Data Load settings. D. From Power Query Editor, use the formula bar to configure the file path for the applied step. E. From Advanced Editor in Power Query Editor, configure the file path in the M code. Explanation: B: 1. Open the file in Power BI desktop. 2. Go to File > Options and settings > Data source settings. 3. Ensure that 'Data sources in current file' is selected, choose the data source the needs to be changed, then click Change Source. Reference: https://community.fabric.microsoft.com/t5/Service/How-to-change-Data-source-in-Power-Bi-service/m-p/712947 32. You use Power Query Editor to preview a query that contains sales order data in the following columns: Tax Amount Sales Order ID Freight Amount Subtotal Amount Total Item Quantity The Sales Order ID column uniquely identifies each sales order. The Subtotal Amount and Total Item Quantity columns are always populated, but the Tax Amount and Freight Amount columns are sometimes null when an order has no associated amount. You need to query the data to identify the following metrics by month: The average item quantity per order The average freight amount per order The maximum item quantity per order How should you modify the query? A. In the Total Item Quantity column, replace the null values with 0. B.In the Tax Amount column, remove rows that contain null values. C. In the Freight Amount column, remove rows that contain null values. D. In the Freight Amount column, replace the null values with 0. Explanation: Incorrect: Not A: No null values in the Total Item Quantity column. Not B: Tax Amount not used in the metrics. Not C: We need to use the rows for calculations, not remove them. 33. HOTSPOT You have a Power BI semantic model that contains the data sources shown in the following table. 37 You need to configure the privacy level s of the data sources. What should you configure for each data source? To answer, select the appropriate options in the answer area. NOTE: Each correct answer is worth one point. Explanation: Box 1: Private Employee review data: Privacy Levels for Cloud Data Sources There are three levels: Private, Organizational, and Public. Private data sources are completely isolated from other data sources. Organizational data sources are isolated from all public data sources but are visible to other organizational data sources. And public data sources are not isolated at all. Box 2: Organizational Sales opportunities: Reference: https://powerbi.microsoft.com/en-us/blog/privacy-levels-for-cloud-data-sources 34. You plan to use Power BI Desktop to create a bug tracking dashboard that will pull data from Analytics in Azure DevOps. From Power BI Desktop, you need to configure a data connector to authenticate to Azure DevOps. The solution must meet the following requirements: Use Analytics views. Filter data from the cloud. 38 Which connector should you use? A. OData queries B. Azure DevOps (Boards only) C. Azure DevOps Server (Boards only) D. OData Feed Explanation: Azure DevOps, Connect Analytics with Power BI Data Connector You can establish a seamless connection between Power BI and Azure DevOps by using Data Connector. This integration allows you to extract valuable insights from your Azure DevOps data and create compelling reports within Power BI. Power BI Data Connector uses Analytics views. Important Analytics views only support Azure Boards data (work items). Analytics views don't support other data types, such as pipelines. Connect to an Analytics view 1. Open Power BI Desktop. 2. Sign in to the service. Upon first-time access, you're required to sign in and have your credentials authenticated. Choose between Windows or Personal Access Token to authenticate. Power BI Desktop saves your credentials so you only have to do so once. 3. Select Connect upon verification of your credentials. 4. Select Get Data > Online Services, Azure DevOps (Boards only) for cloud services or select Azure DevOps Server (Boards only) for on-premises. Then select Connect. Analytics views only support queries against work items and test cases. 39 5. Specify the basic parameters to connect to your data. Reference: https://learn.microsoft.com/en-us/azure/devops/report/powerbi/data-connector-connect 40 35. HOTSPOT You use Power Query Editor to preview the data shown in the following exhibit. You confirm that the data will always start on row 3, and row 3 will always contain the column names. How should you shape the query? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Explanation: Box 1: Remove top rows To configure the query to start on row 3, select……. Why do we want to remove rows from our reports? 41 Power BI is a useful tool that can do some automated processing on reports when we import them. However, this process isn’t typically sufficient, and there will often be cases where we will have to make manual adjustments to a report after it has been exported from the source. This can happen when reports are exported from the data source in formats that are more difficult to interpret, such as a text file, or when they include elements that aren’t part of the data set, like page numbers or report dates. It can also occur when the original data contains an incorrect entry. One of the easiest ways to correct these issues is simply to remove the rows that contain them. We can do that in Power Query Editor using a function called Remove Rows. (If you are unfamiliar with Power Query Editor, it may be beneficial to take a moment to learn a bit about it, but it’s not necessary to complete the tutorial we will be covering here.) The Remove Rows function can be used to complete these six different types of row removals: Remove Blank Rows: removes every blank row within a query *-> Remove Top Rows: removes a specified number of rows from the top of a report Remove Bottom Rows: removes a specified number of rows from the bottom of a report Remove Errors: removes rows that contain errors within a specified column or column(s) Remove Duplicates: removes rows that contain identical values within a specified column or column(s) Remove Alternate Rows: removes and keeps rows within a query using a specified pattern. Box 2: Use first row as headers Promote the First Row to Headers When Power Query Editor connects to a source it will typically use the first row of the data set as column headers. However, in cases where the values in the first row are not formatted using the same conventions as the values that are in the data set, Power Query Editor will not automatically assign column Headers. We need to properly set the column Headers in order to set the data type in the next step. If we do not, Power Query Editor will incorrectly flag the Salary column header as an Error when we change the data type of the Salary Column from Text to Fixed Decimal Number. 1. Navigate to the Transform tab 2. Click the Use First Row as Headers button Reference: https://iterationinsights.com/article/power-bi-basics-cleansing-reports-using-the-remove-rows-function-in-power- query-editor/ 42 36. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have a data source that contains a column. The column contains case sensitive data. You have a Power BI semantic model in DirectQuery mode. You connect to the model and discover that it contains undefined values and errors. You need to resolve the issue. Solution: You implicitly convert the values into the required type. Does this meet the goal? A. Yes B. No Explanation: Correct: * You normalize casing in the source query or Power Query Editor. Because the engine that stores and queries data in Power BI is case insensitive, take special care when you work in DirectQuery mode with a case-sensitive source. Power BI assumes that the source has eliminated duplicate rows. Because Power BI is case insensitive, it treats two values that differ only by case as duplicate, whereas the source might not treat them as such. In such cases, the final result is undefined. To avoid this situation, if you use DirectQuery mode with a case-sensitive data source, normalize casing in the source query or in Power Query Editor. Incorrect: * You add an index key and normalize casing in the data source. * You change the semantic model mode. * You implicitly convert the values into the required type. Reference: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-data-types 43 37. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have a data source that contains a column. The column contains case sensitive data. You have a Power BI semantic model in DirectQuery mode. You connect to the model and discover that it contains undefined values and errors. You need to resolve the issue. Solution: You change the semantic model mode. Does this meet the goal? A. Yes B. No Explanation: Correct: * You normalize casing in the source query or Power Query Editor. Because the engine that stores and queries data in Power BI is case insensitive, take special care when you work in DirectQuery mode with a case-sensitive source. Power BI assumes that the source has eliminated duplicate rows. Because Power BI is case insensitive, it treats two values that differ only by case as duplicate, whereas the source might not treat them as such. In such cases, the final result is undefined. To avoid this situation, if you use DirectQuery mode with a case-sensitive data source, normalize casing in the source query or in Power Query Editor. Incorrect: * You add an index key and normalize casing in the data source. * You change the semantic model mode. * You implicitly convert the values into the required type. Reference: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-data-types 38. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have a data source that contains a column. The column contains case sensitive data. You have a Power BI semantic model in DirectQuery mode. You connect to the model and discover that it contains undefined values and errors. 44 You need to resolve the issue. Solution: You normalize casing in the source query or Power Query Editor. Does this meet the goal? A. Yes B. No 39. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have a data source that contains a column. The column contains case sensitive data. You have a Power BI semantic model in DirectQuery mode. You connect to the model and discover that it contains undefined values and errors. You need to resolve the issue. Solution: You add an index key and normalize casing in the data source. Does this meet the goal? A. Yes B. No Explanation: Correct: * You normalize casing in the source query or Power Query Editor. Because the engine that stores and queries data in Power BI is case insensitive, take special care when you work in DirectQuery mode with a case-sensitive source. Power BI assumes that the source has eliminated duplicate rows. Because Power BI is case insensitive, it treats two values that differ only by case as duplicate, whereas the source might not treat them as such. In such cases, the final result is undefined. To avoid this situation, if you use DirectQuery mode with a case-sensitive data source, normalize casing in the source query or in Power Query Editor. Incorrect: * You add an index key and normalize casing in the data source. * You change the semantic model mode. * You implicitly convert the values into the required type. Reference: 45 https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-data-types 40. You have a Microsoft Excel file in a Microsoft OneDrive folder. The file must be imported to a Power BI semantic model. You need to ensure that the semantic model can be refreshed in PowerBi.com. Which two connectors can you use to connect to the file? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Web B. Excel Workbook C. Folder D. Text/CSV E. SharePoint folder Explanation: B: Connect to an Excel workbook from Power Query Desktop To make the connection from Power Query Desktop: 1. Select the Excel option in the connector selection. 2. Browse for and select the Excel workbook you want to load. Then select Open. 3. Etc. C: Folder connector capabilities supported Folder path Combine Combine and load Combine and transform Connect to a folder from Power Query Online To connect to a folder from Power Query Online: 1. Select the Folder option in the connector selection. 2. Enter the path to the folder you want to load. Note: 46 Reference: https://docs.microsoft.com/en-us/power-query/connectors/excel https://docs.microsoft.com/en-us/power-query/connectors/folder 47 Model the Data 1. You are creating a report in Power BI Desktop. You load a data extract that includes a free text field named col1. You need to analyze the frequency distribution of the string lengths in col1. The solution must not affect the size of the model.What should you do? A. In the report, add a DAX calculated column that calculates the length of col1 B. In the report, add a DAX function that calculates the average length of col1 C. From Power Query Editor, add a column that calculates the length of col1 D. From Power Query Editor, change the distribution for the Column profile to group by length for col1 Explanation: The LEN DAX function returns the number of characters in a text string. Note: DAX is a collection of Power BI functions, operators, and constants that can be used in a formula, or expression, to calculate and return one or more values. Stated more simply, DAX helps you create new information from data already in your model. Reference: https://docs.microsoft.com/en-us/dax/len-function-dax 2. You have a collection of reports for the HR department of your company. The datasets use row-level security (RLS). The company has multiple sales regions. Each sales region has an HR manager. You need to ensure that the HR managers can interact with the data from their region only. The HR managers must be prevented from changing the layout of the reports. How should you provision access to the reports for the HR managers? A. Publish the reports in an app and grant the HR managers access permission. B. Create a new workspace, copy the datasets and reports, and add the HR managers as members of the workspace. C. Publish the reports to a different workspace other than the one hosting the datasets. D. Add the HR managers as members of the existing workspace that hosts the reports and the datasets. Reference: https://kunaltripathy.com/2021/10/06/bring-your-power-bi-to-power-apps-portal-part-ii/ 3. You need to provide a user with the ability to add users to the member role of a workspace. The solution must use the principle of least privilege. Which role should you assign to the user? A. Viewer B. Admin C. Contributor D. Member 48 Explanation: Member role allows adding members or other with lower permissions to the workspace. Reference: https://docs.microsoft.com/en-us/power-bi/collaborate-share/service-roles-new-workspaces 4. You have a Power BI query named Sales that imports the columns shown in the following table. Users only use the date part of the Sales_Date field. Only rows with a Status of Finished are used in analysis. You need to reduce the load times of the query without affecting the analysis. Which two actions achieve this goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Remove the rows in which Sales [Status] has a value of Canceled. B. Remove Sales [Sales Date]. C. Change the data type of Sale [Delivery Time] to Integer. D. Split Sales [Sale Date] into separate date and time columns. 49 E. Remove Sales [Canceled Date]. Explanation: A: Removing uninteresting rows will increase query performance. D: Splitting the Sales_Date column will make comparisons on the Sales date faster 5. You build a report to analyze customer transactions from a database that contains the tables shown in the following table. You import the tables. Which relationship should you use to link the tables? A. one-to-many from Transaction to Customer B. one-to-one between Customer and Transaction C. many-to-many between Customer and Transaction D. one-to-many from Customer to Transaction Explanation: One on the primary Key side (customer table), many on the foreign key side (Transaction table) of the relation 6. You have a custom connector that returns ID, From, To, Subject, Body, and Has Attachments for every email sent during the past year. More than 10 million records are returned. You build a report analyzing the internal networks of employees based on whom they send emails to. You need to prevent report recipients from reading the analyzed emails. The solution must minimize the model size. What should you do? A. From Model view, set the Subject and Body columns to Hidden. B. Remove the Subject and Body columns during the import. C. Implement row-level security (RLS) so that the report recipients can only see results based on the emails they sent. 50 Explanation: The Subject and the Body are not needed in the report. Dropping them resolves the security problem and minimizes the model. 7. HOTSPOT You create a Power BI semantic model that contains the table shown in the following exhibit. You need to make the table available as an organizational data type in Microsoft Excel. How should you configure the properties of the table? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. 51 Explanation: Box 1: Cost Center The Row label field value is used in Excel so users can easily identify the row. It appears as the cell value for a linked cell, in the Data Selector pane, and in the Information card. Box 2: ID The Key column field value provides the unique ID for the row. This value enables Excel to link a cell to a specific row in the table. Box 3: Yes In the Data Types Gallery in Excel, your users can find data from featured tables in your Power BI datasets. Reference: https://docs.microsoft.com/en-us/power-bi/collaborate-share/service-create-excel-featured-tables 52 8. You have the Power BI model shown in the following exhibit. A manager can represent only a single country. You need to use row-level security (RLS) to meet the following requirements: The managers must only see the data of their respective country. The number of RLS roles must be minimized. Which two actions should you perform? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Create a single role that filters Country [Manager Email] by using the USERNAME DAX function. B. Create a single role that filters Country [Manager Email] by using the USEROBJECTID DAX function. C. For the relationship between Purchase Detail and Purchase, select Apply security filter in both directions. D. Create one role for each country. E. For the relationship between Purchase and Purchase Detail, change the Cross filter direction to Single. Explanation: A: You can take advantage of the DAX functions username() or userprincipalname() within your dataset. You can use them within expressions in Power BI Desktop. When you publish your model, it will be used within the Power BI service. Note: To define security roles, follow these steps. Import data into your Power BI Desktop report, or configure a DirectQuery connection. 1. From the Modeling tab, select Manage Roles. 2. From the Manage roles window, select Create. 53 3. Under Roles, provide a name for the role. 4. Under Tables, select the table to which you want to apply a DAX rule. 5. In the Table filter DAX expression box, enter the DAX expressions. This expression returns a value of true or false. For example: [Entity ID] = “Value”. 6. After you've created the DAX expression, select the checkmark above the expression box to validate the expression. Note: You can use username() within this expression. 7. Select Save. C: By default, row-level security filtering uses single-directional filters, whether the relationships are set to single direction or bi-directional. You can manually enable bi-directional cross-filtering with row-level security by selecting the relationship and checking the Apply security filter in both directions checkbox. Select this option when you've also implemented dynamic row-level security at the server level, where row-level security is based on username or login ID. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls 9. HOTSPOT You have a Power BI imported dataset that contains the data model shown in the following exhibit. 54 Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Explanation: Box 1: Assume Referential Integrity 55 When connecting to a data source using DirectQuery, you can use the Assume Referential Integrity selection to enable running more efficient queries against your data source. This feature has a few requirements of the underlying data, and it is only available when using DirectQuery. Note: The following requirements are necessary for Assume referential integrity to work properly: Data in the From column in the relationship is never Null or blank For each value in the From column, there is a corresponding value in the To column Box 2: Star schema Star schema is a mature modeling approach widely adopted by relational data warehouses. It requires modelers to classify their model tables as either dimension or fact. Generally, dimension tables contain a relatively small number of rows. Fact tables, on the other hand, can contain a very large number of rows and continue to grow over time. Example: Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-assume-referential-integrity https://docs.microsoft.com/en-us/power-bi/guidance/star-schema 10. HOTSPOT You have a Power BI model that contains a table named Sales and a related date table. Sales contains a measure named Total Sales. You need to create a measure that calculates the total sales from the equivalent month of the previous year. How should you complete the calculation? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. 56 Explanation: Box 1: CALCULATE Box 2: PARALLELPERIOD PARALLELPERIOD returns a table that contains a column of dates that represents a period parallel to the dates in the specified dates column, in the current context, with the dates shifted a number of intervals either forward in time or back in time. Syntax: PARALLELPERIOD(,,) dates: A column that contains dates. interval: The interval by which to shift the dates. The value for interval can be one of the following: year, quarter, month. Incorrect: SAMEPERIODLASTYEAR returns a table that contains a column of dates shifted one year back in time from the dates in the specified dates column, in the current context. Syntax: SAMEPERIODLASTYEAR() DATESMTD returns a table that contains a column of the dates for the month to date, in the current context. Syntax: DATESMTD() Box 3: 'DATE' [Month] Reference: https://docs.microsoft.com/en-us/dax/parallelperiod-function-dax https://docs.microsoft.com/en-us/dax/sameperiodlastyear-function-dax 11. DRAG DROP 57 You plan to create a report that will display sales data from the last year for multiple regions. You need to restrict access to individual rows of the data on a per region-basis by using roles. Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Explanation: You can define roles and rules within Power BI Desktop. When you publish to Power BI, it also publishes the role definitions. To define security roles, follow these steps. 1. Import data into your Power BI Desktop report (Step 1) 2. From the Modeling tab, select Manage Roles. 3. From the Manage roles window, select Create. (Step 2) 4. Under Roles, provide a name for the role. 5. Under Tables, select the table to which you want to apply a DAX rule. 6. In the Table filter DAX expression box, enter the DAX expressions. This expression returns a value of true or false. For example: [Entity ID] = “Value”(Step 3) 7. After you've created the DAX expression, select the checkmark above the expression box to validate the expression. 8. Select Save. Step 3: Assign Users to the role. You can't assign users to a role within Power BI Desktop. You assign them in the Power BI service. After you've created your roles, test the results of the roles within Power BI Desktop. Step 4: Publish the report. Now that you're done validating the roles in Power BI Desktop, go ahead and publish your report to the Power BI service. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls 58 12. DRAG DROP You create a data model in Power BI. Report developers and users provide feedback that the data model is too complex. The model contains the following tables. The model has the following relationships: There is a one-to-one relationship between Sales_Region and Region_Manager. There are more records in Manager than in Region_Manager, but every record in Region_Manager has a corresponding record in Manager. There are more records in Sales_Manager than in Sales_Region, but every record in Sales_Region has a corresponding record in Sales_Manager. You need to denormalize the model into a single table. Only managers who are associated to a sales region must be included in the reports. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select. 59 Explanation: Step 1: Merge [Sales_Region] and [Sales_Manager] by using an inner join. Inner Join: Returns the rows present in both Left and right table only if there is a match. Otherwise, it returns zero records. Note: Sales_Region and Sales_manager There is a one-to-one relationship between Sales_Region and Region_Manager. There are more records in Sales_Manager than in Sales_Region, but every record in Sales_Region has a corresponding record in Sales_Manager. Step 2: Merge [Region_Manager] and [Manager] by using inner join. Only managers who are associated to a sales region must be included in the reports. Note: Region_Manager and Manager. There are more records in Manager than in Region_Manager, but every record in Region_Manager has a corresponding record in Manager. Step 3: Merge [Sales_region] and [Region_Manager] by using a right join as new query named [Sales_region_and_Region_Manager] Reference: https://www.tutorialgateway.org/joins-in-power-bi/ 60 13. You have a Microsoft Power BI report. The size of PBIX file is 550 MB. The report is accessed by using an App workspace in shared capacity of powerbi.com. The report uses a semantic model containing tables using the import storage model. One of the tables is a fact table. The fact table contains 12 million rows. The semantic model is scheduled to refresh twice a day at 08:00 and 17:00. The report is a single page that contains 15 AppSource visuals and 10 default visuals. Users say that the report is slow to load the visuals when they access and interact with the report. You need to recommend a solution to improve the performance of the report. What should you recommend? A. Change any DAX measures to use iterator functions. B. Enable visual interactions. C. Replace the default visuals with AppSource visuals. D. Split the visuals onto multiple pages. Explanation: For this question these are the correct answers: * Remove unused columns from tables in the data model. * Split the visuals onto multiple pages. One page with many visuals may also make your report loading slow. Please appropriately reduce the number of visualizations on one page. For this question these are the incorrect answers: * Change any DAX measures to use iterator functions. * Change the imported dataset to DirectQuery * Enable visual interactions. Visual Interactions would not affect the loading of the visuals * Implement row-level security (RLS) Performance is not related to security. * Increase the number of times that the dataset is refreshed. Increasing the refresh rate would decrease performance. * Replace the default visuals with AppSource visuals. The default visuals are certified by Microsoft, as are the AppSource visuals. Details: * Remove unused columns from tables in the data model. There are different techniques to help reduce the data loaded into Import models. 61 Despite the efficiencies achieved by the VertiPaq storage engine, it is important that you strive to minimize the data that is to be loaded into your models. It is especially true for large models, or models that you anticipate will grow to become large over time. In particular, you can remove unnecessary columns. Model table columns serve two main purposes: Reporting, to achieve report designs that appropriate filter, group, and summarize model data Model structure, by supporting model relationships, model calculations, security roles, and even data color formatting Columns that don't serve these purposes can probably be removed. Removing columns is referred to as vertical filtering. We recommend that you design models with exactly the right number of columns based on the known reporting requirements. Your requirements may change over time, but bear in mind that it's easier to add columns later than it is to remove them later. Removing columns can break reports or the model structure. Note: You can optimize the model performance: * Remove unnecessary rows and columns * Identify poorly performing measures, relationships, and visuals * Reduce cardinality levels to improve performance Use the minimum data necessary to display information The more data a graph has to display, the longer it will take to load. It seems obvious, but sometimes we overlook this. Most of the time it is not necessary that the graph contains an infinity of data for the user to understand the message. Then it is preferable to reduce the data to the minimum necessary to increase the loading speed. If you are concerned that performance improvement might have a negative impact on the user experience you can choose to keep a larger amount of data than the user will need, but without leaving the default option that keeps the datasets whole. Remember: less is more. Reference: https://docs.microsoft.com/en-us/power-bi/guidance/import-modeling-data-reduction#remove-unnecessary- columns https://community.powerbi.com/t5/Desktop/Visuals-are-loading-extremely-slow/td-p/1565668 14. HOTSPOT You are creating a Microsoft Power BI imported data model to perform basket analysis. The goal of the analysis is to identify which products are usually bought together in the same transaction across and within sales territories. 62 You import a fact table named Sales as shown in the exhibit. (Click the Exhibit tab.) The related dimension tables are imported into the model. Sales contains the data shown in the following table. You are evaluating how to optimize the model. For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point. 63 Explanation: Box 1: Yes Those two columns not need in the analysis. Box 2: No Can remove the surrogate key OrderDateKey from the analysis. Box 3: No Tax charged not relevant for the analysis. 15. You have a Microsoft Power BI data model that contains three tables named Orders, Date, and City. There is a one-to-many relationship between Date and Orders and between City and Orders. The model contains two row-level security (RLS) roles named Role1 and Role2. Role1 contains the following filter. City[State Province] = "Kentucky" Role2 contains the following filter. Date[Calendar Year] = 2020 If a user is a member of both Role1 and Role2, what data will they see in a report that uses the model? A. The user will see data for which the State Province value is Kentucky or where the Calendar Year is 2020. B. The user will receive an error and will not be able to see the data in the report. C. The user will only see data for which the State Province value is Kentucky. D. The user will only see data for which the State Province value is Kentucky and the Calendar Year is 2020. Explanation: Row-level security (RLS) with Power BI can be used to restrict data access for given users. Filters restrict data access at the row level, and you can define filters within roles. Both Roles are applied, and both role filters must be met. Incorrect: Not B: A model relationship is limited when there's no guaranteed "one" side. You get an error message if you belong to multiple RLS roles and at least one of the roles relies on a limited relationship. But here both relationships have a guaranteed 1 side. 64 Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls 16. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: From Power Query Editor, you import the table and then add a filter step to the query. Does this meet the goal? A. Yes B. No Explanation: This would load the entire table in the first step. Instead: You add a WHERE clause to the SQL statement. Reference: https://docs.microsoft.com/en-us/power-query/native-database-query 17. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: You write a DAX expression that uses the FILTER function. Does this meet the goal A. Yes B. No Explanation: Instead: You add a WHERE clause to the SQL statement. Note: DAX is not a language designed to fetch the data like SQL rather than used for data analysis purposes. It is always a better and recommended approach to transform the data as close to the data source itself. For example, your data source is a relational database; then, it’s better to go with T-SQL. 65 SQL is a structured query language, whereas DAX is a formula language used for data analysis purposes. When our data is stored in some structured database systems like SQL server management studio, MySQL, or others, we have to use SQL to fetch the stored data. Reference: https://www.learndax.com/dax-vs-sql-when-to-use-dax-over-sql/ 18. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: You add a WHERE clause to the SQL statement. Does this meet the goal? A. Yes B. No Explanation: Power Query enables you to specify your native database query in a text box under Advanced options when connecting to a database. In the example below, you'll import data from a SQL Server database using a native database query entered in the SQL statement text box. 1. Connect to a SQL Server database using Power Query. Select the SQL Server database option in the connector selection. 2. In the SQL Server database popup window: 3. Specify the Server and Database where you want to import data from using native database query. 4. Under Advanced options, select the SQL statement field and paste or enter your native database query, then select OK. 66 Reference: https://docs.microsoft.com/en-us/power-query/native-database-query 19. DRAG DROP You are preparing a financial report in Power BI. You connect to the data stored in a Microsoft Excel spreadsheet by using Power Query Editor as shown in the following exhibit. You need to prepare the data to support the following: Visualizations that include all measures in the data over time Year-over-year calculations for all the measures Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 67 Reference: https://docs.microsoft.com/en-us/power-query/unpivot-column 20. HOTSPOT You are creating an analytics report that will consume data from the tables shown in the following table. There is a relationship between the tables. There are no reporting requirements on employee_id and employee_photo. You need to optimize the data model. 68 What should you configure for employee_id and employee_photo? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Explanation: Box 1: Hide Need in the relation, so cannot delete it. Box 2: Delete Reference: https://community.powerbi.com/t5/Desktop/How-to-Hide-a-Column-in-power-Bi/m-p/414470 21. HOTSPOT You plan to create Power BI dataset to analyze attendance at a school. Data will come from two separate views named View1 and View2 in an Azure SQL database. View1 contains the columns shown in the following table. View2 contains the columns shown in the following table. 69 The views can be related based on the Class ID column. Class ID is the unique identifier for the specified class, period, teacher, and school year. For example, the same class can be taught by the same teacher during two different periods, but the class will have a different class ID. You need to design a star schema data model by using the data in both views. The solution must facilitate the following analysis: The count of classes that occur by period The count of students in attendance by period by day The average number of students attending a class each month In which table should you include the Teacher First Name and Period Number fields? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Explanation: Box 1: Teacher fact 70 Fact tables store observations or events, and can be sales orders, stock balances, exchange rates, temperatures, etc. A fact table contains dimension key columns that relate to dimension tables, and numeric measure columns. Note: Star schema is a mature modeling approach widely adopted by relational data warehouses. It requires modelers to classify their model tables as either dimension or fact. Box 2: Attendance fact Incorrect: Dimension tables describe business entities – the things you model. Entities can include products, people, places, and concepts including time itself. The most consistent table you'll find in a star schema is a date dimension table. A dimension table contains a key column (or columns) that acts as a unique identifier, and descriptive columns. Reference: https://docs.microsoft.com/en-us/power-bi/guidance/star-schema 22. You have the Power BI model shown in the following exhibit. There are four departments in the Departments table. You need to ensure that users can see the data of their respective department only. What should you do? A. Create a slicer that filters Departments based on DepartmentID. B. Create a row-level security (RLS) role for each department, and then define the membership of the role. C. Create a Department ID parameter to filter the Departments table. D. To the ConfidentialData table, add a calculated measure that uses the CURRENTGROUP DAX function. Explanation: Row-level security (RLS) with Power BI can be used to restrict data access for given users. Filters restrict data access at the row level, and you can define filters within roles. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls 23. In Power BI Desktop, you are building a sales report that contains two tables. Both tables have row-level security (RLS) configured. 71 You need to create a relationship between the tables. The solution must ensure that bidirectional cross-filtering honors the RLS settings. What should you do? A. Create an inactive relationship between the tables and select Apply security filter in both directions. B. Create an active relationship between the tables and select Apply security filter in both directions. C. Create an inactive relationship between the tables and select Assume referential integrity. D. Create an active relationship between the tables and select Assume referential integrity. Explanation: By default, row-level security filtering uses single-directional filters, whether the relationships are set to single direction or bi-directional. You can manually enable bi-directional cross-filtering with row-level security by selecting the relationship and checking the Apply security filter in both directions checkbox. Select this option when you've also implemented dynamic row-level security at the server level, where row-level security is based on username or login ID. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls 24. HOTSPOT You have a column named UnitsInStock as shown in the following exhibit. 72 UnitsInStock has 75 non-null values, of which 51 are unique. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. 73 Explanation: Box 1: 75 rows Is nullable allows NULL values in the column. Box 2: reduce Reference: https://blog.crossjoin.co.uk/2019/01/20/is-nullable-column-property-power-bi/ 25. HOTSPOT You have a Power BI report. You have the following tables. You have the following DAX measure. Accounts := CALCULATE ( DISTINCTCOUNT (Balances[AccountID]), LASTDATE (‘Date’[Date]) For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point. 74 Explanation: Box 1: No It will show the total number of accounts that were live at the last day of the year only. Note: DISTINCTCOUNT counts the number of distinct values in a column. LASTDATE returns the last date in the current context for the specified column of dates. Box 2: No It will show the total number of accounts that were live at the last day of the month only. Box 3: Yes Reference: https://docs.microsoft.com/en-us/dax/distinctcount-function-dax https://docs.microsoft.com/en-us/dax/lastdate-function-dax 26. You have the tables shown in the following table. The Impressions table contains approximately 30 million records per month. You need to create an ad analytics system to meet the following requirements: Present ad impression counts for the day, campaign, and site_name. The analytics for the last year are required. 75 Minimize the data model size. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. A. Create one-to-many relationships between the tables. B. Group the Impressions query in Power Query by Ad id, Site name, and Impression date. Aggregate by using the Count Rows function. C. Create a calculated table that contains Ad id, Site name, and Impression date. D. Create a calculated measure that aggregates by using the COUNTROWS function. Explanation: Incorrect: Not C: A calculated table would increase the data model size. Not D: Need Impression_date etc. 27. HOTSPOT You are creating a Microsoft Power BI data model that has the tables shown in the following table. The Products table is related to the ProductCategory table through the ProductCategoryID column. Each product has one product category. You need to ensure that you can analyze sales by product category. How should you configure the relationship from ProductCategory to Products? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point 76 Explanation: Box 1: One-to-many The one-to-many and many-to-one cardinality options are essentially the same, and they're also the most common cardinality types. Incorrect: A many-to-many relationship means both columns can contain duplicate values. This cardinality type is infrequently used. It's typically useful when designing complex model requirements. You can use it to relate many-to-many facts or to relate higher grain facts. For example, when sales target facts are stored at product category level and the product dimension table is stored at product level. Box 2: Single Incorrect: Bear in mind that bi-directional relationships can impact negatively on performance. Furth