Cris Personal PL300 Exam Topics PDF

Summary

This document contains Power BI exam questions and answers, focusing on areas like data modeling, storage modes, and data connections in Power BI.

Full Transcript

Question #1 Topic 1 HOTSPOT - You plan to create the Power BI model shown in the exhibit. (Click the Exhibit tab.) The data has the following refresh requirements: ✑ Customer must be refreshed daily. ✑ Date must be refreshed once every three years. ✑ Sales must be refreshed in near real time. ✑ S...

Question #1 Topic 1 HOTSPOT - You plan to create the Power BI model shown in the exhibit. (Click the Exhibit tab.) The data has the following refresh requirements: ✑ Customer must be refreshed daily. ✑ Date must be refreshed once every three years. ✑ Sales must be refreshed in near real time. ✑ SalesAggregate must be refreshed once per week. You need to select the storage modes for the tables. The solution must meet the following requirements: ✑ Minimize the load times of visuals. ✑ Ensure that the data is loaded to the model based on the refresh requirements. Which storage mode should you select for each table? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Box 1: Dual - Customer should use the dual storage mode. Dual: Tables with this setting can act as either cached or not cached, depending on the context of the query that's submitted to the Power BI dataset. In some cases, you fulfill queries from cached data. In other cases, you fulfill queries by executing an on-demand query to the data source. Note: You set the Storage mode property to one of these three values: Import, DirectQuery, and Dual. Box 2: Dual - You can set the dimension tables (Customer, Geography, and Date) to Dual to reduce the number of limited relationships in the dataset, and improve performance. Box 3: DirectQuery - Sales should use the DirectQuery storage mode. DirectQuery: Tables with this setting aren't cached. Queries that you submit to the Power BI dataset‫ג‬€"for example, DAX queries‫ג‬€"and that return data from DirectQuery tables can be fulfilled only by executing on-demand queries to the data source. Queries that you submit to the data source use the query language for that data source, for example, SQL. Box 4: Import - Import: Imported tables with this setting are cached. Queries submitted to the Power BI dataset that return data from Import tables can be fulfilled only from cached data. Reference: https://docs.microsoft.com/en-us/power-bi/transform-model/desktop-storage-mode Dual (Composite) Mode: The dual storage mode is between Import and DirectQuery. it is a hybrid approach, Like importing data, the dual storage mode caches the data in the table. However, it leaves it up to Power BI to determine the best way to query the table depending on the query context. 1) Sales Must be Refreshed in Near real time so "Direct Query" 2) Sales Aggregate is once per week so "Import" (performance also required) 3) Both Date and Customer has relationship with both Sales and SalesAggregate tables so "Dual" because to support performance for DirectQuery(Sales) and Import(SalesAggregate) Question #2 Topic 1 You have a project management app that is fully hosted in Microsoft Teams. The app was developed by using Microsoft Power Apps. You need to create a Power BI report that connects to the project management app. Which connector should you select? ​ A. Microsoft Teams Personal Analytics ​ B. SQL Server database ​ C. Dataverse ​ D. Dataflows Correct Answer: C 🗳️ When using Microsoft Power BI template , you're connected to your Microsoft Dataverse instance https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-data-sources https://support.microsoft.com/en-us/office/use-power-bi-desktop-to-connect-with-y our-project-data-df4ccca1-68e9-418c-9d0f-022ac05249a2 C. You can use the Microsoft Power BI template to import data into Power BI from Project for the web and Project Online. When you're using the template, you're connected to your Microsoft Dataverse instance, where your Microsoft Project web app data is stored. https://support.microsoft.com/en-us/office/use-power-bi-desktop-to-connect-with-y our-project-data-df4ccca1-68e9-418c-9d0f-022ac05249a2 Question #3 Topic 1 For the sales department at your company, you publish a Power BI report that imports data from a Microsoft Excel file located in a Microsoft SharePoint folder. The data model contains several measures. You need to create a Power BI report from the existing data. The solution must minimize development effort. Which type of data source should you use? ​ A. Power BI dataset ​ B. a SharePoint folder ​ C. Power BI dataflows ​ D. an Excel workbook Correct Answer: B 🗳️ Connect to a SharePoint folder from Power Query Desktop To connect to a SharePoint folder: 1. From Get Data, select SharePoint folder. 2. Paste the SharePoint site URL you copied in Determine the site URL to the Site URL text box in the SharePoint folder dialog box. In this example, the site URL is https://contoso.sharepoint.com/marketing/data. If the site URL you enter is invalid, a warning icon. warning icon will appear next to the URL text box. Select OK to continue. 3. If this is the first time you've visited this site address, select the appropriate authentication method. Enter your credentials and choose which level to apply these settings to. Then select Connect. 4. When you select the SharePoint folder you want to use, the file information about all of the files in that SharePoint folder are displayed. In addition, file information about any files in any subfolders is also displayed. 5. Etc. Reference: https://docs.microsoft.com/en-us/power-query/connectors/sharepointfolder A. It should be dataset, because the case states there is already a report published and the datamodel contains measures. therefore and to be able to use the measures in the datamodel you should connect to the existing dataset (which was created when you plublished the report) instead of starting from scratch with the files in the SharePoint folder. After reading the question multiple times, the biggest takeaway is that its asking directly for data. A SharePoint folder HOLDS data, but it is not data itself. I agree with this and think its the existing dataset Question #4 Topic 1 You import two Microsoft Excel tables named Customer and Address into Power Query. Customer contains the following columns: ✑ Customer ID ✑ Customer Name ✑ Phone ✑ Email Address ✑ Address ID Address contains the following columns: ✑ Address ID ✑ Address Line 1 ✑ Address Line 2 ✑ City ✑ State/Region ✑ Country ✑ Postal Code Each Customer ID represents a unique customer in the Customer table. Each Address ID represents a unique address in the Address table. You need to create a query that has one row per customer. Each row must contain City, State/Region, and Country for each customer. What should you do? ​ A. Merge the Customer and Address tables. ​ B. Group the Customer and Address tables by the Address ID column. ​ C. Transpose the Customer and Address tables. ​ D. Append the Customer and Address tables. Correct Answer: A 🗳️ A merge queries operation joins two existing tables together based on matching values from one or multiple columns. You can choose to use different types of joins, depending on the output you want. Reference: https://docs.microsoft.com/en-us/power-query/merge-queries-overview A. Merge - you essentially want to join the columns together, which is what Merge does. Question #5 Topic 1 HOTSPOT - You have two Azure SQL databases that contain the same tables and columns. For each database, you create a query that retrieves data from a table named Customer. You need to combine the Customer tables into a single table. The solution must minimize the size of the data model and support scheduled refresh in powerbi.com. What should you do? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Correct Answer: Box 1: Append Queries as New - When you have additional rows of data that you'd like to add to an existing query, you append the query. There are two append options: * Append queries as new displays the Append dialog box to create a new query by appending multiple tables. * Append queries displays the Append dialog box to add additional tables to the current query. Incorrect: When you have one or more columns that you'd like to add to another query, you merge the queries. Box 2: Disable loading the query to the data model By default, all queries from Query Editor will be loaded into the memory of Power BI Model. You can disable the load for some queries, especially queries that used as intermediate transformation to produce the final query for the model. Disabling Load doesn't mean the query won't be refreshed, it only means the query won't be loaded into the memory. When you click on Refresh model in Power BI, or when a scheduled refresh happens even queries marked as Disable Load will be refreshed, but their data will be used as intermediate source for other queries instead of loading directly into the model. This is a very basic performance tuning tip, but very important when your Power BI model grows bigger and bigger. Reference: https://docs.microsoft.com/en-us/power-query/append-queries https://radacad.com/performance-tip-for-power-bi-enable-load-sucks-memory-up - Append Queries as New - Disable loading the query to the data model Question #6 Topic 1 DRAG DROP - In Power Query Editor, you have three queries named ProductCategory, ProductSubCategory, and Product. Every Product has a ProductSubCategory. Not every ProductsubCategory has a parent ProductCategory. You need to merge the three queries into a single query. The solution must ensure the best performance in Power Query. How should you merge the tables? To answer, drag the appropriate merge types to the correct queries. Each merge type may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point. Select and Place: Correct Answer: Box 1: Inner - Every Product has a ProductSubCategory. A standard join is needed. One of the join kinds available in the Merge dialog box in Power Query is an inner join, which brings in only matching rows from both the left and right tables. Box 2: Left outer - Not every ProductsubCategory has a parent ProductCategory. One of the join kinds available in the Merge dialog box in Power Query is a left outer join, which keeps all the rows from the left table and brings in any matching rows from the right table. Reference: https://docs.microsoft.com/en-us/power-query/merge-queries-inner https://docs.microsoft.com/en-us/power-query/merge-queries-left-outer * The correct answer for the first option is Inner Join: it is because we want to keep only the rows where there's matching ProductSubCategory for each Product. * The correct answer for the second option is Left outer join: it is because not every ProductSubCategory has a parent ProductCategor, so we want to keep all ProductSubCategories while matching them with any available ProductCategories. Question #7 Topic 1 You are building a Power BI report that uses data from an Azure SQL database named erp1. You import the following tables. You need to perform the following analyses: ✑ Orders sold over time that include a measure of the total order value Orders by attributes of products sold The solution must minimize update times when interacting with visuals in the report. What should you do first? ​ A. From Power Query, merge the Order Line Items query and the Products query. ​ B. Create a calculated column that adds a list of product categories to the Orders table by using a DAX function. ​ C. Calculate the count of orders per product by using a DAX function. ​ D. From Power Query, merge the Orders query and the Order Line Items query. Reveal Solution Discussion 98 Correct Answer: D 🗳️ A merge queries operation joins two existing tables together based on matching values from one or multiple columns. Join the Orders and the Order Line Items tables. Reference: https://docs.microsoft.com/en-us/power-query/merge-queries-overview I'm very sure it's D. It's the Header/Detail Schema, and the most optimal way is to flatten the header into the detail table. Source: https://www.sqlbi.com/articles/header-detail-vs-star-schema-models-in-tabular-an d-power-bi/ Question #8 Topic 1 You have a Microsoft SharePoint Online site that contains several document libraries. One of the document libraries contains manufacturing reports saved as Microsoft Excel files. All the manufacturing reports have the same data structure. You need to use Power BI Desktop to load only the manufacturing reports to a table for analysis. What should you do? ​ A. Get data from a SharePoint folder and enter the site URL Select Transform, then filter by the folder path to the manufacturing reports library. ​ B. Get data from a SharePoint list and enter the site URL. Select Combine & Transform, then filter by the folder path to the manufacturing reports library. ​ C. Get data from a SharePoint folder, enter the site URL, and then select Combine & Load. ​ D. Get data from a SharePoint list, enter the site URL, and then select Combine & Load. Reveal Solution Discussion 48 Correct Answer: A 🗳️ Get Data from SharePoint folder + select Combine & Load to load the data from all of the files in the SharePoint folder directly into your app. Note: Connect to a SharePoint folder from Power Query Desktop To connect to a SharePoint folder: 1. From Get Data, select SharePoint folder. 2. Paste the SharePoint site URL you copied in Determine the site URL to the Site URL text box in the SharePoint folder dialog box. In this example, the site URL is https://contoso.sharepoint.com/marketing/data. If the site URL you enter is invalid, a warning icon. warning icon will appear next to the URL text box. SharePoint folder selection. 3. Select OK to continue. 4. If this is the first time you've visited this site address, select the appropriate authentication method. Enter your credentials and choose which level to apply these settings to. Then select Connect. 5. When you select the SharePoint folder you want to use, the file information about all of the files in that SharePoint folder are displayed. In addition, file information about any files in any subfolders is also displayed. 6. Select Combine & Transform Data to combine the data in the files of the selected SharePoint folder and load the data into the Power Query Editor for editing. Or select Combine & Load to load the data from all of the files in the SharePoint folder directly into your app. Reference: https://docs.microsoft.com/en-us/power-query/connectors/sharepointfolder !. We have to import Excel files from SharePoint, so we need the connector SharePoint folder which is used to get access to the files stored in the library. SharePoint list is a collection of content that has rows and columns (like a table) and is used for task lists, calendars, etc. Since we have to filter only on manufacturing reports, we have to select Transform and then filter by the corresponding folder path. To load only the manufacturing reports (Microsoft Excel Files) from a specified SharePoint document library into Power BI Desktop for analysis, we should do, Geta data from a SharePoint folder and enter the site URL. Select Transform, then filter by the folder path to the manufacturing reports library. Option B is not the right answer choice because "SharePoint list" is selected, and you need to access files in a document library, not a list. Option C and Option D both include "Combine & Load", which typically combines multiple queries or tables and loads them into Power BI. However, since we only want to load files from a specific folder within the SharePoint library, we should use the "Transform" option to filter and select the desired data before loading it into Power BI. https://youtu.be/XuLnSYjmsJo Question #9 Topic 1 DRAG DROP - You have a Microsoft Excel workbook that contains two sheets named Sheet1 and Sheet2. Sheet1 contains the following table named Table1. Sheet2 contains the following table named Table2. You need to use Power Query Editor to combine the products from Table1 and Table2 into the following table that has one column containing no duplicate values. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Reveal Solution Discussion 92 Correct Answer: Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-shape-and-com bine-data Import From Excel Append Table 2 to Table 1 Remove Duplicates Question #10 Topic 1 You have a CSV file that contains user complaints. The file contains a column named Logged. Logged contains the date and time each complaint occurred. The data in Logged is in the following format: 2018-12-31 at 08:59. You need to be able to analyze the complaints by the logged date and use a built-in date hierarchy. What should you do? ​ A. Apply a transformation to extract the last 11 characters of the Logged column and set the data type of the new column to Date. ​ B. Change the data type of the Logged column to Date. ​ C. Split the Logged column by using at as the delimiter. ​ D. Apply a transformation to extract the first 11 characters of the Logged column. Correct Answer: D 🗳️ Extract the date, which is the first 11 characters. CSV files have no data types. Note: A CSV is a comma-separated values file, which allows data to be saved in a tabular format. CSVs look like a garden-variety spreadsheet but with a. csv extension. CSV files can be used with most any spreadsheet program, such as Microsoft Excel or Google Spreadsheets. Reference: https://www.bigcommerce.com/ecommerce-answers/what-csv-file-and-what-does-it-me an-my-ecommerce-business/ You should split the Logged column by using "at" as the delimiter. This will allow you to separate the date and time into separate columns, which will enable you to analyze the complaints by date and use a built-in date hierarchy. Alternatively, you could also use a transformation to extract the date and time from the Logged column and set the data type of the new columns to Date and Time, respectively. Option A is incorrect because it only extracts the last 11 characters of the Logged column, which would not include the date. Option B is incorrect because the data in the Logged column is in a non-standard date format and cannot be directly converted to the Date data type. Option D is incorrect because it only extracts the first 11 characters of the Logged column, which would not include the time. Question #11 Topic 1 You have a Microsoft Excel file in a Microsoft OneDrive folder. The file must be imported to a Power BI dataset. You need to ensure that the dataset can be refreshed in powerbi.com. Which two connectors can you use to connect to the file? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Excel Workbook B. Text/CSV C. Folder D. SharePoint folder E. Web Hide Solution Discussion 102 Correct Answer: AC 🗳️ A: Connect to an Excel workbook from Power Query Desktop To make the connection from Power Query Desktop: 1. Select the Excel option in the connector selection. 2. Browse for and select the Excel workbook you want to load. Then select Open. 3. Etc. C: Folder connector capabilities supported Folder path - Combine - Combine and load - Combine and transform - Connect to a folder from Power Query Online To connect to a folder from Power Query Online: 1. Select the Folder option in the connector selection. 2. Enter the path to the folder you want to load. Note: Reference: https://docs.microsoft.com/en-us/power-query/connectors/excel https://docs.microsoft.com/en-us/power-query/connectors/folder Community vote distribution DE (83%) Other Question #12 Topic 1 HOTSPOT - You are profiling data by using Power Query Editor. You have a table named Reports that contains a column named State. The distribution and quality data metrics for the data in State is shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Hot Area: Hide Solution Discussion 51 Correct Answer: Box 1: 69 - 69 distinct/different values. Note: Column Distribution allows you to get a sense for the overall distribution of values within a column in your data previews, including the count of distinct values (total number of different values found in a given column) and unique values (total number of values that only appear once in a given column). Box 2: 4 - Reference: https://systemmanagement.ro/2018/10/16/power-bi-data-profiling-distinct-vs-unique/ 69 4 Question #13 Topic 1 HOTSPOT - You have two CSV files named Products and Categories. The Products file contains the following columns: ✑ ProductID ✑ ProductName ✑ SupplierID ✑ CategoryID The Categories file contains the following columns: ✑ CategoryID ✑ CategoryName ✑ CategoryDescription From Power BI Desktop, you import the files into Power Query Editor. You need to create a Power BI dataset that will contain a single table named Product. The Product will table includes the following columns: ✑ ProductID ✑ ProductName ✑ SupplierID ✑ CategoryID ✑ CategoryName ✑ CategoryDescription How should you combine the queries, and what should you do on the Categories query? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Hide Solution Discussion 39 Correct Answer: Box 1: Merge - There are two primary ways of combining queries: merging and appending. * When you have one or more columns that you'd like to add to another query, you merge the queries. * When you have additional rows of data that you'd like to add to an existing query, you append the query. Box 2: Disable the query load - Managing loading of queries - In many situations, it makes sense to break down your data transformations in multiple queries. One popular example is merging where you merge two queries into one to essentially do a join. In this type of situations, some queries are not relevant to load into Desktop as they are intermediate steps, while they are still required for your data transformations to work correctly. For these queries, you can make sure they are not loaded in Desktop by un-checking 'Enable load' in the context menu of the query in Desktop or in the Properties screen: Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-shape-and-combine-da ta https://docs.microsoft.com/en-us/power-bi/connect-data/refresh-include-in-report-refresh Combine the queries by performing a: Merge. On the Categories query: Disable the query load. Question #14 Topic 1 You have an Azure SQL database that contains sales transactions. The database is updated frequently. You need to generate reports from the data to detect fraudulent transactions. The data must be visible within five minutes of an update. How should you configure the data connection? ​ A. Add a SQL statement. ​ B. Set the Command timeout in minutes setting. ​ C. Set Data Connectivity mode to Import. ​ D. Set Data Connectivity mode to DirectQuery. Hide Solution Discussion 20 Correct Answer: D 🗳️ DirectQuery: No data is imported or copied into Power BI Desktop. For relational sources, the selected tables and columns appear in the Fields list. For multi- dimensional sources like SAP Business Warehouse, the dimensions and measures of the selected cube appear in the Fields list. As you create or interact with a visualization, Power BI Desktop queries the underlying data source, so you're always viewing current data. Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-use-directquery Community vote distribution D is the correct answer. DirectQuery model allows Power BI to directly query the data source (Azure SQL database, in this case) in real-time or near real-time. When data is updated in the database, DirectQuery ensures that the reports reflect the most current data without the need to import and refresh the data into the Power BI model. D (100%) Question #15 Topic 1 DRAG DROP - You have a folder that contains 100 CSV files. You need to make the file metadata available as a single dataset by using Power BI. The solution must NOT store the data of the CSV files. Which three actions should you perform in sequence. To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Hide Solution Discussion 62 Correct Answer: Step 1: From Power BI Desktop, Select Get Data, and then Select Folder. Open Power BI Desktop and then select Get Data\More‫ג‬€¦ and choose Folder from the All options on the left. Enter the folder path, select OK, and then select Transform data to see the folder's files in Power Query Editor. Step 2: From Power Query Editor, expand the Attributes column. Step 3: From Power Query Editor, combine the Content column. Combine files behavior - To combine binary files in Power Query Editor, select Content (the first column label) and select Home > Combine Files. Or you can just select the Combine Files icon next to Content. Reference: https://docs.microsoft.com/en-us/power-bi/transform-model/desktop-combine-binaries Get data then select folder Remove content Colum Expand Attribute Colum Question #16 Topic 1 A business intelligence (BI) developer creates a dataflow in Power BI that uses DirectQuery to access tables from an on-premises Microsoft SQL server. The Enhanced Dataflows Compute Engine is turned on for the dataflow. You need to use the dataflow in a report. The solution must meet the following requirements: ✑ Minimize online processing operations. ✑ Minimize calculation times and render times for visuals. ✑ Include data from the current year, up to and including the previous day. What should you do? ​ A. Create a dataflows connection that has DirectQuery mode selected. ​ B. Create a dataflows connection that has DirectQuery mode selected and configure a gateway connection for the dataset. ​ C. Create a dataflows connection that has Import mode selected and schedule a daily refresh. ​ D. Create a dataflows connection that has Import mode selected and create a Microsoft Power Automate solution to refresh the data hourly. Correct Answer: C 🗳️ A daily update is adequate. When you set up a refresh schedule, Power BI connects directly to the data sources using connection information and credentials in the dataset to query for updated data, then loads the updated data into the dataset. Any visualizations in reports and dashboards based on that dataset in the Power BI service are also updated. Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/refresh-desktop-file-local-drive Community vote distribution C (91%) 9% C, because one of the requirements is 'Minimize online processing operations'. Although the dataflow uses DirectQuery, the Dataset can be refreshed with Import.https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dat aflows-directquery Question #17 Topic 1 DRAG DROP - You publish a dataset that contains data from an on-premises Microsoft SQL Server database. The dataset must be refreshed daily. You need to ensure that the Power BI service can connect to the database and refresh the dataset. Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Hide Solution Discussion 31 Correct Answer: Question #18 https://www.youtube.com/watch?v=WpfGIC3i2D8&list=PLApPcvU5-R24K3mbxORV 7T3ckVLfDjmHF&index=40 https://learn.microsoft.com/en-us/power-bi/connect-data/service-gateway-sql-tutor ial Set up an on-premises data gateway: Download and install an on-premises data gateway on a machine that has access to the SQL Server database. Make sure that the gateway is registered to the same workspace as the dataset. Configure a data source: In the Power BI service, go to the dataset settings, and select the data source. Then, enter the necessary details, including the server name, database name, and credentials. Schedule refresh: In the dataset settings, go to the "Scheduled refresh" tab, and set up a refresh schedule. Ensure that the gateway is selected as the "Data source credentials" option. Publish the dataset: Finally, publish the dataset to the Power BI service. The dataset will be refreshed according to the schedule you set up, and the on-premises data gateway will allow the service to connect to the SQL Server database. Question #18 Topic 1 You attempt to connect Power BI Desktop to a Cassandra database. From the Get Data connector list, you discover that there is no specific connector for the Cassandra database. You need to select an alternate data connector that will connect to the database. Which type of connector should you choose? ​ A. Microsoft SQL Server database ​ B. ODBC ​ C. OLE DB ​ D. OData B is Correct because, B´cause it allows you to connect to data sources that aren't identified in the Get Data lists. The ODBC connector lets you import data from any third-party ODBC driver simply by specifying a Data Source Name (DSN) or a connection string. As an option, you can also specify a SQL statement to execute against the ODBC driver. List details a few examples of data sources to which Power BI Desktop can connect by using the generic ODBC interface: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-connect-using-g eneric-interfaces Question #19 Topic 1 DRAG DROP - You receive annual sales data that must be included in Power BI reports. From Power Query Editor, you connect to the Microsoft Excel source shown in the following exhibit. You need to create a report that meets the following requirements: Visualizes the Sales value over a period of years and months Adds a slicer for the month Adds a slicer for the year Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Hide Solution Discussion 18 Correct Answer: Here are the correct three actions should be performed in sequence: Action 1: Select the Month and MonthNumber Columns. These columns will be used for the slicers to filter the data by month. Action 2: Select unpivot other columns. This action will transform the 2019, 2020, and 2021 columns into rows, creating a column called "Attribute" that contains the years and a column called "Value" that contains the sales data. This step makes the data more suitable for visualization and filtering by year. Action 3: Rename the Attribute column as Year and the value column as sales. Renaming the columns provides a more descriptive and meaningful structure for your data. After performing these actions, your data will be in a format that allows you to create visuals and add slicers for the month and year in Power BI. Question #20 Topic 1 HOTSPOT - You are using Power BI Desktop to connect to an Azure SQL database. The connection is configured as shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct solution is worth one point. Hide Solution Discussion 41 Correct Answer: Question #21 HOTSPOT - You have the Azure SQL databases shown in the following table. You plan to build a single PBIX file to meet the following requirements: Data must be consumed from the database that corresponds to each stage of the development lifecycle. Power BI deployment pipelines must NOT be used. The solution must minimize administrative effort. What should you do? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hide Solution Discussion 32 Correct Answer: Given answer is correct To meet the requirements specified, we can use a single parameter in the PBIX file that controls which database is used for data consumption based on the stage of the development lifecycle. We can use a Text parameter type in Power BI to achieve this. The parameter can be used to switch between the different database connections when a user interacts with the report. The text parameter could include values such as "Development", "Staging", and "Production", which correspond to the different databases shown in the table. The parameter can then be used in the queries to dynamically filter the data based on the selected stage of the development lifecycle. By using a single parameter, we can minimize administrative effort and ensure that the report works with each stage of the development lifecycle. Question #22 Topic 1 You are creating a query to be used as a Country dimension in a star schema. A snapshot of the source data is shown in the following table. You need to create the dimension. The dimension must contain a list of unique countries. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. ​ A. Delete the Country column. ​ B. Remove duplicates from the table. ​ C. Remove duplicates from the City column. ​ D. Delete the City column. ​ E. Remove duplicates from the Country column. Hide Solution Discussion 26 Correct Answer: DE 🗳️ Community vote distribution DE (70%) BD (30%) Question #23 Topic 1 DRAG DROP - You use Power Query Editor to preview the data shown in the following exhibit. You need to clean and transform the query so that all the rows of data are maintained, and error values in the discount column are replaced with a discount of 0.05. The solution must minimize administrative effort. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Hide Solution Discussion 22 Correct Answer: The correct three consecutive actions are: Action 1: Select the discount column. This is the column we need to transform. Action 3: From the discount column, change data type to decimal Number. This step is necessary to work with numeric values. Action 5: Select replace error to replace each error value with 0.05. This will address the requirement of replacing error values with the desired discount value. Action 2: select the price column. Selecting the price column is not relevant to the requirement of cleaning and transforming the discount column. Action 4: From the discount column, change data type to whole number. Changing the data type of the discount column to a whole number is not appropriate since the discount values are decimal numbers, and you want to replace errors with 0.05, which is not a whole number. Question #24 Topic 1 HOTSPOT - You attempt to use Power Query Editor to create a custom column and receive the error message shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Question #25 Topic 1 From Power Query Editor, you attempt to execute a query and receive the following error message. Datasource.Error: Could not find file. What are two possible causes of the error? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. You do not have permissions to the file. B. An incorrect privacy level was used for the data source. C. The file is locked. D. The referenced file was moved to a new location. Hide Solution Discussion 16 Correct Answer: AD 🗳️ So, the correct selections would be: A. You do not have permissions to the file. D. The referenced file was moved to a new location. Other options cause another message: B. If an incorrect privacy level is set for a data source, you might receive an error related to data privacy like: "Formula.Firewall: Query 'QueryName' references other queries or steps, so it may not directly access a data source. Please rebuild this data combination." C. If a file is locked, for example because it is open in another application and that application has locked the file for exclusive access, the error message might be something like: "DataSource.Error: The process cannot access the file because it is being used by another process." Question #26 Topic 1 You have data in a Microsoft Excel worksheet as shown in the following table. You need to use Power Query to clean and transform the dataset. The solution must meet the following requirements: If the discount column returns an error, a discount of 0.05 must be used. All the rows of data must be maintained. Administrative effort must be minimized. What should you do in Power Query Editor? A. Select Replace Errors. B. Edit the query in the Query Errors group. C. Select Remove Errors. D. Select Keep Errors. Hide Solution Discussion 10 Correct Answer: A 🗳️ A. Select Replace Errors. is the correct answer. Because selecting "Replace Errors" allows you to replace any errors in the discount column with a specified value, which in the case is 0.05 as per the requirement. Option B is not necessary for this specific task. Option C would remove rows with errors entirely, which is not in line with the requirement to maintain all rows of data. Option D would keep rows with errors as they are, which is not what we want since we want to replace errors with a specific value. Community vote distribution A (100%) Question #27 Topic 1 You have a CSV file that contains user complaints. The file contains a column named Logged. Logged contains the date and time each complaint occurred. The data in Logged is in the following format: 2018-12-31 at 08:59. You need to be able to analyze the complaints by the logged date and use a built-in date hierarchy. What should you do? ​ A. Apply the Parse function from the Data transformations options to the Logged column. ​ B. Change the data type of the Logged column to Date. ​ C. Split the Logged column by using at as the delimiter. ​ D. Create a column by example that starts with 2018-12-31. Hide Solution Discussion 33 Correct Answer: C 🗳️ Yep, the answer is C and using delimiter "at" is the easiest path and will automatically adjust the column to date format upon hitting Ok. Also, this method retains the time column in time format. If time was not a requirement, then you might be better using Extract Table Using Examples to minimize the data imported. Community vote distribution C (60%) D (23%) Other Question #28 Topic 1 DRAG DROP - You have two Microsoft Excel workbooks in a Microsoft OneDrive folder. Each workbook contains a table named Sales. The tables have the same data structure in both workbooks. You plan to use Power BI to combine both Sales tables into a single table and create visuals based on the data in the table. The solution must ensure that you can publish a separate report and dataset. Which storage mode should you use for the report file and the dataset file? To answer, drag the appropriate modes to the correct files. Each mode may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point. Hide Solution Discussion 38 Correct Answer: Question #29 Topic 1 You use Power Query to import two tables named Order Header and Order Details from an Azure SQL database. The Order Header table relates to the Order Details table by using a column named Order ID in each table. You need to combine the tables into a single query that contains the unique columns of each table. What should you select in Power Query Editor? A. Merge queries B. Combine files C. Append queries Hide Solution Discussion 20 Correct Answer: A 🗳️ A. There are two primary ways of combining queries: merging and appending. For one or more columns that you’d like to add to another query, you merge the queries. For one or more rows of data that you’d like to add to an existing query, you append the query. https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-shape-and-comb ine-data Community vote distribution A (90%) 10% Question #30 Topic 1 You have a CSV file that contains user complaints. The file contains a column named Logged. Logged contains the date and time each complaint occurred. The data in Logged is in the following format: 2018-12-31 at 08:59. You need to be able to analyze the complaints by the logged date and use a built-in date hierarchy. What should you do? ​ A. Apply a transformation to extract the last 11 characters of the Logged column and set the data type of the new column to Date. ​ B. Change the data type of the Logged column to Date. ​ C. Split the Logged column by using at as the delimiter. ​ D. Apply the Parse function from the Date transformations options to the Logged column. Hide Solution Discussion 15 Correct Answer: C 🗳️ c Community vote distribution C (100%) Question #31 Topic 1 HOTSPOT - You have a folder that contains 50 JSON files. You need to use Power BI Desktop to make the metadata of the files available as a single dataset. The solution must NOT store the data of the JSON files. Which type of data source should you use, and which transformation should you perform? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hide Solution Discussion 14 Correct Answer: Question #32 Topic 1 You have a PBIX file that imports data from a Microsoft Excel data source stored in a file share on a local network. You are notified that the Excel data source was moved to a new location. You need to update the PBIX file to use the new location. What are three ways to achieve the goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. ​ A. From the Datasets settings of the Power BI service, configure the data source credentials. ​ B. From the Data source settings in Power BI Desktop, configure the file path. ​ C. From Current File in Power BI Desktop, configure the Data Load settings. ​ D. From Power Query Editor, use the formula bar to configure the file path for the applied step. ​ E. From Advanced Editor in Power Query Editor, configure the file path in the M code. Hide Solution Discussion 12 Correct Answer: BDE 🗳️ Community vote distribution BDE (100%) Topic 2 - Question Set 2 Question #1 Topic 2 You are creating a report in Power BI Desktop. You load a data extract that includes a free text field named coll. You need to analyze the frequency distribution of the string lengths in col1. The solution must not affect the size of the model. What should you do? ​ A. In the report, add a DAX calculated column that calculates the length of col1 ​ B. In the report, add a DAX function that calculates the average length of col1 ​ C. From Power Query Editor, add a column that calculates the length of col1 ​ D. From Power Query Editor, change the distribution for the Column profile to group by length for col1 Reveal Solution Discussion 104 Question #2 Topic 2 You have a collection of reports for the HR department of your company. The datasets use row-level security (RLS). The company has multiple sales regions. Each sales region has an HR manager. You need to ensure that the HR managers can interact with the data from their region only. The HR managers must be prevented from changing the layout of the reports. How should you provision access to the reports for the HR managers? A. Publish the reports in an app and grant the HR managers access permission. B. Create a new workspace, copy the datasets and reports, and add the HR managers as members of the workspace. C. Publish the reports to a different workspace other than the one hosting the datasets. D. Add the HR managers as members of the existing workspace that hosts the reports and the datasets. Hide Solution Discussion 30 Correct Answer: A 🗳️ Reference: https://kunaltripathy.com/2021/10/06/bring-your-power-bi-to-power-apps-portal-part-ii/ Community vote distribution A (100%) Question #3 Topic 2 You need to provide a user with the ability to add members to a workspace. The solution must use the principle of least privilege. Which role should you assign to the user? ​ A. Viewer ​ B. Admin ​ C. Contributor ​ D. Member Hide Solution Discussion 29 Correct Answer: D 🗳️ Member role allows adding members or other with lower permissions to the workspace. Reference: https://docs.microsoft.com/en-us/power-bi/collaborate-share/service-roles-new-workspa ces Question #4 Topic 2 You have a Power BI query named Sales that imports the columns shown in the following table. Users only use the date part of the Sales_Date field. Only rows with a Status of Finished are used in analysis. You need to reduce the load times of the query without affecting the analysis. Which two actions achieve this goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Remove the rows in which Sales[Status] has a value of Canceled. B. Remove Sales[Sales_Date]. C. Change the data type of Sale[Delivery_Time] to Integer. D. Split Sales[Sale_Date] into separate date and time columns. E. Remove Sales[Canceled Date]. Hide Solution Discussion 117 Correct Answer: AD 🗳️ A: Removing uninteresting rows will increase query performance. D: Splitting the Sales_Date column will make comparisons on the Sales date faster. Community vote distribution AE (70%) AD (30%) Question #5 Topic 2 You build a report to analyze customer transactions from a database that contains the tables shown in the following table. You import the tables. Which relationship should you use to link the tables? ​ A. one-to-many from Transaction to Customer ​ B. one-to-one between Customer and Transaction ​ C. many-to-many between Customer and Transaction ​ D. one-to-many from Customer to Transaction Hide Solution Discussion 25 Correct Answer: D 🗳️ One on the primary Key side (customer table), many on the foreign key side (Transaction table) of the relation. Community vote distribution D (100%) Question #6 Topic 2 You have a custom connector that returns ID, From, To, Subject, Body, and Has Attachments for every email sent during the past year. More than 10 million records are returned. You build a report analyzing the internal networks of employees based on whom they send emails to. You need to prevent report recipients from reading the analyzed emails. The solution must minimize the model size. What should you do? ​ A. From Model view, set the Subject and Body columns to Hidden. ​ B. Remove the Subject and Body columns during the import. ​ C. Implement row-level security (RLS) so that the report recipients can only see results based on the emails they sent. Hide Solution Discussion 24 Correct Answer: B 🗳️ The Subject and the Body are not needed in the report. Dropping them resolves the security problem and minimizes the model. Community vote distribution B (100%) Question #7 Topic 2 HOTSPOT - You create a Power BI dataset that contains the table shown in the following exhibit. You need to make the table available as an organizational data type in Microsoft Excel. How should you configure the properties of the table? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Hide Solution Discussion 39 Correct Answer: Box 1: Cost Center - The Row label field value is used in Excel so users can easily identify the row. It appears as the cell value for a linked cell, in the Data Selector pane, and in the Information card. Box 2: ID - The Key column field value provides the unique ID for the row. This value enables Excel to link a cell to a specific row in the table. Box 3: Yes - In the Data Types Gallery in Excel, your users can find data from featured tables in your Power BI datasets. Reference: https://docs.microsoft.com/en-us/power-bi/collaborate-share/service-create-excel-featur ed-tables Question #8 Topic 2 You have the Power BI model shown in the following exhibit. A manager can represent only a single country. You need to use row-level security (RLS) to meet the following requirements: ✑ The managers must only see the data of their respective country. ✑ The number of RLS roles must be minimized. Which two actions should you perform? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Create a single role that filters Country[Manager_Email] by using the USERNAME DAX function. B. Create a single role that filters Country[Manager_Email] by using the USEROBJECTID DAX function. C. For the relationship between Purchase Detail and Purchase, select Apply security filter in both directions. D. Create one role for each country. E. For the relationship between Purchase and Purchase Detail, change the Cross filter direction to Single. Hide Solution Discussion 51 Correct Answer: AC 🗳️ A: You can take advantage of the DAX functions username() or userprincipalname() within your dataset. You can use them within expressions in Power BI Desktop. When you publish your model, it will be used within the Power BI service. Note: To define security roles, follow these steps. Import data into your Power BI Desktop report, or configure a DirectQuery connection. 1. From the Modeling tab, select Manage Roles. 2. From the Manage roles window, select Create. 3. Under Roles, provide a name for the role. 4. Under Tables, select the table to which you want to apply a DAX rule. 5. In the Table filter DAX expression box, enter the DAX expressions. This expression returns a value of true or false. For example: [Entity ID] = ‫ג‬€Value.€‫ג‬ 6. After you've created the DAX expression, select the checkmark above the expression box to validate the expression. Note: You can use username() within this expression. 7. Select Save. C: By default, row-level security filtering uses single-directional filters, whether the relationships are set to single direction or bi-directional. You can manually enable bi-directional cross-filtering with row-level security by selecting the relationship and checking the Apply security filter in both directions checkbox. Select this option when you've also implemented dynamic row-level security at the server level, where row-level security is based on username or login ID. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls Community vote distribution AC (73%) AD (21%) 6% Question #9 Topic 2 HOTSPOT - You have a Power BI imported dataset that contains the data model shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Hot Area: Hide Solution Discussion 56 Correct Answer: Box 1: Assume Referential Integrity When connecting to a data source using DirectQuery, you can use the Assume Referential Integrity selection to enable running more efficient queries against your data source. This feature has a few requirements of the underlying data, and it is only available when using DirectQuery. Note: The following requirements are necessary for Assume referential integrity to work properly: Data in the From column in the relationship is never Null or blank For each value in the From column, there is a corresponding value in the To column Box 2: Star schema - Star schema is a mature modeling approach widely adopted by relational data warehouses. It requires modelers to classify their model tables as either dimension or fact. Generally, dimension tables contain a relatively small number of rows. Fact tables, on the other hand, can contain a very large number of rows and continue to grow over time. Example: Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-assume-referential-inte grity https://docs.microsoft.com/en-us/power-bi/guidance/star-schema Changing the cross filter direction setting can have a significant impact on query performance. Setting it to "Single" when appropriate can often improve performance by reducing unnecessary filtering in both directions. In Star Schema, the central table (usually a fact table) is connected to dimension tables through one-to-many or many-to-many relationships. https://learn.microsoft.com/en-us/power-bi/transform-model/desktop-relationships -understand You can see it from the icons next to the tables name. When using DirectQuery the Icon is resembling a table; when using Import the icon is exactly how you see it in the question Question #10 Topic 2 HOTSPOT - You have a Power BI model that contains a table named Sales and a related date table. Sales contains a measure named Total Sales. You need to create a measure that calculates the total sales from the equivalent month of the previous year. How should you complete the calculation? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Hide Solution Discussion 50 Correct Answer: Box 1: CALCULATE - Box 2: PARALLELPERIOD - PARALLELPERIOD returns a table that contains a column of dates that represents a period parallel to the dates in the specified dates column, in the current context, with the dates shifted a number of intervals either forward in time or back in time. Syntax: PARALLELPERIOD(,,) dates: A column that contains dates. interval: The interval by which to shift the dates. The value for interval can be one of the following: year, quarter, month. Incorrect: SAMEPERIODLASTYEAR returns a table that contains a column of dates shifted one year back in time from the dates in the specified dates column, in the current context. Syntax: SAMEPERIODLASTYEAR() DATESMTD returns a table that contains a column of the dates for the month to date, in the current context. Syntax: DATESMTD() Box 3: 'DATE' [Month] Reference: https://docs.microsoft.com/en-us/dax/parallelperiod-function-dax https://docs.microsoft.com/en-us/dax/sameperiodlastyear-function-dax CALCULATE SAMEPERIODLASTYEAR 'DATE'[DATE] https://radacad.com/dateadd-vs-parallelperiod-vs-sameperiodlastyear-dax-time-int elligence-question https://www.youtube.com/watch?v=dBSOYxyRR_w ParallelPeriod could work but here the second agrument only takes one parameter and ParallelPeriod requires three https://learn.microsoft.com/en-us/dax/sameperiodlastyear-function-dax Question #11 Topic 2 DRAG DROP - You plan to create a report that will display sales data from the last year for multiple regions. You need to restrict access to individual rows of the data on a per region-basis by using roles. Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Hide Solution Discussion 43 Correct Answer: You can define roles and rules within Power BI Desktop. When you publish to Power BI, it also publishes the role definitions. To define security roles, follow these steps. 1. Import data into your Power BI Desktop report (Step 1) 2. From the Modeling tab, select Manage Roles. 3. From the Manage roles window, select Create. (Step 2) 4. Under Roles, provide a name for the role. 5. Under Tables, select the table to which you want to apply a DAX rule. 6. In the Table filter DAX expression box, enter the DAX expressions. This expression returns a value of true or false. For example: [Entity ID] = ‫ג‬€Value‫ג‬€(Step 3) 7. After you've created the DAX expression, select the checkmark above the expression box to validate the expression. 8. Select Save. Step 3: Assign Users to the role. You can't assign users to a role within Power BI Desktop. You assign them in the Power BI service. After you've created your roles, test the results of the roles within Power BI Desktop. Step 4: Publish the report. Now that you're done validating the roles in Power BI Desktop, go ahead and publish your report to the Power BI service. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls 1. Import data 2. create the roles on power bi 3. Publish the report 4. Assign Users to the role. https://learn.microsoft.com/en-us/training/modules/row-level-security-power -bi/2-static-method https://www.youtube.com/watch?v=MxU_FYSSnYU step 1 makes sense because you first need to get the data into PBI before you can do anything step 2: at this point, if you publish the report, then everyone will already have access so you can't do publish. therefore, you need to create roles in PBI step 3: you can't assign users to the role yet because the Power BI desktop doesn't have that ability--it needs to be done in the workspace service online and that can only happen after the report is published to that workspace. therefore step 3 has to be to publish step 4: after publishing, you can then choose the report and the users to assign to the role you cannot assign users to the role before publishing the report in Power BI. You can only assign roles to users or groups once the report is published to the Power BI service. Before publishing the report, you can create and define roles in the Power BI Desktop. However, you cannot assign users to these roles until the report is published to the Power BI service. After publishing the report, you can then assign users or groups to the roles that you have defined in the Power BI Desktop. Once users are assigned to a role, they will only be able to see the data that they have permission to view based on their assigned role when they access the report. Question #12 Topic 2 DRAG DROP - You create a data model in Power BI. Report developers and users provide feedback that the data model is too complex. The model contains the following tables. The model has the following relationships: ✑ There is a one-to-one relationship between Sales_Region and Region_Manager. ✑ There are more records in Manager than in Region_Manager, but every record in Region_Manager has a corresponding record in Manager. ✑ There are more records in Sales_Manager than in Sales_Region, but every record in Sales_Region has a corresponding record in Sales_Manager. You need to denormalize the model into a single table. Only managers who are associated to a sales region must be included in the reports. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select. Select and Place: Hide Solution Discussion 68 Correct Answer: Step 1: Merge [Sales_Region] and [Sales_Manager] by using an inner join. Inner Join: Returns the rows present in both Left and right table only if there is a match. Otherwise, it returns zero records. Note: Sales_Region and Sales_manager There is a one-to-one relationship between Sales_Region and Region_Manager. There are more records in Sales_Manager than in Sales_Region, but every record in Sales_Region has a corresponding record in Sales_Manager. Step 2: Merge [Region_Manager] and [Manager] by using inner join. Only managers who are associated to a sales region must be included in the reports. Note: Region_Manager and Manager. There are more records in Manager than in Region_Manager, but every record in Region_Manager has a corresponding record in Manager. Step 3: Merge [Sales_region] and [Region_Manager] by using a right join as new query named [Sales_region_and_Region_Manager] Reference: https://www.tutorialgateway.org/joins-in-power-bi/ 3,1,6 1. Merge [Region_Manager] and [Manager] by using an inner join. 2. Merge [Sales_Region] and [Sales_Manager] by using an inner join. 3. Merge [Sales_Region] and [Region_Manager] by using an inner join. Either 1-3-6 or 3-1-6 is correct. Step 1 and 3 can be done separately. Step 6 is to merge the results of joins in Step 1 and 3 so Step 6 is the last step. I think so, though the question states that more than one order is correct. 6 merges the two smaller tables, I think it is better to merge the larger tables first, because that means the last inner join will have less rows to fuse. Question #13 Topic 2 You have a Microsoft Power BI report. The size of PBIX file is 550 MB. The report is accessed by using an App workspace in shared capacity of powerbi.com. The report uses an imported dataset that contains one fact table. The fact table contains 12 million rows. The dataset is scheduled to refresh twice a day at 08:00 and 17:00. The report is a single page that contains 15 AppSource visuals and 10 default visuals. Users say that the report is slow to load the visuals when they access and interact with the report. You need to recommend a solution to improve the performance of the report. What should you recommend? ​ A. Change any DAX measures to use iterator functions. ​ B. Enable visual interactions. ​ C. Replace the default visuals with AppSource visuals. ​ D. Split the visuals onto multiple pages. Hide Solution Discussion 26 Correct Answer: D 🗳️ One page with many visuals may also make your report loading slow. Please appropriately reduce the number of visualizations on one page. Reference: https://community.powerbi.com/t5/Desktop/Visuals-are-loading-extremely-slow/td-p/156 5668 Community vote distribution D (100%) Question #14 Topic 2 HOTSPOT - You are creating a Microsoft Power BI imported data model to perform basket analysis. The goal of the analysis is to identify which products are usually bought together in the same transaction across and within sales territories. You import a fact table named Sales as shown in the exhibit. (Click the Exhibit tab.) The related dimension tables are imported into the model. Sales contains the data shown in the following table. You are evaluating how to optimize the model. For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point. Hot Area: Hide Solution Discussion 30 Correct Answer: Box 1: Yes - Those two columns not need in the analysis. Box 2: No - Can remove the surrogate key OrderDateKey from the analysis. Box 3: No - Tax charged not relevant for the analysis. Yes: Primary key for the fact table is not needed only foreign keys are needed to connect with other dimension tables No: There is SaleOrderNumber that identifies the orders. Date should be the same under the same SalesOrderNumber, so Date can be removed No: Not necessary Question #15 Topic 2 You have a Microsoft Power BI data model that contains three tables named Orders, Date, and City. There is a one-to-many relationship between Date and Orders and between City and Orders. The model contains two row-level security (RLS) roles named Role1 and Role2. Role1 contains the following filter. City[State Province] = "Kentucky" Role2 contains the following filter. Date[Calendar Year] = 2020 - If a user is a member of both Role1 and Role2, what data will they see in a report that uses the model? ​ A. The user will see data for which the State Province value is Kentucky or where the Calendar Year is 2020. ​ B. The user will receive an error and will not be able to see the data in the report. ​ C. The user will only see data for which the State Province value is Kentucky. ​ D. The user will only see data for which the State Province value is Kentucky and the Calendar Year is 2020. Hide Solution Discussion 146 Correct Answer: D 🗳️ Row-level security (RLS) with Power BI can be used to restrict data access for given users. Filters restrict data access at the row level, and you can define filters within roles. Both Roles are applied, and both role filters must be met. Incorrect: Not B: A model relationship is limited when there's no guaranteed "one" side. You get an error message if you belong to multiple RLS roles and at least one of the roles relies on a limited relationship. But here both relationships have a guaranteed 1 side. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls Community vote distribution A (88%) 11% Question #16 Topic 2 Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: From Power Query Editor, you import the table and then add a filter step to the query. Does this meet the goal? ​ A. Yes ​ B. No Hide Solution Discussion 103 Correct Answer: B 🗳️ This would load the entire table in the first step. Instead: You add a WHERE clause to the SQL statement. Reference: https://docs.microsoft.com/en-us/power-query/native-database-query Community vote distribution B (55%) A (45%) Question #17 Topic 2 Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: You write a DAX expression that uses the FILTER function. Does this meet the goal? ​ A. Yes ​ B. No Hide Solution Discussion 20 Correct Answer: B 🗳️ Instead: You add a WHERE clause to the SQL statement. Note: DAX is not a language designed to fetch the data like SQL rather than used for data analysis purposes. It is always a better and recommended approach to transform the data as close to the data source itself. For example, your data source is a relational database; then, it's better to go with T-SQL. SQL is a structured query language, whereas DAX is a formula language used for data analysis purposes. When our data is stored in some structured database systems like SQL server management studio, MySQL, or others, we have to use SQL to fetch the stored data. Reference: https://www.learndax.com/dax-vs-sql-when-to-use-dax-over-sql/ Community vote distribution B (100%) Also, common sense, we are trying to import a sample of the data meaning that the data is not yet on Power BI, so where are we going to filter with DAX? On the SQL server? That is not possible hence the answer B is correct. Question #18 Topic 2 Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: You add a WHERE clause to the SQL statement. Does this meet the goal? A. Yes B. No Hide Solution Discussion 19 Correct Answer: A 🗳️ Power Query enables you to specify your native database query in a text box under Advanced options when connecting to a database. In the example below, you'll import data from a SQL Server database using a native database query entered in the SQL statement text box. 1. Connect to a SQL Server database using Power Query. Select the SQL Server database option in the connector selection. 2. In the SQL Server database popup window: 3. Specify the Server and Database where you want to import data from using native database query. 4. Under Advanced options, select the SQL statement field and paste or enter your native database query, then select OK. Reference: https://docs.microsoft.com/en-us/power-query/native-database-query Community vote distribution A (100%) The correct answer is A. This means that the data is being filtered at the source database itself, using a SQL query with a WHERE clause. Question #19 Topic 2 DRAG DROP - You are preparing a financial report in Power BI. You connect to the data stored in a Microsoft Excel spreadsheet by using Power Query Editor as shown in the following exhibit. You need to prepare the data to support the following: ✑ Visualizations that include all measures in the data over time ✑ Year-over-year calculations for all the measures Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Question #20 Topic 2 HOTSPOT - You are creating an analytics report that will consume data from the tables shown in the following table. There is a relationship between the tables. There are no reporting requirements on employee_id and employee_photo. You need to optimize the data model. What should you configure for employee_id and employee_photo? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Box 1: Hide - Need in the relation, so cannot delete it. Box 2: Delete - Reference: https://community.powerbi.com/t5/Desktop/How-to-Hide-a-Column-in-power-Bi/m- p/414470 Question #21 Topic 2 HOTSPOT - You plan to create Power BI dataset to analyze attendance at a school. Data will come from two separate views named View1 and View2 in an Azure SQL database. View1 contains the columns shown in the following table. View2 contains the columns shown in the following table. The views can be related based on the Class ID column. Class ID is the unique identifier for the specified class, period, teacher, and school year. For example, the same class can be taught by the same teacher during two different periods, but the class will have a different class ID. You need to design a star schema data model by using the data in both views. The solution must facilitate the following analysis: ✑ The count of classes that occur by period ✑ The count of students in attendance by period by day ✑ The average number of students attending a class each month In which table should you include the Teacher First Name and Period Number fields? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Question #22 Topic 2 You have the Power BI model shown in the following exhibit. There are four departments in the Departments table. You need to ensure that users can see the data of their respective department only. What should you do? ​ A. Create a slicer that filters Departments based on DepartmentID. ​ B. Create a row-level security (RLS) role for each department, and then define the membership of the role. ​ C. Create a DepartmentID parameter to filter the Departments table. ​ D. To the ConfidentialData table, add a calculated measure that uses the CURRENTGROUP DAX function. B. RLS (row level security) is the answer any time you need users to see data based on a certain value of any given dimension. Question #23 Topic 2 In Power BI Desktop, you are building a sales report that contains two tables. Both tables have row-level security (RLS) configured. You need to create a relationship between the tables. The solution must ensure that bidirectional cross-filtering honors the RLS settings. What should you do? ​ A. Create an inactive relationship between the tables and select Apply security filter in both directions. ​ B. Create an active relationship between the tables and select Apply security filter in both directions. ​ C. Create an inactive relationship between the tables and select Assume referential integrity. ​ D. Create an active relationship between the tables and select Assume referential integrity. Correct Answer: B 🗳️ By default, row-level security filtering uses single-directional filters, whether the relationships are set to single direction or bi-directional. You can manually enable bi-directional cross-filtering with row-level security by selecting the relationship and checking the Apply security filter in both directions checkbox. Select this option when you've also implemented dynamic row-level security at the server level, where row-level security is based on username or login ID. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls RLS only works with active relationships. So, there is no question of building inactive relationships here. Also, we have the set the bi-directional filtering as expected and asked in the question. Question #24 Topic 2 HOTSPOT - You have a column named UnitsInStock as shown in the following exhibit. UnitsInStock has 75 non-null values, of which 51 are unique. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Hot Area: Correct Answer: Box 1: 75 rows - Is nullable allows NULL values in the column. Box 2: reduce - Reference: https://blog.crossjoin.co.uk/2019/01/20/is-nullable-column-property-power-bi/ Question #25 Topic 2 HOTSPOT - You have a Power BI report. You have the following tables. You have the following DAX measure. Accounts := CALCULATE ( DISTINCTCOUNT (Balances[AccountID]), LASTDATE ('Date'[Date]) For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point. Hot Area: Correct Answer: Box 1: No - It will show the total number of accounts that were live at the last day of the year only. Note: DISTINCTCOUNT counts the number of distinct values in a column. LASTDATE returns the last date in the current context for the specified column of dates. Box 2: No - It will show the total number of accounts that were live at the last day of the month only. Box 3: Yes - Reference: https://docs.microsoft.com/en-us/dax/distinctcount-function-dax https://docs.microsoft.com/en-us/dax/lastdate-function-dax Question #26 Topic 2 You have the tables shown in the following table. The Impressions table contains approximately 30 million records per month. You need to create an ad analytics system to meet the following requirements: ✑ Present ad impression counts for the day, campaign, and site_name. The analytics for the last year are required. Minimize the data model size. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. ​ A. Create one-to-many relationships between the tables. ​ B. Group the Impressions query in Power Query by Ad_id, Site_name, and Impression_date. Aggregate by using the CountRows function. ​ C. Create a calculated table that contains Ad_id, Site_name, and Impression_date. ​ D. Create a calculated measure that aggregates by using the COUNTROWS function. Correct Answer: AB 🗳️ Incorrect: Not C: A calculated table would increase the data model size. Not D: Need Impression_date etc. AB is the correct answer. Grouping in power query reduces the number of rows in the impression table that is gonna be loaded in the model. Creating relationships doesn't increase the size of the model. Therefore, the answer AB is correct! Question #27 Topic 2 HOTSPOT - You are creating a Microsoft Power BI data model that has the tables shown in the following table. The Products table is related to the ProductCategory table through the ProductCategoryID column. Each product has one product category. You need to ensure that you can analyze sales by product category. How should you configure the relationship from ProductCategory to Products? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Reveal Solution Discussion 39 Correct Answer: Box 1: One-to-many - The one-to-many and many-to-one cardinality options are essentially the same, and they're also the most common cardinality types. Incorrect: A many-to-many relationship means both columns can contain duplicate values. This cardinality type is infrequently used. It's typically useful when designing complex model requirements. You can use it to relate many-to-many facts or to relate higher grain facts. For example, when sales target facts are stored at product category level and the product dimension table is stored at product level. Box 2: Single - Incorrect: Bear in mind that bi-directional relationships can impact negatively on performance. Further, attempting to configure a bi-directional relationship could result in ambiguous filter propagation paths. In this case, Power BI Desktop may fail to commit the relationship change and will alert you with an error message. Reference: https://docs.microsoft.com/en-us/power-bi/transform-model/desktop-relationships-under stand "For one-to-many relationships, the cross filter direction is always from the "one" side, and optionally from the "many" side (bi-directional). " https://learn.microsoft.com/en-us/power-bi/transform-model/desktop-relationships -understand Question #28 You import a Power BI dataset that contains the following tables: ✑ Date ✑ Product ✑ Product Inventory The Product Inventory table contains 25 million rows. A sample of the data is shown in the following table. The Product Inventory table relates to the Date table by using the DateKey column. The Product Inventory table relates to the Product table by using the ProductKey column. You need to reduce the size of the data model without losing information. What should you do? ​ A. Change Summarization for DateKey to Don't Summarize. ​ B. Remove the relationship between Date and Product Inventory ​ C. Change the data type of UnitCost to Integer. ​ D. Remove MovementDate. Correct Answer: D 🗳️ The DateKey and MovementDate columns have the same information. Movementdate can be removed. Incorrect: Not C: Integer data type would lose data. D Question #29 Topic 2 HOTSPOT - You are enhancing a Power BI model that has DAX calculations. You need to create a measure that returns the year-to-date total sales from the same date of the previous calendar year. Which DAX functions should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Hide Solution Discussion 30 Correct Answer: Box 1: CALCULATE - Example: Total sales on the last selected date = CALCULATE ( SUM ( Sales[Sales Amount] ), 'Sales'[OrderDateKey] = MAX ( 'Sales'[OrderDateKey] ) ) Box 2: SUM - Box 3: SAMEPERIODLASTYEAR - SAMEPERIODLASTYEAR returns a set of dates in the current selection from the previous year. Example: -- SAMEPERIODLASTYEAR returns the selected period shifted back one year. EVALUATE - VAR StartDate = DATE ( 2008, 07, 25 ) VAR EndDate = DATE ( 2008, 07, 31 ) RETURN - CALCULATETABLE ( SAMEPERIODLASTYEAR ( 'Date'[Date] ), 'Date'[Date] >= StartDate && 'Date'[Date] 0)),[sales_amount]) ​ C. a measure that uses the following formula: SUM(Sales[sales_amount]) ​ D. a calculated column that uses the following formula: IF(ISBLANK(Sales[sales_amount]),0, (Sales[sales_amount])) b You are concerned with the quality and completeness of the sales data. You must ensure that negative and missing sales_amount values do NOT contribute to the total sales amount calculation. Question #6 You need to create a calculated column to display the month based on the reporting requirements. Which DAX expression should you use? A. FORMAT('Date'[date],"MMM YYYY") B. FORMAT('Date'[date_id],"MMM") & " " & FORMAT('Date'[year], "#") C. FORMAT('Date'[date_id],"MMM YYYY") D. FORMAT('Date'[date],"M YY") A Topic 9 - Testlet 4 General Overview - Northwind Traders is a specialty food import company. The company recently implemented Power BI to better understand its top customers, products, and suppliers. Business Issues - The sales department relies on the IT department to generate reports in Microsoft SQL Server Reporting Services (SSRS). The IT department takes too long to generate the reports and often misunderstands the report requirements. Existing Environment. Data Sources Northwind Traders uses the data sources shown in the following table. Source2 is exported daily from a third-party system and stored in Microsoft SharePoint Online. Existing Environment. Customer Worksheet Source2 contains a single worksheet named Customer Details. The first 11 rows of the worksheet are shown in the following table. All the fields in Source2 are mandatory. The Address column in Customer Details is the billing address, which can differ from the shipping address. Existing Environment. Azure SQL Database Source1 contains the following tables: Orders Products Suppliers Categories Order Details Sales Employees The Orders table contains the following columns. The Order Details table contains the following columns. The address in the Orders table is the shipping address, which can differ from the billing address. The Products table contains the following columns. The Categories table contains the following columns. The Suppliers table contains the following columns. The Sales Employees table contains the following columns. Each employee in the Sales Employees table is assigned to one sales region. Multiple employees can be assigned to each region. Requirements. Report Requirements Northwind Traders requires the following reports: Top Products Top Customers On-Time Shipping The Top Customers report will show the top 20 customers based on the highest sales amounts in a selected order month or quarter, product category, and sales region. The Top Products report will show the top 20 products based on the highest sales amounts sold in a selected order month or quarter, sales region, and product category. The report must also show which suppliers provide the top products. The On-Time Shipping report will show the following metrics for a selected shipping month or quarter: The percentage of orders that were shipped late by country and shipping region Customers that had multiple late shipments during the last quarter Northwind Traders defines late orders as those shipped after the required shipping date. The warehouse shipping department must be notified if the percentage of late orders within the current month exceeds 5%. The reports must show historical data for the current calendar year and the last three calendar years. Requirements. Technical Requirements Northwind Traders identifies the following technical requirements: A single dataset must support all three reports. The reports must be stored in a single Power BI workspace. Report data must be current as of 7 AM Pacific Time each day. The reports must provide fast response times when users interact with a visualization. The data model must minimize the size of the dataset as much as possible, while meeting the report requirements and the technical requirements. Requirements. Security Requirements Access to the reports must be granted to Azure Active Directory (Azure AD) security groups only. An Azure AD security group exists for each department. The sales department must be able to perform the following tasks in Power BI: Create, edit, and delete content in the reports. Manage permissions for workspaces, datasets, and reports. Publish, unpublish, update, and change the permissions for an app. Assign Azure AD groups role-based access to the reports workspace. Users in the sales department must be able to access only the data of the sales region to which they are assigned in the Sales Employees table. Power BI has the following row-level security (RLS) Table filter DAX expression for the Sales Employees table. [EmailAddress] = USERNAME() RLS will be applied only to the sales department users. Users in all other departments must be able to view all the data. You need to design the data model and the relationships for the Customer Details worksheet and the Orders table by using Power BI. The solution must meet the report requirements. For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point. Hot Area: 1/ NO : the relationship is between CustomerCRMID from Customer Details and CustomerID from Orders 2/ YES: the relationship is between CustomerCRMID from Customer Details and CustomerID(NCHAR) from Orders --> CustomerCRMID have to be text too. 3/ NO: from Customer Details Question #2 You need to create a measure that will return the percentage of late orders. How should you complete the DAX expression? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Question #3 You need to minimize the size of the dataset. The solution must meet the report requirements. What should you do? A. Group the Categories table by the CategoryID column. B. Remove the QuantityPerUnit column from the Products table. C. Filter out discontinued products while importing the Products table. D. Change the OrderID column in the Orders table to the Text data type. B QuantityPerUnit column from the Products table is not necessary for the analysis. Discontinued products may not be filtered out because OrderDetails can still have foreign key referencing to them (historical data must be kept). Question #4 You need to design the data model to meet the report requirements. What should you do in Power BI Desktop? A. From Power Query, add a date table. Create an active relationship to the OrderDate column in the Orders table and an inactive relationship to the ShippedDate column in the Orders table. B. From Power Query, add columns to the Orders table to calculate the calendar quarter and the calendar month of the OrderDate column. C. From Power BI Desktop, use the

Use Quizgecko on...
Browser
Browser