pl-300_june download.pdf
Document Details
Uploaded by CleanestHeliodor6477
Tags
Full Transcript
Certy IQ Premium exam material Get certification quickly with the CertyIQ Premium exam material. Everything you need to prepare, learn & pass your certification exam easily. Lifetime free updates First attempt guaranteed success. https://www.CertyIQ.com Microsoft...
Certy IQ Premium exam material Get certification quickly with the CertyIQ Premium exam material. Everything you need to prepare, learn & pass your certification exam easily. Lifetime free updates First attempt guaranteed success. https://www.CertyIQ.com Microsoft (PL-300) Microsoft Power BI Data Analyst Total: 259 Questions Link: https://certyiq.com/papers?provider=microsoft&exam=pl-300 Question: 1 CertyIQ HOTSPOT - You plan to create the Power BI model shown in the exhibit. (Click the Exhibit tab.) The data has the following refresh requirements: ✑ Customer must be refreshed daily. ✑ Date must be refreshed once every three years. ✑ Sales must be refreshed in near real time. ✑ SalesAggregate must be refreshed once per week. You need to select the storage modes for the tables. The solution must meet the following requirements: ✑ Minimize the load times of visuals. ✑ Ensure that the data is loaded to the model based on the refresh requirements. Which storage mode should you select for each table? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: Dual - Customer should use the dual storage mode. Dual: Tables with this setting can act as either cached or not cached, depending on the context of the query that's submitted to the Power BI dataset. In some cases, you fulfill queries from cached data. In other cases, you fulfill queries by executing an on-demand query to the data source. Note: You set the Storage mode property to one of these three values: Import, DirectQuery, and Dual. Box 2: Dual - You can set the dimension tables (Customer, Geography, and Date) to Dual to reduce the number of limited relationships in the dataset, and improve performance. Box 3: DirectQuery - Sales should use the DirectQuery storage mode. DirectQuery: Tables with this setting aren't cached. Queries that you submit to the Power BI dataset"for example, DAX queries"and that return data from DirectQuery tables can be fulfilled only by executing on-demand queries to the data source. Queries that you submit to the data source use the query language for that data source, for example, SQL. Box 4: Import - Import: Imported tables with this setting are cached. Queries submitted to the Power BI dataset that return data from Import tables can be fulfilled only from cached data. Note:- Dual (Composite) Mode: The dual storage mode is between Import and DirectQuery. it is a hybrid approach, Like importing data, the dual storage mode caches the data in the table. However, it leaves it up to Power BI to determine the best way to query the table depending on the query context. 1) Sales Must be Refreshed in Near real time so "Direct Query" 2) Sales Aggregate is once per week so "Import" (performance also required) 3) Both Date and Customer has relationship with both Sales and SalesAggregate tables so "Dual" because to support performance for DirectQuery(Sales) and Import(SalesAggregate) Reference: https://docs.microsoft.com/en-us/power-bi/transform-model/desktop-storage-mode Question: 2 CertyIQ You have a project management app that is fully hosted in Microsoft Teams. The app was developed by using Microsoft Power Apps. You need to create a Power BI report that connects to the project management app. Which connector should you select? A. Microsoft Teams Personal Analytics B. SQL Server database C. Dataverse D. Dataflows Answer: C Explanation: Data sources in Power BI Desktop. The Power Platform category provides the following data connections: Power BI datasets - Power BI dataflows - Common Data Service (Legacy) Dataverse - Dataflows - Other data sources include Microsoft Teams Personal Analytics (Beta). You can use the Microsoft Power BI template to import data into Power BI from Project for the web and Project Online. When you're using the template, you're connected to your Microsoft Dataverse instance, where your Microsoft Project web app data is stored. https://support.microsoft.com/en-us/office/use-power-bi-desktop-to-connect-with-your-project-data- df4ccca1-68e9-418c-9d0f-022ac05249a2 Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-data-sources Question: 3 CertyIQ For the sales department at your company, you publish a Power BI report that imports data from a Microsoft Excel file located in a Microsoft SharePoint folder. The data model contains several measures. You need to create a Power BI report from the existing data. The solution must minimize development effort. Which type of data source should you use? A. Power BI dataset B. a SharePoint folder C. Power BI dataflows D. an Excel workbook Answer: A Explanation: Power BI dataset because the case states there is already a report published and the datamodel contains measures. therefore and to be able to use the measures in the datamodel you should connect to the existing dataset (which was created when you plublished the report) instead of starting from scratch with the files in the SharePoint folder. Question: 4 CertyIQ You import two Microsoft Excel tables named Customer and Address into Power Query. Customer contains the following columns: ✑ Customer ID ✑ Customer Name ✑ Phone ✑ Email Address ✑ Address ID Address contains the following columns: ✑ Address ID ✑ Address Line 1 ✑ Address Line 2 ✑ City ✑ State/Region ✑ Country ✑ Postal Code Each Customer ID represents a unique customer in the Customer table. Each Address ID represents a unique address in the Address table. You need to create a query that has one row per customer. Each row must contain City, State/Region, and Country for each customer. What should you do? A. Merge the Customer and Address tables. B. Group the Customer and Address tables by the Address ID column. C. Transpose the Customer and Address tables. D. Append the Customer and Address tables. Answer: A Explanation: Remember Merge is JOIN, APPEND is UNION A merge queries operation joins two existing tables together based on matching values from one or multiple columns. You can choose to use different types of joins, depending on the output you want. Reference: https://docs.microsoft.com/en-us/power-query/merge-queries-overview Question: 5 CertyIQ HOTSPOT - You have two Azure SQL databases that contain the same tables and columns. For each database, you create a query that retrieves data from a table named Customer. You need to combine the Customer tables into a single table. The solution must minimize the size of the data model and support scheduled refresh in powerbi.com. What should you do? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: Append Queries as New - When you have additional rows of data that you'd like to add to an existing query, you append the query. There are two append options: * Append queries as new displays the Append dialog box to create a new query by appending multiple tables. * Append queries displays the Append dialog box to add additional tables to the current query. Incorrect: When you have one or more columns that you'd like to add to another query, you merge the queries. Box 2: Disable loading the query to the data model By default, all queries from Query Editor will be loaded into the memory of Power BI Model. You can disable the load for some queries, especially queries that used as intermediate transformation to produce the final query for the model. Disabling Load doesn't mean the query won't be refreshed, it only means the query won't be loaded into the memory. When you click on Refresh model in Power BI, or when a scheduled refresh happens even queries marked as Disable Load will be refreshed, but their data will be used as intermediate source for other queries instead of loading directly into the model. This is a very basic performance tuning tip, but very important when your Power BI model grows bigger and bigger. Reference: https://docs.microsoft.com/en-us/power-query/append-queries https://radacad.com/performance-tip-for-power-bi-enable-load-sucks-memory-up Question: 6 CertyIQ DRAG DROP - In Power Query Editor, you have three queries named ProductCategory, ProductSubCategory, and Product. Every Product has a ProductSubCategory. Not every ProductsubCategory has a parent ProductCategory. You need to merge the three queries into a single query. The solution must ensure the best performance in Power Query. How should you merge the tables? To answer, drag the appropriate merge types to the correct queries. Each merge type may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point. Select and Place: Answer: Explanation: Box 1: Inner - Every Product has a ProductSubCategory. A standard join is needed. One of the join kinds available in the Merge dialog box in Power Query is an inner join, which brings in only matching rows from both the left and right tables. Box 2: Left outer - Not every ProductsubCategory has a parent ProductCategory. One of the join kinds available in the Merge dialog box in Power Query is a left outer join, which keeps all the rows from the left table and brings in any matching rows from the right table. Reference: https://docs.microsoft.com/en-us/power-query/merge-queries-inner https://docs.microsoft.com/en-us/power- query/merge-queries-left-outer Question: 7 CertyIQ You are building a Power BI report that uses data from an Azure SQL database named erp1. You import the following tables. You need to perform the following analyses: ✑ Orders sold over time that include a measure of the total order value Orders by attributes of products sold The solution must minimize update times when interacting with visuals in the report. What should you do first? A. From Power Query, merge the Order Line Items query and the Products query. B. Create a calculated column that adds a list of product categories to the Orders table by using a DAX function. C. Calculate the count of orders per product by using a DAX function. D. From Power Query, merge the Orders query and the Order Line Items query. Answer: D Explanation: D. It's the Header/Detail Schema, and the most optimal way is to flatten the header into the detail table. Source: https://www.sqlbi.com/articles/header-detail-vs-star-schema-models-in-tabular-and-power-bi/ GPT: Merging the Orders query and the Order Line Items query in Power Query will allow you to create a single query that combines the necessary data from the different tables. This will make it easier and more efficient to perform the required analyses, as you will have all the information you need in one place. --- PBI will do the best aggregation base on Star Schema model, we now have 1 Fact table (Order Line Items) and 2 Dim tables (Products, Orders). Orders has common field with Products (ProductID), and pretty sure time series field (OrderDate); Orders Line Items has Price and Quanity. --- We need summarize some values like "price" and "quantity" over-time by attributes product. But we only have common field in Dim table (Orders) so we need to merge Dim (Orders) and Fact (Order Line Items) to new single Fact table to design the right Star Schema model. => So that D is correct Question: 8 CertyIQ You have a Microsoft SharePoint Online site that contains several document libraries. One of the document libraries contains manufacturing reports saved as Microsoft Excel files. All the manufacturing reports have the same data structure. You need to use Power BI Desktop to load only the manufacturing reports to a table for analysis. What should you do? A. Get data from a SharePoint folder and enter the site URL Select Transform, then filter by the folder path to the manufacturing reports library. B. Get data from a SharePoint list and enter the site URL. Select Combine & Transform, then filter by the folder path to the manufacturing reports library. C. Get data from a SharePoint folder, enter the site URL, and then select Combine & Load. D. Get data from a SharePoint list, enter the site URL, and then select Combine & Load. Answer: A Explanation: We have to import Excel files from SharePoint, so we need the connector SharePoint folder which is used to get access to the files stored in the library. SharePoint list is a collection of content that has rows and columns (like a table) and is used for task lists, calendars, etc. Since we have to filter only on manufacturing reports, we have to select Transform and then filter by the corresponding folder path. Question: 9 CertyIQ DRAG DROP - You have a Microsoft Excel workbook that contains two sheets named Sheet1 and Sheet2. Sheet1 contains the following table named Table1. Sheet2 contains the following table named Table2. You need to use Power Query Editor to combine the products from Table1 and Table2 into the following table that has one column containing no duplicate values. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Answer: Explanation: Import From Excel since it has not been loaded to Powerbi initially Append Table 2 to Table 1 Remove Duplicates from the table appended to (Table1) Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-shape-and-combine-data Question: 10 CertyIQ You have a CSV file that contains user complaints. The file contains a column named Logged. Logged contains the date and time each complaint occurred. The data in Logged is in the following format: 2018-12-31 at 08:59. You need to be able to analyze the complaints by the logged date and use a built-in date hierarchy. What should you do? A. Apply a transformation to extract the last 11 characters of the Logged column and set the data type of the new column to Date. B. Change the data type of the Logged column to Date. C. Split the Logged column by using at as the delimiter. D. Apply a transformation to extract the first 11 characters of the Logged column. Answer: C Explanation: You should split the Logged column by using "at" as the delimiter. This will allow you to separate the date and time into separate columns, which will enable you to analyze the complaints by date and use a built-in date hierarchy. Alternatively, you could also use a transformation to extract the date and time from the Logged column and set the data type of the new columns to Date and Time, respectively. Option A is incorrect because it only extracts the last 11 characters of the Logged column, which would not include the date. Option B is incorrect because the data in the Logged column is in a non-standard date format and cannot be directly converted to the Date data type. Option D is incorrect because it only extracts the first 11 characters of the Logged column, which would not include the time. Question: 11 CertyIQ You have a Microsoft Excel file in a Microsoft OneDrive folder. The file must be imported to a Power BI dataset. You need to ensure that the dataset can be refreshed in powerbi.com. Which two connectors can you use to connect to the file? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Excel Workbook B. Text/CSV C. Folder D. SharePoint folder E. Web Answer: DE Explanation: We can import an excel file from multiple connectors (excel workbook, folder, web, share point) but if we must refresh the data from the service with no gateways then We must use web and share point connectors. Question: 12 CertyIQ HOTSPOT - You are profiling data by using Power Query Editor. You have a table named Reports that contains a column named State. The distribution and quality data metrics for the data in State is shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: 69 - 69 distinct/different values. Note: Column Distribution allows you to get a sense for the overall distribution of values within a column in your data previews, including the count of distinct values (total number of different values found in a given column) and unique values (total number of values that only appear once in a given column). Box 2: 4 - Reference: https://systemmanagement.ro/2018/10/16/power-bi-data-profiling-distinct-vs-unique/ Question: 13 CertyIQ HOTSPOT - You have two CSV files named Products and Categories. The Products file contains the following columns: ✑ ProductID ✑ ProductName ✑ SupplierID ✑ CategoryID The Categories file contains the following columns: ✑ CategoryID ✑ CategoryName ✑ CategoryDescription From Power BI Desktop, you import the files into Power Query Editor. You need to create a Power BI dataset that will contain a single table named Product. The Product will table includes the following columns: ✑ ProductID ✑ ProductName ✑ SupplierID ✑ CategoryID ✑ CategoryName ✑ CategoryDescription How should you combine the queries, and what should you do on the Categories query? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: Merge - There are two primary ways of combining queries: merging and appending. * When you have one or more columns that you'd like to add to another query, you merge the queries. * When you have additional rows of data that you'd like to add to an existing query, you append the query. Box 2: Disable the query load - Managing loading of queries - In many situations, it makes sense to break down your data transformations in multiple queries. One popular example is merging where you merge two queries into one to essentially do a join. In this type of situations, some queries are not relevant to load into Desktop as they are intermediate steps, while they are still required for your data transformations to work correctly. For these queries, you can make sure they are not loaded in Desktop by un-checking 'Enable load' in the context menu of the query in Desktop or in the Properties screen: Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-shape-and-combine-data https://docs.micr osoft.com/en-us/power-bi/connect-data/refresh-include-in-report-refresh Question: 14 CertyIQ You have an Azure SQL database that contains sales transactions. The database is updated frequently. You need to generate reports from the data to detect fraudulent transactions. The data must be visible within five minutes of an update. How should you configure the data connection? A. Add a SQL statement. B. Set the Command timeout in minutes setting. C. Set Data Connectivity mode to Import. D. Set Data Connectivity mode to DirectQuery. Answer: D Explanation: DirectQuery: No data is imported or copied into Power BI Desktop. For relational sources, the selected tables and columns appear in the Fields list. For multi- dimensional sources like SAP Business Warehouse, the dimensions and measures of the selected cube appear in the Fields list. As you create or interact with a visualization, Power BI Desktop queries the underlying data source, so you're always viewing current data. Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-use-directquery Question: 15 CertyIQ DRAG DROP - You have a folder that contains 100 CSV files. You need to make the file metadata available as a single dataset by using Power BI. The solution must NOT store the data of the CSV files. Which three actions should you perform in sequence. To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Answer: Explanation: 1. Get data and select folder 2. Remove the content column 3. Expand the attributes column Question: 16 CertyIQ A business intelligence (BI) developer creates a dataflow in Power BI that uses DirectQuery to access tables from an on-premises Microsoft SQL server. The Enhanced Dataflows Compute Engine is turned on for the dataflow. You need to use the dataflow in a report. The solution must meet the following requirements: ✑ Minimize online processing operations. ✑ Minimize calculation times and render times for visuals. ✑ Include data from the current year, up to and including the previous day. What should you do? A. Create a dataflows connection that has DirectQuery mode selected. B. Create a dataflows connection that has DirectQuery mode selected and configure a gateway connection for the dataset. C. Create a dataflows connection that has Import mode selected and schedule a daily refresh. D. Create a dataflows connection that has Import mode selected and create a Microsoft Power Automate solution to refresh the data hourly. Answer: C Explanation: A daily update is adequate. When you set up a refresh schedule, Power BI connects directly to the data sources using connection information and credentials in the dataset to query for updated data, then loads the updated data into the dataset. Any visualizations in reports and dashboards based on that dataset in the Power BI service are also updated. Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/refresh-desktop-file-local-drive Question: 17 CertyIQ DRAG DROP - You publish a dataset that contains data from an on-premises Microsoft SQL Server database. The dataset must be refreshed daily. You need to ensure that the Power BI service can connect to the database and refresh the dataset. Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Answer: Explanation: Configure an on premises data gateway. Add a data source. Add the dataset owner to the data source. Configure a scheduled refresh. Question: 18 CertyIQ You attempt to connect Power BI Desktop to a Cassandra database. From the Get Data connector list, you discover that there is no specific connector for the Cassandra database. You need to select an alternate data connector that will connect to the database. Which type of connector should you choose? A. Microsoft SQL Server database B. ODBC C. OLE DB D. OData Answer: B Explanation: B is Correct because, B´cause it allows you to connect to data sources that aren't identified in the Get Data lists. The ODBC connector lets you import data from any third-party ODBC driver simply by specifying a Data Source Name (DSN) or a connection string. As an option, you can also specify a SQL statement to execute against the ODBC driver. List details a few examples of data sources to which Power BI Desktop can connect by using the generic ODBC interface: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-connect-using-generic-interfaces Question: 19 CertyIQ DRAG DROP - You receive annual sales data that must be included in Power BI reports. From Power Query Editor, you connect to the Microsoft Excel source shown in the following exhibit. You need to create a report that meets the following requirements: Visualizes the Sales value over a period of years and months Adds a slicer for the month Adds a slicer for the year Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Answer: Explanation: Select the Month and Month Number Columns. Select Unpivot Other Columns Rename the Attribute Column as Year and the value Column as Sales. Question: 20 CertyIQ HOTSPOT - You are using Power BI Desktop to connect to an Azure SQL database. The connection is configured as shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct solution is worth one point. Answer: Explanation: 10 minutes. Only tables that Contain data. Question: 21 CertyIQ HOTSPOT - You have the Azure SQL databases shown in the following table. You plan to build a single PBIX file to meet the following requirements: Data must be consumed from the database that corresponds to each stage of the development lifecycle. Power BI deployment pipelines must NOT be used. The solution must minimize administrative effort. What should you do? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Answer: Explanation: To meet the requirements specified, we can use a single parameter in the PBIX file that controls which database is used for data consumption based on the stage of the development lifecycle. We can use a Text parameter type in Power BI to achieve this. The parameter can be used to switch between the different database connections when a user interacts with the report. The text parameter could include values such as "Development", "Staging", and "Production", which correspond to the different databases shown in the table. The parameter can then be used in the queries to dynamically filter the data based on the selected stage of the development lifecycle. By using a single parameter, we can minimize administrative effort and ensure that the report works with each stage of the development lifecycle. Question: 22 CertyIQ You are creating a query to be used as a Country dimension in a star schema. A snapshot of the source data is shown in the following table. You need to create the dimension. The dimension must contain a list of unique countries. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. A. Delete the Country column. B. Remove duplicates from the table. C. Remove duplicates from the City column. D. Delete the City column. E. Remove duplicates from the Country column. Answer: DE Explanation: The table has to contain unique values for "Country" column, so- delete the city column --> in fact this column is not even requested- Remove dupicates from the Country column Question: 23 CertyIQ DRAG DROP - You use Power Query Editor to preview the data shown in the following exhibit. You need to clean and transform the query so that all the rows of data are maintained, and error values in the discount column are replaced with a discount of 0.05. The solution must minimize administrative effort. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Answer: Explanation: Select the discount Column Select Replace Errors to replace each error value with 0.05 For the discount column ,Change Data Type to Decimal Number. Question: 24 CertyIQ HOTSPOT - You attempt to use Power Query Editor to create a custom column and receive the error message shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Answer: Explanation: mismatched data types A1 Question: 25 CertyIQ From Power Query Editor, you attempt to execute a query and receive the following error message. Datasource.Error: Could not find file. What are two possible causes of the error? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A.You do not have permissions to the file. B.An incorrect privacy level was used for the data source. C.The file is locked. D.The referenced file was moved to a new location. Answer: AD Explanation: A and D. A if PBI cant find the file in the given path and D due this. https://community.fabric.microsoft.com/t5/Power-Query/SOLVED-Datasource-error-could-not-find-file/td- p/252703 Question: 26 CertyIQ You have data in a Microsoft Excel worksheet as shown in the following table. You need to use Power Query to clean and transform the dataset. The solution must meet the following requirements: If the discount column returns an error, a discount of 0.05 must be used. All the rows of data must be maintained. Administrative effort must be minimized. What should you do in Power Query Editor? A.Select Replace Errors. B.Edit the query in the Query Errors group. C.Select Remove Errors. D.Select Keep Errors. Answer: A Explanation: A. Select Replace Errors - is correct. C&D will remove some rows Option B, "Edit the query in the Query Errors group", would technically also allow to achieve the required result. However, this would not be the optimal solution given the constraints provided in the scenario, which specifies that administrative effort must be minimized. Question: 27 CertyIQ You have a CSV file that contains user complaints. The file contains a column named Logged. Logged contains the date and time each complaint occurred. The data in Logged is in the following format: 2018-12-31 at 08:59. You need to be able to analyze the complaints by the logged date and use a built-in date hierarchy. What should you do? A.Apply the Parse function from the Data transformations options to the Logged column. B.Change the data type of the Logged column to Date. C.Split the Logged column by using at as the delimiter. D.Create a column by example that starts with 2018-12-31. Answer: C Explanation: Split the Logged column by using at as the delimiter. You should split the Logged column by using "at" as the delimiter. This will allow you to separate the date and time into separate columns, which will enable you to analyze the complaints by date and use a built-in date hierarchy. Alternatively, you could also use a transformation to extract the date and time from the Logged column and set the data type of the new columns to Date and Time, respectively. Option A is incorrect because it only extracts the last 11 characters of the Logged column, which would not include the date. Option B is incorrect because the data in the Logged column is in a non-standard date format and cannot be directly converted to the Date data type. Option D is incorrect because it only extracts the first 11 characters of the Logged column, which would not include the time. Question: 28 CertyIQ DRAG DROP - You have two Microsoft Excel workbooks in a Microsoft OneDrive folder. Each workbook contains a table named Sales. The tables have the same data structure in both workbooks. You plan to use Power BI to combine both Sales tables into a single table and create visuals based on the data in the table. The solution must ensure that you can publish a separate report and dataset. Which storage mode should you use for the report file and the dataset file? To answer, drag the appropriate modes to the correct files. Each mode may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point. Answer: Explanation: Report file: Import. In Power BI, when you import data, it means that the data is loaded into the Power BI Desktop file. In this case, you would import the data from both Excel workbooks into your Power BI Desktop report file. This allows you to create visuals and reports based on the imported data. Importing the data ensures that you can work with the data even when you're not connected to OneDrive. Dataset: Direct Query. To keep the data in OneDrive and maintain a live connection to the source, you should use Direct Query for the dataset. Direct Query allows Power BI to retrieve and query data from the original data source (in this case, the Excel workbooks in OneDrive) in real-time without importing it into the dataset. This ensures that your dataset is always up-to-date and reflects changes made to the source data. Question: 29 CertyIQ You use Power Query to import two tables named Order Header and Order Details from an Azure SQL database. The Order Header table relates to the Order Details table by using a column named Order ID in each table. You need to combine the tables into a single query that contains the unique columns of each table. What should you select in Power Query Editor? A.Merge queries B.Combine files C.Append queries Answer: A Explanation: Correct answer is A. Merge combines columns. Append combines rows. The question is about related tables. Question: 30 CertyIQ You have a CSV file that contains user complaints. The file contains a column named Logged. Logged contains the date and time each complaint occurred. The data in Logged is in the following format: 2018-12-31 at 08:59. You need to be able to analyze the complaints by the logged date and use a built-in date hierarchy. What should you do? A.Apply a transformation to extract the last 11 characters of the Logged column and set the data type of the new column to Date. B.Change the data type of the Logged column to Date. C.Split the Logged column by using at as the delimiter. D.Apply the Parse function from the Date transformations options to the Logged column. Answer: C Explanation: Split the Logged column by using at as the delimiter. Question: 31 CertyIQ HOTSPOT - You have a folder that contains 50 JSON files. You need to use Power BI Desktop to make the metadata of the files available as a single dataset. The solution must NOT store the data of the JSON files. Which type of data source should you use, and which transformation should you perform? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Answer: Explanation: Folder. Delete the Content column. Question: 32 CertyIQ You have a PBIX file that imports data from a Microsoft Excel data source stored in a file share on a local network. You are notified that the Excel data source was moved to a new location. You need to update the PBIX file to use the new location. What are three ways to achieve the goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A.From the Datasets settings of the Power BI service, configure the data source credentials. B.From the Data source settings in Power BI Desktop, configure the file path. C.From Current File in Power BI Desktop, configure the Data Load settings. D.From Power Query Editor, use the formula bar to configure the file path for the applied step. E.From Advanced Editor in Power Query Editor, configure the file path in the M code. Answer: BDE Explanation: B.From the Data source settings in Power BI Desktop, configure the file path. D.From Power Query Editor, use the formula bar to configure the file path for the applied step. E.From Advanced Editor in Power Query Editor, configure the file path in the M code. Question: 33 CertyIQ You are creating a report in Power BI Desktop. You load a data extract that includes a free text field named coll. You need to analyze the frequency distribution of the string lengths in col1. The solution must not affect the size of the model. What should you do? A. In the report, add a DAX calculated column that calculates the length of col1 B. In the report, add a DAX function that calculates the average length of col1 C. From Power Query Editor, add a column that calculates the length of col1 D. From Power Query Editor, change the distribution for the Column profile to group by length for col1 Answer: D Explanation: A will affect the size of the model as would C. B doesn't give you enough information about the distribution (just the average) D is the right answer. 1. Power Query Editor -> View -> Enable Column Profile 2. Select three dots (top left corner) in the profile pane appear at the bottom of the Query Editor window. 3. Group By -> Text length Question: 34 CertyIQ You have a collection of reports for the HR department of your company. The datasets use row-level security (RLS). The company has multiple sales regions. Each sales region has an HR manager. You need to ensure that the HR managers can interact with the data from their region only. The HR managers must be prevented from changing the layout of the reports. How should you provision access to the reports for the HR managers? A. Publish the reports in an app and grant the HR managers access permission. B. Create a new workspace, copy the datasets and reports, and add the HR managers as members of the workspace. C. Publish the reports to a different workspace other than the one hosting the datasets. D. Add the HR managers as members of the existing workspace that hosts the reports and the datasets. Answer: A Explanation: correct ans looks as A since an app would prevent to change the layout In the Power BI service, members of a workspace have access to datasets in the workspace. RLS doesn't restrict this data access. and RLS is used to restrict access to data not to layout of the report. Members are allowed to change the report layout. Reference: https://kunaltripathy.com/2021/10/06/bring-your-power-bi-to-power-apps-portal-part-ii/ Question: 35 CertyIQ You need to provide a user with the ability to add members to a workspace. The solution must use the principle of least privilege. Which role should you assign to the user? A. Viewer B. Admin C. Contributor D. Member Answer: D Explanation: Member role allows adding members or other with lower permissions to the workspace. Reference: https://docs.microsoft.com/en-us/power-bi/collaborate-share/service-roles-new-workspaces Question: 36 CertyIQ You have a Power BI query named Sales that imports the columns shown in the following table. Users only use the date part of the Sales_Date field. Only rows with a Status of Finished are used in analysis. You need to reduce the load times of the query without affecting the analysis. Which two actions achieve this goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Remove the rows in which Sales[Status] has a value of Canceled. B. Remove Sales[Sales_Date]. C. Change the data type of Sale[Delivery_Time] to Integer. D. Split Sales[Sale_Date] into separate date and time columns. E. Remove Sales[Canceled Date]. Answer: AD Explanation: A: Removing uninteresting rows will increase query performance. D: Splitting the Sales_Date column will make comparisons on the Sales date faster. The Power BI Desktop data model only supports date/time, but they can be formatted as dates or times independently. Date/Time – Represents both a date and time value. Underneath the covers, the Date/Time value is stored as a Decimal Number Type. Since there's a T in the dates column before split, it's saved as a source text value. Splitting converts it to a numeric value. This reduces the size. Question: 37 CertyIQ You build a report to analyze customer transactions from a database that contains the tables shown in the following table. You import the tables. Which relationship should you use to link the tables? A. one-to-many from Transaction to Customer B. one-to-one between Customer and Transaction C. many-to-many between Customer and Transaction D. one-to-many from Customer to Transaction Answer: D Explanation: One on the primary Key side (customer table), many on the foreign key side (Transaction table) of the relation. Question: 38 CertyIQ You have a custom connector that returns ID, From, To, Subject, Body, and Has Attachments for every email sent during the past year. More than 10 million records are returned. You build a report analyzing the internal networks of employees based on whom they send emails to. You need to prevent report recipients from reading the analyzed emails. The solution must minimize the model size. What should you do? A. From Model view, set the Subject and Body columns to Hidden. B. Remove the Subject and Body columns during the import. C. Implement row-level security (RLS) so that the report recipients can only see results based on the emails they sent. Answer: B Explanation: "prevent report recipients from reading the analyzed emails" The Subject and the Body are not needed in the report. Dropping them resolves the security problem and minimizes the model. Question: 39 CertyIQ HOTSPOT - You create a Power BI dataset that contains the table shown in the following exhibit. You need to make the table available as an organizational data type in Microsoft Excel. How should you configure the properties of the table? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1:Row label: Name See: https://www.myonlinetraininghub.com/power-bi-organizational-data-types-in- excel#:~:text=Power%20BI%20Organizational%20Data%20Types%20in%20Excel%20allow%20you%20to,company%2 Box 2: ID - The Key column field value provides the unique ID for the row. This value enables Excel to link a cell to a specific row in the table. Box 3: Yes - In the Data Types Gallery in Excel, your users can find data from featured tables in your Power BI datasets. Reference: https://docs.microsoft.com/en-us/power-bi/collaborate-share/service-create-excel-featured-tables Question: 40 CertyIQ You have the Power BI model shown in the following exhibit. A manager can represent only a single country. You need to use row-level security (RLS) to meet the following requirements: ✑ The managers must only see the data of their respective country. ✑ The number of RLS roles must be minimized. Which two actions should you perform? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Create a single role that filters Country[Manager_Email] by using the USERNAME DAX function. B. Create a single role that filters Country[Manager_Email] by using the USEROBJECTID DAX function. C. For the relationship between Purchase Detail and Purchase, select Apply security filter in both directions. D. Create one role for each country. E. For the relationship between Purchase and Purchase Detail, change the Cross filter direction to Single. Answer: AC Explanation: A: You can take advantage of the DAX functions username() or userprincipalname() within your dataset. You can use them within expressions in Power BI Desktop. When you publish your model, it will be used within the Power BI service. Note: To define security roles, follow these steps. Import data into your Power BI Desktop report, or configure a DirectQuery connection. 1. From the Modeling tab, select Manage Roles. 2. From the Manage roles window, select Create. 3. Under Roles, provide a name for the role. 4. Under Tables, select the table to which you want to apply a DAX rule. 5. In the Table filter DAX expression box, enter the DAX expressions. This expression returns a value of true or false. For example: [Entity ID] = Value. 6. After you've created the DAX expression, select the checkmark above the expression box to validate the expression. Note: You can use username() within this expression. 7. Select Save. C: By default, row-level security filtering uses single-directional filters, whether the relationships are set to single direction or bi-directional. You can manually enable bi-directional cross-filtering with row-level security by selecting the relationship and checking the Apply security filter in both directions checkbox. Select this option when you've also implemented dynamic row-level security at the server level, where row-level security is based on username or login ID. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls Question: 41 CertyIQ HOTSPOT - You have a Power BI imported dataset that contains the data model shown in the following exhibit. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: cross filter direction - As the answer correctly states "Assume Referential Integrity" only works for direct query connections. Box 2: Star schema - Star schema is a mature modeling approach widely adopted by relational data warehouses. It requires modelers to classify their model tables as either dimension or fact. Generally, dimension tables contain a relatively small number of rows. Fact tables, on the other hand, can contain a very large number of rows and continue to grow over time. Example: Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-assume-referential-integrity https://docs.microsoft.com/en-us/power-bi/guidance/star-schema Question: 42 CertyIQ HOTSPOT - You have a Power BI model that contains a table named Sales and a related date table. Sales contains a measure named Total Sales. You need to create a measure that calculates the total sales from the equivalent month of the previous year. How should you complete the calculation? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: CALCULATE SAMEPERIODLASTYEAR 'DATE'[DATE] Box 1: CALCULATE - Box 2: SAMEPERIODLASTYEAR accepts a data column, Month will usually be either text (Jan) or Integer (1). so: CALCULATE([Total Sales], SAMEPERIODLASTYEAR('Date'[Date])) Box 3: 'DATE' [DATE] Reference: https://docs.microsoft.com/en-us/dax/parallelperiod-function-dax https://docs.microsoft.com/en- us/dax/sameperiodlastyear-function-dax Question: 43 CertyIQ DRAG DROP - You plan to create a report that will display sales data from the last year for multiple regions. You need to restrict access to individual rows of the data on a per region-basis by using roles. Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Answer: Explanation: With respect, you can not assign users to a role until AFTER the report has been published to the Power BI Service. Those posting that you create the role and then assign users to the role BEFORE publishing are incorrect. Roles are created in Power BI Desktop. Desktop does not have any way to assign users to the roles. They are empty when created. Role assignment happens in the service. Publish the report to the Power BI service. Go to your Workspace, using the Dataset, select the More Options menu(...) and click Security. This is where the Roles are populated. 1) Import your data into Power BI Desktop 2) Create the role definition (on the Modeling tab) 3) Publish the report to the Power BI service 4) Assign users to the role Question: 44 CertyIQ DRAG DROP - You create a data model in Power BI. Report developers and users provide feedback that the data model is too complex. The model contains the following tables. The model has the following relationships: ✑ There is a one-to-one relationship between Sales_Region and Region_Manager. ✑ There are more records in Manager than in Region_Manager, but every record in Region_Manager has a corresponding record in Manager. ✑ There are more records in Sales_Manager than in Sales_Region, but every record in Sales_Region has a corresponding record in Sales_Manager. You need to denormalize the model into a single table. Only managers who are associated to a sales region must be included in the reports. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select. Select and Place: Answer: Explanation: 1.Merge [Region_Manager] and [Manager] by using an inner join. 3.Merge [Sales_Region] and [Sales_Manager] by using an inner join. 6.Merge [Sales_Region] and [Region_Manager] by using an inner join. Question: 45 CertyIQ You have a Microsoft Power BI report. The size of PBIX file is 550 MB. The report is accessed by using an App workspace in shared capacity of powerbi.com. The report uses an imported dataset that contains one fact table. The fact table contains 12 million rows. The dataset is scheduled to refresh twice a day at 08:00 and 17:00. The report is a single page that contains 15 AppSource visuals and 10 default visuals. Users say that the report is slow to load the visuals when they access and interact with the report. You need to recommend a solution to improve the performance of the report. What should you recommend? A. Change any DAX measures to use iterator functions. B. Enable visual interactions. C. Replace the default visuals with AppSource visuals. D. Split the visuals onto multiple pages. Answer: D Explanation: One page with many visuals may also make your report loading slow. Please appropriately reduce the number of visualizations on one page. Reference: https://community.powerbi.com/t5/Desktop/Visuals-are-loading-extremely-slow/td-p/1565668 Question: 46 CertyIQ HOTSPOT - You are creating a Microsoft Power BI imported data model to perform basket analysis. The goal of the analysis is to identify which products are usually bought together in the same transaction across and within sales territories. You import a fact table named Sales as shown in the exhibit. (Click the Exhibit tab.) The related dimension tables are imported into the model. Sales contains the data shown in the following table. You are evaluating how to optimize the model. For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: Yes - Those two columns not need in the analysis. Box 2: No - Can remove the surrogate key OrderDateKey from the analysis. Box 3: No - Tax charged not relevant for the analysis. Question: 47 CertyIQ You have a Microsoft Power BI data model that contains three tables named Orders, Date, and City. There is a one- to-many relationship between Date and Orders and between City and Orders. The model contains two row-level security (RLS) roles named Role1 and Role2. Role1 contains the following filter. City[State Province] = "Kentucky" Role2 contains the following filter. Date[Calendar Year] = 2020 - If a user is a member of both Role1 and Role2, what data will they see in a report that uses the model? A. The user will see data for which the State Province value is Kentucky or where the Calendar Year is 2020. B. The user will receive an error and will not be able to see the data in the report. C. The user will only see data for which the State Province value is Kentucky. D. The user will only see data for which the State Province value is Kentucky and the Calendar Year is 2020. Answer: A Explanation: A, from the Microsoft documentation (https://docs.microsoft.com/en-us/power-bi/guidance/rls-guidance): "When a report user is assigned to multiple roles, RLS filters become additive. It means report users can see table rows that represent the union of those filters." This means that you would see all data where either Role1 OR Role2 applies, so the answer is A not D. Example from MS Learn linked below: https://learn.microsoft.com/en-us/power-bi/guidance/rls-guidance "Consider a model with two roles: The first role, named Workers, restricts access to all Payroll table rows by using the following rule expression: DAX: FALSE() A rule will return no table rows when its expression evaluates to false. Yet, a second role, named Managers, allows access to all Payroll table rows by using the following rule expression: DAX: TRUE() Take care: Should a report user map to both roles, they'll see all Payroll table rows." It seems to be indeed A in that scenario. User will see the data from the first as well as the second filter, it is FILTER A OR FILTER B (not FILTER A AND FILTER B) Question: 48 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: From Power Query Editor, you import the table and then add a filter step to the query. Does this meet the goal? A. Yes B. No Answer: B Explanation: This would load the entire table in the first step. Instead: You add a WHERE clause to the SQL statement. Reference: https://docs.microsoft.com/en-us/power-query/native-database-query Question: 49 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: You write a DAX expression that uses the FILTER function. Does this meet the goal? A. Yes B. No Answer: B Explanation: Instead: You add a WHERE clause to the SQL statement. Note: DAX is not a language designed to fetch the data like SQL rather than used for data analysis purposes. It is always a better and recommended approach to transform the data as close to the data source itself. For example, your data source is a relational database; then, it's better to go with T-SQL. SQL is a structured query language, whereas DAX is a formula language used for data analysis purposes. When our data is stored in some structured database systems like SQL server management studio, MySQL, or others, we have to use SQL to fetch the stored data. Reference: https://www.learndax.com/dax-vs-sql-when-to-use-dax-over-sql/ Question: 50 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: You add a WHERE clause to the SQL statement. Does this meet the goal? A. Yes B. No Answer: A Explanation: Power Query enables you to specify your native database query in a text box under Advanced options when connecting to a database. In the example below, you'll import data from a SQL Server database using a native database query entered in the SQL statement text box. 1. Connect to a SQL Server database using Power Query. Select the SQL Server database option in the connector selection. 2. In the SQL Server database popup window: 3. Specify the Server and Database where you want to import data from using native database query. 4. Under Advanced options, select the SQL statement field and paste or enter your native database query, then select OK. Reference: https://docs.microsoft.com/en-us/power-query/native-database-query Question: 51 CertyIQ DRAG DROP - You are preparing a financial report in Power BI. You connect to the data stored in a Microsoft Excel spreadsheet by using Power Query Editor as shown in the following exhibit. You need to prepare the data to support the following: ✑ Visualizations that include all measures in the data over time ✑ Year-over-year calculations for all the measures Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Answer: Explanation: 1. Use first row as header 2. Unpivot all columns other than "Measure" 3. Rename "Attribute" to "Year" 4. Change data type of "Year" column to Date Reference: https://docs.microsoft.com/en-us/power-query/unpivot-column Question: 52 CertyIQ HOTSPOT - You are creating an analytics report that will consume data from the tables shown in the following table. There is a relationship between the tables. There are no reporting requirements on employee_id and employee_photo. You need to optimize the data model. What should you configure for employee_id and employee_photo? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: Hide - Need in the relation, so cannot delete it. Box 2: Delete - Reference: https://community.powerbi.com/t5/Desktop/How-to-Hide-a-Column-in-power-Bi/m-p/414470 Question: 53 CertyIQ HOTSPOT - You plan to create Power BI dataset to analyze attendance at a school. Data will come from two separate views named View1 and View2 in an Azure SQL database. View1 contains the columns shown in the following table. View2 contains the columns shown in the following table. The views can be related based on the Class ID column. Class ID is the unique identifier for the specified class, period, teacher, and school year. For example, the same class can be taught by the same teacher during two different periods, but the class will have a different class ID. You need to design a star schema data model by using the data in both views. The solution must facilitate the following analysis: ✑ The count of classes that occur by period ✑ The count of students in attendance by period by day ✑ The average number of students attending a class each month In which table should you include the Teacher First Name and Period Number fields? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: Teacher Dimension- Box 2: Class Dimension- teacher's dim and class dim because teacher name and period number are static information that are directly related to the keys (teacher ID and class ID) so they belong in the relevant dimension tables. Since the "Class ID is unique for the class, period, teacher and school year" this information should be included in the class dimension table and not repeated for each student's attendance to keep your model as small as possible and to avoid mistakes. Reference: https://docs.microsoft.com/en-us/power-bi/guidance/star-schema Question: 54 CertyIQ You have the Power BI model shown in the following exhibit. There are four departments in the Departments table. You need to ensure that users can see the data of their respective department only. What should you do? A. Create a slicer that filters Departments based on DepartmentID. B. Create a row-level security (RLS) role for each department, and then define the membership of the role. C. Create a DepartmentID parameter to filter the Departments table. D. To the ConfidentialData table, add a calculated measure that uses the CURRENTGROUP DAX function. Answer: B Explanation: Row-level security (RLS) with Power BI can be used to restrict data access for given users. Filters restrict data access at the row level, and you can define filters within roles. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls Question: 55 CertyIQ In Power BI Desktop, you are building a sales report that contains two tables. Both tables have row-level security (RLS) configured. You need to create a relationship between the tables. The solution must ensure that bidirectional cross-filtering honors the RLS settings. What should you do? A. Create an inactive relationship between the tables and select Apply security filter in both directions. B. Create an active relationship between the tables and select Apply security filter in both directions. C. Create an inactive relationship between the tables and select Assume referential integrity. D. Create an active relationship between the tables and select Assume referential integrity. Answer: B Explanation: By default, row-level security filtering uses single-directional filters, whether the relationships are set to single direction or bi-directional. You can manually enable bi-directional cross-filtering with row-level security by selecting the relationship and checking the Apply security filter in both directions checkbox. Select this option when you've also implemented dynamic row-level security at the server level, where row-level security is based on username or login ID. Reference: https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-rls Question: 56 CertyIQ HOTSPOT - You have a column named UnitsInStock as shown in the following exhibit. UnitsInStock has 75 non-null values, of which 51 are unique. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: 75 rows - Is nullable allows NULL values in the column. Box 2: reduce - We're not dealing with a matric here, we're dealing with a simple table. In simple tables values that occur more than once won't be shown in the rows multiple times. Since you're they tell you you have 51 unique values (and the other ones aren't null values) you can be sure it's more than 51. Since you'll already have 51 rows of unique values. So the first is answer is 75. Furthermore, when you add another table, change the sign to summarize, you will add up all the values of the 51 unique values and all the rest. Which means you will get one single row, displaying the sum of all these values. Therefore, the second answer is reduce. Reference: https://blog.crossjoin.co.uk/2019/01/20/is-nullable-column-property-power-bi/ Question: 57 CertyIQ HOTSPOT - You have a Power BI report. You have the following tables. You have the following DAX measure. Accounts := CALCULATE ( DISTINCTCOUNT (Balances[AccountID]), LASTDATE ('Date'[Date]) For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: No - It will show the total number of accounts that were live at the last day of the year only. Note: DISTINCTCOUNT counts the number of distinct values in a column. LASTDATE returns the last date in the current context for the specified column of dates. Box 2: No - It will show the total number of accounts that were live at the last day of the month only. Box 3: Yes - Reference: https://docs.microsoft.com/en-us/dax/distinctcount-function-dax https://docs.microsoft.com/en-us/dax/lastd ate-function-dax Question: 58 CertyIQ You have the tables shown in the following table. The Impressions table contains approximately 30 million records per month. You need to create an ad analytics system to meet the following requirements: ✑ Present ad impression counts for the day, campaign, and site_name. The analytics for the last year are required. Minimize the data model size. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. A. Create one-to-many relationships between the tables. B. Group the Impressions query in Power Query by Ad_id, Site_name, and Impression_date. Aggregate by using the CountRows function. C. Create a calculated table that contains Ad_id, Site_name, and Impression_date. D. Create a calculated measure that aggregates by using the COUNTROWS function. Answer: AB Explanation: Incorrect: Not C: A calculated table would increase the data model size. Not D: Need Impression_date etc. Question: 59 CertyIQ HOTSPOT - You are creating a Microsoft Power BI data model that has the tables shown in the following table. The Products table is related to the ProductCategory table through the ProductCategoryID column. Each product has one product category. You need to ensure that you can analyze sales by product category. How should you configure the relationship from ProductCategory to Products? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: One-to-many because several products have the same product category. Single because the performance is much better and the assignment states only that you need to be able to analyze sales by product category. Box 1: One-to-many - The one-to-many and many-to-one cardinality options are essentially the same, and they're also the most common cardinality types. Incorrect: A many-to-many relationship means both columns can contain duplicate values. This cardinality type is infrequently used. It's typically useful when designing complex model requirements. You can use it to relate many-to-many facts or to relate higher grain facts. For example, when sales target facts are stored at product category level and the product dimension table is stored at product level. Box 2: Single - Incorrect: Bear in mind that bi-directional relationships can impact negatively on performance. Further, attempting to configure a bi-directional relationship could result in ambiguous filter propagation paths. In this case, Power BI Desktop may fail to commit the relationship change and will alert you with an error message. Reference: https://docs.microsoft.com/en-us/power-bi/transform-model/desktop-relationships-understand Question: 60 CertyIQ You import a Power BI dataset that contains the following tables: ✑ Date ✑ Product ✑ Product Inventory The Product Inventory table contains 25 million rows. A sample of the data is shown in the following table. The Product Inventory table relates to the Date table by using the DateKey column. The Product Inventory table relates to the Product table by using the ProductKey column. You need to reduce the size of the data model without losing information. What should you do? A. Change Summarization for DateKey to Don't Summarize. B. Remove the relationship between Date and Product Inventory C. Change the data type of UnitCost to Integer. D. Remove MovementDate. Answer: D Explanation: The DateKey and MovementDate columns have the same information. Movementdate can be removed. D, because the best way to reduce the data model size is to remove the unnecessary column. Incorrect: Not C: Integer data type would lose data. Question: 61 CertyIQ HOTSPOT - You are enhancing a Power BI model that has DAX calculations. You need to create a measure that returns the year-to-date total sales from the same date of the previous calendar year. Which DAX functions should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: CALCULATE - Example: Total sales on the last selected date = CALCULATE ( SUM ( Sales[Sales Amount] ), 'Sales'[OrderDateKey] = MAX ( 'Sales'[OrderDateKey] ) ) Box 2: SUM - Box 3: DatesBetween This is due to the expected parameters. DatesBetween expects two parameters as per the exhibit, SamePeriodLastYear expects one parameter (but two are used in the exhibit) Reference: https://docs.microsoft.com/en-us/dax/calculate-function-dax https://dax.guide/sameperiodlastyear/ Question: 62 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You are modeling data by using Microsoft Power BI. Part of the data model is a large Microsoft SQL Server table named Order that has more than 100 million records. During the development process, you need to import a sample of the data from the Order table. Solution: You add a report-level filter that filters based on the order date. Does this meet the goal? A. Yes B. No Answer: B Explanation: You want the raw data, not a report with the data. Instead add a WHERE clause to the SQL statement. Reference: https://docs.microsoft.com/en-us/power-query/native-database-query Question: 63 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have a Power BI report that imports a date table and a sales table from an Azure SQL database data source. The sales table has the following date foreign keys: ✑ Due Date ✑ Order Date ✑ Delivery Date You need to support the analysis of sales over time based on all the date foreign keys. Solution: For each date foreign key, you add inactive relationships between the sales table and the date table. Does this meet the goal? A. Yes B. No Answer: B Explanation: Instead: Solution: From the Fields pane, you rename the date table as Due Date. You use a DAX expression to create Order Date and Delivery Date as calculated tables. You can reference an inactive relationship whit DAX function USERELATIONSHIP(), but using DAX is not mentioned here. So follow this refactory methodology: Create a copy of the role-playing table, providing it with a name that reflects its role. If it's an Import table, we recommend defining a calculated table. If it's a DirectQuery table, you can duplicate the Power Query query. Source: https://learn.microsoft.com/en-us/power-bi/guidance/relationships-active-inactive Reference: https://docs.microsoft.com/en-us/power-bi/guidance/relationships-active-inactive Question: 64 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have a Power BI report that imports a date table and a sales table from an Azure SQL database data source. The sales table has the following date foreign keys: ✑ Due Date ✑ Order Date ✑ Delivery Date You need to support the analysis of sales over time based on all the date foreign keys. Solution: From Power Query Editor, you rename the date query as Due Date. You reference the Due Date query twice to make the queries for Order Date and Delivery Date. Does this meet the goal? A. Yes B. No Answer: A Explanation: 1. It's not going to be great solution from the performance side...but that's not part of the requirements 2. Answer is YES.That's not the best solution regarding the performance but it's not the subject. Question: 65 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have a Power BI report that imports a date table and a sales table from an Azure SQL database data source. The sales table has the following date foreign keys: ✑ Due Date ✑ Order Date ✑ Delivery Date You need to support the analysis of sales over time based on all the date foreign keys. Solution: From the Fields pane, you rename the date table as Due Date. You use a DAX expression to create Order Date and Delivery Date as calculated tables. Does this meet the goal? A. Yes B. No Answer: A Explanation: Refactoring methodology - Here's a methodology to refactor a model from a single role-playing dimension-type table, to a design with one table per role. 1. Remove any inactive relationships. 2. Consider renaming the role-playing dimension-type table to better describe its role. In the example (not present here), the Airport table is related to the ArrivalAirport column of the Flight table, so it's renamed as Arrival Airport. 3. Create a copy of the role-playing table, providing it with a name that reflects its role. If it's an Import table, we recommend defining a calculated table. If it's a DirectQuery table, you can duplicate the Power Query query. In the example, the Departure Airport table was created by using the following calculated table definition. Departure Airport = 'Arrival Airport' Create an active relationship to relate the new table. 4. Consider renaming the columns in the tables so they accurately reflect their role. In the example, all columns are prefixed with the word Departure or Arrival. These names ensure report visuals, by default, will have self-describing and non-ambiguous labels. It also improves the Q&A experience, allowing users to easily write their questions. 5. Consider adding descriptions to role-playing tables. (In the Fields pane, a description appears in a tooltip when a report author hovers their cursor over the table.) This way, you can communicate any additional filter propagation details to your report authors. Reference: https://docs.microsoft.com/en-us/power-bi/guidance/relationships-active-inactive Question: 66 CertyIQ DRAG DROP - You receive revenue data that must be included in Microsoft Power BI reports. You preview the data from a Microsoft Excel source in Power Query as shown in the following exhibit. You plan to create several visuals from the data, including a visual that shows revenue split by year and product. You need to transform the data to ensure that you can build the visuals. The solution must ensure that the columns are named appropriately for the data that they contain. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Answer: Explanation: Correct Sequence = 2>3>4 Select Use First Row as Headers Select Department and Product and Unpivot Other Column Rename the Attribute column to YEAR and the Value column to REVENUE Question: 67 CertyIQ HOTSPOT - You have a Power BI report named Orders that supports the following analysis: ✑ Total sales over time ✑ The count of orders over time ✑ New and repeat customer counts The data model size is nearing the limit for a dataset in shared capacity. The model view for the dataset is shown in the following exhibit. The data view for the Orders table is shown in the following exhibit. The Orders table relates to the Customers table by using the CustomerID column. The Orders table relates to the Date table by using the OrderDate column. For each of the following statements, select Yes if the statement is true, Otherwise, select No. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: No - Would not support total sales over time. Box 2: No - Would not support new and repeat customer counts Box 3: Yes Question: 68 CertyIQ HOTSPOT - You are building a financial report by using Power BI. You have a table named financials that contains a column named Date and a column named Sales. You need to create a measure that calculates the relative change in sales as compared to the previous quarter. How should you complete the measure? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: CALCULATE - Calculate the sum. Box 2: DATEADD - DATEADD -1 QUARTER will give the previous month. Box 3: DIVIDE - Use DIVIDE to get the relative change. Question: 69 CertyIQ DRAG DROP - You are creating a Power BI model and report. You have a single table in a data model named Product. Product contains the following fields: ✑ ID ✑ Name ✑ Color ✑ Category ✑ Total Sales You need to create a calculated table that shows only the top eight products based on the highest value in Total Sales. How should you complete the DAX expression? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. Select and Place: Answer: Explanation: Box 1: TOPN - TOPN returns the top N rows of the specified table. Syntax: TOPN(, , , [[, , []] ]) Box 2: DESC - Descending order to get the highest values first. Reference: https://docs.microsoft.com/en-us/dax/topn-function-dax Question: 70 CertyIQ You are creating a sales report in Power BI for the NorthWest region sales territory of your company. Data will come from a view in a Microsoft SQL Server database. A sample of the data is shown in the following table: The report will facilitate the following analysis: ✑ The count of orders and the sum of total sales by Order Date ✑ The count of customers who placed an order ✑ The average quantity per order You need to reduce data refresh times and report query times. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. A. Set the data type for SalesOrderNumber to Decimal Number. B. Remove the CustomerKey and ProductKey columns. C. Remove the TaxAmt and Freight columns. D. Filter the data to only the NorthWest region sales territory. Answer: CD Explanation: C: Remove columns that are not used in the report. D: Reduce the number of rows. Incorrect: Not A: Not possible. Not B: Need CustomerKey to count of customers who placed an order Question: 71 CertyIQ You are creating a Power BI model that contains a table named Store. Store contains the following fields. You plan to create a map visual that will show store locations and provide the ability to drill down from Country to State/Province to City. What should you do to ensure that the locations are mapped properly? A. Change the data type of City, State/Province, and Country. B. Set Summarization for City, State/Province, and Country to Don't summarize. C. Set the data category of City, State/Province, and Country. D. Create a calculated column that concatenates the values in City, State/Province, and Country. Answer: C Explanation: A hierarchy is a set of fields categorized in a hierarchical way that one level is the parent of another level. Values of the parent level can be drilled down to the lower level. Create Hierarchy - Right-click on the field you want to set as level 1 of the hierarchy in the fields list, and then select Create Hierarchy. After that, you will see a new hierarchy created named your field name Category plus the word Hierarchy. This would have a hierarchy icon beside it and also an option to expand to the fields of the hierarchy. If you expand, you will see a copy of the Category field in there too. Etc. Reference: https://radacad.com/what-a-power-bi-hierarchy-is-and-how-to-use-it Question: 72 CertyIQ You are building a data model for a Power BI report. You have data formatted as shown in the following table. You need to create a clustered bar chart as shown in the following exhibit. What should you do? A. From Power Query Editor, split the Machine-User column by using a delimiter. B. From Power Query Editor, create a column that contains the last three digits of the Machine-User column. C. In a DAX function, create two calculated columns named Machine and User by using the SUBSTITUTE function. D. In a DAX function, create two measures named Machine and User by using the SUBSTITUTE function. Answer: A Explanation: Split a column of text (Power Query) You can split a column with a text data type into two or more columns by using a common delimiter character. For example, a Name column that contains values written as , can be split into two columns using the comma (,) character. Note: Power Query is an Extract Transform Load (ETL) tool. It allows us to Download and fetch data from different sources. We call this data ingestion Combine, clean, and model this data. We call this data wrangling Reference: https://support.microsoft.com/en-us/office/split-a-column-of-text-power-query-5282d425-6dd0-46ca-95bf- 8e0da9539662 Question: 73 CertyIQ DRAG DROP - You need create a date table in Power BI that must contain 10 full calendar years, including the current year. How should you complete the DAX expression? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Select and Place: Answer: Explanation: Box 1: YEAR - Get the current year. Box 2: TODAY - TODAY returns the current date. Box 3: CALENDAR - CALENDAR returns a table with a single column named Date containing a contiguous set of dates. The range of dates is from the specified start date to the specified end date, inclusive of those two dates. The following formula returns a table with dates between January 1st, 2005 and December 31st, 2015. CALENDAR ( DATE ( 2005, 1, 1 ), DATE ( 2015, 12, 31 ) Reference: https://dax.guide/calendar/ Question: 74 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have a Power BI report that imports a date table and a sales table from an Azure SQL database data source. The sales table has the following date foreign keys: ✑ Due Date ✑ Order Date ✑ Delivery Date You need to support the analysis of sales over time based on all the date foreign keys. Solution: You create measures that use the USERELATIONSHIP DAX function to filter sales on the active relationship between the sales table and the date table. Does this meet the goal? A. Yes B. No Answer: B Explanation: You can't use USERELATIONSHIP() to filter on an active relationship, but need additional innactive relationships Instead: Solution: From the Fields pane, you rename the date table as Due Date. You use a DAX expression to create Order Date and Delivery Date as calculated tables. Reference: https://docs.microsoft.com/en-us/power-bi/guidance/relationships-active-inactive Question: 75 CertyIQ HOTSPOT - You have a Power BI report that contains a measure named Total Sales. You need to create a new measure that will return the sum of Total Sales for a year up to a selected date. How should you complete the DAX expression? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Hot Area: Answer: Explanation: Box 1: TOTALYTD - TOTALYTD evaluates the specified expression over the interval which begins on the first day of the year and ends with the last date in the specified date column after applying specified filters. Syntax: TOTALYTD ( , [, ] [, ] Expression - The expression to be evaluated. Dates - The name of a column containing dates or a one column table containing dates. Example: TOTALYTD ( -- 2007-01-01 : 2007-05-12 [Sales Amount], 'Date'[Date] Box 2: 'Date'[Date] Reference: https://dax.guide/totalytd/ Question: 76 CertyIQ DRAG DROP - You are modifying a Power BI model by using Power BI Desktop. You have a table named Sales that contains the following fields. You have a table named Transaction Size that contains the following data. You need to create a calculated column to classify each transaction as small, medium, or large based on the value in Sales Amount. How should you complete the code? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point. Select and Place: Answer: Explanation: Box 1: FILTER Box 2: AND Box 3: CALCULATE FILTER needs to followed by table reference , AND is needed to check the limits , and CALCULATE because needs to be followed by expression such as distinct in this case Reference: https://docs.microsoft.com/en-us/dax/calculate-function-dax https://docs.microsoft.com/en-us/dax/filter-function-dax Question: 77 CertyIQ You have a Power BI report for the procurement department. The report contains data from the following tables. There is a one-to-many relationship from Suppliers to LineItems that uses the ID and Supplier ID columns. The report contains the visuals shown in the following table. You need to minimize the size of the dataset without affecting the visuals. What should you do? A. Merge Suppliers and LineItems. B. Remove the LineItems[Description] column. C. Remove the rows from LineItems where LineItems[Invoice Date] is before the beginning of last month. D. Group LineItems by LineItems[Invoice ID] and LineItems[Invoice Date] with a sum of LineItems[Price]. Answer: B Explanation: Remove a column that is not used in the visuals reduces the size of the dataset. Incorrect: Not A: Merging the tables would increase the dataset. Not C: Two of the visuals need historical data. Not D: Grouping would not affect size. Question: 78 CertyIQ You have a Power BI report for the marketing department. The report reports on web traffic to a blog and contains data from the following tables. There is a one-to-many relationship from Posts to Traffic that uses the URL and URL Visited columns. The report contains the visuals shown in the following table. The dataset takes a long time to refresh. You need to modify Posts and Traffic queries to reduce load times. Which two actions will reduce the load times? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. A. Remove the rows in Posts in which Posts[Publish Date] is in the last seven days. B. Remove the rows in Traffic in which Traffic[URL Visited] does not contain blog. C. Remove Traffic[IP Address], Traffic[Browser Agent], and Traffic[Referring URL]. D. Remove Posts[Full Text] and Posts[Summary]. E. Remove the rows in Traffic in which Traffic[Referring URL] does not start with /. Answer: BD Explanation: B: Only blog posts rows are useful for the visuals. D: These two columns are not used in the visuals and can be removed. Incorrect: Not A: Three visuals need historical data. Not C: Traffic[Referring URL] is used in one of the visuals and therefore cannot be removed. Not E: These rows are used in 3 visuals. Question: 79 CertyIQ HOTSPOT - You are creating a quick measure as shown in the following exhibit. You need to create a monthly rolling average measure for Sales over time. How should you configure the quick measure calculation? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Answer: Explanation: 1. Total Sales; 2. Date; 3. Months Question: 80 CertyIQ You have the Power BI data model shown in the following exhibit. The Sales table contains records of sales by day from the last five years up until today’s date. You plan to create a measure to return the total sales of March 2021 when March 2022 is selected. Which DAX expression should you use? A. Calculate (Sum(Sales[Sales]), PREVIOUSYEAR( dimDate[Date]) B. TOTALYTD (SUM(Sales[Sales]), dimDate[Date] ) C. Calculate (SUM(Sales[Sales]), SAMEPERIODLASTYEAR(dimDate[Date] )) D. SUM(Sales[Sales]) Answer: C Explanation: Calculate (SUM(Sales[Sales]), SAMEPERIODLASTYEAR(dimDate[Date] )) Question: 81 CertyIQ You use Power BI Desktop to load data from a Microsoft SQL Server database. While waiting for the data to load, you receive the following error. You need to resolve the error. What are two ways to achieve the goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point. A. Reduce the number of rows and columns returned by each query. B. Split log running queries into subsets of columns and use Power Query to merge the queries. C. Use Power Query to combine log running queries into one query. D. Disable query folding on long running queries. Answer: AB Explanation: A. Reduce the number of rows and columns returned by each query. B. Split log running queries into subsets of columns and use Power Query to merge the queries. Question: 82 CertyIQ Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some que