DP-203

Practice DP-203 Exam

Is it difficult for you to decide to purchase Microsoft DP-203 exam dumps questions? CertQueen provides FREE online Data Engineering on Microsoft Azure DP-203 exam questions below, and you can test your DP-203 skills first, and then decide whether to buy the full version or not. We promise you get the following advantages after purchasing our DP-203 exam dumps questions.
1.Free update in ONE year from the date of your purchase.
2.Full payment fee refund if you fail DP-203 exam with the dumps

 

 Full DP-203 Exam Dump Here

Latest DP-203 Exam Dumps Questions

The dumps for DP-203 exam was last updated on Apr 23,2025 .

Viewing page 1 out of 14 pages.

Viewing questions 1 out of 71 questions

Question#1

You have an Azure Stream Analytics job that read data from an Azure event hub.
You need to evaluate whether the job processes data as quickly as the data arrives or cannot keep up.
Which metric should you review?

A. InputEventLastPunctuationTime
B. Input Sources Receive
C. Late input Events
D. Backlogged input Events

Question#2

You need to implement a Type 3 slowly changing dimension (SCD) for product category data in an Azure Synapse Analytics dedicated SQL pool.
You have a table that was created by using the following Transact-SQL statement.



Which two columns should you add to the table? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

A. [EffectiveScarcDate] [datetime] NOT NULL,
B. [CurrentProduccCacegory] [nvarchar] (100) NOT NULL,
C. [EffectiveEndDace] [dacecime] NULL,
D. [ProductCategory] [nvarchar] (100) NOT NULL,
E. [OriginalProduccCacegory] [nvarchar] (100) NOT NULL,

Explanation:
A Type 3 SCD supports storing two versions of a dimension member as separate columns. The table includes a column for the current value of a member plus either the original or previous value of the member. So Type 3 uses additional columns to track one key instance of history, rather than storing additional rows to track each change like in a Type 2 SCD.
This type of tracking may be used for one or two columns in a dimension table. It is not common to use it for many members of the same table. It is often used in combination with Type 1 or Type 2 members.



Reference: https://k21academy.com/microsoft-azure/azure-data-engineer-dp203-q-a-day-2-live-session-review/

Question#3

DRAG DROP
You have an Azure subscription that contains an Azure Data Lake Storage Gen2 account named storage1. Storage1 contains a container named container1. Container1 contains a directory named directory1. Directory1 contains a file named file1.
You have an Azure Active Directory (Azure AD) user named User1 that is assigned the Storage Blob Data Reader role for storage1.
You need to ensure that User1 can append data to file1. The solution must use the principle of least privilege.
Which permissions should you grant? To answer, drag the appropriate permissions to the correct resources. Each permission may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.


A. 

Explanation:
Box 1: Execute
If you are granting permissions by using only ACLs (no Azure RBAC), then to grant a security principal
read or write access to a file, you'll need to give the security principal Execute permissions to the root folder of the container, and to each folder in the hierarchy of folders that lead to the file.
Box 2: Execute
On Directory: Execute (X): Required to traverse the child items of a directory
Box 3: Write
On file: Write (W): Can write or append to a file.
Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control

Question#4

HOTSPOT
You have an Azure subscription that contains an Azure Cosmos DB analytical store and an Azure Synapse Analytics workspace named WS 1. WS1 has a serverless SQL pool name Pool1.
You execute the following query by using Pool1.



For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.


A. 

Question#5

You have an Azure subscription that contains an Azure Data Factory data pipeline named Pipeline1, a Log Analytics workspace named LA1, and a storage account named account1.
You need to retain pipeline-run data for 90 days.
The solution must meet the following requirements:
• The pipeline-run data must be removed automatically after 90 days.
• Ongoing costs must be minimized.
Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

A. Configure Pipeline1 to send logs to LA1.
B. From the Diagnostic settings (classic) settings of account1. set the retention period to 90 days.
C. Configure Pipeline1 to send logs to account1.
D. From the Data Retention settings of LA1, set the data retention period to 90 days.

Exam Code: DP-203         Q & A: 352 Q&As         Updated:  Apr 23,2025

 

 Full DP-203 Exam Dumps Here