Hello Readers! Here are some sample questions for the Microsoft Associate level certification DP-200 Implementing an Azure Data Solutions exam.
After learning the course material, answer the below sample questions, and test your knowledge. The correct answers are available at the end of this article.
Here goes the Quiz:
- Duplicating customer’s content for redundancy and meeting Service Level Agreements in Azure meets which cloud technical requirement?
c. Multi-lingual support
- What data platform technology is a globally distributed, multi-model database that can offer sub-second query performance?
a. Azure SQL Database
b. Azure Cosmos DB
c. Azure Synapse analytics
d. Blog Storage
- Which Azure Data Platform technology is commonly used to process data in an ELT framework?
a. Azure Data Factory
b. Azure Data Lake
c. Azure Databricks
d. Azure Cosmos DB
- Suppose you have two video files stored as blobs. One of the videos is business-critical and requires a replication policy that creates multiple copies across geographically diverse data centers. The other video is non-critical, and a local replication policy is sufficient. True or false: To satisfy these constraints, the two blobs will need to be in separate storage accounts.
- Dave is creating an Azure Data Lake Storage Gen 2 account. He must configure this account to be able to processes analytical data workloads for the best performance. Which option should he configure when creating the storage account?
a. On the Basic Tab set the performance to ON
b. On the Advanced tab, set the Hierarchical Namespace to enabled
c. On the Basic Tab set the performance option to standard
d. On the advanced tab set the performance to On
- How does Spark connect to databases like MySQL, Hive, and other data stores?
c. Using the REST API Layer
- By default, how are corrupt records dealt with using:
a. They appear in a column called “_corrupt_record”
b. They get deleted automatically
c. They throw an exception and exit the read operation
- Why do you chain methods:
a. to get accurate results
b. to avoid the creation of temporary DataFrames as local variables
c. it is the only way to do it
- Authentication for an Event hub is defined with a combination of an Event Publisher and which other components?
a. Transport Layer Security v1.2
b. Shared Access Signature
c. Storage Account Key
- You are receiving an error message in Azure Synapse Analytics, You want to view information about the service and help to solve the problem, what can you use to quickly check the availability of the service?
a. Diagnose and solve problems
b. Azure monitor
c. Network performance monitor
Check the correct answers here:
Explanation: High Availability duplicates customers’ content for redundancy and meeting Service Level Agreements in Azure.
Explanation: Azure Cosmos DB is a globally distributed, multi-model database that can offer sub-second query performance.
Explanation: Azure Data Factory (ADF) is a cloud integration service that orchestrates the movement of data between various data stores.
Explanation: Replication policy is a characteristic of a storage account. Every member of the storage account must use the same policy. If you need some data to use the geo-replication strategy and other data to use the local replication strategy, then you will need two storage accounts
Explanation: If you want to enable the best performance for Analytical Workloads in Data Lake Storage Gen 2, then on the Advanced tab of the Storage Account creation set the Hierarchical Namespace to enabled.
Explanation: JDBC. JDBC stands for Java Database Connectivity and is a Java API for connecting to databases such as MySQL, Hive, and other data stores. ODBC is not an option and the REST API Layer is not available
Explanation: They appear in a column called “_corrupt_record”. They do not get deleted automatically or throw an exception and exit the read operation
Explanation: To avoid the creation of temporary DataFrames as local variables. You could have written the above as – tempDF1 = myDataFrameDF.select(), tempDF2 = tempDF1.filter() and then tempDF2.groupBy(). This is syntatically equivalent, but, notice how you now have extra local variables.
Explanation: A Shared Access Signature in combination with an Event Publisher is used to define authentication for an event hub.
Explanation: Diagnose and solve problems. Diagnose and solve problems can quickly show you the service availability. Azure Monitor allows you in collecting, analyzing, and acting on telemetry from both cloud and on-premises environments. A network performance monitor measures the performance and reachability of the networks that you have configured.
Did you get a score above 70%? Then you are ready to crack the DP-200 Implementing an Azure Solution certification exam.
Disclaimer: These questions are NOT appearing in the certification exam. I personally or CloudThat do not have any official tie-up with Microsoft regarding the certification or the kind of questions asked. These are my best guesses for the kind of questions to expect with Microsoft in general and with the examination.
You can practice more and more by reviewing the TestPrep material upon course enrolment with instructor-led training for the Microsoft certification exam DP-200 Implementing an Azure Data Solutions appear for the exam and grab your Microsoft badge quickly. A comprehensive study guide to help you with your DP-200 certification preparation is going to kickstart your career as a Data Engineer.
Also, more sample questions are coming up, so keep checking. Please share your feedback in the below comments section.
Feel free to drop any questions in the comment box, I would love to address them. I hope you enjoyed the article. Best of luck!