Updated DP-300 Study Materials – For Pass Microsoft Azure Database Administrator Exam Smoothly

Updated DP-300 Study Materials – For Pass Microsoft Azure Database Administrator Exam Smoothly

If do not know where to start preparing for DP-300 Administering Relational Databases on Microsoft Azure exam well, choose updated DP-300 study materials of ITExamShop as the preparation resource. The most updated DP-300 study materials are written by the top team of ITExamShop, they have curated 197 practice exam questions and answers to help candidates practice them regularly with ITExamShop pdf file and testing engine. With ITExamShop DP-300 study materials, you can pass DP-300 Administering Relational Databases on Microsoft Azure exam and achieve the Microsoft certification for going further in the future.

Below are DP-300 free questions, you can check them before getting the updated DP-300 study materials:

Page 1 of 2

1. You have an on-premises app named App1 that stores data in an on-premises Microsoft SQL Server 2016 database named DB1.

You plan to deploy additional instances of App1 to separate Azure regions. Each region will have a separate instance of App1 and DB1. The separate instances of DB1 will sync by using Azure SQL Data Sync.

You need to recommend a database service for the deployment. The solution must minimize administrative effort.

What should you include in the recommendation?

2. You have an Azure Data Factory that contains 10 pipelines.

You need to label each pipeline with its main purpose of either ingest, transform, or load. The labels must be available for grouping and filtering when using the monitoring experience in Data Factory.

What should you add to each pipeline?

3. Topic 3, ADatum Corporation



This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.



To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.



At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.



To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.



Overview

ADatum Corporation is a retailer that sells products through two sales channels: retail stores and a website.



Existing Environment

ADatum has one database server that has Microsoft SQL Server 2016 installed. The server hosts three mission-critical databases named SALESDB, DOCDB, and REPORTINGDB.

SALESDB collects data from the stores and the website.

DOCDB stores documents that connect to the sales data in SALESDB. The documents are stored in two different JSON formats based on the sales channel.

REPORTINGDB stores reporting data and contains several columnstore indexes. A daily process creates reporting data in REPORTINGDB from the data in SALESDB. The process is implemented as a SQL Server Integration Services (SSIS) package that runs a stored procedure from SALESDB.



Requirements

Planned Changes

ADatum plans to move the current data infrastructure to Azure.

The new infrastructure has the following requirements:

✑ Migrate SALESDB and REPORTINGDB to an Azure SQL database.

✑ Migrate DOCDB to Azure Cosmos DB.

✑ The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytics process will perform aggregations that must be done continuously, without gaps, and without overlapping.

✑ As they arrive, all the sales documents in JSON format must be transformed into one consistent format.

✑ Azure Data Factory will replace the SSIS process of copying the data from SALESDB to REPORTINGDB.



Technical Requirements

The new Azure data infrastructure must meet the following technical requirements:

✑ Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The encryption must use your own key.

✑ SALESDB must be restorable to any given minute within the past three weeks.

✑ Real-time processing must be monitored to ensure that workloads are sized properly based on actual usage patterns.

✑ Missing indexes must be created automatically for REPORTINGDB.

✑ Disk IO, CPU, and memory usage must be monitored for SALESDB.



Which windowing function should you use to perform the streaming aggregation of the sales data?

4. You have an Azure SQL database named DB1.

You need to display the estimated execution plan of a query by using the query editor in the Azure portal.

What should you do first?

5. You need to implement the surrogate key for the retail store table. The solution must meet the sales transaction dataset requirements.

What should you create?

6. You have an Azure subscription that contains a server named Server1. Server1 hosts two Azure SQL databases named DB1 and DB2.

You plan to deploy a Windows app named App1 that will authenticate to DB2 by using SQL authentication.

You need to ensure that App1 can access DB2.

The solution must meet the following requirements:

✑ App1 must be able to view only DB2.

✑ Administrative effort must be minimized.

What should you create?

7. What should you implement to meet the disaster recovery requirements for the PaaS solution?

8. You have an Azure Databricks resource.

You need to log actions that relate to changes in compute for the Databricks resource.

Which Databricks services should you log?

9. Based on the PaaS prototype, which Azure SQL Database compute tier should you use?

10. What should you use to migrate the PostgreSQL database?


 

Leave a Reply

Your email address will not be published. Required fields are marked *