We provide real DP-200 exam questions and answers braindumps in two formats. Download PDF & Practice Tests. Pass Microsoft DP-200 Exam quickly & easily. The DP-200 PDF type is available for reading and printing. You can print more and practice many times. With the help of our Microsoft DP-200 dumps pdf and vce product and material, you can easily pass the DP-200 exam.

Online DP-200 free questions and answers of New Version:

NEW QUESTION 1

A company has a SaaS solutions that will uses Azure SQL Database with elastic pools. The solution will have a dedicated database for each customer organization Customer organizations have peak usage at different periods during the year.
Which two factors affect your costs when sizing the Azure SQL Database elastic pools? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

  • A. maximum data size
  • B. number of databases
  • C. eDTUs consumption
  • D. number of read operations
  • E. number of transactions

Answer: AC

NEW QUESTION 2

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some questions sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to implement diagnostic logging for Data Warehouse monitoring. Which log should you use?

  • A. RequestSteps
  • B. DmsWorkers
  • C. SqlRequests
  • D. ExecRequests

Answer: C

Explanation:
Scenario:
The Azure SQL Data Warehouse cache must be monitored when the database is being used.
DP-200 dumps exhibit
References:
https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-pdw-sql-r

NEW QUESTION 3

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution. Determine whether the solution meets the stated goals.
You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data to be ingested resides in parquet files stored in an Azure Data lake Gen 2 storage account.
You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL Data Warehouse.
Solution:
1. Create an external data source pointing to the Azure storage account
2. Create a workload group using the Azure storage account name as the pool name
3. Load the data using the INSERT…SELECT statement
Does the solution meet the goal?

  • A. Yes
  • B. No

Answer: B

Explanation:
You need to create an external file format and external table using the external data source. You then load the data using the CREATE TABLE AS SELECT statement.
References:
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-from-azure-data-lake-store

NEW QUESTION 4

You need to develop a pipeline for processing data. The pipeline must meet the following requirements.
•Scale up and down resources for cost reduction.
•Use an in-memory data processing engine to speed up ETL and machine learning operations.
•Use streaming capabilities.
•Provide the ability to code in SQL, Python, Scala, and R.
•Integrate workspace collaboration with Git. What should you use?

  • A. HDInsight Spark Cluster
  • B. Azure Stream Analytics
  • C. HDInsight Hadoop Cluster
  • D. Azure SQL Data Warehouse

Answer: B

NEW QUESTION 5

You are a data architect. The data engineering team needs to configure a synchronization of data between an on-premises Microsoft SQL Server database to Azure SQL Database.
Ad-hoc and reporting queries are being overutilized the on-premises production instance. The synchronization process must:
Perform an initial data synchronization to Azure SQL Database with minimal downtime Perform bi-directional data synchronization after initial synchronization
You need to implement this synchronization solution. Which synchronization method should you use?

  • A. transactional replication
  • B. Data Migration Assistant (DMA)
  • C. backup and restore
  • D. SQL Server Agent job
  • E. Azure SQL Data Sync

Answer: E

Explanation:
SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple SQL databases and SQL Server instances.
With Data Sync, you can keep data synchronized between your on-premises databases and Azure SQL databases to enable hybrid applications.
Compare Data Sync with Transactional Replication
DP-200 dumps exhibit
References:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-sync-data

NEW QUESTION 6

You need set up the Azure Data Factory JSON definition for Tier 10 data.
What should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Connection String
To use storage account key authentication, you use the ConnectionString property, which xpecify the information needed to connect to Blobl Storage.
Mark this field as a SecureString to store it securely in Data Factory. You can also put account key in Azure Key Vault and pull the accountKey configuration out of the connection string.
Box 2: Azure Blob
Tier 10 reporting data must be stored in Azure Blobs
DP-200 dumps exhibit
References:
https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage

NEW QUESTION 7

Note: This question is part of series of questions that present the same scenario. Each question in the series contain a unique solution. Determine whether the solution meets the stated goals.
You develop data engineering solutions for a company.
A project requires the deployment of resources to Microsoft Azure for batch data processing on Azure HDInsight. Batch processing will run daily and must:
Scale to minimize costs
Be monitored for cluster performance
You need to recommend a tool that will monitor clusters and provide information to suggest how to scale. Solution: Monitor clusters by using Azure Log Analytics and HDInsight cluster management solutions. Does the solution meet the goal?

  • A. Yes
  • B. No

Answer: A

Explanation:
HDInsight provides cluster-specific management solutions that you can add for Azure Monitor logs. Management solutions add functionality to Azure Monitor logs, providing additional data and analysis tools. These solutions collect important performance metrics from your HDInsight clusters and provide the tools to
search the metrics. These solutions also provide visualizations and dashboards for most cluster types
supported in HDInsight. By using the metrics that you collect with the solution, you can create custom monitoring rules and alerts.

NEW QUESTION 8

You are developing a solution using a Lambda architecture on Microsoft Azure. The data at test layer must meet the following requirements:
Data storage:
•Serve as a repository (or high volumes of large files in various formats.
•Implement optimized storage for big data analytics workloads.
•Ensure that data can be organized using a hierarchical structure. Batch processing:
•Use a managed solution for in-memory computation processing.
•Natively support Scala, Python, and R programming languages.
•Provide the ability to resize and terminate the cluster automatically. Analytical data store:
•Support parallel processing.
•Use columnar storage.
•Support SQL-based languages.
You need to identify the correct technologies to build the Lambda architecture.
Which technologies should you use? To answer, select the appropriate options in the answer area NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
DP-200 dumps exhibit

NEW QUESTION 9

Your company uses several Azure HDInsight clusters.
The data engineering team reports several errors with some application using these clusters. You need to recommend a solution to review the health of the clusters.
What should you include in you recommendation?

  • A. Azure Automation
  • B. Log Analytics
  • C. Application Insights

Answer: C

NEW QUESTION 10

A company builds an application to allow developers to share and compare code. The conversations, code snippets, and links shared by people in the application are stored in a Microsoft Azure SQL Database instance. The application allows for searches of historical conversations and code snippets.
When users share code snippets, the code snippet is compared against previously share code snippets by using a combination of Transact-SQL functions including SUBSTRING, FIRST_VALUE, and SQRT. If a match is found, a link to the match is added to the conversation.
Customers report the following issues:
DP-200 dumps exhibit Delays occur during live conversations
DP-200 dumps exhibit A delay occurs before matching links appear after code snippets are added to conversations
You need to resolve the performance issues.
Which technologies should you use? To answer, drag the appropriate technologies to the correct issues. Each technology may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: memory-optimized table
In-Memory OLTP can provide great performance benefits for transaction processing, data ingestion, and transient data scenarios.
Box 2: materialized view
To support efficient querying, a common solution is to generate, in advance, a view that materializes the data in a format suited to the required results set. The Materialized View pattern describes generating prepopulated views of data in environments where the source data isn't in a suitable format for querying, where generating a suitable query is difficult, or where query performance is poor due to the nature of the data or the data store.
These materialized views, which only contain data required by a query, allow applications to quickly obtain the information they need. In addition to joining tables or combining data entities, materialized views can include the current values of calculated columns or data items, the results of combining values or executing transformations on the data items, and values specified as part of the query. A materialized view can even be optimized for just a single query.
References:
https://docs.microsoft.com/en-us/azure/architecture/patterns/materialized-view

NEW QUESTION 11

Your company plans to create an event processing engine to handle streaming data from Twitter. The data engineering team uses Azure Event Hubs to ingest the streaming data.
You need to implement a solution that uses Azure Databricks to receive the streaming data from the Azure Event Hubs.
Which three actions should you recommend be performed in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
DP-200 dumps exhibit

NEW QUESTION 12

You need to process and query ingested Tier 9 data.
Which two options should you use? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. Azure Notification Hub
  • B. Transact-SQL statements
  • C. Azure Cache for Redis
  • D. Apache Kafka statements
  • E. Azure Event Grid
  • F. Azure Stream Analytics

Answer: EF

Explanation:
Event Hubs provides a Kafka endpoint that can be used by your existing Kafka based applications as an alternative to running your own Kafka cluster.
You can stream data into Kafka-enabled Event Hubs and process it with Azure Stream Analytics, in the following steps:
DP-200 dumps exhibit Create a Kafka enabled Event Hubs namespace.
DP-200 dumps exhibit Create a Kafka client that sends messages to the event hub.
DP-200 dumps exhibit Create a Stream Analytics job that copies data from the event hub into an Azure blob storage. Scenario:
DP-200 dumps exhibit
Tier 9 reporting must be moved to Event Hubs, queried, and persisted in the same Azure region as the company’s main office
References:
https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-kafka-stream-analytics

NEW QUESTION 13

You need to provision the polling data storage account.
How should you configure the storage account? To answer, drag the appropriate Configuration Value to the correct Setting. Each Configuration Value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
DP-200 dumps exhibit

NEW QUESTION 14

You are developing the data platform for a global retail company. The company operates during normal working hours in each region. The analytical database is used once a week for building sales projections.
Each region maintains its own private virtual network.
Building the sales projections is very resource intensive are generates upwards of 20 terabytes (TB) of data. Microsoft Azure SQL Databases must be provisioned.
DP-200 dumps exhibit Database provisioning must maximize performance and minimize cost
DP-200 dumps exhibit The daily sales for each region must be stored in an Azure SQL Database instance
DP-200 dumps exhibit Once a day, the data for all regions must be loaded in an analytical Azure SQL Database instance You need to provision Azure SQL database instances.
How should you provision the database instances? To answer, drag the appropriate Azure SQL products to the correct databases. Each Azure SQL product may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Azure SQL Database elastic pools
SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands. The databases in an elastic pool are on a single Azure
SQL Database server and share a set number of resources at a set price. Elastic pools in Azure SQL Database enable SaaS developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity for each database.
Box 2: Azure SQL Database Hyperscale
A Hyperscale database is an Azure SQL database in the Hyperscale service tier that is backed by the Hyperscale scale-out storage technology. A Hyperscale database supports up to 100 TB of data and provides high throughput and performance, as well as rapid scaling to adapt to the workload requirements. Scaling is transparent to the application – connectivity, query processing, and so on, work like any other SQL database.

NEW QUESTION 15

A company runs Microsoft SQL Server in an on-premises virtual machine (VM).
You must migrate the database to Azure SQL Database. You synchronize users from Active Directory to Azure Active Directory (Azure AD).
You need to configure Azure SQL Database to use an Azure AD user as administrator. What should you configure?

  • A. For each Azure SQL Database, set the Access Control to administrator.
  • B. For the Azure SQL Database server, set the Active Directory to administrator.
  • C. For each Azure SQL Database, set the Active Directory administrator role.
  • D. For the Azure SQL Database server, set the Access Control to administrator.

Answer: A

NEW QUESTION 16

You are developing a solution to visualize multiple terabytes of geospatial data. The solution has the following requirements:
•Data must be encrypted.
•Data must be accessible by multiple resources on Microsoft Azure. You need to provision storage for the solution.
Which four actions should you perform in sequence? To answer, move the appropriate action from the list of actions to the answer area and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
DP-200 dumps exhibit

NEW QUESTION 17

You need to ensure polling data security requirements are met.
Which security technologies should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Azure Active Directory user Scenario:
Access to polling data must set on a per-active directory user basis
Box 2: DataBase Scoped Credential
SQL Server uses a database scoped credential to access non-public Azure blob storage or Kerberos-secured Hadoop clusters with PolyBase.
PolyBase cannot authenticate by using Azure AD authentication. References:
https://docs.microsoft.com/en-us/sql/t-sql/statements/create-database-scoped-credential-transact-sql

NEW QUESTION 18

You are a data engineer. You are designing a Hadoop Distributed File System (HDFS) architecture. You plan to use Microsoft Azure Data Lake as a data storage repository.
You must provision the repository with a resilient data schema. You need to ensure the resiliency of the Azure Data Lake Storage. What should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: NameNode
An HDFS cluster consists of a single NameNode, a master server that manages the file system namespace and regulates access to files by clients.
Box 2: DataNode
The DataNodes are responsible for serving read and write requests from the file system’s clients. Box 3: DataNode
The DataNodes perform block creation, deletion, and replication upon instruction from the NameNode.
Note: HDFS has a master/slave architecture. An HDFS cluster consists of a single NameNode, a master server that manages the file system namespace and regulates access to files by clients. In addition, there are a number of DataNodes, usually one per node in the cluster, which manage storage attached to the nodes that they run on. HDFS exposes a file system namespace and allows user data to be stored in files. Internally, a file is split into one or more blocks and these blocks are stored in a set of DataNodes. The NameNode executes file system namespace operations like opening, closing, and renaming files and directories. It also determines the mapping of blocks to DataNodes. The DataNodes are responsible for serving read and write requests from the file system’s clients. The DataNodes also perform block creation, deletion, and replication upon instruction from the NameNode.
References: https://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#NameNode+and+DataNodes

NEW QUESTION 19

Contoso, Ltd. plans to configure existing applications to use Azure SQL Database. When security-related operations occur, the security team must be informed. You need to configure Azure Monitor while minimizing administrative efforts
Which three actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A. Create a new action group to email alerts@contoso.com.
  • B. Use alerts@contoso.com as an alert email address.
  • C. Use all security operations as a condition.
  • D. Use all Azure SQL Database servers as a resource.
  • E. Query audit log entries as a condition.

Answer: ACE

NEW QUESTION 20

You need to ensure that phone-based polling data can be analyzed in the PollingData database.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer are and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
DP-200 dumps exhibit
Scenario:
All deployments must be performed by using Azure DevOps. Deployments must use templates used in multiple environments
No credentials or secrets should be used during deployments

NEW QUESTION 21

You need to mask tier 1 data. Which functions should you use? To answer, select the appropriate option in the answer area.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
A: Default
Full masking according to the data types of the designated fields.
For string data types, use XXXX or fewer Xs if the size of the field is less than 4 characters (char, nchar, varchar, nvarchar, text, ntext).
B: email
C: Custom text
Custom StringMasking method which exposes the first and last letters and adds a custom padding string in the middle. prefix,[padding],suffix
Tier 1 Database must implement data masking using the following masking logic:
DP-200 dumps exhibit
References:
https://docs.microsoft.com/en-us/sql/relational-databases/security/dynamic-data-masking

NEW QUESTION 22

You need to set up access to Azure SQL Database for Tier 7 and Tier 8 partners.
Which three actions should you perform in sequence? To answer, move the appropriate three actions from the list of actions to the answer area and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Tier 7 and 8 data access is constrained to single endpoints managed by partners for access Step 1: Set the Allow Azure Services to Access Server setting to Disabled
Set Allow access to Azure services to OFF for the most secure configuration.
By default, access through the SQL Database firewall is enabled for all Azure services, under Allow access to Azure services. Choose OFF to disable access for all Azure services.
Note: The firewall pane has an ON/OFF button that is labeled Allow access to Azure services. The ON setting allows communications from all Azure IP addresses and all Azure subnets. These Azure IPs or subnets might not be owned by you. This ON setting is probably more open than you want your SQL Database to be. The virtual network rule feature offers much finer granular control.
Step 2: In the Azure portal, create a server firewall rule Set up SQL Database server firewall rules
Server-level IP firewall rules apply to all databases within the same SQL Database server. To set up a server-level firewall rule:
DP-200 dumps exhibit In Azure portal, select SQL databases from the left-hand menu, and select your database on the SQL databases page.
DP-200 dumps exhibit On the Overview page, select Set server firewall. The Firewall settings page for the database server opens.
Step 3: Connect to the database and use Transact-SQL to create a database firewall rule
Database-level firewall rules can only be configured using Transact-SQL (T-SQL) statements, and only after you've configured a server-level firewall rule.
To setup a database-level firewall rule:
DP-200 dumps exhibitIn Object Explorer, right-click the database and select New Query.
DP-200 dumps exhibitEXECUTE sp_set_database_firewall_rule N'Example DB Rule','0.0.0.4','0.0.0.4';
DP-200 dumps exhibit On the toolbar, select Execute to create the firewall rule. References:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-tutorial

NEW QUESTION 23

You implement 3 Azure SQL Data Warehouse instance.
You plan to migrate the largest fact table to Azure SQL Data Warehouse The table resides on Microsoft SQL Server on-premises and e 10 terabytes (TB) in size.
Incoming queues use the primary key Sale Key column to retrieve data as displayed in the following table:
DP-200 dumps exhibit
You need to distribute the fact table across multiple nodes to optimize performance of the table. Which technology should you use?

  • A. hash distributed table with clustered ColumnStore index
  • B. hash distributed table with clustered index
  • C. heap table with distribution replicate
  • D. round robin distributed table with clustered index
  • E. round robin distributed table with clustered ColumnStore index

Answer: A

NEW QUESTION 24

The data engineering team manages Azure HDInsight clusters. The team spends a large amount of time creating and destroying clusters daily because most of the data pipeline process runs in minutes.
You need to implement a solution that deploys multiple HDInsight clusters with minimal effort. What should you implement?

  • A. Azure Databricks
  • B. Azure Traffic Manager
  • C. Azure Resource Manager templates
  • D. Ambari web user interface

Answer: C

Explanation:
A Resource Manager template makes it easy to create the following resources for your application in a single, coordinated operation:
DP-200 dumps exhibit HDInsight clusters and their dependent resources (such as the default storage account).
DP-200 dumps exhibit Other resources (such as Azure SQL Database to use Apache Sqoop).
In the template, you define the resources that are needed for the application. You also specify deployment parameters to input values for different environments. The template consists of JSON and expressions that you use to construct values for your deployment.
References:
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-hadoop-create-linux-clusters-arm-templates

NEW QUESTION 25

Your company has on-premises Microsoft SQL Server instance.
The data engineering team plans to implement a process that copies data from the SQL Server instance to Azure Blob storage. The process must orchestrate and manage the data lifecycle.
You need to configure Azure Data Factory to connect to the SQL Server instance.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
DP-200 dumps exhibit

NEW QUESTION 26
......

100% Valid and Newest Version DP-200 Questions & Answers shared by Certifytools, Get Full Dumps HERE: https://www.certifytools.com/DP-200-exam.html (New 88 Q&As)