DP-900 Azure Data Fundamentals Dumps & Practice Questions
- CertiMaan
- Oct 27
- 9 min read
Updated: 8 hours ago
Get fully prepared for the Microsoft Azure Data Fundamentals DP-900 exam with our updated dumps and sample questions. These DP-900 dumps are designed to mirror the real exam format, covering key domains like core data concepts, relational and non-relational data in Azure, and analytics workloads. Whether you're aiming for DP-900 certification or looking for quick revision with practice exams and mock tests, this resource is tailored for efficient and confident learning. Perfect for beginners exploring Azure data services in 2026.
DP-900 Azure Data Fundamentals Dumps & Sample Questions List :
1. Which of the following tools can be used in MacOS to access data stored in Azure SQL Database?
Azure Data Studio
Azure Storage Explorer
SQL Server Data Tools
SQL Server Management Studio
2. Which of the following statements is true about SQL Server running on a virtual machine?
Software installation and maintenance are automated, but you must do your own backups
You must install and maintain the software for the database management system yourself, but backups are automated
You're responsible for all software installation and maintenance, and performing back ups
3. Relational databases use … to enforce relationship between tables.
Columns
Partitions
Collections
Keys
4. You need to recommend a storage solution for a sales system that will receive thousands of small files per minute. The files will be in JSON, text, and CSV formats. The files will be processed and transformed before they are loaded into a data warehouse in Azure Synapse Analytics. The files must be stored and secured in folders. Which storage solution should you recommend?
Azure Cosmos DB
Azure Data Lake Storage Gen2
Azure Blob storage
Azure SQL Database
5. Normalizing the database reduces data redundancy.
No
Yes
6. The organization needs to store employees personal detail in a format which cannot be read easily by anyone. To read the data, the person would need a key. What will be the category into which this data will fall into?
Semi structured data
Encrypted data
Batch data
Streaming data
7. You are designing a data storage solution for a database that is expected to grow to 50 TB. The usage pattern is singleton inserts, singleton updates, and reporting. Which storage solution should you use?
Azure Cosmos DB that uses the Gremlin API
Azure SQL Data Warehouse
Azure SQL Database Hyperscale
Azure SQL Database elastic pools
8. Pair the visualization type with an appropriate description. A chart that shows the main contributors to the selected result or value.
Major influencers
Scatter plot
Tree map
9. You have a large amount of data held in files in Azure Data Lake storage. You want to retrieve the data in these files and use it to populate tables held in Azure Synapse Analytics. Which processing option is most appropriate?
Use Azure Synapse Link to connect to Azure Data Lake storage and download the data
Synapse Spark pool
Synapse SQL pool
10. You are designing an application that will store petabytes of medical imaging data When the data is first created, the data will be accessed frequently during the first week. After one month, the data must be accessible within 30 seconds, but files will be accessed infrequently. After one year, the data will be accessed infrequently but must be accessible within five minutes. You need to select a storage strategy for the data. The solution must minimize costs. Which storage tier should you use for each time frame? If you would like to access the data after one year, which storage strategy you will use ?
Cool
Hot
Archive
11. Which is an important characteristic of relational databases?
No dependencies between tables
Lots of duplicate data
Query and manipulate data using a variant of the SQL language
Flexible data structure
12. Data analysts create data visualizations to help companies make data-driven decisions.
Yes
No
13. A company is developing a mission-critical line of business app that uses Azure SQL Database Managed Instance. You must design a disaster recovery strategy for the solution, You need to ensure that the database automatically recovers when full or partial loss of the Azure SQL Database service occurs in the primary region. What should you recommend?
Failover-group
Azure SQL Data Sync
SQL Replication
Active geo-replication
14. Select the correct option:
A database object that holds data - View A database object whose content is defined by a query -Index A database object that helps to improve the speed of data retrieval- Table
A database object that holds data - Table A database object whose content is defined by a query -View A database object that helps to improve the speed of data retrieval- Index
A database object that holds data - View A database object whose content is defined by a query -Table A database object that helps to improve the speed of data retrieval- Index
15. You need to gather real-time telemetry data from a mobile application. Which type of workload describes this scenario?
Streaming
Massively parallel processing (MPP)
Batch
Online Transaction Processing (OLTP)
16. You have an Azure Storage account and an Azure SQL data warehouse by using Azure Data Factory. Ensure that the you will have full control over runtime and you would to decide when to apply the patch and updates to your rutime environment. Which type of integration runtime should you use?
Azure integration runtime
Azure-SSIS integration runtime
Self-hosted integration runtime
17. What is [event_date] in the following T-SQL example? DELETE FROM [dbo].[tickets] WHERE [event_date] < GETDATE().
An entity node in the tickets table
A record in the tickets table
A table event in the database
A column field in the tickets table
18. When using an Infrastructure as a Service (laaS) instance of Microsoft SQL Server on Azure, it is the [.......] that hosts the SQL Server that you manage.
MySQL server
Virtual machine
PostgreSQL server
Elastic pool
19. You are developing a solution that will stream to Azure Stream Analytics. The solution will have both streaming data and reference data. Which input type should you use for the reference data?
Azure Event Hubs
Azure IoT Hub
Azure Blob storage
Azure Cosmos DB
20. Your company uses several Azure HDInsight clusters. The data engineering team reports several errors with some applications using these clusters. You need to recommend a solution to review the health of the clusters. What should you include in your recommendation?
Azure Diagnostics
Azure Automation
Log Analytics
Application Insights
21. A company manages several on-premises Microsoft SQL Server databases. You need to migrate the databases to Microsoft Azure by using a backup process of Microsoft SQL Server. Which data technology should you use?
Azure Cosmos DB
Azure SQL Data Warehouse
Azure SQL Database single database
Azure SQL Database Managed Instance
22. An application will use Microsoft Azure Cosmos DB as its data solution. The application will use the Cassandra API to support a column-based database type that uses containers to store items. You need to provision Azure Cosmos DB. Which container name and item name should you use?
Entities and collection
Collection and row
Graph and rows
Rows and table
23. You have an e commerce application that reads and writes data to an Azure SQL Database. What type of processing does this application use?
OLTP
OLAP
Stream
Batch
24. Which storage solution supports Role-Based Access Control (RBAC) at the folder and file level?
Azure Disk Storage
Azure Blob storage
Azure Data Lake Storage
Azure Queue storage
25. You need to develop a solution to provide data to executives. The solution must provide an interactive graphical interface, depict various key performance indicators, and support data exploration by using drill down. What should you use in Microsoft Power BI?
A dashboard
A dataflow
A report
Microsoft Power Apps
26. What database type should a data engineer setup to show the links and relationships of bank accounts, houses, cars, and persons to help data scientists detect money laundering activities?
Relational
Query
Node
Graph
27. You are developing a solution on Microsoft Azure. The data at rest layer must meet the following requirements: Serve as a repository for high volumes of large files in various formats. Implement optimized storage for big data analytics workloads. Ensure that data can be organized using a hierarchical structure.
HDInsights Spark
Azure Data Lake Store
HDInsights Hadoop
Azure Databricks
28. As a data engineer at a company you have identified the need to provision a relational database in Azure. You get permission, but your manager says that the solution must minimize ongoing maintenance. Which Azure service should you use?
Azure SQL Managed Instance
Azure Cosmos DB
Azure Table Store
MySQL via IaaS
29. Which one is responsible for creating visualizations and graphs that help companies make informed decisions.
Data analyst
Data engineer
Database administrator
Data scientist
30. Databricks can process data held in many different types of storage, including Azure Blob storage, Azure Data Lake Store, Hadoop storage, flat files, SQL databases, and data warehouses, and Azure services such as Cosmos DB ?
No
Yes
31. Working as a consultant, you need to advise your client of a technology that allows their existing Microsoft SQL Server in Azure to access external data in their Oracle SQL Server in Azure. What do you recommend?
Route the queries via Azure SQL Data Warehouse
Upgrade to SQL Server Premium
Route the queries via Azure SQL Cosmos
PolyBase
32. When should you use a block blob, and when should you use a page blob?
Use a page block for blobs that require random read and write access. Use a block blob for discrete objects that change infrequently.
Use a block blob for active data stored using the Hot data access tier, and a page blob for data stored using the Cool or Archive data access tiers.
Use a block blob for unstructured data that requires random access to perform reads and writes. Use a page blob for discrete objects that rarely change.
33. Normalization involves eliminationing relationships between tables.
Yes
No
34. Complete the following sentence: Documents in a Cosmos DB database are organized into [_________].
Tables.
Containers.
Pages.
Partitions.
35. Key/value datastores are optimized for simple searches.
Yes
No
36. Although data within an Azure Cosmos DB is encrypted at rest and in transport, data engineers have the option to disable this to increase database performance and throughput.
Yes
No
37. Before you can use the AzCopy CLI command to upload or download files from an Azure file share, you must first generate a:
Shared access signature (SAS) token.
Azure storage access key (ASAK).
Transfer security token (TST).
Azure permission token.
38. Classifying images from last year is an example of batch processing.
Yes
No
39. Relational databases use _________________________ to enforce relationships between different data tables and _____________________ to enforce referential integrity.
Sequences, index
Keys, Foreign Key
Columns, Domain constraints
Partitions, logical partitions
40. The production workload is facing technical issue with one of the server. You need to collect the logs and analyze the logs to determine the root cause of the issue. What type of analysis would you perform?
Diagnostic analysis
Prescriptive analysis
Cognitive analysis
Predictive analysis
41. Using T-SQL, fill in the blank in following command to set the status of a selection of tickets: [_________] [dbo].[tickets] SET [status] = ‘Sold’ WHERE [seat] IN (‘4B’, ‘5B’, ‘6B’).
CHANGE
INSERT
UPDATE
MODIFY
42. Which one is involved in the operation of a system and is characterized by a large number of short on-line transactions (INSERT, UPDATE, DELETE). The main emphasis for the system is placed on very fast query processing, maintaining data integrity and an effectiveness measured by number of transactions per second.
LDAP
TCAP
OLAP
OLTP
43. Azure Databricks is based on apache spark
No
Yes
44. By default, Azure Cosmos accounts are accessible from internet, as long as requests are accompanied by a valid authorization token.
Yes
No
45. You have a requirement to process data whenever data is changed (real time) which processing strategy will you recommend ?
Stream Processing
Batch processing
Analytical Processing
46. What tasks are data engineers responsible for?
Explore data to identify trends.
Implement backup and recovery plan policies, tools, and processes.
Design and build analytical models.
Design and implement datastores for analytics workloads.
47. Which of the following tools can be used in macOS to access data stored in Azure SQL Database?
SQL Server Management Studio
Azure Storage Explorer
Azure Data Studio
SQL Server Data Tools
48. Pair the visualization type with an appropriate description. A graph showing the relationship between two numbers.
Major influencers
Scatter plot
Tree map
49. What is [dbo].[tickets] in the following T-SQL example? INSERT INTO [dbo].[tickets] (id, price, venue, tix_class) VALUES (NEWID(), 23.99, ‘Great Hall’, ‘Class B’)
A database
A JSON document
A graph node
A table
FAQs
1. What is the Microsoft Azure Data Fundamentals (DP-900) certification?
The DP-900 certification validates your foundational knowledge of core data concepts and how data services are implemented using Microsoft Azure.
2. How do I become certified in Azure Data Fundamentals (DP-900)?
You must pass the DP-900 exam, which measures your understanding of data storage, processing, and analytics in Azure environments.
3. What are the prerequisites for the Microsoft DP-900 exam?
There are no prerequisites for the DP-900 exam. It’s ideal for beginners interested in cloud data management and analytics.
4. How much does the Azure Data Fundamentals certification cost?
The exam costs $99 USD, though the price may vary based on your location and currency.
5. How many questions are in the DP-900 certification exam?
The exam typically includes 40–60 multiple-choice and multiple-select questions to be completed within 60 minutes.
6. What topics are covered in the Azure Data Fundamentals (DP-900) exam?
It covers core data concepts, relational and non-relational data, data analytics, and Azure data services such as SQL Database, Cosmos DB, and Synapse Analytics.
7. How difficult is the Microsoft Azure Data Fundamentals (DP-900) exam?
It’s considered entry-level and beginner-friendly, with no coding experience required.
8. How long does it take to prepare for the Azure DP-900 certification exam?
Most candidates prepare in 2–4 weeks, depending on their familiarity with data concepts and Azure tools.
9. What jobs can I get after earning the Azure Data Fundamentals (DP-900) certification?
You can pursue roles such as Data Analyst, Database Administrator, Cloud Data Engineer, or BI Specialist.
10. How much salary can I earn with the Microsoft Azure Data Fundamentals (DP-900) certification?
Certified professionals typically earn between $70,000–$95,000 per year, depending on experience and role.

Comments