MICROSOFT CERTIFICATION

DP-300 Azure Database Administrator Associate Practice Exam

Exam Number: 3115 | Last updated 16-Apr-26 | 786+ questions across 4 vendor-aligned objectives

The DP-300 Azure Database Administrator Associate certification validates the skills of database administrators who manage relational databases on Azure, including SQL Server and Azure SQL. This exam measures your ability to work with Azure SQL Database, Azure SQL Managed Instance, SQL Server on Azure VMs, Azure Data Studio, Elastic Jobs, demonstrating both conceptual understanding and practical implementation skills required in today’s enterprise environments.

The heaviest exam domains include Plan and Implement Data Platform Resources (20–25%), Monitor, Configure, and Optimize Database Resources (20–25%), and Plan and Implement a High Availability and Disaster Recovery Environment (20–25%). These areas collectively represent the majority of exam content and require focused preparation across their respective subtopics.

Additional domains tested include Implement a Secure Environment (15–20%), and Configure and Manage Automation of Tasks (15–20%). Together, these areas round out the full exam blueprint and ensure candidates possess well-rounded expertise across the certification scope.

 HADR and resource optimization are equally weighted as top domains. Master Active Geo-Replication, failover groups, and intelligent performance tuning with Azure SQL Database Advisor and Query Performance Insight.

Every answer links to the source. Each explanation below includes a hyperlink to the exact Microsoft documentation page the question was derived from. PowerKram is the only practice platform with source-verified explanations. Learn about our methodology →

310

practice exam users

94.7%

satisfied users

91.2%

passed the exam

4.8/5

quality rating

Test your DP-300 Azure Database Administrator Associate knowledge

10 of 786+ questions

Question #1 - Plan and Implement Data Platform Resources

A company is migrating an on-premises SQL Server database to Azure. The database uses cross-database queries and SQL Agent jobs that must be preserved.

Which Azure SQL deployment option should be selected?

A) Azure SQL Database single database
B) Azure Database for PostgreSQL
C) Azure SQL Database serverless
D) Azure SQL Managed Instance

 

Correct answers: D – Explanation:
SQL Managed Instance supports cross-database queries and SQL Agent jobs, providing near-complete SQL Server compatibility. Single database does not support cross-database queries. Serverless is a billing model, not a compatibility solution. PostgreSQL is a different database engine. Source: Check Source

A company migrates an on-premises SQL Server database to Azure. The database uses cross-database queries and SQL Agent jobs that must be preserved.

Which Azure SQL deployment option should be selected?

A) Azure SQL Managed Instance providing near-complete SQL Server engine compatibility
B) Azure SQL Database single database running in the General Purpose service tier
C) Azure Database for PostgreSQL Flexible Server with SQL compatibility extensions
D) Azure SQL Database serverless compute tier with auto-pause for cost optimization

 

Correct answers: A – Explanation:
SQL Managed Instance supports cross-database queries, SQL Agent jobs, and the vast majority of SQL Server engine features in a managed PaaS environment. Single database does not support cross-database queries or SQL Agent. Serverless is a compute billing model available on single database, not a compatibility solution. PostgreSQL is a different database engine entirely requiring application rewrite. Source: Check Source

A startup needs a cost-effective Azure SQL Database for dev workloads active only during business hours that should auto-pause when idle.

Which configuration minimizes cost?

A) Business Critical tier with reserved capacity providing premium performance and local SSD
B) Serverless compute tier with auto-pause enabled suspending the database during idle periods
C) Hyperscale tier with read scale-out replicas optimized for very large database workloads
D) Provisioned General Purpose tier with compute resources allocated continuously at all times

 

Correct answers: B – Explanation:
Serverless auto-pauses the database during inactivity, charging only for storage until the next connection resumes compute. Provisioned General Purpose charges continuously regardless of activity. Business Critical provides premium performance at premium cost, unsuitable for intermittent dev work. Hyperscale targets very large databases with read replicas, adding unnecessary cost for a development workload. Source: Check Source

A financial database contains PII. The DBA needs to ensure even administrators cannot view certain columns like social security numbers.

Which security feature should be configured?

A) Transparent Data Encryption protecting data at rest on disk while allowing plaintext query results
B) Row-Level Security filtering which data rows each user can see based on their identity context
C) Always Encrypted with client-managed column encryption keys preventing server-side decryption
D) Dynamic Data Masking hiding sensitive values from non-privileged users in query result sets

 

Correct answers: C – Explanation:
Always Encrypted encrypts column data with keys managed exclusively by the client application, preventing even DBAs from seeing plaintext values in query results. TDE encrypts data at rest but all authenticated users see plaintext in queries. Dynamic Masking can be bypassed by users with UNMASK permission including admin roles. Row-Level Security controls which rows are visible, not which columns are readable. Source: Check Source

A company needs to audit all database access including who queried which tables and when, with 90-day log retention.

Which feature should be enabled?

A) Extended Events sessions configured for specific diagnostic debugging investigations
B) Database Mail notifications alerting administrators about each query execution event
C) Azure SQL auditing configured to export audit logs to a Log Analytics workspace
D) SQL Server Profiler running trace sessions against the Azure SQL Database instance

 

Correct answers: C – Explanation:
Azure SQL auditing captures all database operations and exports logs to Log Analytics for 90-day retention, querying, and alerting capabilities. SQL Server Profiler is a legacy tool not available for Azure SQL Database PaaS. Database Mail sends notifications but does not capture comprehensive audit trails. Extended Events are for targeted diagnostic sessions, not continuous comprehensive access auditing. Source: Check Source

A DBA notices query performance degrading during peak hours. They need to identify resource-consuming queries and get tuning recommendations.

Which tools should be used?

A) Azure Service Health displaying platform incidents affecting the Azure SQL Database region
B) Query Performance Insight showing top queries with Azure SQL Database Advisor recommendations
C) Azure Traffic Analytics providing network-level traffic flow analysis between data services
D) Azure Cost Management identifying database pricing tier and reserved capacity recommendations

 

Correct answers: B – Explanation:
Query Performance Insight surfaces the most resource-intensive queries with execution statistics, and Database Advisor provides actionable index and parameterization recommendations. Cost Management tracks spending and pricing optimization. Service Health reports platform-wide outages, not query-level performance. Traffic Analytics monitors network flows, not database query performance patterns. Source: Check Source

An e-commerce database experiences timeouts during flash sales. The elastic pool DTU is maxed across 10 databases.

Which immediate action should the DBA take?

A) Move all databases to individual provisioned single-database instances immediately
B) Restart all 10 databases in the pool to clear connection pools and cached query plans
C) Delete historical data from the largest tables to free up storage space for new transactions
D) Scale the elastic pool to a higher DTU tier providing more compute and I/O resources

 

Correct answers: D – Explanation:
Scaling the elastic pool DTU tier provides more compute, memory, and I/O resources immediately during peak demand. Deleting data may help long-term storage but does not address compute/DTU exhaustion. Restarting databases causes downtime during an already-stressed period. Moving to individual instances is a major architectural change requiring planning, not an immediate response to a flash sale. Source: Check Source

A DBA needs to rebuild fragmented indexes across 20 Azure SQL databases every Sunday at 2 AM without manual intervention.

Which automation approach should be used?

A) Azure Automation with a scheduled runbook executing index maintenance across all databases
B) Azure Logic Apps with an HTTP trigger invoking a stored procedure on each database
C) Manual T-SQL script execution by the on-call DBA logging in every Sunday at 2 AM
D) SQL Agent jobs configured individually on each of the 20 Azure SQL Database instances

 

Correct answers: A – Explanation:
Azure Automation runbooks execute T-SQL maintenance scripts on configurable schedules across multiple databases without human intervention. Manual execution requires a DBA available every Sunday. SQL Agent is not available in Azure SQL Database (only in Managed Instance). Logic Apps with HTTP triggers can work but Azure Automation is purpose-built for scheduled cross-database administrative tasks. Source: Check Source

A company needs elastic jobs to run data cleanup procedures across 50 databases in an elastic pool on a daily schedule.

Which feature should be configured?

A) Azure SQL Elastic Jobs executing T-SQL across multiple databases from a single job agent
B) Azure Data Factory with Copy Activity moving data between databases for cleanup processing
C) Individual stored procedure scheduling configured separately inside each of the 50 databases
D) Power Automate with the SQL connector invoking a cleanup query on one database at a time

 

Correct answers: A – Explanation:
Elastic Jobs execute T-SQL scripts across multiple databases on a schedule from a centralized job agent with monitoring and retry capabilities. Individual scheduling inside 50 databases is unmanageable and inconsistent. Data Factory Copy Activity moves data between locations, not index or data cleanup operations. Power Automate SQL connector processes one database per action, lacking efficient multi-database batch execution. Source: Check Source

A banking app requires RPO under 5 seconds and automatic failover to a secondary region for Azure SQL Database.

Which configuration provides this?

A) Log shipping configured to an Azure VM running SQL Server in the secondary Azure region
B) Locally redundant backup with monthly restore testing exercises in a test environment
C) Geo-redundant backup with manual point-in-time restore to the paired region on demand
D) Active geo-replication with an auto-failover group providing automatic regional failover

 

Correct answers: D – Explanation:
Auto-failover groups with active geo-replication provide continuous asynchronous replication with RPO under 5 seconds and automatic DNS-based failover on regional outage. Geo-redundant backup has RPO measured in hours based on the backup frequency. Locally redundant backup does not protect against regional failure scenarios. Log shipping is not available for Azure SQL Database PaaS and requires SQL Server on VMs. Source: Check Source

Get 786+ more questions with source-linked explanations

Every answer traces to the exact Microsoft documentation page — so you learn from the source, not just memorize answers.

Exam mode & learn mode · Score by objective · Updated 16-Apr-26

Learn more...

What the DP-300 Azure Database Administrator Associate exam measures

  • Plan and Implement Data Platform Resources (20–25%) — Evaluate your ability to implement and manage tasks within this domain, including real-world job skills and scenario-based problem solving.
  • Implement a Secure Environment (15–20%) — Evaluate your ability to implement and manage tasks within this domain, including real-world job skills and scenario-based problem solving.
  • Monitor, Configure, and Optimize Database Resources (20–25%) — Evaluate your ability to implement and manage tasks within this domain, including real-world job skills and scenario-based problem solving.
  • Configure and Manage Automation of Tasks (15–20%) — Evaluate your ability to implement and manage tasks within this domain, including real-world job skills and scenario-based problem solving.
  • Plan and Implement a High Availability and Disaster Recovery Environment (20–25%) — Evaluate your ability to implement and manage tasks within this domain, including real-world job skills and scenario-based problem solving.

  • Review the official exam guide to understand every objective and domain weight before you begin studying
  • Complete the relevant Microsoft Learn learning path to build a structured foundation across all exam topics
  • Get hands-on practice in an Azure free-tier sandbox or trial environment to reinforce what you have studied with real configurations
  • Apply your knowledge through real-world project experience — whether at work, in volunteer roles, or contributing to open-source initiatives
  • Master one objective at a time, starting with the highest-weighted domain to maximize your score potential early
  • Use PowerKram learn mode to study by individual objective and review detailed explanations for every question
  • Switch to PowerKram exam mode to simulate the real test experience with randomized questions and timed conditions

Earning this certification can open doors to several in-demand roles:

Microsoft provides comprehensive free training to prepare for the DP-300 Azure Database Administrator Associate exam. Start with the official Microsoft Learn learning path for structured, self-paced modules covering every exam domain. Review the exam study guide for the complete skills outline and recent updates.

Related certifications to explore

Related reading from our Learning Hub