IBM C9006700 IBM Certified Db2 13 for z/OS Database Administrator – Professional

0 k+
Previous users

Very satisfied with PowerKram

0 %
Satisfied users

Would reccomend PowerKram to friends

0 %
Passed Exam

Using PowerKram and content desined by experts

0 %
Highly Satisfied

with question quality and exam engine features

Mastering IBM C9006700 db2 v13 admin professional: What you need to know

PowerKram plus IBM C9006700 db2 v13 admin professional practice exam - Last updated: 3/18/2026

✅ 24-Hour full access trial available for IBM C9006700 db2 v13 admin professional

✅ Included FREE with each practice exam data file – no need to make additional purchases

Exam mode simulates the day-of-the-exam

Learn mode gives you immediate feedback and sources for reinforced learning

✅ All content is built based on the vendor approved objectives and content

✅ No download or additional software required

✅ New and updated exam content updated regularly and is immediately available to all users during access period

FREE PowerKram Exam Engine | Study by Vendor Objective

About the IBM C9006700 db2 v13 admin professional certification

The IBM C9006700 db2 v13 admin professional certification validates your ability to administer IBM Db2 13 databases on z/OS at a professional level, leveraging new Db2 13 capabilities. This certification validates advanced skills in database design, performance optimization, utility management, recovery strategies, security hardening, and system parameter tuning for enterprise Db2 13 environments. within modern IBM cloud and enterprise environments. This credential demonstrates proficiency in applying IBM‑approved methodologies, platform capabilities, and enterprise‑grade frameworks across real business, automation, integration, and data‑governance scenarios. Certified professionals are expected to understand Db2 13 z/OS advanced administration, performance optimization, utility management, recovery strategies, security hardening, system parameter tuning, and new feature implementation, and to implement solutions that align with IBM standards for scalability, security, performance, automation, and enterprise‑centric excellence.

How the IBM C9006700 db2 v13 admin professional fits into the IBM learning journey

IBM certifications are structured around role‑based learning paths that map directly to real project responsibilities. The C9006700 db2 v13 admin professional exam sits within the IBM Mainframe and Data Specialty path and focuses on validating your readiness to work with:

  • Db2 13 z/OS advanced administration and new feature usage
  • Performance optimization and utility management
  • Security hardening, recovery strategies, and system tuning

This ensures candidates can contribute effectively across IBM Cloud workloads, including IBM Cloud Pak for Data, Watson AI, IBM Cloud, Red Hat OpenShift, IBM Security, IBM Automation, IBM z/OS, and other IBM platform capabilities depending on the exam’s domain.

What the C9006700 db2 v13 admin professional exam measures

The exam evaluates your ability to:

  • Administer Db2 13 databases leveraging new features and enhancements
  • Optimize database performance through advanced tuning techniques
  • Manage utilities for reorganization, backup, and recovery
  • Implement advanced security and authorization controls
  • Configure system parameters and buffer pool optimization
  • Troubleshoot complex Db2 operational issues

These objectives reflect IBM’s emphasis on secure data practices, scalable architecture, optimized automation, robust integration patterns, governance through access controls and policies, and adherence to IBM‑approved development and operational methodologies.

Why the IBM C9006700 db2 v13 admin professional matters for your career

Earning the IBM C9006700 db2 v13 admin professional certification signals that you can:

  • Work confidently within IBM hybrid‑cloud and multi‑cloud environments
  • Apply IBM best practices to real enterprise, automation, and integration scenarios
  • Design and implement scalable, secure, and maintainable solutions
  • Troubleshoot issues using IBM’s diagnostic, logging, and monitoring tools
  • Contribute to high‑performance architectures across cloud, on‑premises, and hybrid components

Professionals with this certification often move into roles such as Senior Db2 DBA, Mainframe Database Architect, and Data Management Lead.

How to prepare for the IBM C9006700 db2 v13 admin professional exam

Successful candidates typically:

  • Build practical skills using IBM Db2 13 for z/OS, Db2 Performance Expert, IBM Data Studio, Db2 Utilities Suite, IBM OMEGAMON for Db2
  • Follow the official IBM Training Learning Path
  • Review IBM documentation, IBM SkillsBuild modules, and product guides
  • Practice applying concepts in IBM Cloud accounts, lab environments, and hands‑on scenarios
  • Use objective‑based practice exams to reinforce learning

Similar certifications across vendors

Professionals preparing for the IBM C9006700 db2 v13 admin professional exam often explore related certifications across other major platforms:

Other popular IBM certifications

These IBM certifications may complement your expertise:

Official resources and career insights

Try 24-Hour FREE trial today! No credit Card Required

24-Trial includes full access to all exam questions for the IBM C9006700 db2 v13 admin professional and full featured exam engine.

🏆 Built by Experienced IBM Experts
📘 Aligned to the C9006700 db2 v13 admin professional 
Blueprint
🔄 Updated Regularly to Match Live Exam Objectives
📊 Adaptive Exam Engine with Objective-Level Study & Feedback
✅ 24-Hour Free Access—No Credit Card Required

PowerKram offers more...

Get full access to C9006700 db2 v13 admin professional, full featured exam engine and FREE access to hundreds more questions.

Test your knowledge of IBM C9006700 db2 v13 admin professional exam content

A senior DBA is administering a Db2 13 subsystem supporting a high-volume OLTP application processing 15,000 transactions per second. Buffer pool hit ratios have dropped below 90% for the most active tablespace.

How should bufferpool performance be optimized?

A) Increase all bufferpool sizes equally
B) Analyze the specific bufferpool’s getpage and read I/O statistics using Db2 Performance Expert or RMF, identify whether the low hit ratio is caused by insufficient bufferpool size for the hot data or by sequential scan workloads polluting the pool, consider creating a dedicated bufferpool for the hot tablespace with appropriate sizing, and implement PGSTEAL(LRU) algorithm tuning if needed
C) Reduce the bufferpool size to free memory for other subsystems
D) Disable bufferpool caching to force direct I/O

 

Correct answers: B – Explanation:
Targeted bufferpool analysis identifies whether sizing, workload patterns, or pool assignment is the cause. Equal increase for all (A) wastes memory on pools that do not need it. Reduction (C) worsens the problem. Disabling caching (D) dramatically degrades performance.

The DBA must implement Db2 13’s continuous delivery capability to adopt new features without a full version migration.

How does Db2 13 continuous delivery work?

A) New features require a full Db2 version upgrade each time
B) Db2 13 introduces function levels that can be activated incrementally—the DBA applies maintenance PTFs and activates new function levels without a full migration, allowing gradual adoption of new capabilities while maintaining application compatibility through the activation sequence: AT STARTUP, ACTIVATE, and full V13R1 function levels
C) New features are automatically activated without DBA intervention
D) Continuous delivery means Db2 updates itself daily from the internet

 

Correct answers: B – Explanation:
Function level activation provides controlled feature adoption without full migration. Full version upgrade (A) is the old model. Auto-activation (C) could break applications. Internet updates (D) is not how z/OS maintenance works.

A complex query joining five tables has degraded from 3 seconds to 45 seconds after a rebind. The DBA suspects an access path regression.

How should the regression be diagnosed and resolved?

A) Accept the new access path since the optimizer chose it
B) Use EXPLAIN to compare the current access path against the previous one using PLAN_TABLE data, check if RUNSTATS was run with adequate sampling for the involved tables, use Db2 13’s statement-level PLANMGMT capability to fall back to the previous access path while investigating, and analyze the access path change to determine if statistics updates or data distribution changes caused the regression
C) Rebind again and hope for a better access path
D) Add indexes on every column in the five tables

 

Correct answers: B – Explanation:
EXPLAIN comparison with PLANMGMT fallback provides diagnostic capability with immediate relief. Accepting regression (A) impacts the application. Rebinding again (C) may reproduce the same bad plan. Indexes everywhere (D) wastes storage and slows updates.

The DBA needs to implement row-level security so that branch managers can only query customer records for their own branch.

How should row-level security be implemented in Db2 13?

A) Create separate tables for each branch
B) Implement Db2’s Row and Column Access Control (RCAC) by creating a row permission that filters rows based on a comparison between the table’s branch_id column and a session variable set to the user’s assigned branch at connection time, activate the row access control on the table, and verify that branch managers see only their branch’s data while administrators see all data
C) Use views with WHERE clauses per branch
D) Rely on the application to filter data by branch

 

Correct answers: B – Explanation:
RCAC enforces row-level security transparently at the database engine level. Separate tables (A) create massive data duplication. Views per branch (C) are harder to maintain and can be circumvented. Application filtering (D) is bypassable with direct SQL access.

The DBA must plan a disaster recovery strategy for the Db2 subsystem with an RPO of 15 minutes and RTO of 2 hours.

What DR strategy meets these requirements?

A) Take daily full image copies and ship them to the DR site
B) Configure Db2’s active log shipping or GDPS-based log replication to the DR site at intervals meeting the 15-minute RPO, maintain full and incremental image copies at the DR site, prepare the DR Db2 subsystem with an automated recovery procedure that applies logs forward from the latest image copy, and test the recovery quarterly to validate the 2-hour RTO
C) Rely on the storage subsystem’s async replication as the sole DR mechanism
D) Document the recovery procedure without testing it

 

Correct answers: B – Explanation:
Log replication with image copies and tested recovery procedures meets RPO/RTO. Daily copies (A) give a 24-hour RPO. Storage replication alone (C) is not Db2-aware and may produce inconsistent recovery. Untested procedures (D) provide no RTO confidence.

Db2 13 performance monitoring shows that lock escalation events are occurring frequently for a batch process, impacting concurrent OLTP transactions.

How should lock escalation be addressed?

A) Increase the LOCKMAX parameter to the maximum to prevent escalation
B) Analyze the batch process to determine why it acquires so many page or row locks—typically due to large UPDATE or DELETE operations without frequent COMMITs. Implement COMMIT points within the batch process at regular intervals (every 5,000-10,000 rows), consider using LOCKSIZE ROW for finer-grained locking if using page locks, and review the batch SQL for potential optimization
C) Disable locking entirely for the batch process
D) Schedule the batch to run only when no OLTP transactions are active

 

Correct answers: B – Explanation:
Frequent COMMITs and lock granularity optimization address the root cause. Maximum LOCKMAX (A) may delay but still escalate. Disabling locking (C) compromises data integrity. OLTP-free windows (D) may not exist for 24/7 systems.

The DBA needs to implement audit logging to track all data changes on a regulatory-compliance table.

How should audit be configured in Db2 13?

A) Rely on the application to log its own changes
B) Configure Db2 audit policies using the CREATE AUDIT POLICY and AUDIT statement on the target table to capture INSERT, UPDATE, and DELETE operations with the authorization ID, timestamp, and old/new values where applicable, direct audit records to SMF for centralized collection, and configure audit data retention per compliance requirements
C) Enable trace for all Db2 activity
D) Disable bufferpool caching to force direct I/O

 

Correct answers: B – Explanation:
Db2 AUDIT provides targeted, table-level change tracking. Application logging (A) can be bypassed. Global trace (C) generates massive overhead. SYSLGRNX (D) tracks log ranges, not individual data changes.

A new application team requests a database with 20 tables. The DBA must design the database topology and storage hierarchy.

How should the database be organized?

A) Create all 20 tables in the default database
B) Design the database with logically grouped tablespaces: separate tablespaces for high-activity tables, medium-activity tables, and reference/lookup tables, assign appropriate bufferpools by activity level, use universal tablespaces for optimal space management, configure SMS storage groups for physical volume management, and document the design rationale
C) Create 20 separate databases, one per table
D) Put all 20 tables in a single tablespace

 

Correct answers: B – Explanation:
Targeted bufferpool analysis identifies whether sizing, workload patterns, or pool assignment is the cause. Equal increase for all (A) wastes memory on pools that do not need it. Reduction (C) worsens the problem. Disabling caching (D) dramatically degrades performance.

The Db2 subsystem must be upgraded to enable a new function level. Critical applications must continue running during the activation.

How should the function level activation be performed?

A) Activate the new function level during peak hours
B) Plan the activation during a low-activity window, review the function level documentation for any new behavior changes, take a full backup before activation, activate the function level using the -ACTIVATE command, monitor the system for any anomalies during and after activation, and verify application functionality with regression tests before declaring the activation complete
C) Deactivate the old function level before activating the new one
D) Let function levels activate automatically without planning

 

Correct answers: B – Explanation:
Planned activation with backup, monitoring, and validation ensures safe function level adoption. Peak hours (A) risks impact on critical workloads. Deactivation is not required (C). Auto-activation does not happen (D).

The DBA discovers that index rebuild times are excessive for a 500 GB partitioned table during maintenance windows.

How can index maintenance be optimized?

A) Rebuild all index partitions every time
B) Use Db2 13’s online REORG INDEX with SHRLEVEL CHANGE to reorganize indexes while the table remains accessible, target only the index partitions that are fragmented rather than rebuilding all partitions, schedule index maintenance during low-activity windows for minimal performance impact, and monitor index health using RUNSTATS data to rebuild only when fragmentation exceeds thresholds
C) Drop indexes before maintenance and recreate afterward
D) Disable indexes permanently to eliminate maintenance

 

Correct answers: B – Explanation:
Targeted online index REORG with SHRLEVEL CHANGE minimizes disruption. Full rebuild every time (A) wastes time on healthy partitions. Drop/recreate (C) causes temporary query degradation. No indexes (D) severely impacts query performance.

Get 1,000+ more questions + FREE Powerful Exam Engine!

Sign up today to get hundreds more FREE high-quality proprietary questions and FREE exam engine for C9006700 db2 v13 admin professional. No credit card required.

Sign up