SALESFORCE CERTIFICATION

Certified Data Cloud Consultant Practice Exam

Exam Number: 3720 | Last updated 14-Apr-26 | 2316+ questions across 6 vendor-aligned objectives

The Certified Data Cloud Consultant exam evaluates your ability to implement Salesforce Data Cloud — the platform’s customer data platform (CDP) — to unify customer profiles from multiple sources, build identity resolution models, and activate data across marketing, sales, and service channels. It sits at the intersection of data architecture and marketing technology.

The Data Ingestion and Modeling domain weighs in at 25%, covering data streams, connectors, data model objects, and mapping. With 20% of the exam, Identity Resolution demands serious preparation, covering match rules, reconciliation, and unified profile creation. Questions on segmentation and activation make up 20% of the test, covering segment builder, calculated insights, and activation targets. Combined, these sections account for the lion’s share of the exam and reflect the skills employers value most.

Beyond the core areas, the exam also evaluates complementary skills. Expect about 20% of exam content to cover Implementation and Troubleshooting, which spans monitoring, error handling, and performance optimization. Data Governance and Privacy commands 15% of the blueprint, which spans consent management, data sharing, and compliance. Do not overlook these sections — the exam regularly weaves them into multi-concept scenarios.

 Identity resolution configuration is the most technically dense topic — practice creating match rules with different levels of deterministic and fuzzy matching. Know the difference between data streams, data lake objects, and data model objects, since the exam uses precise terminology throughout.

Every answer links to the source. Each explanation below includes a hyperlink to the exact Salesforce documentation page the question was derived from. PowerKram is the only practice platform with source-verified explanations. Learn about our methodology →

785

practice exam users

98.1%

satisfied users

95.2%

passed the exam

4.8/5

quality rating

Test your Certified Data Cloud Consultant knowledge

10 of 2316+ questions

Question #1 - Architect and maintain data streams, connectors, and data model objects to ensure clean, scalable data structures that power accurate reporting and integrations

A retail company wants to bring together customer data from their Salesforce CRM, e-commerce platform, email marketing tool, and mobile app into Data Cloud. Each system uses different customer identifiers.

What is the first step the consultant should take?

A) Import all data into a single Salesforce custom object
B) Wait until all source systems use the same customer identifier before starting
C) Configure data streams for each source system, map the data to the Data Cloud data model, and define the data ingestion schedule
D) Create a custom ETL pipeline outside of Data Cloud

 

Correct answers: C – Explanation:
Data streams are the ingestion mechanism for bringing external data into Data Cloud. Each source system is connected via a data stream, and data is mapped to the standard or custom Data Model Objects (DMOs). Ingestion schedules control refresh frequency. Custom objects cannot handle the volume or cross-system scope. Waiting for unified identifiers delays value — identity resolution handles different identifiers. External ETL bypasses Data Cloud’s native capabilities. Source: Trailhead: Data Cloud Overview

A consultant has ingested customer data from five sources into Data Cloud. Customers exist in multiple systems with different IDs — email in the CRM, loyalty number in the POS system, and device ID in the mobile app.

How should the consultant configure identity resolution?

A) Choose a single identifier and delete records from systems that do not use it
B) Create a custom Apex class that matches records based on a single field
C) Manually merge duplicate records one by one
D) Configure match rules that use deterministic matching (email, phone, loyalty ID) and probabilistic matching (name address similarity) to create unified individual profiles

 

Correct answers: D – Explanation:
Data Cloud identity resolution uses configurable match rules with both deterministic (exact field matching) and probabilistic (fuzzy matching) strategies. Multiple rules can be layered to progressively link records across systems into unified profiles. Manual merging does not scale. Deleting non-matching records loses data. Custom Apex cannot operate on Data Cloud objects directly. Source: Trailhead: Data Cloud Identity Resolution

A marketing team wants to create a customer segment in Data Cloud that includes all customers who purchased in the last 90 days, opened at least three emails in the last 30 days, and have a predicted high lifetime value.

What Data Cloud features should the consultant use?

A) A custom SOQL query against the Data Cloud data model
B) A Salesforce report filtered by these three criteria
C) An Excel spreadsheet exported from each source system and merged manually
D) Segment Builder with filters combining purchase behavior, email engagement, and calculated insight for predicted lifetime value

 

Correct answers: D – Explanation:
Data Cloud Segment Builder provides a visual interface for creating segments using behavioral data (purchases, email opens), calculated insights (predicted lifetime value from Einstein), and time-based filters. Segments update dynamically as data changes. Salesforce reports cannot query Data Cloud data model objects. SOQL does not directly query Data Cloud segments. Manual Excel merging is not dynamic or scalable. Source: Trailhead: Data Cloud Segmentation

A consultant has created customer segments in Data Cloud. The marketing team wants to activate these segments by pushing them to Marketing Cloud for email campaigns and to Google Ads for retargeting.

How should the consultant configure this?

A) Share the Data Cloud login credentials with the marketing team so they can access segments directly
B) Build a custom API integration between Data Cloud and each advertising platform
C) Configure activation targets for Marketing Cloud and Google Ads in Data Cloud, mapping segment attributes to each platform’s required format
D) Export segment lists as CSV files and upload them to each platform manually

 

Correct answers: C – Explanation:
Data Cloud activation targets are pre-built connectors that push segments to downstream platforms like Marketing Cloud and advertising platforms. Each target maps Data Cloud attributes to the platform’s required format and handles automatic synchronization. Manual CSV exports are labor-intensive and stale. Custom API integrations duplicate built-in functionality. Credential sharing violates security practices. Source: Trailhead: Data Cloud Segmentation

A company wants to use Data Cloud to create a calculated insight that predicts customer churn probability based on purchase frequency decline, support case volume increase, and engagement score drops.

What should the consultant configure?

A) An Einstein Analytics dashboard that displays churn indicators
B) A formula field on the Contact object in Sales Cloud
C) A batch Apex job that runs daily to calculate churn scores
D) A calculated insight in Data Cloud that combines behavioral metrics into a churn probability score using data from unified profiles

 

Correct answers: D – Explanation:
Data Cloud calculated insights derive new metrics from unified profile data, combining cross-system behavioral signals into actionable scores. The churn probability score leverages purchase, support, and engagement data from the unified profile. Formula fields cannot access Data Cloud data. Einstein Analytics dashboards display data but do not create calculated metrics within Data Cloud. Batch Apex operates on Salesforce objects, not Data Cloud. Source: Trailhead: Data Cloud Overview

A healthcare company using Data Cloud needs to ensure that patient data is handled in compliance with HIPAA regulations. Only authorized users should see patient health information.

What governance measures should the consultant implement?

A) Encrypt all data and give everyone access since encryption is sufficient
B) Store patient data in a separate, disconnected system outside of Data Cloud
C) Configure Data Cloud consent management, data access policies, and field-level restrictions to ensure only authorized roles can view PHI, with audit logging enabled
D) Rely solely on Salesforce profile-based security without additional Data Cloud configuration

 

Correct answers: C – Explanation:
HIPAA compliance in Data Cloud requires layered governance: consent management tracks patient authorization, data access policies restrict who can view protected health information, field-level restrictions hide sensitive fields from unauthorized users, and audit logging provides compliance evidence. Encryption alone does not control access. Separating data defeats the purpose of unified profiles. Salesforce profiles do not fully control Data Cloud object access. Source: Trailhead: Data Cloud Overview

A consultant discovers that Data Cloud identity resolution is creating too many false matches — merging profiles of different customers who share the same first name and city.

How should the consultant refine the identity resolution configuration?

A) Switch to manual matching review for every potential match
B) Accept the false matches since some inaccuracy is inevitable
C) Tighten match rules by adding more required matching fields (e.g., email or phone), increasing match confidence thresholds, and adding exclusion rules for common name combinations
D) Disable identity resolution entirely to prevent false matches

 

Correct answers: C – Explanation:
Refining match rules reduces false positives by requiring more identifying fields for a match, increasing confidence thresholds for probabilistic matching, and adding exclusion rules for known problematic patterns. This balances match quality with coverage. Disabling resolution loses all matching benefit. Accepting false matches degrades data quality. Manual review does not scale. Source: Trailhead: Data Cloud Identity Resolution

A consultant needs to monitor Data Cloud data ingestion health across 10 data streams. Several streams have intermittent failures that the team discovers days after the fact.

What monitoring approach should the consultant implement?

A) Create a custom polling application that checks Data Cloud status via API every hour
B) Check each data stream manually once a week
C) Configure Data Cloud monitoring dashboards and alerts that notify the team immediately when data stream ingestion failures or delays occur
D) Assume data streams are working unless users report missing data

 

Correct answers: C – Explanation:
Proactive monitoring with dashboards and alerts ensures immediate notification of ingestion failures, preventing data staleness from going undetected. Data Cloud provides monitoring capabilities for stream health, processing errors, and refresh status. Weekly manual checks allow days of undetected failures. Reactive monitoring based on user reports means issues are already impacting business. Custom polling applications duplicate built-in monitoring. Source: Trailhead: Data Cloud Overview

A consultant is designing the data model mapping for a retail client bringing e-commerce order data into Data Cloud. The source data includes orders, order line items, products, and customer records.

How should the consultant map this data to the Data Cloud data model?

A) Store the raw source data as-is without mapping to the data model
B) Map source entities to standard Data Cloud Data Model Objects — Individual for customers, Sales Order for orders, Sales Order Product for line items — preserving relationships between objects
C) Create custom Data Model Objects for every source table without using standard objects
D) Dump all data into a single Data Model Object with denormalized fields

 

Correct answers: B – Explanation:
Mapping to standard Data Model Objects (DMOs) enables Data Cloud’s built-in functionality for identity resolution, segmentation, and calculated insights. Standard DMOs like Individual, Sales Order, and Sales Order Product preserve relational integrity. Denormalized single objects lose relationships. All-custom objects miss built-in functionality tied to standard DMOs. Unmapped raw data cannot leverage Data Cloud features. Source: Trailhead: Data Cloud Overview

A consultant has configured Data Cloud for a financial services client. The client wants to use unified customer profiles to personalize the customer’s experience in real time when they visit the company’s website.

What Data Cloud capability should the consultant configure?

A) A nightly data export from Data Cloud to the website’s database
B) Data Cloud real-time data actions or web SDK integration that queries the unified profile and triggers personalization rules when the customer is identified on the website
C) Manual profile lookup by customer service agents when customers call
D) A static personalization engine that uses rules without customer data

 

Correct answers: B – Explanation:
Data Cloud supports real-time activation through data actions and web SDK integration, enabling the website to query unified profiles and trigger personalized experiences the moment a customer is identified. Nightly exports create stale data. Static rules without customer data miss personalization. Manual lookup does not apply to website interactions. Source: Trailhead: Data Cloud Overview

Get 2316+ more questions with source-linked explanations

Every answer traces to the exact Salesforce documentation page — so you learn from the source, not just memorize answers.

Exam mode & learn mode · Score by objective · Updated 14-Apr-26

Learn more...

What the Certified Data Cloud Consultant exam measures

  • Architect and maintain data streams, connectors, and data model objects to ensure clean, scalable data structures that power accurate reporting and integrations
  • Implement and monitor match rules, reconciliation, and unified profile creation to safeguard sensitive data and enforce least-privilege access across the organization
  • Design and deliver segment builder, calculated insights, and activation targets to deliver intuitive, responsive interfaces that drive user adoption and productivity
  • Audit and certify consent management, data sharing, and compliance to meet regulatory requirements and maintain auditable records of system changes and access
  • Profile and accelerate monitoring, error handling, and performance optimization to maintain fast response times and high availability even under peak traffic loads

  • Review the official exam guide
  • Complete the Data Cloud trail on Trailhead — focus on data ingestion, identity resolution, and segmentation modules
  • Set up Data Cloud in a sandbox and practice ingesting data from multiple sources, configuring identity resolution, and building segments
  • Participate in a Data Cloud implementation or audit an existing deployment to understand real-world data quality challenges
  • Start with Data Ingestion and Implementation — they combine for 45% of the exam
  • Use PowerKram’s learn mode to master Data Cloud concepts with scenario-based questions
  • Run timed practice exams in PowerKram’s exam mode

Data Cloud consultants are increasingly in demand as organizations invest in customer data platforms:

  • Data Cloud Consultant — $125,000–$170,000 per year, implementing Salesforce’s customer data platform (Glassdoor salary data)
  • CDP Architect — $145,000–$195,000 per year, designing enterprise customer data strategies (Indeed salary data)
  • Marketing Technology Director — $150,000–$200,000 per year, leading martech stack integration and data unification (Glassdoor salary data)

Follow the Data Cloud Learning Path on Trailhead. The official exam guide provides the complete objective list.

Related certifications to explore

Related reading from our Learning Hub