The Salesforce architect exams are some of the most rewarding and interesting exams to get. I really enjoy them because they offer a rare chance to dive very deep into a specific area of the platform. These can expand your capabilities within Salesforce, and provide you valuable understanding as you progress your career towards Technical Architect. This is the study guide for the Data Architecture and Management Designer certification exam.
Each of these exams has a study guide (like all other certifications), as well as a resource guide which has linked articles, Trailhead modules, documentation and more. To get the most out of those guides, I have written down some important areas to study and understand. If you understand the concepts below, you’ll do well on your exam.
Data Architecture and Management Designer
The Salesforce Data Architecture and Management Designer exam focuses on your ability to understand Salesforce data architecture, particularly as that applies to large data volumes (LDV). Your comprehension of best practices will help you make good recommendations to designing a Salesforce org that can handle millions of records without hitting limits or issues.
Large Data Volumes
This is central to the exam, so make sure you understand the impacts of LDV. Watch this video for a description of scenarios, patterns and solutions around LDV. I found that understanding the concepts there was extremely helpful.
Make sure you understand skinny tables and indexed fields, and when you would recommend using them.
Query & Search Optimization
In order to effectively use your data, you need to optimize your queries. This cheatsheet is a great resource for understanding how to optimize your queries.
Using the Bulk API
Salesforce released a 6-part series of articles on data loading for LDV. There are some great recommendations and considerations in those articles. Make sure you also understand record locking and concurrency, and how that should be considered.
As part of this, make sure you understand how serialization, sharing calculations, and indexed fields affect data loading.
Master Data Management (MDM)
You want to understand the considerations and ideas behind MDM. Which system should be the system of record? Why? How do we advise a client on an MDM solution?
What tools are available to ensure data quality? What should we consider for data backup or data archiving? What data storage limits might an organization hit, and how do you mitigate/plan for that?
Universal Containers has a large volume of accounts (~300,000), each creating a related purchase record on average 1-2 times per month. The company is steadily growing and expects to increase purchases 15%. When using a SOQL query on the Purchase__c object, what should an architect consider? Choose 2 answers.
A. Create the SOQL queries without a WHERE condition.
B. Request Salesforce Support enable Skinny Tables.
C. Reduce the number of triggers on Purchase _c object.
D. Make the queries more selective by leveraging indexed fields.
Why? Answer A would be querying the entire set of purchase records, which is a terrible option. Answer C is a best practice (as few triggers as possible), but has no bearing on a SOQL query. A Skinny Table would allow for better query performance, as would using indexed fields for our WHERE condition.
Universal Containers is migrating from a legacy system to Salesforce. They will be creating 5,000 users, 2,500,000 accounts, and 10,000,000 purchase records. The visibility of these records is controlled by owner and criteria-based sharing rules. Which two approaches will minimize data loading time during the migration? Choose 2 answers.
A. Create the users, upload all data, and then deploy the sharing rules.
B. Contact Salesforce to activate indexing before uploading the data.
C. First, load all account records, and then load all user records.
D. Defer sharing calculations until the data has finished uploading.
Why? C won’t work because accounts need owners, so users need to be added first. Activating indexing before upload will not help data loading time. Creating the users, then uploading the data will allow for a more efficient upload. And you want to defer your sharing calculations to the end. This will greatly increase the load time.
Universal Containers is implementing a Salesforce solution with both real-time web service integrations and Visualforce embedding data from back-end systems. US is using a full sandbox, which integrates with full-scale back-end testing systems. What two types of performance testing are appropriate for this project? Choose 2 answers
A. Pre Go-live automated page-load testing against the Salesforce Full sandbox.
B. Post Go-live automated page-load testing against the Salesforce Production org.
C. Pre Go-live unit testing in the Salesforce Full sandbox.
D. Stress testing against the web services hosted by the integration middleware.
Why? Due to the multi-tenant architecture of Salesforce, you can’t do performance testing against production (and shouldn’t). Unit testing isn’t appropriate here. But we want to make sure we test the page loading of our Visualforce page. And we need to do a full stress test of our middleware to ensure it can handle the volumes of our system.