UC is planning a massive SF implementation with large volumes of data. As part of the org’s implementation, several roles, territories, groups, and sharing rules have been configured. The data architect has been tasked with loading all of the required data, including user data, in a timely manner.
What should a data architect do to minimize data load times due to system calculations?
A. Enable defer sharing calculations, and suspend sharing rule calculations
B. Load the data through data loader, and turn on parallel processing.
C. Leverage the Bulk API and concurrent processing with multiple batches
D. Enable granular locking to avoid “UNABLE _TO_LOCK_ROW” error.
A
UC has millions of case records with case history and SLA data. UC’s compliance team would
like historical cases to be accessible for 10 years for Audit purpose.
What solution should a data architect recommend?
A. Archive Case data using Salesforce Archiving process
B. Purchase more data storage to support case object
C. Use a custom object to store archived case data.
D. Use a custom Big object to store archived case data.
D
A large telecommunication provider that provides internet services to both residence and business has the following attributes:
* A customer who purchases its services for their home will be created as an Account in Salesforce.
* Individuals within the same house address will be created as Contact in Salesforce.
* Businesses are created as Accounts in Salesforce.
* Some of the customers have both services at their home and business.
What should a data architect recommend for a single view of these customers without creating multiple customer records?
A. Customers are created as Contacts and related to Business and Residential Accounts using the Account Contact Relationships.
B. Customers are created as Person Accounts and related to Business and Residential Accounts using the Account Contact relationship.
C. Customer are created as individual objects and relate with Accounts for Business and Residence accounts.
D. Costumers are created as Accounts for Residence Account and use Parent Account to relate Business Account.
B
Universal Containers (UC) has over 10 million accounts with an average of 20 opportunities with
each account. A Sales Executive at UC needs to generate a daily report for all opportunities in a
specific opportunity stage.
Which two key considerations should be made to make sure the performance of the report is not
degraded due to large data volume?
A. Number of queries running at a time.
B. Number of joins used in report query.
C. Number of records returned by report query.
D. Number of characters in report query.
BC
A customer is facing locking issued when importing large data volumes of order records that are children in a master-detail relationship with the Account object. What is the recommended way to avoid locking issues during import?
A. Import Account records first followed by order records after sorting order by OrderID.
B. Import Account records first followed by order records after sorting orders by AccountID.
C. Change the relationship to Lookup and update the relationship to master-detail after import.
D. Import Order records and Account records separately and populate AccountID in orders using batch Apex.
B
To address different compliance requirements, such as general data protection regulation (GDPR), personally identifiable information (PII), of health insurance Portability and Accountability Act (HIPPA) and others, a SF customer decided to categorize each data element in SF with the following:
Data owner
Security Level, such as confidential
Compliance types such as GDPR, PII, HIPPA
A compliance audit would require SF admins to generate reports to manage compliance.
What should a data architect recommend to address this requirement?
A. Use metadata API, to extract field attribute information and use the extract to classify and build reports
B. Use field metadata attributes for compliance categorization, data owner, and data sensitivity level.
C. Create a custom object and field to capture necessary compliance information and build custom reports.
D. Build reports for field information, then export the information to classify and report for Audits.
B
UC has one SF org (Org A) and recently acquired a secondary company with its own Salesforce org (Org B).
UC has decided to keep the orgs running separately but would like to bidirectionally share opportunities between the orgs in near-real time.
Which 3 options should a data architect recommend to share data between Org A and Org B?
Choose 3 answers.
A. Leverage Heroku Connect and Heroku Postgres to bidirectionally sync Opportunities.
B. Install a 3rd party AppExchange tool to handle the data sharing
C. Develop an Apex class that pushes opportunity data between orgs daily via the Apex schedule.
D. Leverage middleware tools to bidirectionally send Opportunity data across orgs.
E. Use Salesforce Connect and the cross-org adapter to visualize Opportunities into external objects
BDE
A large healthcare provider wishes to use Salesforce to track patient care. The following actors are in Salesforce:
A data architect needs to map these actors to Salesforce objects.
What should be the optimal selection by the data architect?
A. Patients as Accounts, Payment providers as Accounts, and Doctors as Person Accounts
B. Patients as Person Accounts, Payment providers as Accounts, and Doctors as Person Accounts
C. Patients as Contacts, Payment providers as Accounts, and Doctors as Accounts
D. Patients as Person Accounts, Payment providers as Accounts, and Doctors as Contacts
B
As part of addressing general data protection regulation (GDPR) requirements, UC plans to implement a data classification policy for all its internal systems that stores customer information including salesforce.
What should a data architect recommend so that UC can easily classify consumer information maintained in salesforce under both standard and custom objects?
A. Use App Exchange products to classify fields based on policy.
B. Use data classification metadata fields available in field definition.
C. Create a custom picklist field to capture classification of information on customer.
D. Build reports for customer information and validate.
B
UC is trying to switch from legacy CRM to salesforce and wants to keep legacy CRM and salesforce in place till all the functionality is deployed in salesforce. The want to keep data in synch b/w Salesforce, legacy CRM and SAP. What is the recommendation.
A. Integrate legacy CRM to salesforce and keep data in synch till new functionality is in place
B. Do not integrate legacy CRM to Salesforce, but integrate salesforce to SAP
C. Integrate SAP with Salesforce, SAP to legacy CRM but not legacy CRM to Salesforce
D. Suggest MDM solution and link MDM to salesforce and SAP
2 answers possible
cd
Universal Containers has two systems. Salesforce and an on -premise ERP system. An architect has been tasked with copying Opportunity records to the ERP once they reach a Closed/Won Stage. The Opportunity record in the ERP system will be read-only for all fields copied in from Salesforce. What is the optimal real-time approach that achieves this solution?
A. Implement a Master Data Management system to determine system of record.
B. Implement a workflow rule that sends Opportunity data through Outbound Messaging.
C. Have the ERP poll Salesforce nightly and bring in the desired Opportunities.
D. Implement an hourly integration to send Salesforce Opportunities to the ERP system.
b
Universal Containers (UC) is facing data quality issues where Sales Reps are creating duplicate customer accounts, contacts, and leads. UC wants to fix this issue immediately by prompting users about a record that possibly exists in Salesforce. UC wants a report regarding duplicate records. What would be the recommended approach to help UC start immediately?
A. Create an after insert and update trigger on the account, contact and lead, and send an error if a duplicate is found using a custom matching criteria.
B. Create a duplicate rule for account, lead, and contact, use standard matching rules for these objects, and set the action to report and alert for both creates and edits.
C. Create a duplicate rule for account, lead, and contact, use standard matching rules for these objects, and set the action to block for both creates and edits.
D. Create a before insert and update trigger on account, contact, and lead, and send an error if a duplicate is found using a custom matching criteria.
b
Which two best practices should be followed when using SOSL for searching?
A. Use searches against single Objects for greater speed and accuracy.
B. Keep searches specific and avoid wildcards where possible.
C. Use SOSL option to ignore custom indexes as search fields are pre-indexed.
D. Use Find in “ALL FIELDS” for faster searches.
ab
Universal Container (US) is replacing a home-grown CRM solution with Salesforce, UC has decided to migrate operational (Open and active) records to Salesforce, while keeping historical records in legacy system, UC would like historical records to be available in Salesforce on an as needed basis.
Which solution should a data architect recommend to meet business requirement?
A. Leverage real-time integration to pull records into Salesforce.
B. Bring all data Salesforce, and delete it after a year.
C. Leverage mashup to display historical records in Salesforce.
D. Build a chair solution to go the legacy system and display records.
c
A manager at Cloud Kicks is importing Leads into Salesforce and needs to avoid creating duplicate records.
Which two approaches should the manager take to achieve this goal? (Choose two.)
A. Acquire an AppExchange Lead de-duplication application.
B. Implement Salesforce Matching and Duplicate Rules.
C. Run the Salesforce Lead Mass de-duplication tool.
D. Create a Workflow Rule to check for duplicate records.
ab
Universal Containers (UC) loads bulk leads and campaigns from third-party lead aggregators on a weekly and monthly basis. The expected lead record volume is 500K records per week, and the expected campaign records volume is 10K campaigns per week. After the upload, Lead records are shared with various sales agents via sharing rules and added as Campaign members via Apex triggers on Lead creation. UC agents work on leads for 6 months, but want to keep the records in the system for at least 1 year for reference. Compliance requires them to be stored for a minimum of 3 years. After that, data can be deleted. What statement is true with respect to a data archiving strategy for UC?
A. UC can store long-term lead records in custom storage objects to avoid counting against storage limits.
B. UC can leverage the Salesforce Data Backup and Recovery feature for data archival needs.
C. UC can leverage recycle bin capability, which guarantees record storage for 15 days after deletion.
D. UC can leverage a “tier”-based approach to classify the record storage need.
d
Universal Containers (UC) needs to move millions of records from an external enterprise resource planning (ERP) system into Salesforce.
What should a data architect recommend to be done while using the Bulk API in serial mode instead of parallel mode?
A. Placing 20 batches on the queue for upset jobs.
B. Inserting 1 million orders distributed across a variety of accounts with potential lock exceptions.
C. Leveraging a controlled feed load with 10 batches per job.
D. Inserting 1 million orders distributed across a variety of accounts with lock exceptions eliminated and managed.
D
Northern Trail Outfitters (NTO) wants to start a loyalty program to reward repeat customers. The program will track every item a customer has bought and grants them points for discounts. The following conditions will exist upon implementation:
Data will be used to drive marketing and product development initiatives.
NTO estimates that the program will generate 100 million rows of date monthly.
NTO will use Salesforce’s Einstein Analytics and Discovery to leverage their data and make business and marketing decisions.
What should the Data Architect do to store, collect, and use the reward program data?
A. Create a custom big object in Salesforce which will be used to capture the Reward Program data for consumption by Einstein.
B. Have Einstein connect to the point of sales system to capture the Reward Program data.
C. Create a big object in Einstein Analytics to capture the Loyalty Program data.
D. Create a custom object in Salesforce that will be used to capture the Reward Program data.
a
Cloud Kicks is launching a Partner Community, which will allow users to register shipment requests that are then processed by Cloud Kicks employees. Shipment requests contain header information, and then a list of no more than 5 items being shipped.
First, Cloud Kicks will introduce its community to 6,000 customers in North America, and then to 24,000 customers worldwide within the next two years. Cloud Kicks expects 12 shipment requests per week per customer, on average, and wants customers to be able to view up to three years of shipment requests and use Salesforce reports.
What is the recommended solution for the Cloud Kicks Data Architect to address the requirements?
A. Create an external custom object to track shipment requests and a child external object to track shipment items. External objects are stored off-platform in Heroku’s Postgres database.
B. Create an external custom object to track shipment requests with five lookup custom fields for each item being shipped. External objects are stored off-platform in Heroku’s Postgres database.
C. Create a custom object to track shipment requests and a child custom object to track shipment items.
Implement an archiving process that moves data off-platform after three years.
D. Create a custom object to track shipment requests with five lookup custom fields for each item being shipped Implement an archiving process that moves data off-platform after three years.
c
Cloud Kicks has the following requirements:
* Their Shipment custom object must always relate to a Product, a Sender, and a Receiver (all separate custom objects).
* If a Shipment is currently associated with a Product, Sender, or Receiver, deletion of those records should not be allowed.
* Each custom object must have separate sharing models.
What should an Architect do to fulfill these requirements?
A. Associate the Shipment to each parent record by using a VLOOKUP formula field.
B. Create a required Lookup relationship to each of the three parent records.
C. Create a Master-Detail relationship to each of the three parent records.
D. Create two Master-Detail and one Lookup relationship to the parent records.
b
NTO has multiple systems across its enterprise landscape including salesforce, with disparate version the customer records.
In salesforce, the customer is represented by the contact object.
NTO utilizes an MDM solution with these attributes:
1.The MDM solution keeps track of customer master with a master key.
2.The master key is a map to the record ID’s from each external system that customer data is stored within.
3.The MDM solution provides de-duplication features, so it acts as the single source of truth.
How should a data architect implement the storage of master key within salesforce?
A. Store the master key in Heroku postgres and use Heroku connect for synchronization.
B. Create a custom object to store the master key with a lookup field to contact.
C. Create an external object to store the master key with a lookup field to contact.
D. Store the master key on the contact object as an external ID (Field for referential imports)
d
Universal Containers (UC) has an open sharing model for its Salesforce users to allow all its Salesforce internal users to edit all contacts, regardless of who owns the contact. However, UC management wants to allow only the owner of a contact record to delete that contact. If a user does not own the contact, then the user should not be allowed to delete the record. How should the architect approach the project so that the requirements are met?
A. Create a “before delete” trigger to check if the current user is not the owner.
B. Set the Sharing settings as Public Read Only for the Contact object.
C. Set the profile of the users to remove delete permission from the Contact object.
D. Create a validation rule on the Contact object to check if the current user is not the owner.
a
Universal Containers (UC) has a complex system landscape and is implementing a data governance program for the first time Which two first steps would be appropriate for UC to initiate an assessment of data architecture? Choose 2 answers
A. Engage with IT program managers to assess current velocity of projects in the pipeline.
B. Engage with database administrators to assess current database performance metrics.
C. Engage with executive sponsorship to assess enterprise data strategy and goals.
D. Engage with business units and IT to assess current operational systems and data models.
cd
Universal Container (UC) stores 10 million rows of inventory data in a cloud database, As part of creating a connected experience in Salesforce, UC would like to this inventory data to Sales Cloud without a import. UC has asked its data architect to determine if Salesforce Connect is needed.
Which three consideration should the data architect make when evaluating the need for Salesforce Connect?
A. You want real-time access to the latest data, from other systems.
B. You have a large amount of data and would like to copy subsets of it into Salesforce.
C. You need to expose data via a virtual private connection.
D. You have a large amount of data that you don’t want to copy into your Salesforce org.
E. You need to small amounts of external data at any one time.
ade