The Latest Drops

I finally got back on the horse, so to speak and got my recording rig up and running again. I needed the break after the 100 video a day challenge. Now I’m back with a goal of continuing to provide learning videos with relevant content.

Here are the videos and key topics for April 2024:

  1. Data Cloud Diaries - Ingesting from Amazon S3 - Cool Updates: We are back into Data Cloud and I want to share some cool new updates! There is a new way to create Amazon S3 Data Streams, and I want to show how they work. Although it is an extra step on the first Data Stream, I like how it separates the Credentials from the Data Stream. It makes creating additional Data Streams so much easier.

  2. Data Cloud Diaries - Data Cloud Triggered Flow: As we dive back into Data Cloud and I want to share a new feature.

    Recently I was working in Data Cloud and saw the new Data Cloud Triggered Flow. This is a Flow you can build in the Sales Cloud Org and it can be triggered from changes in the Data Cloud DMOs. It was very easy to monitor for a changed DMO and then quickly do something in the Sales Cloud Org, such as creating a new record.

    You should definitely keep this in mind and a great new feature.

April 26, 2024

April 30, 2024

This week, I’m focused on hitting my 100th video goal! It’s been a long journey and coincidentally enough, it will happen during the Cactusforce Conference. I’m definitely celebrating!

I’ve been rounding out the videos so that each series is balanced and close to complete. I plan to continue creating videos, but not at the rapid daily pace that I’ve been cranking them out. I am proud that I’ve established a repository of knowledge that I can build on and that others can draw on. Cheers!

Here are the videos and key topics from Jan. 22nd - Jan. 26th.

  1. Integration Patterns: Options for Achieving Near-Real-Time Notifications to External Systems: This is a high level overview of potential integration patterns for when you need near-real-time notifications of a change in Salesforce and have it transmitted to an external system in near-real-time. I review: -Flows vs Triggers, -Outbound Messages, -Platform Events, -Change Data Capture, -Flow HTTP Callout Actions, -External Services, -Apex Callouts. This will give you a high level view of what the possibilities are, and you can then determine which one is best for your situation.

  2. Data Cloud Diaries: Overview of Methods for Loading and Ingesting Data into Data Cloud: There are now a number of ways to bring data into Data Cloud. In this session, I provide a summary of the different methods and thoughts on frequency and when you might want to use them. You can be pulling from other Salesforce Products: -Salesforce Core, -Salesforce Marketing Cloud, -Salesforce B2C Commerce Cloud. You can be pulling from Cloud Storage: -Amazon, -Google, -Azure. And you can push data: -SFTP, -Ingestion API (Bulk and Streaming), -Mobile/Website API. And I show where you can find the reference links.

  3. Data Architecture: Survey Data Model Explained, Simple to Complex: This is one of my favorite data model patterns. It starts simply: Need to have dynamic Surveys (ability to add more) which have dynamic Questions (ability to add more) and capture the responses. I have used this model for many years, even back before Salesforce. I have implemented solutions in many different technologies, such as Java, .NET, and even back to FoxPro! This model can start with just a few objects, but then additional requirements come into play: -Shared Questions, -Dynamic Response Types, -Survey Versioning. Once you have the base, you can grow this to even more complexity. It is important to understand this and be able to use and modify it.

  4. Data Architecture: Survey Data Model Expanded to Dynamic Forms: In this session I build on the Survey Data Model and show how it can be expanded to handle Dynamic Forms. Dynamic forms is the concept where you can have the following: -Multiple Forms -Questions on Pages -Questions can be conditionally displayed, dependent on the answers to previous questions. -Pages can be conditionally displayed. Once you get a handle on the capabilities of this approach, you can see how it can be expanded to handle more advanced capabilities.

  5. Data Architecture: Using a Data Dictionary for Planning Before You Setup in Salesforce: There is a phrase "Measure Twice, Cut Once" about woodworking and it can help with configuring Salesforce. Sometimes, if at the start of a project, you immediately go into a Sandbox and start adding configurations, you may end up spending more time reworking as the designs get updates. I have been using Data Dictionaries for many years, back when I was working in FoxPro, .NET and Java. It would be a definition of all the tables and the fields. I had code that could generated the DB Schema from them automatically, and handle field updates for incremental additions. So, many years ago, when working on Service Cloud implementations, I found that I could design 90% of the configurations in a Data Dictionary Spreadsheet before I even touched Salesforce.

    I will put in: -Objects: Standard and Custom, -Fields: Standard Fields and Custom, -Picklists: Where uses and which values, -Role: Build the Hierarchy. And More: Public Groups, Assignment Rules, Validations, Escalations, Milestones, etc. I would then standardize the field names and descriptions. I could design most of the solution in the Spreadsheet and then just plug it in quickly. Then, I found myself using the Data Dictionary for Integrations, mapping fields to systems, and more.

January 22, 2024

January 23, 2024

January 24, 2024

January 25, 2024

January 26, 2024

This was a great week to start wrapping up the data cloud series. I got certified to teach Data Cloud this week and I learned much more while doing it. I also have pushed the boundaries of what it can do in this series, it was fun!

Here are the videos and key topics from Jan. 15th - Jan. 19th.

  1. Data Cloud Diaries, Data Transformation Demo: Welcome back to the Data Cloud Diaries. In these videos I have been showing different capabilities of Data Cloud, and today I wanted to show the power of the Data Transformation. Data Transformations are able to take data in DLO (Data Lake Objects) or DMO (Data Model Objects) and transform them into existing or new Objects. Note, DMOs go to DMOs and DLOs go to DLOs. Without code, you can do some sophisticated and complex mappings and tranformations. In this video I show: -Choosing Columns -Creating new Formula Fields -Filtering Records -Running Aggregate Calcluations And then we map to a new DMO. This demonstrated some of the capabilities you have.

  2. Data Cloud Diaries: JDBC Driver Access into Data Cloud from DBeaver: Here is another cool capability, the ability to access your Data Cloud Data externally using SQL through JDBC. This empowers you to have external access to your Data Lake Objects (DLOs) and Data Model Objects (DMOs) using the powerful SQL syntax. In this video, I show you how to setup JDBC Access using the DBeaver Client.

  3. Data Architecture: Address, Affiliation, and Relationship Learnings from NPSP Data Model: As a Data Architect you have to balance the requirements and design an optimal data model. Looking at other implementations, you can learn successful approaches. By diving into the Non-Profit Consultant, I got deep into the NPSP Data Model and saw some great patterns. In this session, I am showing the power of some simple concepts: Household (Account and Contact) Address (Custom object with flexibility) Relationship (Custom object joining Contacts together) Affiliation (Custom object joining Contact with other Accounts) Take a tour with me and see if some of these concepts help you in your future design sessions!

  4. Data Architecture: Denormalized Object for Imports, Lessons from Data Cloud, NPSP & Field Service: Importing data can be challenging when your target data model is complex and the source data does not match. In this session, I show creating a target SObject to receive the data directly, and then post processing to place in the target objects. This is not new, this follows existing patterns: a) Non-Profit NPSP has a full Data Import Process that follows this pattern. b) Data Cloud is designed to bring the data in existing structures to Data Lake Objects and then Normalize to Data Model Objects c) A Project where I brought data into a Temp Object and then had batch processes to push to Work Order, Service Appointment, etc. This approach can help you bring the data into Salesforce, and give you an audit log of how and where the data was mapped. Learn from this and take what works for you.

  5. Data Cloud Diaries: Using Data Spaces to Separate Your Data and the Extra Steps to Connect: Data Spaces are a mechanism in Data Cloud that allow you to have separate sets of data and you can control access. This can be helpful when you want to control access, but it can add additional steps when you want to connect. In this session, I walk through some of the additional steps needed to connect to your data (Data Lake Objects, Data Model Objects) when they are assigned to a Data Space.

January 15, 2024

January 16, 2024

January 17, 2024

January 18, 2024

January 19, 2024

I really do enjoy tacking the complex which is why I think integrations are my bread and butter. Enjoy these insights!

Here are the videos and key topics from Jan. 8th - Jan. 12th.

  1. Across APIs: Salesforce Tooling API Viewing and Creating Metadata: Going further into the Tooling API. I show how you can perform the same tasks as Visual Studio Code or other IDEs, but through the APIs directly. In this video I show how to connect to the Tooling API and automate tasks such as: -Viewing Custom Objects, -Viewing Custom Fields, -Creating a Custom Field, -Viewing Apex Classes, -Creating an Apex Class. There are a lot of capabilities here, it’s definitely worth learning and having ready in case you need it.

  2. Across APIs: Platform Event Behavior Compare to SObject Behavior: Platform Events are a very powerful mechanism for communication both inside and outside of Salesforce. Platform Events seem to have many of the behaviors of SObjects (Standard/Custom). However, there are some significant differences. In this video, I compare the behaviors of traditional SObjects with that of Platform Events. For SObjects: -Write with DML, -Read with SOQL or SOSL, -Records are identified with a Primary .

    For Platform Events -Publish Event goes to the End of the Channel, -Subscribe has 3 Options, - Option #1 New Events Only, -Option #2 Events after a Replay ID, -Option #3 All Events in the Retention Window.

  3. Across the APIs: Resilience in Integration,Platform Events and Use Of Replay Id Explained: Using Platform Events and the Replay ID properly can enable a resilience connection between Salesforce and an external subscriber. In this session I explain the Replay ID and how the different values can be used to setup a resilient connection. There are 3 Options, with slightly different values depending on if you are using the Streaming API or the Pub/Sub API: -New Messages Only, -Missed Messages starting at a particular message Replay ID, -All Messages in the Retention Window. These are important concepts to understand as an Integration Architect.

  4. Across the APIs: Viewing Platform Events Using The Streaming Monitor: Platform Events are a very powerful tool for communication between processes and even systems. Using them can create a decoupled, flexible, system. However, because there is no way to query Platform Events, the only way to see them and their contents is to Subscribe with a Client. The Streaming Monitor is a great tool that offers the ability to Subscribe to both Standard and Custom Platform Events. In this video, I demonstrate how you can create and monitor custom Platform Events.

  5. Across the APIs Platform Event Patterns, Inside of Salesforce, External Subscribers and with Kafka: Platform Events can be a flexible pattern for communication between processes. In this session, I show some different patterns: -Between Processes inside of Salesforce, -With External, Direct Subscribers/Publishers, -Bridging to an External Event Bus such as Kafka. Using Platform Events can help build a flexible system that can grow over time to include more Publishers and Subscribers, even from directions you do not anticipate in the initial designs.

January 8, 2024

January 9, 2024

January 10, 2024

January 11, 2024

January 12, 2024

Happy 2024! I’ve started back at it and decided to continue the journey this week going across the APIs and adding to my integration extravaganza series.

Here are the videos and key topics from Jan. 2nd - Jan. 5th.

  1. Across the APIs: Salesforce GraphQL Introduction: This an introduction to the GraphQL for Salesforce. This is a newer API that follows the GraphQL pattern. It has some cool new features: Get fine control over the objects and fields returned, Can query multiple objects in the same query. Strong schema. Today is an introduction to what it can do.

  2. Across APIs: Salesforce GraphQL, Object Schema, Object Query, and Aggregate Query: Delving deeper… In this powerful API, you can query many things in the same query. I show that you can: Query Multiple SObjects, Retrieve Meta Data for multiple Objects, Make Aggregate Queries. And you can do all of these in 1 single call that is tailored to your specific needs. This can create efficient and fast queries for data.

  3. Across APIs: Salesforce UI API Can Power Your Own User Interface: And here is another fun Salesforce API: The UI API. With this API you can tap into the power of the Salesforce User Experience, but through the API, so you can present your own User Experience to the Users. This API offers you a wide variety of data elements and structures that can allow you to recreate much of the data and configurations items available on the Salesforce User Interface. Using this, you can build your own User Interface powered by Salesforce Data.

  4. Across APIs: Salesforce Tooling API Introduction to Automated Developer Control: Did you know that you can automate many Development Tasks? The Salesforce Tooling API provides an interface for controlling Salesforce like a Developer, but in a way that can be automated. Using this API you can perform a number of powerful queries, command, and updates. You can also have external tools or scrips that can help you Automate Salesforce Development. In this video I show how to connect to the Tooling API and automate tasks such as: -Run Apex Test, -Query ApexLog/Debug Log, -Retrieve an ApexLog/Debug Log, -Execute Anonymous Apex. This is just the tip of the iceberg for what you can do. I will show more in future videos!

January 2, 2024

January 3, 2024

January 4, 2024

January 5, 2024

Video Series Library

I’m striving to provide fresh content for Salesforce continuously; there are now 102 videos with loads of content on integration, data architecture, security, identity, data cloud, certifications, and introducing the new AI adventures series.