top of page
Search

JourneyToCTA Diaries Part 9 - Data Security

  • Shreyas Dhond
  • Mar 22
  • 6 min read

Updated: 2 days ago

In the previous blog, we explored how Salesforce relationship types enable normalized data modelling and help structure data efficiently across objects.


Building on the previous discussion of normalization, we examined how different relationship types—Lookup, Master-Detail, One-to-One (via design patterns), Many-to-Many (junction objects), Hierarchical, and External relationships—support the separation of data into distinct entities while maintaining logical connections between them.


Each relationship type introduces varying levels of dependency, ownership, and integrity, allowing architects to model real-world business scenarios while reducing data redundancy. From loosely coupled lookup relationships to tightly bound master-detail structures, these patterns help enforce data integrity and scalability in different ways.


We also highlighted practical considerations such as implementing one-to-one relationships using automation and leveraging junction objects for many-to-many associations, both of which are common patterns in normalized Salesforce designs.


From an architectural standpoint, selecting the appropriate relationship type is critical to ensuring data integrity, minimizing data duplication, supporting scalable data models, and accurately representing real-world business relationships within the system.


In this blog, we will shift focus to security considerations, and look at some common enterprise data security concepts. Knowledge of these concepts is essential in the toolkit of a CTA aspirant to provide a well-rounded solution for a given scenario.


Data Security


In today’s enterprise landscape, where large volumes of data are continuously captured and processed across applications, data security becomes a critical architectural consideration. For a Salesforce architect—and especially a CTA aspirant—it is essential to understand key concepts such as encryption, data obfuscation, data archival (and restore), and data purging. In this blog, we will explore each of these areas in more detail and examine how they influence secure and compliant data architecture design.


Data Encryption


Data encryption can be applied at multiple stages throughout the data lifecycle, but from an architectural standpoint it is typically categorized into two primary types: encryption at rest and encryption in transit.


Encryption at Rest


Encryption at rest focuses on protecting data stored within the system. This is often a requirement in regulated industries such as healthcare and banking, particularly for data classified under standards like PII (Personally Identifiable Information) and PHI (Protected Health Information).


The primary objective is to ensure that even if an attacker gains access to the underlying database or storage layer, the data remains unreadable without the appropriate encryption keys. This significantly reduces the risk of data exposure in the event of a breach.


In Salesforce, encryption at rest is supported through Salesforce Shield Platform Encryption, which enables encryption across the database, file storage, and search indexes. This allows sensitive data to remain protected while still being usable within the application based on configured access controls.


Encryption in Transit


Encryption in transit protects data as it moves between systems, ensuring that it cannot be intercepted and read during transmission.


This is typically achieved using public-key cryptography, where data is encrypted using a public key and decrypted using a corresponding private key at the destination.

The most common implementation is HTTPS, which secures communication between clients (such as browsers) and servers using SSL/TLS certificates. This ensures that data exchanged over the network remains secure and protected against interception.

Encryption in transit is critical for preventing man-in-the-middle (MITM) attacks, where an attacker attempts to intercept and read data flowing between systems.


For higher levels of security, additional mechanisms such as Mutual TLS (mTLS) can be used to enforce two-way authentication, ensuring that both the client and server verify each other before establishing a connection. Salesforce also provides cryptographic capabilities through its Crypto library, enabling architects to implement custom encryption strategies for highly sensitive data exchange scenarios.


From an architectural perspective, both encryption at rest and in transit are essential components of a defense-in-depth strategy, ensuring data is protected both when stored and while being transmitted across systems.


Data Archival (and Restoration)


In a previous blog, we discussed the concept of reference and operational data. Over time, operational data naturally becomes stale—meaning it is no longer relevant to active business processes. In such cases, retaining this data within Salesforce is often not justified, especially for large data volume (LDV) objects, where storage limits and governor constraints must be carefully managed.

This is where a well-defined data archival strategy becomes essential. The goal is to move stale data out of Salesforce into a secure, cost-effective, and retrievable storage layer, while ensuring it remains accessible if needed for compliance, auditing, or historical analysis. Effective archival reduces storage consumption and helps maintain system performance without permanently losing valuable data.

However, data restoration introduces additional complexity. Restoring archived data is not simply about reloading records—it requires careful handling of:

  • Partial data restoration scenarios

  • Parent-child relationships

  • Referential integrity across objects

Ensuring that restored data maintains its original structure and relationships is a key architectural challenge.

Salesforce previously offered a native data recovery service, but this has been deprecated. As a result, several AppExchange solutions such as Odaseva and OwnBackup have emerged to fill this gap, providing robust backup, archival, and recovery capabilities.

More recently, Salesforce Data Cloud Archive (Salesforce Archive) has been introduced as a native solution to support data archival and restoration use cases within the platform ecosystem.

For a CTA aspirant, it is important to have a strong understanding of:

  • Archival strategies (policies, retention rules, storage patterns)

  • Restoration challenges and limitations

  • Practical experience with at least one archival/backup solution

Archiving is not just about storage optimization—it is a critical component of data lifecycle management and platform scalability.


Data Obfuscation


Data obfuscation is the process of masking sensitive data so that it is not readable through the UI or APIs, while still allowing it to be used for underlying business processes. This is especially important when dealing with personally identifiable information (PII) or other sensitive data elements.


A common example is displaying only the last four digits of a credit card number. While the full value may still be accessible and usable at the system level, it is hidden from end users to reduce the risk of exposure.


There are two primary techniques used for data obfuscation:


  • Pseudonymization: Sensitive data is replaced with a substitute value, but it can still be re-identified if required using additional information or mapping. This approach is useful when data needs to retain some level of traceability for business or operational purposes.

  • Anonymization: Data is transformed in such a way that it cannot be re-identified. This is typically used for analytics or data sharing scenarios where privacy must be strictly preserved.


Another related concept is tokenization. While not strictly a form of obfuscation, it is often used alongside it. In tokenization, sensitive data is replaced with a token that has no inherent meaning to users. The token acts as a reference that can be used by a secure system to retrieve the original data when needed.


From an architectural perspective, selecting the right technique depends on the sensitivity of the data, regulatory requirements, and how the data needs to be used. Proper implementation of obfuscation strategies is critical for ensuring data privacy while still enabling business functionality.


Data Purging


Data that is no longer relevant to a business process—or data that must be removed due to customer requests or regulatory requirements—needs to be handled carefully within the data lifecycle.


It is important to distinguish between data deletion and data purging, as they are not the same.


In Salesforce, data deletion is often a soft delete, where records are moved to the Recycle Bin and can still be recovered within a certain timeframe. While this provides a safety net for accidental deletions, it does not fully eliminate the data from the system.


Data purging, on the other hand, is a permanent and irreversible operation. Once data is purged, it cannot be recovered. This is especially critical for handling highly sensitive data or meeting strict regulatory requirements such as data privacy laws and compliance mandates.


From an architectural perspective, understanding when to use deletion versus purging is essential for implementing proper data lifecycle management. For a CTA aspirant, having a clear grasp of these concepts ensures that solutions are not only scalable and performant, but also compliant and secure.


Conclusion


Data security is a critical pillar of enterprise architecture. From encryption and obfuscation to archival and purging, each technique plays a key role in protecting sensitive data while supporting scalability and compliance.


For a Salesforce architect, understanding these concepts is not just about implementation—it’s about making the right trade-offs across the data lifecycle. Strong designs ensure that data is secure, governed, and managed efficiently, without compromising performance or business needs. In the next blog we will go over Data Regulations and policies that govern regulations and how the security concepts we have learnt in this blog apply.



 
 
 

Comments


Copyright © 2024 SFDCShred

bottom of page