Data Pillars and General Principles
The following pillars are defined to address current issues and challenges,and serve as a guide for the use and manipulation of data in Studio.
Clarity, organization and documentation: In Studio, tokens are identified numerically, which makes them difficult to understand directly. Therefore, it is essential to establish and document a clear structure that defines the purpose, scope, and use of each token. This allows other users to understand, maintain, and evolve the solution without relying on implicit knowledge.
Security / Privacy: Any solution must consider the proper handling of sensitive, personal, or financial data. This involves complying with applicable regulations (such as data protection or banking regulations), defining which tokens can be persisted, how data is encrypted in transit or at rest, and ensuring that critical information is not exposed beyond the necessary scope.
Performance and efficiency: Redundant data should be avoided, unnecessary token creation minimized, and array size and usage limited. It's also important to consider what data should be saved, for how long, and at what point in the process so as not to affect application performance.
Interoperability: In Studio, data flows between processes, screens, modules, and external entities. In some cases it is necessary to establish well-defined interfaces for the exchange of information. Designing clear input/output contracts allows components to be reused without creating coupling. This practice is key when integrating functional flows or consuming external services, as it improves traceability, facilitates maintenance, and promotes a more robust architecture.
General Principles
It is also important to establish a set of general principles to guide data management. Data design principles in solution development (not Studio-specific) seek to ensure that data is managed securely, efficiently, traceably, and maintainably. Below is a list of the most recognized and applicable principles.
Minimization Principle: Collect, store and process the data strictly necessary to meet the objectives of the system. This reduces attack surface, helps comply with regulations (such as GDPR or local data protection laws), reduces noise in logic and improves performance, and avoids unnecessary repetition of data.
In Studio
Avoid storing sensitive or large amounts of data if it is not being used.
Limit the scope of the token to the smallest possible (local vs global).
Clean up unused tokens
Avoid saving transformed data to new tokens or columns
Single Source of Truth (SSOT): Each piece of data must have a single, reliable and up-to-date source, avoiding duplication or inconsistencies. This allows data to be synchronized between entities securely, improves system integrity, and facilitates debugging and maintenance.
In Studio
Use persistent tokens or aliases wisely to maintain consistency.
Do not redefine the same value in different places in the logic.
Avoid duplicating data between screens and processes if it is already in a token.
Separation of responsibilities: Separate data according to its function (user input, configuration data, temporary data, intermediate states, etc.). This improves readability and traceability, helps identify what data can be persisted or shared and reduces coupling between components.
In Studio
Use different tokens for user input, transaction results, and temporary data (the latter only if necessary).
Use Lambda variables for internal calculations.
Principle of least privilege: Data should be accessible only in the contexts where it is truly needed. It minimizes the risk of sensitive information being leaked, improves the design of interfaces and contracts and aligns with security practices such as zero trust.
In Studio
Do not use global tokens unnecessarily.
Avoid the use of tokens that can be modified from anywhere without control.
Audit and traceability. Critical data must be audited: when it was modified, why, and by whom (or by what logic). This is key for financial, legal or sensitive systems. It allows compliance with regulatory or monitoring requirements and helps detect errors or fraud.
In Studio
Use DevTools to understand how data flows.
Controlled evolution of the scheme. The data model should allow adaptations without breaking existing components. This avoids forced migrations or errors due to unexpected changes and allows for secure versioning and testing. This applies to the design of APIs, internal structures or screen models.
In Studio
Be careful when changing array structures or contracts.