EU LDT Toolbox glossary
A
A user role with permissions to manage various aspects of the platform, including participatory processes, user data, and system settings.
The primary user persona (and the only) responsible for creating schemas, exporting/importing them and generating Synthetic Data.
A tool within the EU LDT Toolbox that manages the execution of AI models and algorithms for digital twins. This tool provides AI model information and triggers the execution of AI-driven simulations.
A platform used to programmatically author, schedule, and monitor workflows within the tool.
An open-source distributed event streaming platform used as a Message Broker for high-performance data pipelines.
An easy-to-use, powerful, and reliable system for processing and distributing data, used as a Data Flow Manager and Harmonisation technology.
A set of defined rules and specifications that allows different software applications to communicate with each other. APIs are used for configuration and integration in the EU LDT Data Space Ready deployment.
Serves as the primary entry point for all communications between the frontend and backend services of the EU LDT Use Cases & Scenarios tool. It manages secure API communications, user authentication, implements role-based access control, handles rate limiting and load balancing, and routes event-driven communications.
Software tools that users can access and interact with based on their roles and permissions through a unified interface.
A KPI that is defined in the system but remains inactive until the organisation reaches a specific required maturity threshold. The system maintains a registry of these KPIs, automatically activating them as the organisation progresses through maturity levels, ensuring strategic alignment is maintained while respecting implementation readiness.
A digital resource in Integrated Environment, such as a dataset, metadata, model, document, or configuration. It is used to organise and manage important resources, support data governance and auditing, and enable integration across tools and processes.
A set of rules and policies that define how data assets are managed, shared, and protected within the system to ensure compliance and quality.
A central catalogue where all assets (such as datasets, APIs, or applications) are registered, described, and managed.
The connections between data assets that show how information flows from one source to another, helping understand dependencies and data origins.
A comprehensive, immutable record of dispositive operations performed within the system. The minimum set of audit log information includes the description of the event, the audit date, the author (user), details of the change, and the operation type (create, update, delete). These logs ensure traceability and compliance with governance standards.
The configurable policy determining how long audit logs are stored in the system before deletion. The system implements automated data retention and cleanup processes to maintain system efficiency and data privacy compliance while preserving essential historical data.
A chronological set of records that provide evidence of data access and processing activities for accountability and compliance.
Overlays digital information or 3D models onto the real-world environment through devices like smartphones, tablets, or AR glasses, enhancing the user's perception of their surroundings by blending physical and digital information.
Overlays digital information or 3D models onto the real-world environment through devices like smartphones, tablets, or AR glasses, enhancing the user's perception of their surroundings by blending physical and digital information.
The process of determining whether an authenticated user, service, or application has the necessary permissions to perform a specific action or access a particular resource.
A feature that allows the system to connect to an existing external database, analyse its structure (tables, columns, types), and automatically generate a corresponding schema definition within the tool.
A mechanism that automatically adjusts the number of resources (e.g., containers or pods) based on traffic or workload demands.