Tech Maven Geospatial has worked on several projects and R&D into Data Mesh
We also use Trinio https://trino.io/ecosystem/data-source
Data mesh is about moving analytical data away from a monolithic data warehouse or data lake into a distributed architecture, allowing data to be shared for analytical purposes right at the point of origin. This concept was introduced by Zhamak Dehghani. If you’d like to learn more about data mesh, how to build one, you may like these resources:
Placing Apache Kafka at the Heart of a Data Revolution at Saxo Bank 3 [Podcast]
Data mesh is an architectural pattern for large-scale data management that aims to decentralize data ownership and decision making. The key principles of data mesh are: – Domain-oriented decentralization – Data is organized around domains rather than technical systems. Each domain owns its data and makes decisions about it. – Data as a product – Data is treated as a product with its own lifecycle and workflow for discovery, access, and understanding. Data products are the primary way data is exposed. – Self-serve data infrastructure – Common data infrastructure services like storage, processing, and governance are provided in a decentralized way for domains to use. – Federated computational governance – Technical governance is decentralized so domains can choose their own technologies but aligned on things like security and accessibility. – Discoverability – Data and its meaning are organized and documented so they can easily be found and understood. Common metadata standards aid discovery. The goal is to shift away from centralized, monolithic data lakes towards more decentralized and autonomous data management aligned with business domains. This allows faster innovation and scaling for large complex organizations.