This theme aims to address challenges related to developing methodologies and tools for data pipelines across organisational and technological boundaries through efficient, understandable, controllable and auditable large-scale data sharing – specifically:
What are the required data sharing standards, frameworks, methodologies and tools for fast and easy-to-use data integration, provenance tracking, data exploration and data risk assessment?
How to ensure scalable multimodal entity linking algorithms knowledge based data integration in open environments with progressive approaches?
How to ensure data access methods support digital fingerprints based on well-defined security and privacy protocols?
How to build easy-to-use and efficient data testbeds to facilitate scenario-based policy-aware data sharing risk management, and data-level and workflow-level provenance tracking with visualisation support?
The outcomes of this theme will be utilized to advocate and influence effective data sharing and data reuse policies and regulations.