Conveners
IBERGRID: Digital Twins for GEOphysical extremes
- Isabel Campos (CSIC)
IBERGRID: Special Topics
- Jorge Gomes (LIP)
IBERGRID: Contributions
- Mário David (LIP)
IBERGRID: Special Topics
- Ignacio Blanquer Espert (Universitat Politècnica de València)
IBERGRID: Contributions
- João Pina (LIP)
IBERGRID: Contributions
- Javier Cacheiro (CESGA)
IBERGRID: Contributions
- Germán Moltó (Universitat Politècnica de València)
IBERGRID: Special Topics
- Davide Salomoni (Fondazione ICSC)
IBERGRID: Datacenters and Infrastructures
- Zacarias Benta (LIP - Minho)
IBERGRID: Conference Opening
- There are no conveners in this block
IBERGRID: Closing
- There are no conveners in this block
The DT-GEO project (2022-2025), funded under the Horizon Europe topic call INFRA-2021-TECH-01-01, is implementing an interdisciplinary digital twin for modelling and simulating geophysical extremes at the service of research infrastructures and related communities. The digital twin consists of interrelated Digital Twin Components (DTCs) dealing with geohazards from earthquakes to volcanoes to...
The today's computational capabilities and the availability of large data volumes is allowing to develop Digital Twins able to provide unrivaled precision. Geophysics is one field that benefits from the ability to simulate the evolution of multi-physics natural system across a wide spatio-temporal scale range. This is also possible thanks to the access to HPC systems and programming frameworks...
FAIRness is an important quality of all kinds of data and metadata with each of the principles FAIR standing for the different criteria: Findability, Accessibility, Interoperability and Reusabilty.
FAIR-EVA is a tool that reports the FAIRrness level of digital objects from different repositories or data portals implemented via different plugins, allowing the user to improve the maturity...
DT-GEO aims to provide digital twins of the earth system to mimic different system components and provide analysis, forecasts and what-if scenarios for geophysical extremes, enabling a deeper insight into these events. Addressing the complexity of mimicking the earth system, as well as the multitude of software codes required to realise the DT-GEO vision, demands a modular architecture, where...
The data management team in the DT-GEO project is following a novel approach for the characterization of the Digital Twins for geophysical extremes being developed in the project. The approach relies on the use of rich metadata to describe the components of Digital Twins (DTCs): (i) the digital assets (DAs) --namely datasets, software-services, workflows and steps within those workflows--,...
Portugal and Spain share a common history when suffering from natural hazards. In 1755, we experienced the largest natural hazard ever happened in Europe in historical times. This event included an earthquake and the second most deadly tsunami in our records, just behind the Sumatra catastrophe in 2004. DT-GEO project (A Digital Twin for GEOphysical extremes) aims at analysing and forecasting...
We will make a summary of the infrastructure status and the summary of research and development activities taking place under the umbrella of IBERGRID.
This presentation will provide a summary and details about the history of IBERGID in the EGI Federation, showing the complex provider, user and innovator relationships that exist between Spain, Portugal and the rest of EGI. The talk aims to socialise and promote the benefits and achievements of the IBERGRID - EGI partnership.
Spain and Portugal have been significant contributors to the EGI...
Kubernetes (K8s) is the industry-leading container orchestration platform, offering organizations enhanced reliability, availability, and resilience for its services compared to monolithic architectures. Also, by adopting a declarative paradigm, K8s simplifies the management of multiple complex environments. When integrated with tools such as ArgoCD and Gitlab CI pipelines, K8s also makes it...
The third National Tripartite Event (NTE) organised by Spain took place on 24 September 2024. The event covered three blocks: the EOSC governance, new INFRAEOSC projects with Spanish partners, and updates on the EOSC Federation.
The event covered the last updates on the EOSC governance, including the new task forces, the consultation processes on the Strategic Research and Innovation Agenda...
In the context of the CHAIMELEON project (https://chaimeleon.eu/) we have developed a secure processing environment to manage medical imaging data and their associated clinical data enabling researchers to share, publish, process and trace datasets in virtual environments, powered by intensive computing resources.
The environment is built on top of a Kubernetes cluster and leverages native...
Over the last two years, the Galician Marine Sciences Program (CCMM) has developed a Data Lake to support the collection and analysis of data related to Galicia’s marine ecosystem. The Data Lake architecture facilitates processing both structured and unstructured data, already integrating diverse datasets such as ocean currents velocity maps, species distribution data, upwelling indices,...
OSCAR is an open-source framework built on Kubernetes (K8s) for event-driven data processing of serverless applications packaged as Docker containers. The execution of these applications can be triggered both by detecting events from object-storage systems, such as MinIO or dCache (asynchronous calls) or by directly invoking them (synchronous calls). OSCAR is being used in several research...
Recently, the computing continuum has emerged as an extension of existing cloud services. Existing cloud services enable users to access almost unlimited resources for the processing, storage, and sharing of data. Nevertheless, the increasing volume of data, as well as their constant production, have motivated the distribution of both data and computation through existing computational nodes....
The presentation will discuss the point of view and opportunities offered by the EuroHPC JU to support the computing needs of SMEs and large industries. This will be a remote presentation.
HPCNow! provides its customers with solutions and technologies for dealing with the most complex problems in High Performance Computing (HPC). Installing and managing a HPC cluster and deploying user applications involves a wide range of software packages. A cluster manager provides a unified way to install all the nodes, manage them, and synchronize configurations across the entire cluster. A...
The amount of data gathered, shared, and processed in frontier research is set to increase steeply in the coming decade, leading to unprecedented data processing, simulation, and analysis needs. In particular, high-energy physics and radio astronomy are gearing up for groundbreaking instruments, necessitating infrastructures many times larger than current capabilities. In this context, the...
Through the National Recovery and Resilience Program (NRRP), Italy has funded the constitution of an unprecedented national infrastructure targeting digital resources and services for science and industry. Specifically, the National Center on HPC, Big Data and Quantum Computing (“ICSC”) is an initiative funded with €320M to evolve existing public state-of-the-art network, data, and compute...
With the increase in microbial resistance to therapeutics, there is a higher demand for finding new pathways or molecular targets to treat bacterial infections. The Pseudomonas quinolone system (PQS) is a part of the quorum sensing (QS) communication system of Pseudomonas aeruginosa, which controls the production of biofilms and several other virulence factors. Inhibiting quorum sensing does...
The COVID-19 pandemic, caused by the SARS-CoV-2 virus, has led to a global health crisis, triggering an urgent need for effective therapeutic interventions to mitigate its impact. The virus primarily infects human cells by binding its spike protein (S-RDB) to the ACE2 receptor, making this interaction a key target for drug discovery. In response, this study aimed to identify novel compounds...
The AI4EOSC and iMagine projects are closely related initiatives under the European Open Science Cloud (EOSC), both designed to support research communities in leveraging artificial intelligence (AI).
The AI4EOSC project is dedicated to provide researchers with easy access to a comprehensive range of AI development services and tools. It focuses on enabling the development and deployment of...
AI models require extensive computing power to perform scalable inferences on distributed computing platforms to cope with increased workloads. This contribution summarises the work done in the AI4EOSC and iMagine projects to support AI model inference execution with OSCAR and AI4Compose. AI4EOSC delivers and enhanced set of services to create and manage the lifecycle of AI models (develop,...
Machine Learning (ML) is one of the most widely used technologies in the field of Artificial Intelligence (AI). As ML applications become increasingly ubiquitous, concerns about data privacy and security have also grown. The presentation is about applied ML landscape concerning the evolution of ML/DL from various aspects including data quality, data privacy awareness and federated learning. It...
Dataverse is an open source data repository solution being
increasingly adopted by research organizations and user
communities for data sharing and preservation. Datasets
stored in Dataverse are cataloged, described with metadata,
and can be easily shared and downloaded. In the context of
the development of a pilot catchall data repository for the
Portuguese research community we...
The Horizon Europe interTwin project is developing a highly generic yet powerful Digital Twin Engine (DTE) to support interdisciplinary Digital Twins (DT). Comprising thirty-one high-profile scientific partner institutions, the project brings together infrastructure providers, technology providers, and DT use cases from Climate Research and Environmental Monitoring, High Energy and...
InterTwin co-designs and implements a prototype of an interdisciplinary Digital Twin Engine (DTE) - an open source platform based on open standards that offers the capability to integrate with application-specific Digital Twins (DTs).
While there are many components that are part of the DTE, this contribution focuses on OSCAR and DCNiOS and how they are being used in InterTwin to support...
The Horizon Europe interTwin project is developing a highly generic Digital Twin Engine (DTE) to support interdisciplinary Digital Twins(DT). The project brings together infrastructure providers, technology providers and DT use cases from High Energy and AstroParticle Physics, Radio Astronomy, Climate Research and Environmental Monitoring. This group of experts enables the co-design of both...
In this presentation we will outline the role of the SQAaaS platform as the architectural building block for quality assurance (QA) within two ongoing EC-funded projects that are prototyping Digital Twins in diverse scientific domains: DT-GEO and Intertwin.
The individual requirements of each project have shaped the SQAaaS platform to be a flexible engine that is able to evaluate both the...
This presentation provides an overview of the architecture and implementation of the new artefacts repositories for EGI.
The EGI repositories are developed, maintained and operated by LIP and IFCA/CSIC. The new repositories will host RPMs for (RHEL and compatible distributions), DEBs (for Ubuntu and compatible distributions) and Docker images for container-based services and...
DESY, one of Europe's leading synchrotron facilities, is active in various scientific fields, including High Energy and Astro particle Physics, Dark matter research, Physics with Photons, and Structural Biology. These fields generate large amounts of data, which are managed according to specific policies that respect restrictions on ownership, licenses, and embargo periods. Currently there is...
In the current era of Big Data data management practices are an increasingly important consideration when doing scientific research. The Scientific community's aspiration for FAIR data depends on good data management practices and policies, and interTwin's DataLake has been designed with these goals in mind. I will present the status of an application of the DataLake to the particular field of...
Effective water resources management depends on accurate river flow forecasts, which affect hydroelectric power generation, flood control, and agriculture, among other sectors. Achieving consistent projections is challenging, though, because the complex characteristics defining the river flow—which are influenced by factors including precipitation, reservoir management, and changes in land...
Workflow languages have become indispensable for defining reproducible and scalable data analysis pipelines across various scientific domains, including bioinformatics, medical imaging, astronomy, high-energy physics, and machine learning. In recent years, languages such as the Common Workflow Language (CWL), Workflow Description Language (WDL), and Nextflow have gained significant traction...
Citizen Science (CS) is evolving rapidly, driven by digital technologies that make data collection and public participation more accessible. However, these technologies also introduce challenges such as complexity and fragmentation. Many projects addressing similar research questions use inconsistent methodologies, making it difficult to compare and integrate results. Fragmentation is worsened...
Biomolecular simulations have long been an important part of the drug discovery and development process, with techniques such as docking, virtual screening, molecular dynamics and quantum mechanics being routinely used in the study of the interaction and selection of small molecular drugs with their target proteins or enzymes.
More recently, the application of these techniques in aptamer...
The Infrastructure Manager (IM) is an open-source production-ready (TRL 8) service used for the dynamic deployment of customized virtual infrastructures across multiple Cloud back-ends. It has evolved in the last decade through several European projects to support the needs of multiple scientific communities. It features a CLI, a REST API and a web-based graphical user interface (GUI), called...
A new INCD data centre has recently become operational located at UTAD (Universidade Trás os Montes e Alto Douro in Vila Real). It offers Cloud and HPC computing, as well as Data management services and repositories.
In this work we describe the architecture, deployment and configuration of the cloud IaaS infrastructure based on Openstack. We describe as well, the Ceph storage architecture...
The demand for computing resources is growing everyday making the capability of expanding capacity to address user needs increasingly important. For research organisations, the Open Clouds for Research Environment (OCRE) provides an opportunity to exploit the extension of existing computing and platform resources to commercial providers under better conditions. This reality brings a challenge...
In this presentation we will discuss our experience of migrating the
INCD Helpdesk ticketing system from Request Tracker (RT) to Zammad. We
will highlight the most relevant INCD requirements and Zammad features
that led to the choice of this platform. We will also describe the
required steps that were taken during the migration process, including
data extraction from RT and the import...
INCD (www.incd.pt) provides computing and data services to the Portuguese scientific and academic community for research and innovation in all domains. The infrastructure is oriented to provide scientific computing and data oriented services, supporting researchers and their participation in national and international projects.
INCD operates an integrated infrastructure with services being...
In this presentation we will share the experience of FCT-FCCN in setting up and managing datacenters.