Conveners
IBERGRID: Conference Opening
- There are no conveners in this block
IBERGRID: Contributions
- Germán Moltó (Universitat Politècnica de València)
IBERGRID: Contributions
- Tim Wetzel (DESY IT (Research and Innovation in Scientific Computing))
IBERGRID: Contributions
- Marcin Plociennik (PSNC)
IBERGRID: Contributions
- Patrick Fuhrmann (DESY)
IBERGRID: Contributions
- Josep Flix Molina (CIEMAT / PIC)
IBERGRID: Contributions
- Mário David (LIP)
Description
Zoom Ibergrid 2023
Id: 846 2964 6790
Passwd: 044413
URL: https://us06web.zoom.us/j/84629646790?pwd=5qakht6CbPHd0hm5aKtGaRsdkCHB5Y.1
The BigHPC project is bringing together innovative solutions to improve the monitoring of heterogeneous HPC infrastructures and applications, the deployment of applications and the management of HPC computational and storage resources. It aims as well to alleviate the current storage performance bottleneck of HPC services when dealing with data intensive applications growth among the major...
udocker is a tool to execute containers in HPC resources. It can pull container images from any registry, be it DockerHub, GitHub Container Registry (https://ghcr.io) or GitLab Container Registry (https://registry.gitlab.com) or others. udocker is a run-time tool to enable the execution of applications encapsulated in containers both on HPC and Cloud resources.
This presentation will...
The EuroCC project aims to boost knowledge and use of high-performance computing (HPC) across Europe, through a network of national competence centres (NCC).
The Portuguese Competence Centre (NCC Portugal) coordinates dissemination, training, knowledge and technology transfer activities, as well as the promotion of the use of HPC. It has been a point of contact for potential users - whether...
The field of Complex Systems has rapidly gained prominence over recent decades as a result of its capacity to explore the intricate and interdependent behaviors exhibited by a wide range of natural, social, and technological systems. Driven by advances in data availability, computational methods, and interdisciplinary collaboration, Complex Systems research has become a burgeoning field of...
Software engineering best practices favour the creation of better quality projects, where similar projects should originate from similar pre-defined layout, also called software templates. This approach greatly enhances project comprehension, without the need for extensive documentation. Additionally, it allows the pre-setting of certain functionalities simplifying further code development....
PypKa is a tool developed by the Machuqueiro Lab at the University of Lisbon (UL), Portugal. It’s a Poisson–Boltzmann-based pKa predictor for proteins using 3D structures as input. The tool also predicts isoelectric points and can process pdb structures to assign the correct protonation states to all residues. The impact of the PypKa cloud service is to predict pKa values of titratable sites...
OIDC (OpenID Connect) is widely used for transforming our digital
infrastructures (e-Infrastructures, HPC, Storage, Cloud, ...) into the
token based world.
OIDC is an authentication protocol that allows users to be authenticated
with an external, trusted identity provider. Although typically meant for
web- based applications, there is an increasing need for integrating
shell- based...
Computing and data management workflows are increasingly demanding access to S3 storage services with POSIX capabilities by locally mounting a file system from a remote site to directly perform operations on files and directories.
To address this requirement in distributed environments, various service integrations and needs must be considered.
In the context of this activity,...
One of the main obstacles in collecting data for life science today is the compliance with GDPR. Among the others, the requirement of managing the Informed Consent in a lawful, transparent and auditable way is one of the open issues that Trusted Research Environments must address. Today many hospitals and research organizations exploit some kind of Consent Management System (CMS), usually...
The interTwin project is designing and building a Digital Twin Engine (DTE) to support interdisciplinary Digital Twins.
In particular, the interTwin DTE is aiming at supporting both end users (e.g. scientists, policymakers) but also DT developers, who would like to have an easy way to build and model their Digital Twins.
The talk will present the general project status focusing on...
interTwin is a project started in September 2022, funded by the EU for the development of an open source platform, called Digital Twin Engine (DTE), to support the digital twins of selected communities, which can be exported in multiple scientific fields. For this reason, interTwin was designed to develop the platform involving both scientific domain experts and computational resource...
The itwinAI framework represents a comprehensive solution developed by CERN and the Julich Supercomputing Center (JSC) to facilitate the development, training, and maintenance of AI-based methods for scientific applications. It serves as a core module within the interTwin project, aimed at co-designing and implementing an interdisciplinary Digital Twin Engine. itwinAI streamlines the entire AI...
Dataverse is an open source data repository solution with increased adoption by research organizations and user communities for data sharing and preservation. Datasets stored in Dataverse are
cataloged, described with metadata, and can be easily shared and downloaded. After having dedicated one year to the development and integration of a Dataverse based repository for research data, we...
In the framework of EOSC Association several Task Forces have been created to study and report about Open Science, Open Data and in particular Quality for Research Software.
Research Software (RS) is defined as software that is produced by researchers and used as an enabler for scientific activities. A major objective of the EOSC task force on Infrastructures for Quality Research Software is...
The EOSC-Synergy project is developing a toolset to bring over mainstream practices close to researchers throughout the development life cycle of the EOSC software and services. The objective is twofold: on the one hand streamlining the adoption of such practices in the scope of the EOSC, and on the other hand, providing a software-quality assessment tool to promote, measure and reward...
The H2020 C-SCALE (Copernicus – eoSC AnaLytics Engine, https://c-scale.eu/) project has created services to unlock the vast potential of Copernicus data for advanced Earth Observation analytics by providing a pan-European federated data and computing infrastructure through the EOSC Portal. As the project is coming to the end, this session aims to present the main outcomes of the project with a...
The International Lattice Data Grid (ILDG) is a community effort
of physicists working on Lattice Field Theory in order to coordinate
and enable the sharing of their large and valuable datasets from
numerical simulations. ILDG started around 20 years ago and is organized
as a world-wide federation of regional grids which use interoperable
services (e.g catalogues) and unified standards,...
Climate data analysis often entails downloading datasets of several terabytes in size from various sources and employing local workstations or HPC computing infrastructures for analysis. However, this approach becomes inefficient in the era of big data due to the considerable expenses linked with transferring substantial volumes of raw data over the Internet from diverse sources, encompassing...
Reproducibility is a cornerstone of scientific research, ensuring the reliability and validity of results by allowing independent verification of findings. During the EGI-ACE project, EGI has developed EGI Replay, a service that allows researchers to reproduce and share custom computing environments effortlessly. With Replay, researchers can replicate the execution of your analysis in a...
OpenStack, known for its open-source cloud capabilities, offers a wide range of services for creating and managing various types of clouds. However, setting up and maintaining an OpenStack cloud can be a complex and time-consuming task. In recent years, Kubernetes has gained popularity as a platform for managing containers, making it an attractive choice for simplifying the infrastructure...
In today's world of high-throughput bioinformatics and advanced experimental techniques, researchers are generating enormous datasets. Consider of cryo-electron microscopy data or the complex images produced by nuclear magnetic resonance and light microscopy—they're all rich in scientific value. Not just for the researchers who generate them, but for the entire scientific community. The key to...
Instruct is the pan-European Research Infrastructure for structural biology, centered in bringing high-end technologies and methods to European researchers. The Instruct Image Processing Center (I2PC) has actively been promoting FAIR practices for cryoEM data acquisition and processing workflows. Efforts carried in the scope of projects such as EOSC-Life, BY_COVID and EOSC-Synergy are driving...
The “Digital Science and Innovation” Interdisciplinary Thematic Platform (PTI) was launched by the Spanish National Research Council (CSIC) in June 2022, with the aim to innovate in all areas of digital science and data lifecycle management, from planning, acquisition, and processing to publication and preservation.
The platform groups its activity into the following 4 strategic areas and...
Kampal IC is an application to host networked "collective intelligence" brainstorms with an evolution model inspired in physics, limiting interactions to 2^D "nearest neighbors"
We describe its expansion to include artificial agents, that can interact either with humans or with themselves. Due to the computational requirements of modern AI agents, mostly based on Large Language Models, this...
The Jenkins Pipeline Library (JPL) is one of the core components of the EOSC-Synergy software and services quality assurance as a service platform (SQAaaS) aimed to foster the adoption of EOSC services through a quality based approach. It is a self contained component that facilitates the process of creation and execution of CI/CD pipelines and the first line for integration with any...