12th MEFT Student Workshop

Europe/Lisbon
Anfiteatro PA1 (Instituto Superior Técnico - Campus Alameda)

Anfiteatro PA1

Instituto Superior Técnico - Campus Alameda

Av. Rovisco Pais 1, 1049-001 Lisboa
Armando Gonçalves, Diogo Lemos, Francisco Dias, Francisco Ferreira, Gonçalo Barreto, Gonçalo Costa, Gonçalo Martins, Gonçalo Ribeiro, Hugo Amaral, João Biu, João Nuno Santos, João Pedro Nunes, João Tavares, Miguel Pereira, Pedro Esperanço, Pedro Figueiredo
Description
image-placeholder-500x500 - MTek3D

 

The 12th MEFT Student Workshop is a 2-day conference during which Physics Engineering students at Instituto Superior Técnico will share their projects with an audience of colleagues and professors.

Each student will prepare a 10-minute talk to present a video (4 minutes) and a short pitch (3 minutes), followed by 3 minutes of questions from the chairpersons and audience. 

This takes the form of a typical scientific conference, with the objective of inspiring the students and kickstarting their scientific careers. This is not only symbolic, as it also represents the first step of their Master Thesis.

Participants
  • Adélia Ferreira
  • António Lourenço
  • Armando Gonçalves
  • Bernardo Picão
  • Bruna Lima
  • Bruno Semião
  • Daniel Mendonça
  • Filipe Ficalho
  • Gonçalo Almeida
  • Gonçalo Costa
  • Gonçalo Martins
  • Hugo Amaral
  • Inês Freitas
  • Inês Martins
  • Inês Santos
  • Iuna Dreyer
  • Joana Bonito
  • João Barbosa
  • João Coimbra
  • João Cândido
  • João Nunes
  • João Oliveira
  • João Pedro Ferreira Biu
  • João Rodrigues
  • João Santos
  • João Tavares
  • Mafalda Rodrigues
  • Manuel Abreu
  • Marco Costa
  • Margarida Pereira
  • Maria Colaço
  • Maria Inês Nunes
  • Maria Lopes
  • Mariana Abreu
  • Marta Simões
  • Matilde Fernandes
  • Miguel Parreira
  • Miguel Pereira
  • Miriam Simões
  • Pedro Esperanço
  • Pedro Figueiredo
  • Pedro Teigão
  • Tiago Valente
  • Tomás Baltazar
  • Vasco Nunes
  • Vasco Pires
  • Vasco Santos
  • +11
  • Tuesday 28 January
    • Opening Session
    • 1
      Exploring the 3HDM with Dark Matter and Machine Learning

      We investigate the constraints and phenomenology of the $Z_2\times Z'_2$ symmetric Three Higgs Doublet Model, focusing on a framework with two inert scalars as suitable dark matter candidates. Our study includes an analysis of the vacuum structure and evaluation of the model against all theoretical and experimental constraints. We expand the analysis to unexplored regions in parameter space, populating the entire mass range and uncovering new features, such as the potential for equal contributions from both dark matter candidates to the relic density. Additionally, we employed an evolutionary Machine Learning algorithm, implemented with a recent CMAES python package, to enhance parameter space exploration.

      Speaker: Pedro Figueiredo
    • 2
      Using Graph Neural Networks for Flavour Tagging in Heavy Ion Collisions

      Heavy ion collisions create a region of extreme energy density near the collision point where quarks and gluons behave as nearly free particles. This state of matter is called quark-gluon plasma (QGP). The QGP is very challenging to study since it is extremely short lived and never reaches the detector. However, by analysing jets created at the same time as the QGP, that do reach the detector and comparing them to jets from proton-proton collisions, it is possible to see how the jets produced in the heavy ion collisions were affected by the QGP. Now the issue becomes identifying which particles gave origin to the jets, a task known as jet flavour tagging.

      This project aims to study how one of the ATLAS Collaboration’s flavour tagging algorithms, called GN2, behaves when trying to identify jets from heavyion collisions, motivated by the prospect of one day discovering more about the QGP.

      Speaker: Pedro Esperanço
    • 3
      Thermal Spray

      The Thermal Spray (Plasma) coating process is a widely adopted technique for enhancing the performance and durability of critical engine parts, offering superior resistance to wear, corrosion and high-temperature environments. However, a recurring defect, delamination at the edges of test specimens, has been observed, where the coating separates from the substrate. This issue compromises the reliability of the coating process, making it unsuitable for aerospace and other high-performance applications.

      This study focuses on investigating the root causes of delamination, examining factors such as substrate preparation, thermal stresses and process parameters. Using experimental trials, surface characterization and thermal modeling, the research explores how particle velocity, spray distance and substrate temperature influence coating adhesion.

      The findings are expected to refine plasma spray methodologies, addressing delamination and enhancing coating reliability. This work will contribute to the development of more robust coatings for high-stakes applications while optimizing resource use and supporting the industry's goals for sustainability and efficiency.

      Speaker: Margarida Rodrigues
    • 4
      Wearable Sensors for Enhanced Assisted Rescue Response

      Firefighters operate in adverse and hazardous conditions, facing smoke, toxic gases and high temperatures, and risking physical exhaustion and emotional stress. Additionally, connectivity issues often hinder timely risk and health assessments, as well as the management of teams and resources, compromising the efficiency and effectiveness of firefighting strategies.

      This project focuses on enhancing firefighters’ health and performance by proposing innovative approaches that integrate physiological and environmental sensors with intelligent data analysis. The goal is to develop wearable sensors tailored for firefighting environments for real-time monitoring and to create an algorithm to detect correlations between self-reported physical fatigue and physiological metrics, eventually allowing the implementation of an automated alert system for risk prediction using machine learning.

      Speaker: Inês Freitas (Instituto Superior Técnico)
    • 10:18
      Coffee Break
    • 5
      Transport Through A Critical Magnetic Quantum Dot Away From Equilibrium

      The last decades saw substantial progress in understanding the equilibrium phases and phase transitions of low-temperature quantum matter. These so-called quantum phase transitions host genuine quantum coherent phenomena that may display exotic properties and phase transitions whose universality classes differ from those of classical systems. However, certain systems — namely electronic transport setups — are inherently out of equilibrium, as they are coupled to different environments (leads) with distinct thermodynamic potentials. Such setups may host energy, charge and spin currents, and must be modeled as non-equilibrium open quantum systems. These bring novel paradigms for studying quantum matter and its phases transitions, with properties not possible under equilibrium conditions. Although these represent new opportunities for fundamental and technological advancements many challenges need yet to be addressed to understand these phases. Recent studies indicate that non-equilibrium conditions can significantly alter the properties of equilibrium stable phases, induce exotic phase transitions, cause heating, and generate novel phases of matter. A comprehensive body of work has addressed the so-called Markovian regime where the memory of the environment is much shorter than the typical system's time scales. In a transport setting this regime corresponds to a large bias or high temperature. The non-Markovian regime, which connects known equilibrium results to these extreme conditions, and is especially relevant for low temperature electronic systems at small bias, remains widely unexplored.

      The current project aims to fill in some of these gaps by investigating non-equilibrium phase transitions in a voltage-biased magnetic quantum dot. This model is directly relevant to understanding the transport and magnetic properties of quantum spintronic devices in systems near criticality. Specifically, we aim to develop a theoretical framework in terms of the quantum dots' collective degrees of freedom as they undergo a quantum phase transition under an applied bias voltage.

      Speaker: Tiago Jorge (Instituto Superior Técnico)
    • 6
      Back to the Tape: Developing tools for the recovery of historical audio recordings

      Audio storage has evolved significantly, from analog phonographs to digital formats like CDs and audio files. Among these, magnetic tape has played a pivotal role, offering high-fidelity sound, rewritability, and adaptability for both personal and professional use. Magnetic tape recordings form a substantial part of the world’s cultural heritage, preserving iconic music and valuable historical archives. However, over time, wear and chemical degradation threaten these recordings, rendering them unreadable with traditional playback methods and risking irreversible damage to culturally significant content.

      To address this, a collaboration between INESC-MN and the Paul Scherrer Institute (PSI) aims to develop non-destructive, contactless techniques for audio restoration. INESC-MN utilizes tunneling magnetoresistance (TMR) sensors to detect weak magnetic fields without causing physical wear, while PSI employs high-resolution X-ray beams to read magnetic particle states. Despite their promise, these methods introduce noise and distortion, requiring advanced signal processing to ensure faithful recovery.

      This project focuses on three primary goals: restoring raw signals by reducing noise and distortion, reconstructing audio signals to replicate the tonal characteristics of original recordings, and enhancing measurement efficiency through adaptive sampling algorithms. Ultimately, the project seeks to recover degraded recordings, such as a 1980 B.B. King performance at the Montreux Jazz Festival, contributing to the preservation of audio heritage and preventing the loss of irreplaceable cultural artifacts.

      Speaker: João Barbosa
    • 7
      Design and modeling a plasma reactor for the production of O2 from the conversion of CO2

      The design and modeling of a plasma reactor for the conversion of CO₂ into O₂ offers a novel and sustainable approach to addressing the increasing levels of carbon dioxide in the Earth’s atmosphere. By utilizing plasma-assisted processes and oxygen-conducting membranes, the project investigates the potential to split CO₂ into its base components, enabling the production of a continuous oxygen flow. This approach directly addresses the urgent need to reduce greenhouse gas concentrations and to mitigate their contribution to global warming.

      In addition to its terrestrial applications, the proposed technology has significant implications for space exploration. Mars, with an atmosphere composed of over 95% CO₂, provides an ideal context for deploying this method to generate oxygen in terraforming efforts.

      The project integrates multiple phases, including reactor modeling and simulation, the development and optimization of 3 types of oxygen-conducting membranes, and experimental validation through proof-of-concept oxygen flux measurements.

      These efforts aim to refine the reactor design and optimize its efficiency, contributing to global sustainability goals by measuring pure oxygen fluxes.

      Speaker: Inês Santos (Instituto Superior Técnico)
    • 8
      Characterization of anomalous air shower events in SWGO

      The interaction of very-high-energy gamma rays and cosmic rays with the Earth’s atmosphere leads to the production of cascading particle events known as Extensive Air Showers (EAS). The aim of my project is to use state-of-the-art simulations to characterize shower features at the ground and connect them to the longitudinal development of the shower, with a focus on identifying anomalous events, as their study provides valuable insights into the underlying physical mechanisms of shower development. This study may also lead to improvements in current methods for gamma/hadron discrimination, which is crucial for observatories like the future Southern Wide-field Gamma-ray Observatory (SWGO), as it will only rely on its large and dense array to distinguish between the shower’s electromagnetic and muonic components.

      Speaker: Inês Martins (LIP)
    • 9
      Pile-up event identification and rejection in SNO+: enhancing the signal to noise ratio

      The primary objective of the SNO+ experiment is to detect the neutrinoless double beta decay, a process that, if observed, would revolutionize our understanding of neutrino physics. Achieving this goal requires very low background levels. One significant source of background arises from pile-up events, which occur when two decays happen within the same trigger window and are reconstructed as a single event. To address this challenge, a comprehensive study of various variables was conducted and the impact of applying cuts on these variables was analyzed, determining the best values for each cut. The resulting analysis of the data showed promising progress in effectively separating pile-up events from single events.

      Speaker: Tomás Baltazar
    • 10
      Towards the neutrinoless double-beta decay study with SNO+: radioactive background characterization with SNO+ scintillator data

      Neutrinoless double beta decay is a process that could give very strong evidence about the majorana nature of neutrinos. SNO+ (Sudbury Neutrino Observatory) will begin its search in the Fall of 2025, where it will add Tellurium into the scintillator. This analysis comes right before adding the Tellurium. It aims to study and identify the all the events in the region of interest, which are possible backgrounds, and pave the way for an unambiguous measurement of the neutrinoless double beta decay. Firstly, this analysis identifies a clean period of data through the study of Bismuth activity. Then, to that period of data, it applies cuts to background events, such as muons, coincident events, events from Pile up or from the neck. To each cut, it evaluates the sacrifice on signal events with Monte Carlo simulations. After each cut, it was checked that all the remaining events were uniformly distributed.

      Speaker: Manuel Abreu (LIP)
    • 11
      YOLO Application on Portuguese Highways for Photovoltaic Energy Potential Evaluation

      The world is facing an urgent challenge: the need for sustainable and clean energy solutions. Highways, with their vast and underutilised infrastructure, present an innovative opportunity through the integration of photovoltaic panels.

      In this work, I employed YOLO, a state-of-the-art computer vision algorithm, to automatically detect and identify highway structures and evaluate their potential for solar energy generation. As a first step in detecting sound barriers, I applied data augmentation techniques to artificially expand my dataset, which was then used to train a YOLOv10n model. The model achieved impressive results, with a precision of 86%, a recall of 89%, and a mean average precision (mAP) of 91%, which demonstrates strong performance and highlight the effectiveness of this approach.

      This research contributes to the development of smarter and greener highways.

      Speaker: João Tavares (Instituto Superior Técnico)
    • 12:24
      Lunch Break
    • 12
      Reduced Models for Relativistic Plasmas: Integration of “particle-in-cell” Simulations and Machine Learning

      The main objective of this project is the development of reduced models for plasma dynamics, formulated as partial differential equations derived from particle-in-cell (PIC) simulations, with a focus on the yet unexplored regime of relativistic plasmas. The development of these models will enable the inclusion of phenomena that can only be captured by kinetic dynamics in fluid models.

      Speaker: Margarida Pereira
    • 13
      Data-driven models for anomalous resistivity in collisionless plasmas

      Plasmas compose most of the visible universe. Understanding their dynamics could lead to advances in fields such as nuclear fusion and particle acceleration, but this is not a trivial task. The multi-scale nature of plasmas makes studying them extremely difficult since behavior at microscopic scales can influence the macroscopic dynamics of the system, and the opposite can also happen. This is a long-standing challenge in plasma physics given the lack of good analytical models to describe the interplay between the different scales, and the difficulty of modeling these systems computationally while capturing the nonlinear relations of phenomena at different scales.
      The wide-scale range of plasma dynamics has led to the compartmentalization of plasma studies by either taking a fully kinetic approach to model microscopic behaviors or using fluid models to describe large-scale phenomena. Therefore, reduced models that can encapsulate the impact of microscopic physics on the macroscopic behavior of plasmas are crucial for accurately modeling them at various scales.
      In this work, we will focus on collisionless plasma shocks, which are a quintessential multi-scale problem in plasma physics. In these scenarios, instabilities induce microscopic fluctuations that are manifested as an anomalous resistivity that affects the macroscopic behavior of the system.
      In this PIC2, we aim to develop a foundational understanding of the physics and tools to be explored in the Thesis. In particular, to get a better grasp on collisionless plasmas and their multi-scale nature, we study in this work a mean-field fluid description of collisionless plasma shocks to obtain a relation between the average quantities of the system and the small-scale field fluctuations that create the anomalous resistivity observed in shocks. We also investigate the physical interpretation of the kinetic fluctuations that have more impact on the macroscopic behavior of the system. This method is evaluated on fully kinetic PIC simulations of collisionless shocks. This allows us to verify if the mean-field description retains relevant information about the shock that can be used to extract a reduced model of anomalous resistivity from simulation data using machine learning tools, which will be done as part of the Thesis.

      Speaker: João Biu (IST)
    • 14
      Application of Deep Learning to Reflectometry Signals in Nuclear Fusion Plasmas

      In a context where the transition to sustainable energy is crucial, nuclear fusion appears as a promising solution, capable of generating a lot of energy in a clean and sustainable way. One of the main challenges of nuclear fusion in tokamaks is controlling the shape and position of the plasma, allowing the optimization of the process and preventing the plasma from touching the walls, degrading the reactor and losing confinement and energy, making fusion impossible.
      Microwave reflectometry is a diagnostic tool that allows monitoring the position of the plasma by analyzing the reflection of electromagnetic waves in the plasma. From the beat frequency spectrogram, the group delay curve is obtained and, through Abel inversion, the electronic density profile. However, system noise, plasma dynamics and reflectometer characteristics can affect group delay curves, introducing errors into the extracted profiles, making it essential to develop more reliable methods for group delay curve extraction.
      Recent advances have demonstrated the effectiveness of Convolutional Neural Networks and Transformers in analyzing spectrograms, both for signal classification and for data denoise and reconstruction. Therefore,this work proposes the application of these deep learning models to obtain more reliable group delay curves, consequently, more representative density profiles, improving plasma control and supporting the general objective of making nuclear fusion more viable.

      Speaker: Mafalda Rodrigues
    • 15
      Study of shock waves in the National Ignition Facility

      Astrophysical collisionless shocks are ubiquitous in the Universe and are observed to amplify magnetic fields and accelerate electrons and protons to highly relativistic speeds in a variety of places, from supernova remnants to active galactic nuclei. In the well-established model of diffusive shock acceleration, electrons are accelerated by multiple shock front crossings; however, this requires a separate mechanism that pre-accelerates electrons to enable shock crossing. This is known as the injection problem and remains one of the most pressing open questions in shock acceleration, alongside the poorly understood energy partitioning between electrons and ions. Observational limitations in distant astrophysical settings prevent direct access to the microphysics of these shocks, driving the need for complementary approaches. Significant progress in our understanding of collisionless shock physics has been achieved through kinetic and nonlinear numerical approaches, with a primary emphasis on particle-in-cell (PIC) simulations, which describe the plasma from first principles. In recent years, further progress has been attained by synergistically combining PIC codes with laboratory experiments conducted in high-power laser facilities like the National Ignition Facility (NIF), offering a unique avenue to study the microphysics of scaled-down analogues of astrophysical shocks under reproducible and well-diagnosed conditions.
      In this work, we conduct fully kinetic one-dimensional PIC simulations under conditions relevant to upcoming NIF experiments, focusing on the formation of magnetized shocks and electron acceleration. Our results demonstrate that the formation of quasi-perpendicular shocks and the onset of non-thermal electron acceleration are achievable within the experimental time and spatial scales. These findings provide valuable insights into the feasibility of addressing the injection problem and advancing our understanding of shock microphysics through laboratory experiments.

      Speaker: Marco Costa (Instituto Superior Técnico)
    • 16
      Kinetic simulations of high-field fusion plasmas

      The ability to efficiently model highly magnetized plasmas is critical for the study of both astrophysical processes and magnetic fusion devices. However, resolving the high gyrofrequencies associated with these plasmas from first-principles is computationally demanding. In this work, we explore the use of a guiding center approximation (GCA) to relax the demanding resolutions required, allowing for more efficient simulations of high-field plasmas. We review the algorithms more commonly used in kinetic simulations of plasmas and compare them with those based on the GCA approach, that we intend to implement in the particle-in-cell code OSIRIS for applications in the study of magnetic mirror machines.
      Our findings confirm the need to employ a very high resolution ($\Delta t \Omega < 0.2$) with current algorithms to reach reasonable accuracy and show good perspectives for significant computational gains by employing a CGA approach.

      Speaker: João Cândido (Instituto Superior Técnico)
    • 17
      Topological Modes in 2D Dirac Materials

      In this work, the potential of the use of magnetic interface systems in bidimensional graphene for information propagation is studied, through an hydrodynamical approach, that also involves important theoretical concepts of topology, plasma physics and fluid dynamics. A characterization of the system is performed through the calculation of the Chern numbers of the different frequency bands of the system, and then, an analytical study of chiral and general solutions that propagate through an interface set in a graphene-field-effect-transistor (GFET) is performed. This includes the employment of concepts normally associated with equatorial wave dynamics in geophysics, and after arriving at interesting solutions, those results were tested using numerical simulations, ran using the source code DEDALUS. At the end, interesting properties about the bulk-edge correspondence promoted by the modes are inferred, the behaviour of the modes through time is evaluated, showing that the Kelvin mode has a very high potential in terms of information propagation, and it is arrived at a general conclusion that the Hall viscosity is not useful for the purpose of propagation through the interface.

      Speaker: Vasco Santos
    • 18
      Towards Accurate Reaction Mechanisms for N₂-H₂ Low-Temperature Plasma Simulations

      We live in an era shaped by groundbreaking technologies, but there was a time when tools like ChatGPT or even the personal computers we now rely on to search the web did not exist. These innovations didn’t simply appear; they are the result of collective effort and years of development to reach the sophistication we see today. In the context of plasma physics, it is tempting to assume we already possess precise plasma simulation capabilities, but the reality is far from complete. This study positions itself at the frontier of plasma research, striving to advance the state of the art. By contributing to one of the most sophisticated plasma simulation tools, the LisbOn KInetics (LoKI) code developed by the PSI.COM group at the Instituto de Plasmas e Fusão Nuclear (IPFN), and by refining reaction mechanisms pivotal for simulation accuracy, this work aims to push the boundaries of plasma modeling. This includes validating and simplifying these mechanisms using machine learning and conducting experimental diagnostics through a dedicated measurement campaign at the Laboratoire de Physique des Plasmas (LPP) in Paris. Focused on nitrogen-hydrogen (N₂-H₂) plasmas, the research investigates the kinetic pathways of ammonia (NH₃) synthesis, offering critical insights for scalable production of sustainable fertilizers, hydrogen-based energy storage, alternative fuels, and advancing the broader plasma science community.

      Speaker: Armando Gonçalves
    • 19
      Fight instabilities with instabilities – a solution for fusion energy

      Tokamak research has long used the H-mode to attempt to achieve the conditions necessary for fusion energy, but it is plagued by Edge Localized modes (ELMs) which disrupt confinement and represent a hazard to plasma facing components. The EDA H-mode is a promising regime which lacks these ELMs, and thus warrants research in its viability of use for large scale machines like ITER and DEMO. The EDA H-mode always has present a localized instability called the Quasi Coherent Mode (QCM), whose properties can be studied as a window into the EDA itself. Of particular interest is the QCM's radial location within the plasma's pedestal, since it is of direct consequence to the extrapolation of the regime to large tokamaks. This work will focus on data from the ASDEX Upgrade tokamak, particularly from the Reflectometry system, to determine a new radial location measurement of the QCM.

      Speaker: Miguel Pereira (Instituto Superior Técnico)
    • 20
      Improving the neutrino detection at Pierre Auger Observatory

      The search for ultra-high-energy (UHE) neutrinos has garnered significant attention. These neutrinos are not only expected if the composition of UHE cosmic rays includes protons at the highest energies, but they are also anticipated if hadronic processes occur during the acceleration of these particles. UHE neutrinos are expected to be produced, escape from their astrophysical accelerators without deflection, and reach Earth.

      To date, no UHE neutrinos have been observed. The best limits at these energies come from the Pierre Auger Observatory, which distinguishes highly inclined events from the cosmic ray background to search for neutrinos. The observatory has recently undergone a major detector upgrade, adding scintillators and radio antennas to each water Cherenkov detector. This upgrade enhances the ability to scrutinize shower development features, enabling the identification of neutrino-induced showers regardless of their arrival direction.

      The student will use state-of-the-art simulations to evaluate these shower features and combine them to develop data analyses that will significantly improve Auger`s sensitivity to UHE neutrinos.

      Speaker: David Dias (Instituto Superior Técnico)
    • 21
      Lattice Tetraquark Spectroscopy and Disentanglement of Excited States

      Understanding exotic QCD matter from first principles has seemed, for the past 50 years, like an intractable problem. The well-established field of Lattice QCD, invented by K. Wilson in the 70s, has proved to be a powerful tool to study this systems, by implementing QCD numerically, via an ab initio, non-perturbative approach.
      In this talk, we will explain the basics of LQCD, see how to simulate tetraquark systems, and understand how this theoretical calculations can help guide future experiments, and shed light on our understanding of these strange objects.

      Speaker: Bernardo Picão
    • 15:30
      Coffee Break
    • 22
      Multi-Higgs Doublet Models and softly-broken symmetries

      The objective of this project is to use a Σ(36) potential, select a specific VEV, and activate one of the soft-breaking parameters that does not preserve the chosen VEV alignment. The goal is to develop a method to minimize the potential and identify its minima. With this new VEV, the mass matrix of the Higgs sector will change, allowing the determination of the masses of the physical scalars.
      Additionally, it would be interesting to investigate possible decays into non-standard Higgs particles.
      This project also aims to provide an introduction to research in particle physics, particularly in MHDM and symmetries.

      Speaker: Gonçalo Barreto
    • 23
      Real-Time Beam Monitoring Based on Cherenkov Effect for FLASH Radiotherapy

      In 2021, cancer was the second leading cause of death in the European Union, representing more than 20% of the total number of fatalities, according to Eurostat. Even though there are several treatment options, such as chemotherapy, surgery, and radiotherapy, there is no single, fully effective treatment.
      Radiotherapy (RT) is one of the most used therapies worldwide, accounting for more than 50% of prescribed therapies. However, it induces several undesirable side effects, such as nausea and skin irritation, which affect the patient’s quality of life. But what if we could minimize them?
      In 2014, researchers used high dose rates to deliver the radiation in less than a second to mice and concluded that they could achieve the same results on tumour control while sparing the healthy tissue – the so-called FLASH Effect. This resulted in a new and promising field of research, FLASH Radiotherapy, called by some the Holy Graal of cancer treatment. However, the full implementation of FLASH in the clinic poses several challenges for physicists, since it requires unprecedented beam currents to achieve FLASH dose rates and fast beam monitoring. This thesis has the bold purpose of addressing this last limitation using a Cherenkov-based detector, thus bringing FLASH one step closer to transforming cancer treatment and tackling one of modern medicine’s most formidable challenges.

      Speaker: Gonçalo Machado Ribeiro (LIP)
    • 24
      The Lunar Ionising Radiation Environment – A Benchmark Model

      Understanding the lunar radiation environment is critical for ensuring astronaut safety and mission success in upcoming ESA's lunar exploration programs such as Artemis and the Lunar Gateway. This study presents the development of the detailed Lunar Energetic Radiation Environment Model (dLEREM), a Geant4-based Monte Carlo simulation tool designed to assess radiation exposure on the Moon. dLEREM builds upon the detailed Mars Energetic Radiation Environment Model (dMEREM) by adapting its computational framework to the lunar surface, considering the absence of an atmosphere, unique regolith interactions, and exposure to Galactic Cosmic Rays (GCRs), Solar Energetic Particles (SEPs), and secondary radiation.

      This presentation aims to outline the challenges of modeling the lunar radiation environment, including data limitations and the complexity of radiation interactions with the lunar surface. It will also describe the development of dLEREM, from adapting dMEREM’s architecture to incorporating lunar-specific radiation sources and validating results with mission data from LRO, Artemis, past Apollo missions, and others. By addressing these challenges, dLEREM aims to provide a comprehensive and user-friendly tool for mission planning and radiation environment assessment.

      Speaker: Bruna Lima (LIP/IST)
    • 25
      Acceleration of the ATLAS Calorimeter Calibration Algorithms Using GPUs

      The ATLAS experiment at the Large Hadron Collider (LHC) processes an extraordinary 60 terabytes of data every second in its endeavor to identify the most fundamental blocks that compose our universe. As the LHC approaches its High Luminosity Upgrade in 2030, the number of collisions per bunch crossing will increase from $\sim$54 to 200. This will significantly increase the data volume and computational demand of the trigger algorithms, which the current infrastructure cannot handle.

      This work aims to use GPUs to accelerate the calorimeter calibration algorithms that are used in the trigger, in order to address these new demands. The four calibration algorithms—Classification, Hadronic Calibration, Out-Of-Cluster Correction, Dead Material Correction—are being implemented in GPU in an optimized way, and benefiting from the massively parallel architecture to achieve significant computational performance improvements. Preliminary results show a 11x speedup for jet events and 22x speedup for $t\bar{t}$ events, with 87 % of calibrations showing differences of less than 10 % compared to previous CPU results.
      These results highlight the potential of GPUs to enable significantly faster algorithms in the trigger system, ensuring that the computational challenges of the High Luminosity phase of the LHC can be effectively addressed.

      Speaker: Bruno Semião
    • 26
      Design optimization for a flat-panel PET scanner with DoI capability

      Cancer is a disease characterized by the uncontrolled proliferation of mutated cells that spread to other tissues and organs. Proton beams are effective in treating deep, radioresistant tumors close to sensitive organs. Positron Emission Tomography (PET) makes it possible to locate beta+ emitters produced in nuclear reactions between the proton and the nuclei of the patient's atoms, making it possible to monitor treatment in real-time.
      The main goal of this project is to fine-tune, through Monte Carlo simulations, the design of a flat panel PET scanner in terms of spatial resolution and image distortions. The scanner is being developed at Austin Technical University (UT Austin) in collaboration with the Laboratory for Instrumentation and Experimental Particle Physics (LIP), which is the external institution in this project.
      The first objective of the study is to create a simulation model of the scanner (geometry and materials) in the ANTS3 toolkit, an in-house software package for Geant4. The second objective is to establish the scanner's depth-of-interaction (DoI) determination capabilities by testing various sizes of the scintillation crystal and applying different approaches to encapsulating the crystal. The final objective is to carry out a study characterizing the resolution and image distortion of the optimized scanner over the entire field of view.

      Speaker: Marta Simões
    • 27
      Laser-Induced Breakdown Spectroscopy

      Laser-Induced Breakdown spectroscopy (LIBS) is a powerful analytical technique used to determine the chemical composition of a sample, by examining the light emitted from a plasma produced on its surface using an intense and short laser pulse.

      LIBS offers several advantages over other analytical techniques, making it a valuable tool in a wide range of applications. LIBS requires minimal to no sample preparation. It can analyse a broad range of materials and detect all the elements in the periodic table. Using high repetition rates of the laser, it can perform real-time analysis which is useful for monitoring applications. At last, it can also operate in different atmospheric conditions.

      Ultrashort laser pulses are currently being explored as a promising tool to enhance the capabilities of LIBS for precise elemental analysis, compared to conventional nanosecond lasers. These femtosecond or picosecond pulses reduce the thermal effects on the sample, offering minimal sample damage, reduced variability between pulses, improved spatial resolution and can also increase signal-to-background ratio.

      The objective of my project is to develop a system for determining the atomic composition of various materials using LIBS technique. My main contribution will focus on investigating the use of ultrashort laser pulses to improve signal analysis, while finding the optimal experimental conditions. In a second step, I will also optimize the design of a spectrometer using the ray tracing software, Zemax. The system will be designed to enable rapid analysis, chemical mapping, and be competitive in terms of price and compactness for commercial applications.

      Speaker: Gonçalo Almeida (Instituto Superior Técnico)
    • 28
      Wearable Sensors to Evaluate Stress and Enhanced Assisted Rescue Response

      This project focuses on the development of wearable sensors for real-time stress monitoring in firefighters, adapting to the unique challenges they face in high-risk environments. The project is centralized on chest-worn devices due to their superior accuracy in recording physiological signals such as ECG and respiratory rate, essential for assessing stress levels. Research investigates optimal designs, materials and sensor placements to ensure durability, comfort and suitability with firefighting gear. Data processing methods are developed and refined to enable accurate real-time insight into firefighters’ vital signs. The preliminary results focus on the effectiveness of the prototypes and guide improvements for robust solutions. The findings aim to support operational safety and health management in firefighting teams.

      Speaker: João Oliveira
    • 29
      Thermal channel implementation at FISSIONIST

      FISSIONIST is a low-power nuclear reactor simulator designed to replicate the operational conditions of the Reactor Português de Investigação (RPI). It will be used for the training of nuclear engineering students by providing a safe and realistic simulation of reactor dynamics.

      In this project, the new cooling circuits module will be developed, incorporating the reading of control commands, the simulation of water flow rate and temperature in critical points of the primary and secondary circuits, and the corresponding detector outputs. A basic simulation of the primary circuit flow has been done using Python, and communication with a Digital-to-Analog Converter has been successfully tested.

      Speaker: António Lourenço (Instituto Superior Técnico)
    • Closing Remarks
  • Wednesday 29 January
    • 30
      Incoherent Diffraction Imaging with hard X-rays

      Incoherent Diffraction Imaging (IDI) with hard X-rays is a promising improvement in lensless imaging tech- niques, taking advantage of the partial coherence of scattered light to obtain structural insights at the nanoscale. Building on the foundations of Coherent Diffraction Imaging (CDI), IDI uses second-order intensity correlations to reconstruct high-resolution images while overcoming the obstacles posed by coherence limitations. This project focuses on creating an experimental campaign to test IDI with a synchrotron source at SOLEIL’s Nanoscopium beamline. This study aims to improve imaging with partially incoherent X-rays by combining numerical simulations and an innovative experimental setup that includes a rotating diffuser and a resolution target. The research results show potential for more accessible, high-resolution imaging techniques for nanostructures, widening applications in material sciences, medicine and biology.

      Speaker: Matilde Fernandes (IST)
    • 31
      Assessment of the potential of radiosensitizers to improve the efficacy of Radiation Therapy

      Radiotherapy is one of the major therapeutic approaches used in cancer treatment, along chemotherapy and surgery. Current data shows that around 50% of cancer patients undergo radiotherapy for the treatment of local tumors. This methodology is based on the deposition of ionizing energy in tumor cells, commonly through the use of high-energy gamma rays, X-rays, or charged particles, in order to damage these cells and induce tumor reduction/elimination. In this context, the use of radiosensitizing agents may play a crucial role to obtain enhanced radiobiological effects with improved therapeutic outcomes. For this purpose, several classes of radiosensitizers with diversified mechanisms of actions can be studied. Towards this goal, this project will investigate the ability of, already developed, radiosensitizers to enhance radiation therapy using the clinical beam at Fundação Champalimaud. The biological and physical studies will focus on the analysis of local deposition energy and its impact on the direct effects of radiation, through quantification of Deoxyribonucleic Acid (DNA) damage in the form of DNA double strand breaks (DSBs), and long term effects through clonogenic assays. The primary objective of this project is to assess the impact of radiosensitizers in radiotherapy treatment planning by incorporating the obtained biological data into the calculations.

      Speaker: Maria Lopes
    • 32
      Biocompatible flexible photosensors based on 2D materials and crystalline organic semiconductors

      Vision is the sense humans rely the most on in daily activities. It is allowed by the presence of natural photoreceptors in the eye that absorb the incident light and transmit the visual inputs to the brain.
      The degeneration of natural photoreceptors in the eye leads to debilitating visual impairments, such as Age-Related Macular Degeneration (AMD) which is one of the leading causes of legal blindness, affecting millions of people worldwide.
      One promising strategy to restore vision are implantable artificial light sensors. These would mimic the natural photoreceptors by converting light into electric stimuli for the brain. Organic semiconductors, which combine good optoelectronic properties (as good light detection and good photogeneration) with flexibility and biocompatibility are ideal candidates for such biological applications.
      This project aims to fabricate and characterize photodetectors based three organic semiconductors, each capable of light sensing in the red, green or blue portions of the visible light spectrum.
      The first step will consist on the selection and purification of the organic materials, growing single crystals by Physical Vapor Transport. Photodiodes will, then, be fabricated and assembled in a stacked structure or pixel array to allow colour sensing. Finally, optical characterization will be performed to evaluate the performance of the devices.
      Ideally, this research would contribute to the advancement of implantable photosensors, focusing on optimizing photodetection and photocurrent generation, as well as enabling colour sensing.

      Speaker: Adélia Ferreira
    • 33
      Reconfigurable High-Power Vectorial Beams Based on Disordered Optical Metasurfaces

      Light possesses spatiotemporal characteristics, including amplitude, phase, polarization, and frequency, making it a complete energy and information carrier. As such, we must understand how and to what extent we can manipulate light so that we can use it for applications like optical microscopy, laser-induced nuclear fusion, and material processing. Optical metasurfaces are incredibly useful tools for structuring light, but they face major limitations when working with high-power beams. In this work, we will start by studying the effects of metasurfaces with a random profile on scalar beams and then move to the vectorial case. We will propose a novel design that is capable of multifunctional vectorial shaping, which promises to reduce the production cost of fabricating optical metasurfaces adequate for high-power beam shaping. As a result, we hope to make applications that depend on high-power structured light more accessible.

      Speaker: Maria Inês Nunes (Instituto Superior Técnico)
    • 34
      Analog Logic Gates of Structured Light based on nonlinear optical neural networks

      Artificial Intelligence (AI) has revolutionized many areas of information technology, such as data analysis, automation, predictive modeling, and decision-making processes. However, as we continue to generate and process an ever-growing tsunami of data, the global electricity consumption associated with these operations is becoming a significant concern. Photonics, the science of light generation, detection, and manipulation, offers a promising approach to augment or even replace traditional electronic architectures for AI. Photonic systems are renowned for their parallelism, high-dimensionality, and low-power consumption, making them ideal for handling the increasing demands of AI applications. They can help make AI faster, more efficient, and more sustainable, thereby addressing the pressing need for energy-efficient data processing. Despite these advantages, the mutual influence of AI and optics has so far been mainly confined to the realm of deep learning inference in computer vision, microscopy and other visual computing tasks. While these fields have seen significant advancements, there is a vast, unexplored potential for further integration of AI and optics.
      This Master thesis aims to transcend the boundaries of existing optical architectures and propose a novel optical system for physical neural networks. The objective is to implement an analog all-optical system realizing logic operations. This will serve as the basis to advance the frontier of nonlinear structured light. The ultimate goal is to generate optical lifeforms capable of self-organization, evolution, complex interactions, and dynamic adaptation.

      Speaker: Joana Bonito (Instituto Superior Técnico)
    • 10:30
      Coffee Break
    • 35
      Nanoelectronic chip design: from physics principles to circuit design for production

      Conventional computer hardware suffers from the “memory wall” or von Neumann bottleneck, where data transfer between memory arrays and processors leads to significant latency and energy inefficiencies. A promising solution to this issue is in-memory computing hardware, which integrates memory and computation in the same unit. Neuromorphic computing is one such architecture, designed to emulate the structure and function of the brain. This approach requires electronic components that replicate biological elements like neurons and synapses. Memristors—nanoscale resistors with non-volatile, analog conductance states, which can be tuned by an electric bias—are ideal candidates for these roles, acting as synthetic synapses within neuromorphic systems.

      This work focuses on the study of memristors, aiming to provide a solid understanding of the device. Cadence Virtuoso, an industry-standard tool for professional circuit design, is used to model and simulate the memristor at the circuit level. Additionally, Si/Ag-based memristors are fabricated at INESC MN, providing a comprehensive understanding of the nano- and microfabrication process. This foundational knowledge serves as a basis for future research in neuromorphic nanoelectronic circuits.

      Speaker: Vasco Nunes (Instituto Superior Técnico)
    • 36
      Development of a magnetoresistive sensor for fingerprint reading

      Fingerprints are one of the best ways to identify people, they have been used to solve crimes and link suspects to the crime scene for over 100 years, they are one of the most important tools in criminal investigation. Currently fingerprint imaging is done using optical devices, by using magnetic imaging methods instead additional data can be encoded into the fingerprint and used, for example, for security and anti-falsification purposes.

      For this we decided to use devices that exploit the property of magnetoresistance, specifically tunneling magnetoresistance devices which provide a high magnetoresistance ratio such that, even normal toner ink, which contains very a very low amount of magnetic particles, can be used for this purpose.

      Speaker: Hugo Amaral (Instituto Superior Técnico)
    • 37
      Relevance of electronic interactions at quasiperiodicity-driven localization transitions

      In real materials, disorder can induce (Anderson) insulating phases. Interestingly, incommensurate or quasiperiodic modulations in crystals can also profoundly affect the localization properties of the electronic wavefunction. As the quasiperiodic perturbation strength increases, single-particle states can transition from delocalized (plane-wave-like), to critical (fractal-like) and finally to localized. While some properties of these phases resemble those in disordered systems, the metal-insulator transitions are fundamentally different from their uncorrelated counterparts.

      The simplest model that captures the transition between extended and localized phases at a critical quasiperiodic modulation strength is the celebrated Aubry-André model, which features a remarkable duality between localized and delocalized states. Recently, this duality — previously thought to be fine-tuned — was shown to be a generic feature, but somehow hidden, near the localization-delocalization transition. Remarkably, for one-dimensional systems of interacting spinless fermions, interactions of this type were shown to become irrelevant around the transition, with eigenstates following the hidden duality scenario of the non-interacting limit.

      However, the role of spinful interactions in quasiperiodicity-driven transitions remains unexplored. The aim of this project is to explore the effects of electronic interactions near the quasiperiodicity-driven localization-delocalization transition. Specifically, we seek to determine whether such interactions become relevant at the transition, as in higher-dimensional disorder-driven transitions, or remain irrelevant, as in the spinless case.

      Speaker: Mariana Abreu (Instituto Superior Técnico)
    • 38
      Characterization of Color Centers in diamond for quantum sensing

      Nitrogen-vacancy (NV) centers diamond are versatile quantum systems that combine single photon emission and spin-dependent fluorescence. These properties have made NV centers central in advancements in quantum information and quantum sensing. In particular, NV-based magnetometry leverages the Zeeman Effect to achieve high spatial resolution and sensitivity through the detection of fluorescence. Furthermore, the chemical inertness and biocompatibility of diamonds makes the NV-based magnetometer an ideal sensor for both non-biological and biological sensing applications.

      Recently femtosecond lasers have emerged as a promising tool for creating NV centers in diamond. This project aims to investigate the quality of NV centers produced by illuminating diamond with a 515 nm laser, exploring the impact of laser parameters during both defect creation and the subsequent annealing process. Comparisons may also be made with NV centers fabricated using focused ion beam and high temperature irradiation techniques.

      The characterization will focus on the fluorescence properties of NV centers under varying fabrication conditions. Techniques will include hyperspectral confocal microscopy for wavelength-resolved studies, fluorescence lifetime imaging microscopy (FLIM) for temporal analysis

      Moreover, Optically Detected Magnetic Resonance (ODMR) experiments shall be performed to assess their magnetic field sensing capabilities. The magnetic sensitivity of NV centers will be quantified, and the most sensitive configurations will be tested in a proof-of-concept application.

      Speaker: Inês Gonçalves
    • 39
      Measurement Waiting Time Distributions as Probes of Dynamical Many-Body Quantum Phase Transitions
      Speaker: Francisco Dias
    • 40
      Magnetic sensors with superior performance: material and design optimization for angular sensing

      Magnetoresistive (MR) sensors measure magnetic field intensities with high precision by leveraging resistance changes. Tunneling Magnetoresistance (TMR) sensors, like other MR sensors, rely on the angular difference between the magnetic moments of the Free Layer (FL) and the Reference Layer (RL) to determine resistance. While TMR sensors excel in sensitivity and performance, a key challenge is stabilizing the RL magnetic moment in high-field environments, where deviations can compromise the accuracy by shifting the system's reference.

      This project aims to optimize TMR sensors for angular measurements by addressing three primary challenges: enhancing RL stability, extending the operational range (plateau range), and minimizing deviations from measured field directions. Using a Stoner-Wohlfarth-based framework, the effects of anisotropies, coupling mechanisms, geometry and materials on sensor performance were investigated. These insights guided the development of optimized designs, for later to be implemented in a Wheatstone bridge configuration to reduce deviations and enhance sensing accuracy. The iterative combination of simulation and experimental validation will advance TMR sensors for superior angular field measurements.

      Speaker: DANIEL MENDONCA
    • 41
      R-matrix analysis of nuclear-reaction cross-section data

      Any theoretical method for the description of nuclear reactions cannot fully describe the nuclear effects inside the nucleus because of the complexity of the nucleus and nuclear forces acting within the nucleus. R-matrix theory does not attempt to describe these forces inside nuclei, but rather uses quantities of internal properties of nuclei as parameters that can be determined from experiment. In this work, we will analyse Helium-3 induced nuclear-reaction data from the Ion Beam Laboratory of IST at CTN, and from IBANDL data library using R-matrix theory. The analysis will be made with the help of the AZURE2 computer code. The objective is to provide a fit and theoretical interpretation of the experimental cross-section data and also to use these data to make new theoretical predictions.

      Speaker: Gonçalo Costa (Instiuto superior tecnico)
    • 12:24
      Lunch Break
    • 42
      Plasma instabilities in Fireball experiments at CERN

      The main objective of the project is the study of electromagnetic instabilities in plasmas in the ultra-relativistic regime, focusing on the interaction of electron/positron beams with electron and ion plasmas. This study will be conducted using particle-in-cell (PIC) simulations with the OSIRIS numerical code.

      The aim is to reproduce, ab initio, the experimental conditions of the program at CERN and those planned at INFN, and their relevance to extreme phenomena present in astrophysical scenarios, such as gamma-ray bursts (GRBs), relativistic shocks, or magnetogenesis, where kinetic instabilities, such as the Weibel instability, may play a crucial role.

      The initial steps are focused on developing reduced simulations to understand which types of instabilities are most relevant under the experimental conditions to be studied. Subsequently, the project will aim to generalize the simulations from 2D to quasi-3D and 3D, reproducing the conditions of the experiments carried out or under preparation at CERN and INFN.

      Speaker: Manuel Afonso dos Santos Ratola
    • 43
      Development of a LWFA target for fine electron injection control

      Laser-wakefield acceleration (LWFA) is a research field that explores the production of high-quality particle beams by leveraging the interaction between intense laser pulses and plasma. LWFA holds promise for applications such as X-ray generation, where electron beams with high charge, tunable energy, low energy spread, and minimal emittance are essential. Achieving these beam characteristics depends on controlling laser and plasma parameters, particularly the electron plasma density profile. Sharp plasma density gradients have proven effective in generating high-quality beams by promoting localized electron injection with minimal phase variation. This project aims to advance LWFA by developing a novel gas target that precisely manipulates the density profile using a two-chambers design.

      Speaker: Diogo Lemos (IST)
    • 44
      Investigating the EDA H-mode edge

      The global pursuit of sustainable energy has placed nuclear fusion at the forefront of transformative technologies. My thesis will investigate the Enhanced Dα (EDA) H-mode as a promising no-ELM regime for future fusion reactors, that features the quasi-coherent mode (QCM), which enhances local transport, while avoiding edge-localized modes, and offering high energy confinement, minimal impurity accumulation, and compatibility with tungsten walls. By leveraging recent experimental observations at ASDEX Upgrade, this study will extend the EDA H-mode parameter space to the separatrix, possibly unveiling the physics behind this regime, which are not yet fully understood.

      Key objectives include recover the plasma parameter profiles across the edge region, obtain empirical scaling laws to separatrix parameters, analyse turbulence regimes and transitions, and correlate confinement regime transitions with separatrix operational space. A comprehensive dataset encompassing plasma current, fueling rates, and heating power will be analysed to derive insights into the physics underlying this high-performance regime.

      The results may be able to bridge the operational conditions of EDA H-modes at current tokamaks with the challenges posed by future devices such as ITER and SPARC, emphasizing the role of separatrix collisionality and pedestal gradients, while contributing to the broader goal of achieving stable, efficient, and sustainable fusion energy.

      Speaker: João Pedro Nunes
    • 45
      Linearized General Relativity in Hyperboloidal Coordinates

      In this work, we pursue the numerical solution of the Einstein equations to study the behavior of gravitational waves at future null infinity. In this pursuit, we study the hyperboloidal coordinate system and its properties and develop a numerical framework capable of handling some variations of the wave equation in this coordinate system, obtaining promising results.

      Speaker: Filipe Ficalho (Instituto Superior Técnico)
    • 46
      Solving the Teukolsky Equation with Spectral Methods

      There is a necessity for efficient and accurate algorithms to simulate events that are expected to appear in LISA’s frequency range in the next few decades. With this work, we are exploring an implicit and time-symmetric integrator in the context of the Teukolsky equation. In theory, this time-domain solver shows some advantages to ordinary integrators such as Runge-Kutta 4 for long-time evolutions of this equation. Even though there are already several articles exploring these time-symmetric schemes, there are still several aspects that need to be studied in detail in order to prove their usefulness. So far, we have provided convergence tests which were not shown previously in the context of the Teukolsky equation, and also several ideas to improve this scheme. In future work, we will be working on implementing these ideas into our code and on implementing an already existing version of the integrator that considers particle sources in the spatial domain. This latter point will enable us to study geodesic orbits of test-particles.

      Speaker: Tiago Valente (Instituto Superior Técnico)
    • 47
      White Holes: Formation and Dynamics

      In this project, our first goal is to study the different possibilities for the behaviour of a FLRW universe, focusing on the recollapse solution, born from a White Hole in a Big Bang type of singularity. We start by briefly revising the maximal analytical extension of the white hole, followed by a deep study of the FLRW metric and universe. It was possible to obtain the Friedman equation, including the cosmological constant, which by looking at it as an energy conservation equation, allowed us to understand all the possible behaviours, depending on the space-time geometry.
      To obtain the time-reversed Oppenheimer-Snyder collapse, so one can describe the FLRW universe with the White Hole at the Big Bang, one had to do the matching of the FLRW metric with the Schwarzschild-De Sitter metric (for the more general case). The extrinsic curvature was proven equal under some restrictions, allowing one to match these metrics and construct the Penrose Diagrams for some of the different behaviours in the Schwarzschild case.
      Finally, we move to our second goal, focusing on the recollapse solution, where the sphere of dust coming out of the White hole reaches a maximum and collapses into a Black Hole. We aim to study the dynamics of this solution, starting from the presence of a scalar field and studying the Klein-Gordon, leading to the radial equation for the dynamics. In future work, we will impose initial conditions, consider specific cases and adapt the study of the dynamics for other fields and gravitational waves, culminating in the study of the quasi-normal modes of this solution.

      Speaker: Vasco Pires (Instituto Superior Técnico)
    • 48
      Fields On Naked Singularities

      In this work, we studied the Schwarzschild solution to the Einstein equations and replicated previously obtained results in order to create a baseline to use on the study of the q-metric. In the q-metric, the curvature Kretschmann scalar and the radial null geodesics on the equatorial plane were studied in order to reach the tortoise coordinate and null coordinate transformations.

      Speaker: Miguel Parreira (Instituto Superior Técnico)
    • 49
      QUANTUM FINANCE: Path Integrals and Hamiltonians for Options Pricing

      This work approaches finance through the lens of quantum mechanics, offering a conceptual and mathematical framework beyond traditional stochastic calculus. Tools such as Hamiltonians and path integrals capture complex financial dynamics and correlations, addressing gaps in current methodologies. This perspective enhances theoretical understanding and provides practical tools for navigating modern financial markets, fostering efficiency and competitiveness

      Speaker: Pedro Teigão (ISR)
    • 50
      Characterization of the dosimetry system in terms of Hp(3) for eye lens dose assessment

      This project focuses on the characterization of a dosimetry system in terms of $H_p(3)$ for eye lens dose assessment. The study aims to validate a dedicated thermoluminescent dosimeter (TLD) with a LiF:Mg,Cu,P composition for eye lens monitoring, addressing the growing concerns over radiation-induced cataracts among healthcare workers. The project assesses performance under various conditions by analyzing the dosimeter's linearity, energy and angular dependence, reproducibility, detection limits, and estimating measurement uncertainty, ensuring accuracy and compliance with international standards.

      Speaker: Miriam Simões
    • 51
      A comparative study of Nietzsche’s and Heisenberg’s view of understanding, language and knowledge

      Epistemological considerations are indispensable in order to interpret science and its results, and at times may even help guide scientific investigation.
      My thesis will consist in a philosophical exploration of the process of cognition and conceptualization that sits at the basis of the production of knowledge, through a comparison of Heisenberg's and Nietzsche's perspective, drawing also from Kant and Wittgenstein.
      This comparison results in a relativization of the Kantian understanding, which becomes grounded in the historical process of the biological evolution of the human species and of the comunitarian development of language in social activity.
      Modern physics, especially quantum mechanics, will be a crucial presence in the thesis, as it both motivates the philosophical move undertaken by Heisenberg, and appears in a new light when considered from this perspective of a historically contingent understanding.

      Speaker: Francisco Ferreira
    • 15:30
      Coffee Break
    • 52
      Lagrangian formulation of General Relativity

      This study explores the Lagrangian formulation of General Relativity, a mathematical framework that connects geometry and field theory. Initially developed through the Einstein-Hilbert action, this formulation derives the fundamental Einstein field equations, uniting gravity and matter interactions. I then transitioned to modified gravity theories, emphasizing New Massive Gravity (NMG), a three-dimensional extension incorporating higher-curvature terms. A black hole solution within NMG is examined, revealing a curvature singularity covered by an event horizon, analogous to the BTZ black hole in classical GR. This research underscores the Lagrangian approach as a robust tool for advancing theoretical physics and extending GR's applications to broader gravitational phenomena.

      Speaker: Raul Santos
    • 53
      Uma Abordagem Progressiva à Teoria da Relatividade

      Como explicar conceitos que transformaram a nossa visão do universo? Esta é a missão desta adaptação compacta de uma tese maior, projetada para ensinar Relatividade de forma clara, acessível e dinâmica. A ideia é simples: levar o leitor numa viagem lógica, onde cada conceito surge naturalmente, como um degrau sólido para o próximo passo.

      Nesta versão, mergulhamos no clássico problema da soma de velocidades e exploramos ideias revolucionárias como a Dilatação do Tempo, Contração do Espaço e as Transformações de Lorentz. Mais do que explicar, este trabalho convida o leitor a pensar, a participar ativamente na construção dos conceitos, guiado por uma narrativa fluida e exercícios estratégicos.

      A tese completa estende este método à Relatividade Geral, criando um recurso único em português, que une rigor científico e uma abordagem didática inovadora. Aqui, o conhecimento não é imposto, mas sim descoberto pelo leitor, passo a passo, tornando cada conceito uma conquista pessoal.

      Speaker: João Coimbra
    • 54
      Modelling the accretion-ejection flow around the supermassive black hole at the centre of the Milky Way

      With the recent reconstructed image of Sagittarius A (Sgr A) from the Event Horizon Telescope Collaboration, interest and curiosity in studying the black hole at the centre of our Galaxy has increased. The entire data collection, including the image, the spectrum and the light polarisation, has motivated and attracted many to develop and analyse different numerical models of the low luminosity accretion-ejection flow. These state-of-the-art models also help to understand how different physical parameters affect the final simulated images.

      The work done so far has consisted of collecting results from these models and studying the impact of each physical parameter on the generated spectra, as well as synthetic images of the parabolic jet model.

      In addition, simulation codes such as GYOTO have undergone many updates and improvements. In particular, the polarisation of the light, which will also help to better constrain our model.

      Finally, the aim of the proposed Master's thesis will be to combine the parabolic jet model already studied with a thick disk model into an analytical model. This model should be able to accurately fit the observed EHT data.

      Speaker: João Santos
    • 55
      Examining the Hubble tension with differences in supernova and host galaxies properties

      The Hubble constant $H_{0}$ corresponds to the present time (z=0) value of the Hubble parameter $H(z)$, giving us the Universe’s local expansion rate and providing crucial information about its age and history. When comparing the different results of the $H_{0}$ from the early-time measurements using the standard $\Lambda$CDM cosmological model, and the ones estimated directly from late-time measurements using the Cepheid-based Cosmic Distance Ladder, a persistent 4-6$\sigma$ disagreement between both estimations emerges. This discrepancy, known as the Hubble tension, represents a significant crisis in cosmology and can indicate the need for new physics, or, at the very least, unexpectedly large systematic errors in either or both of the two principal measurements. The accuracy of the distance ladder method relies on the precise matching of the supernovae population properties and their host environments in the calibration and the Hubble Flow (HF) galaxies, which is taken for granted in almost all studies using this technique.

      In this project, we investigate and compare the supernovae characteristics as well as their host galaxy properties of both the calibration and Hubble Flow samples, concluding that the stellar mass (M) and specific star formation rate (sSFR) distributions of the calibration sample supernovae might not be representative of the ones in the Hubble Flow sample. We also obtain a subsample from the HF sample that better matches the calibration sample to understand the impact in the Hubble parameter estimation and derive a more accurate correction for SNe Ia luminosities. However, we conclude that there is no noticeable relation between the better concordance between the different properties distributions of the SNe and their host galaxies and the estimated correction parameters, and that a subsample from the Hubble Flow sample capable of independently reducing the Hubble tension may not exist.

      Speaker: Gonçalo Martins
    • 56
      Advanced Digital Encoder Design for Next-Generation Smart RFID Tags in Textile Industry Digital Product Passports

      The fast-fashion industry’s environmental impact necessitates innovative solutions to improve sustainability and transparency in textile production. This project focuses on developing advanced digital encoders for next-generation smart RFID tags to facilitate the integration of Digital Product Passports (DPPs) in the textile industry. DPPs provide detailed product lifecycle data, enabling compliance with European Union regulations and promoting environmentally responsible consumer decisions. The proposed RFID tags utilize cutting-edge materials and manufacturing techniques, including screen printing and heat transfer, to achieve durability, flexibility and reliable performance under industrial conditions. Emphasizing eco-friendly practices, this project explores the use of recyclable materials, advanced conductive inks and scalable production processes. Collaborative efforts with INESC ensure real-world applicability, with comprehensive testing validating performance in challenging environmental and mechanical stress scenarios. This research contributes to sustainable textile innovation by addressing key challenges in RFID integration and paving the way for enhanced traceability and sustainability in global fashion markets.

      Speaker: Maria Colaço (Instituto Superior Técnico)
    • 57
      Descriptive and Predictive Modeling of Chaos in Cold Atom Physics using Explainable Deep Learning

      Ultracold atom clouds have garnered significant attention for their intricate dynamics, which may exhibit chaotic behavior and remain without a comprehensive explanatory framework. This work explores these dynamics in a $^{85}$Rb cold atom cloud through the integration of explainable deep learning techniques, given the recent meteoric rise of artificial intelligence (AI). This is employed through the analysis of image series data obtained in a laser cooling experiment in a magneto-optical trap. This study aims to uncover low-dimensional structures and chaotic behaviors within the system by employing convolutional autoencoders and dimensionality reduction methods, along with explainability strategies. Preliminary results reveal evidence of a potential phase transition between stable and turbulent regimes, with further analysis providing critical insights into system parameters. This interdisciplinary approach highlights the transformative potential of explainable AI in elucidating complex phenomena in cold atom physics.

      Speaker: Mr João Rodrigues (IST PIC2 Website Team)
    • 58
      Probing Unification Scenarios with Big Bang Nucleosynthesis

      Big Bang Nucleosynthesis (BBN) is an observational cornerstone of the Hot Big Bang model and a sensitive probe of physics beyond it. Although some analytic approximations can be made, a fully consistent analysis must be done numerically, starting with the classic code by Kawano and leading to the recently developed PRyMordial, a publicly available Python code. An example of physics beyond the standard model to which BBN is sensitive are Grand Unified Theory (GUT) models. A self-consistent perturbative analysis of the effects of variations in nature’s fundamental constants, which are unavoidable in a broad class of GUT models, has recently been developed. The specific goal of this PIC project is to implement this perturbative approach in the PRyMordial code. This will enable a subsequent use of the extended code to obtain constraints on GUT models using current observations, and also detailed forecasts of improvements expected with next generation astrophysical facilities, such as the ANDES spectrograph for the ELT.

      Speaker: Iuna Dreyer
    • Closing Remarks