We welcome the new year 2026 with a fresh lineup of speakers for the Atomistic Modeling Seminar. Get ready for exciting insights into materials science and atomistic modeling!
Tejs Vegge is Professor and Head of the Section for Autonomous Materials Discovery at the Department of Energy Conversion and Storage at the Technical University of Denmark. Our second speaker is an expert in computational and AI-accelerated discovery of clean energy materials, next-generation battery materials, production and storage of hydrogen and ammonia, and nanoparticle electrocatalysts for sustainable fuel production.
AI‑driven simulation, machine‑learning potentials, and autonomous synthesis platforms are used to design nanoparticles, map electrocatalyst surface behavior, and accelerate solid–liquid interface studies with near–ab initio accuracy. In his presentation “AI-accelerated computational discovery of nanoparticles, electrocatalysts, and solid-liquid interface kinetics”, Vegge will propose a unifying, information‑theoretic view of how different experimental and computational “observables” represent materials, while introducing richer physics‑based representations to improve prediction and simulation efficiency.
Date: Tuesday, January 27, 2026, 3:30 pm
Location: MIBE Lecture Hall
Abstract:
Establishing large-scale infrastructures for autonomous materials discovery and synthesis is becoming a key lever for accelerating the development of advanced energy materials, from electrocatalysts to electrochemical interfaces, using self-driving labs (SDLs) and materials acceleration platforms (MAPs) [1]. A central challenge is the computational prediction and controlled synthesis of nanoparticles with targeted atomic structures. We build a simulation-to-synthesis pathway that links (i) global structure exploration of nanoalloys using symmetry constraints and machine-learned interatomic potentials [2], (ii) rapid mapping of alloy surface phase diagrams and adsorbate-driven restructuring using Bayesian evolutionary multitasking [3], and (iii) transferable atomistic simulation at scale via foundation models for materials chemistry, enabling data-efficient fine-tuning to new conditions [4]. These advances culminate in ScatterLab, an autonomous approach that designs synthesis protocols by matching real-time total scattering and pair distribution function (PDF) data to simulated target patterns, without requiring prior synthesis knowledge, and demonstrates ondemand synthesis of structurally distinct gold nanoparticles [5]. Together, the workflow illustrates how atomistic simulation and machine learning can be coupled to close the loop between “target structure” and “reproducible protocol”.
Understanding dynamic processes at solid–liquid interfaces in electrochemical devices is equally crucial for performance and durability. Ab initio molecular dynamics (AIMD) provides the accuracy needed to describe bond-making and bond-breaking at interfaces, but its cost limits statistical sampling and access to long time- and length-scales. We therefore develop and deploy machine-learning potentials and workflows that preserve ab initio fidelity while enabling mechanistic inference and rare-event sampling. For CO2 electroreduction at Au–water interfaces, we connect explicit-solvent interfacial structure to reactivity by resolving cation-coordinated inner-sphere pathways and by showing how engineering local cation conditions modulates activity and selectivity [6]. To access activated kinetics directly, we combine graph neural network potentials with enhanced sampling to accelerate metadynamics of the oxygen reduction reaction at Au-water interfaces [7]. More broadly, we demonstrate how ensembles of machine-learning potentials enable nanosecond-scale sampling of complex interfacial water-hydroxyl structures on Pt(111), yielding robust finite-temperature structure and energetics [8].
Finally, we introduce and discuss a meta-question: which experiments or simulations provide the most informative “views” of a material for a downstream prediction task? Recent ML results sometimes show competitive performance from “simple” materials representations (e.g., chemical composition) that omit explicit structure, challenging standard physics intuition. A tomographic interpretation of structure–property relations provides a unifying framework for defining material representations, material properties, and the material itself, and for using information-theoretic concepts to explain how different observables contribute complementary information [9]. In parallel, we highlight how learning richer physics objects can expand the representation toolbox, exemplified by models that predict three-dimensional charge densities on Cartesian grids with floating orbitals, offering new intermediate representations for property prediction and simulation acceleration [10].
1. Stier et al., Adv. Mater, 2024, 36, 2407791, doi.org/10.1002/adma.202407791
2. Han, Barcaro, Fortunelli, Lysgaard, Vegge, Hansen, npj Comp. Mater., 2022, 8, 121
3. Han, Lysgaard, Vegge, Hansen, npj Comp. Mater., 2023, 9, 123, doi.org/10.1038/s41524-023-01087-4
4. Batatia et al., J. Chem. Phys., 2025, doi.org/10.1063/5.0297006
5. Anker et al., ACS Nano, 2026, doi.org/10.48550/arXiv.2505.13571
6. Qin, Vegge, Hansen, J. Am. Chem. Soc., 2023, 145, 1897-1905, and ACS Catal.., 2024, 14, 8168-8175 (2024)
7. Yang, Bhowmik, Vegge, Hansen, Chem. Sci., 2023, 13, 3913
8. Mikkelsen, Kristoffersen, Schiøtz, Vegge, Hansen, Jacobsen, Phys. Chem. Chem. Phys. 2022, 24, 9885
9. Ortega-Ochoa, Aspuru-Guzik, Vegge, Buonassisi, arXiv, 2025, doi.org/10.48550/arXiv.2501.18163
10. Elsborg, Thiede, Aspuru-Guzik, Vegge, Bhowmik, arXiv, 2025, doi.org/10.48550/arXiv.2503.08305