Scientific foundation models extend the breakthroughs of large language models and computer vision into science. Trained on massive unlabeled scientific data — experiments, simulations, papers, instrument logs — they learn generalizable, transferable knowledge across chemistry, materials, biology, and beyond without task-specific supervision. Coupled with AI agents that can plan experiments, run tools, and interpret results, they accelerate discovery from hypothesis generation to validation.
This effort started as an MICDE Initiative on Scientific Foundation Models in May 2023, and morphed into the Center for Foundation Models & AI Agents for Science in September 2024. We are supported by Los Alamos National Laboratory, NSF NAIRR, NVIDIA, OpenAI and Microsoft.
According to GPT-5, the words ‘Scientific Foundation Models’ first appeared on the internet in July 2023. If this is accurate, we were among the first to introduce this term.
We are a multidisciplinary team combining expertise in physics, chemistry, materials science, and computer science. Our group brings together PIs, postdocs, and students from Michigan, along with our collaborators across the globe, to build foundation models that are scalable and practically useful. We collaborate closely with domain experts to ensure models respect scientific constraints, are interpretable, and can be integrated directly into experimental and simulation workflows.
At SciFM, we envision a future where scientific discovery is accelerated by foundation models and AI — where computations don't just support research, but actively generate new knowledge and deliver direct solutions to today's most pressing challenges.