Papers
arxiv:2508.16067

Training a Foundation Model for Materials on a Budget

Published on Aug 22
· Submitted by teddykoker on Aug 28
Authors:

Abstract

Nequix, a compact E(3)-equivariant potential, achieves high accuracy with reduced computational cost and faster inference speed compared to other methods on materials modeling benchmarks.

AI-generated summary

Foundation models for materials modeling are advancing quickly, but their training remains expensive, often placing state-of-the-art methods out of reach for many research groups. We introduce Nequix, a compact E(3)-equivariant potential that pairs a simplified NequIP design with modern training practices, including equivariant root-mean-square layer normalization and the Muon optimizer, to retain accuracy while substantially reducing compute requirements. Built in JAX, Nequix has 700K parameters and was trained in 500 A100-GPU hours. On the Matbench-Discovery and MDR Phonon benchmarks, Nequix ranks third overall while requiring less than one quarter of the training cost of most other methods, and it delivers an order-of-magnitude faster inference speed than the current top-ranked model. We release model weights and fully reproducible codebase at https://github.com/atomicarchitects/nequix

Community

Paper author Paper submitter

Introducing Nequix MP, a new foundation model for materials.

Example of using Nequix to simulate the dissolution of a NaCl nanocrystal in water:

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2508.16067 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2508.16067 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2508.16067 in a Space README.md to link it from this page.

Collections including this paper 1