Materials Learning Algorithms

MALA (Materials Learning Algorithms) is a data-driven framework to accelerate electronic structure calculations based on machine learning. Its purpose is to construct neural-network surrogate models for bypassing computationally expensive steps in state-of-the-art density functional simulations.


Cite this software

What Materials Learning Algorithms can do for you


MALA (Materials Learning Algorithms) is a software package for building machine-learning (ML) models that replace density functional theory (DFT) calculations. DFT is the most popular method for simulating materials on a quantum level and predicting materials properties.

While DFT is very efficient compared to other quantum mechanical simulation techniques, it still scales cubically with number of atoms and temperature simulated, making large scale simulation at arbitrary temperatures computationally challenging.

MALA is at the forefront of efforts leveraging ML to accelerate DFT calculations. It allows users to build ML models that give full access to the electronic structure and observables such as the total free energy of the system. MALA scales well with increasing system size and temperature.

MALA is designed as a modular and open-source python package. It enables users to perform the entire modeling toolchain using only a few lines of code.

MALA is jointly developed by the Sandia National Laboratories (SNL) and the Center for Advanced Systems Understanding (CASUS), an institute of the Helmholtz-Zentrum Dresden-Rossendorf. See Contributing for contributing code to the repository.

This repository is structured as follows:

├── examples : contains useful examples to get you started with the package
├── install : contains scripts for setting up this package on your machine
├── mala : the source code itself
├── test : test scripts used during development, will hold tests for CI in the future
└── docs : Sphinx documentation folder


Please refer to Installation of MALA.


You can familiarize yourself with the usage of this package by running the examples in the example/ folder.


Founding Institutions

Contributing Institutions


Scientific Supervision

  • Attila Cangi (CASUS)
  • Siva Rajamanickam (SNL)

Core Developers

  • Austin Ellis (ORNL)
  • Lenz Fiedler (CASUS)
  • Daniel Kotik (CASUS)
  • Normand Modine (SNL)
  • Vladyslav Oles (ORNL)
  • Gabriel Popoola (SNL)
  • Aidan Thompson (SNL)
  • Steve Schmerler (HZDR)
  • Adam Stephens (SNL)


  • Sneha Verma (CASUS)
  • Parvez Mohammed (CASUS)
  • Nils Hoffmann (CASUS)
  • Omar Faruk (CASUS)
  • Somashekhar Kulkarni (CASUS)

Citing MALA

If you publish work which uses or mentions MALA, please cite the following paper:

J. A. Ellis, L. Fiedler, G. A. Popoola, N. A. Modine, J. A. Stephens, A. P. Thompson,
A. Cangi, S. Rajamanickam (2021). Accelerating Finite-temperature
Kohn-Sham Density Functional Theory with Deep Neural Networks.
Phys. Rev. B 104, 035120 (2021)

alongside this repository.

Learn more about MALA here.

Logo of Materials Learning Algorithms
Programming languages
  • Python 96%
  • Fortran 4%
  • BSD-3-Clause
  • Open Access
</>Source code

Participating organisations

Center for Advanced Systems Understanding
Helmholtz-Zentrum Dresden-Rossendorf
Sandia National Laboratories



Attila Cangi
Scientific supervision
Helmholtz-Zentrum Dresden-Rossendorf
Lenz Fiedler
Core developer
Helmholtz-Zentrum Dresden-Rossendorf
Daniel Kotik
Core developer
Helmholtz-Zentrum Dresden-Rossendorf
Normand Modine
Core developer
Sandia National Laboratories
Siva Rajamanickam
Scientific supervision
Sandia National Laboratories
Steve Schmerler
Core developer
Helmholtz-Zentrum Dresden-Rossendorf
Aidan Thomson
Core developer
Sandia National Laboratories
J. Adam Stephens
Core developer
Sandia National Laboratories
Vladyslav Oles
Core developer
Oak Ridge National Laboratory
Gabriel Popoola
Core developer
Elder Research
J. Austin Ellis
Core developer
Oak Ridge National Laboratory