Opportunity
Ab initio calculations based on density functional theory (DFT) are widely applied to simulate the physical properties of materials, proteins, and drugs. However, the high cost of this method leads to difficulties in simulating large systems with more than 100 atoms and long-time molecular dynamics trajectories. Classical molecular dynamic is very efficient in studying the properties of large systems over a long-time scale. Though, the interatomic potentials in MD are usually obtained by fitting to the physical and chemical properties from limited experimental or theoretical data. Consequently, the accuracy of interatomic potential in predicting energy and force is rather limited. An accurate interatomic potential thus can greatly improve the efficiency of large-scale simulations, guiding for searching new materials and designing new functionalities of current materials.
Technology
In this invention, a deep-learning framework based on atomic graph attention networks (AGAT) is proposed. The AGAT model is trained to represent the energy and atomic forces from accurate DFT simulations, and then the well-trained model is deployed to simulate larger systems with a much longer duration at the DFT accuracy. Compared to conventional machine-learning interatomic potentials, the current AGAT model provides an end-to-end solution for accurate and efficient training and predicting the energy, force, and stress of large atomic systems.
Advantages
- AGAT shows a high accuracy.
- AGAT is extendable.
- AGAT provides an end-to-end solution to obtain energies and forces of any given systems.
Applications
- Large-scale molecular simulations
- Efficient electrocatalyst design
- Accurate prediction of mechanical properties
- Optimization of crystal structural models efficiently
- Prediction of defect properties in materials