DeepModeling

Define the future of scientific computing together

In 2023, the AI for Science Institute, Beijing team introduced the v1 version of the DeePTB method, which was published on arXiv and joined the DeepModeling community. After nearly a year of rigorous peer review, it was officially published on August 8, 2024, in the international academic journal Nature Communications with the title "Deep learning tight-binding approach for large-scale electronic simulations at finite temperatures with ab initio accuracy" [1], DOI: 10.1038/s41467-024-51006-4.

The v1 version of DeePTB focuses on developing a deep learning-based method for constructing tight-binding (TB) model Hamiltonians. Based on the Slater-Koster TB parameterization, it builds first-principles equivalent electronic models using a minimal-basis set. By incorporating the localized chemical environment of atoms/bonds into the TB parameters, DeePTB achieves TB Hamiltonian predictions with near-DFT accuracy across a range of key material systems. By integrating with software like DeePMD-kit and TBPLaS, it enables the calculation and simulation of electronic structure properties and photoelectric responses in large-scale systems of up to millions of atoms in finite-temperature ensembles. This groundbreaking advancement has garnered widespread attention in the academic community and was ultimately published in Nature Communications. For more technical details on the DeePTB version, interested readers can refer to the DeePTB article in Nat Commun 15, 6772 (2024).

Read more »

On February 13, 2024, DP Technology published a cover article in JACS Au titled "Node-Aligned Graph-to-Graph: Elevating Template-free Deep Learning Approaches in Single-Step Retrosynthesis." This study developed a Transformer-based Node-Aligned Graph-to-Graph (NAG2G) model, significantly improving the accuracy of single-step retrosynthesis prediction.

The NAG2G model integrates 2D molecular graph and 3D conformation information, achieving atom mapping between products and reactants through node alignment. This approach overcomes the limitations of traditional template-based methods.

This groundbreaking achievement provides a powerful tool for chemical synthesis design, advancing the field of retrosynthesis and setting a new standard for single-step prediction methodologies.

Read more »

On the journey toward developing a Large Atomic Model (LAM), the core Deep Potential development team has launched the OpenLAM initiative for the community. OpenLAM’s slogan is "Conquer the Periodic Table!" The project aims to create an open-source ecosystem centered on microscale large models, providing new infrastructure for microscopic scientific research and driving transformative advancements in microscale industrial design across fields such as materials, energy, and biopharmaceuticals.

Read more »

On June 17, 2024, researchers Xi Cheng and Liuqing Wen from the Shanghai Institute of Materia Medica, Chinese Academy of Sciences, in collaboration with Dingyan Wang from Lingang Laboratory, published a study titled "Highly accurate carbohydrate-binding site prediction with DeepGlycanSite" in Nature Communications [1]. This research introduces DeepGlycanSite, a deep learning-based algorithm for predicting carbohydrate-binding sites on protein structures with high precision. By leveraging Uni-Mol, DeepGlycanSite achieves exceptional accuracy in identifying carbohydrate-binding sites, providing a powerful tool for studying carbohydrate-protein interactions.

Read more »

Pre-trained models are sweeping through the AI field by extracting representative information from large-scale unlabeled data and then performing supervised learning on small-scale labeled downstream tasks, becoming the de facto solution in many application scenarios. In drug design, there is still no consensus on the "best way to represent molecules." In the field of materials chemistry, predicting molecular properties is equally important. Mainstream molecular pre-training models typically start from one-dimensional sequences or two-dimensional graph structures, but molecular structures are inherently represented in three-dimensional space. Therefore, directly constructing pre-trained models from three-dimensional information to achieve better molecular representations has become an important and meaningful problem. To further promote research on molecular representation and pre-trained models, Uni-Mol will join the DeepModeling community to work with community developers to advance the development of a three-dimensional molecular representation pre-training framework.

Read more »

On the journey toward developing a Large Atomic Model (LAM), the core Deep Potential development team has launched the OpenLAM initiative for the community. OpenLAM’s slogan is "Conquer the Periodic Table!" The project aims to create an open-source ecosystem centered on microscale large models, providing new infrastructure for microscopic scientific research and driving transformative advancements in microscale industrial design across fields such as materials, energy, and biopharmaceuticals.

Read more »

From the development of software ecosystems in fields such as electronic structure calculations and molecular dynamics, to the systematic evaluation of large models like OpenLAM, and gradually addressing scientific and industrial R&D problems such as biological simulations, drug design, and molecular property prediction, a series of AI4Science scientific computing software and models are rapidly advancing. This progress is closely linked to better research infrastructure, with the Dflow project being a key component.

Read more »

The tight-binding model based on second quantization is a widely used theoretical model in condensed matter physics. In this model:

  • Atoms in a lattice are represented as discrete points with a specific number of electrons.
  • Each electron occupies a corresponding atomic orbital.
  • Using creation and annihilation operators, electron transitions between atomic orbitals are described in the second quantization framework.
  • The Hamiltonian comprises:
    • Transition terms between atomic orbitals.
    • Energy levels of the orbitals.

Project on GitHub: https://github.com/deepmodeling/tbplas

Read more »

The slogan for OpenLAM is "Conquer the Periodic Table!" We hope to provide a new infrastructure for microscale scientific research and drive the transformation of microscale industrial design in fields such as materials, energy, and biopharmaceuticals by establishing an open-source ecosystem around large microscale models. Relevant models, data, and workflows will be consolidated around the AIS Square; related software development will take place in the DeepModeling open-source community. At the same time, we welcome open interaction from different communities in model development, data sharing, evaluation, and testing.

See AIS Square for more details.

Read more »

"The integration of machine learning and physical modeling is revolutionizing the paradigm of scientific research. People aiming to push the boundaries of science and solve challenging problems through computational modeling are coming together in unprecedented ways." Recently, the DeepModeling open-source community has welcomed a new member in the field of macro-scale computation. To further advance the development of the JAX-FEM project, a differentiable finite element method library, JAX-FEM will join the DeepModeling community. Together with developers and users in the community, it aims to expand the frontiers of finite element methods in the AI4Science era.

Community project homepage:
https://github.com/deepmodeling/jax-fem

Read more »
0%