DeepModeling

Define the future of scientific computing together

DeepFlame is an open-source combustion fluid dynamics platform developed for the AI for Science era [1-3], aimed at overcoming the longstanding challenges of applying traditional Computational Fluid Dynamics (CFD) in the field of combustion. Since its release, DeepFlame has garnered significant interest and attention from both academia and industry, attracting a group of outstanding developers and users. This ongoing support has provided continuous momentum for DeepFlame's development and has been a crucial driving force in its application to real-world scenarios.

In recent years, research on aerosol or spray detonation propulsion using liquid fuels has been experiencing a resurgence, and supersonic combustion, such as detonation combustion in gas-liquid two-phase systems, has been gaining increasing attention. The DeepFlame team has captured these trending topics and, based on the OpenFOAM open-source library, coupled the Euler-Lagrange model into the high-speed flow solver dfHighSpeedFoam and the low-speed flow solver dfLowMachFoam. This enables the solvers to simulate two-phase reactive flows, thereby expanding the application scenarios of DeepFlame.

Read more »

On August 19, 2024, Shuqi Lu and Zhifeng Gao from DP Technology, in collaboration with Professor Di He from Peking University, published a research article titled "Data-driven quantum chemical property prediction leveraging 3D conformations with Uni-Mol+" in Nature Communications. This study introduces Uni-Mol+, a deep learning algorithm that innovatively utilizes neural networks to iteratively optimize initial 3D molecular conformations, enabling precise prediction of quantum chemical properties. By progressively approximating Density Functional Theory (DFT) equilibrium conformations, Uni-Mol+ significantly enhances prediction accuracy, providing a powerful tool for high-throughput screening and new material design.

Read more »

In 2023, the AI for Science Institute, Beijing team introduced the v1 version of the DeePTB method, which was published on arXiv and joined the DeepModeling community. After nearly a year of rigorous peer review, it was officially published on August 8, 2024, in the international academic journal Nature Communications with the title "Deep learning tight-binding approach for large-scale electronic simulations at finite temperatures with ab initio accuracy" [1], DOI: 10.1038/s41467-024-51006-4.

The v1 version of DeePTB focuses on developing a deep learning-based method for constructing tight-binding (TB) model Hamiltonians. Based on the Slater-Koster TB parameterization, it builds first-principles equivalent electronic models using a minimal-basis set. By incorporating the localized chemical environment of atoms/bonds into the TB parameters, DeePTB achieves TB Hamiltonian predictions with near-DFT accuracy across a range of key material systems. By integrating with software like DeePMD-kit and TBPLaS, it enables the calculation and simulation of electronic structure properties and photoelectric responses in large-scale systems of up to millions of atoms in finite-temperature ensembles. This groundbreaking advancement has garnered widespread attention in the academic community and was ultimately published in Nature Communications. For more technical details on the DeePTB version, interested readers can refer to the DeePTB article in Nat Commun 15, 6772 (2024).

Read more »

On February 13, 2024, DP Technology published a cover article in JACS Au titled "Node-Aligned Graph-to-Graph: Elevating Template-free Deep Learning Approaches in Single-Step Retrosynthesis." This study developed a Transformer-based Node-Aligned Graph-to-Graph (NAG2G) model, significantly improving the accuracy of single-step retrosynthesis prediction.

The NAG2G model integrates 2D molecular graph and 3D conformation information, achieving atom mapping between products and reactants through node alignment. This approach overcomes the limitations of traditional template-based methods.

This groundbreaking achievement provides a powerful tool for chemical synthesis design, advancing the field of retrosynthesis and setting a new standard for single-step prediction methodologies.

Read more »

On the journey toward developing a Large Atomic Model (LAM), the core Deep Potential development team has launched the OpenLAM initiative for the community. OpenLAM’s slogan is "Conquer the Periodic Table!" The project aims to create an open-source ecosystem centered on microscale large models, providing new infrastructure for microscopic scientific research and driving transformative advancements in microscale industrial design across fields such as materials, energy, and biopharmaceuticals.

Read more »

On June 17, 2024, researchers Xi Cheng and Liuqing Wen from the Shanghai Institute of Materia Medica, Chinese Academy of Sciences, in collaboration with Dingyan Wang from Lingang Laboratory, published a study titled "Highly accurate carbohydrate-binding site prediction with DeepGlycanSite" in Nature Communications [1]. This research introduces DeepGlycanSite, a deep learning-based algorithm for predicting carbohydrate-binding sites on protein structures with high precision. By leveraging Uni-Mol, DeepGlycanSite achieves exceptional accuracy in identifying carbohydrate-binding sites, providing a powerful tool for studying carbohydrate-protein interactions.

Read more »

Pre-trained models are sweeping through the AI field by extracting representative information from large-scale unlabeled data and then performing supervised learning on small-scale labeled downstream tasks, becoming the de facto solution in many application scenarios. In drug design, there is still no consensus on the "best way to represent molecules." In the field of materials chemistry, predicting molecular properties is equally important. Mainstream molecular pre-training models typically start from one-dimensional sequences or two-dimensional graph structures, but molecular structures are inherently represented in three-dimensional space. Therefore, directly constructing pre-trained models from three-dimensional information to achieve better molecular representations has become an important and meaningful problem. To further promote research on molecular representation and pre-trained models, Uni-Mol will join the DeepModeling community to work with community developers to advance the development of a three-dimensional molecular representation pre-training framework.

Read more »

On the journey toward developing a Large Atomic Model (LAM), the core Deep Potential development team has launched the OpenLAM initiative for the community. OpenLAM’s slogan is "Conquer the Periodic Table!" The project aims to create an open-source ecosystem centered on microscale large models, providing new infrastructure for microscopic scientific research and driving transformative advancements in microscale industrial design across fields such as materials, energy, and biopharmaceuticals.

Read more »

From the development of software ecosystems in fields such as electronic structure calculations and molecular dynamics, to the systematic evaluation of large models like OpenLAM, and gradually addressing scientific and industrial R&D problems such as biological simulations, drug design, and molecular property prediction, a series of AI4Science scientific computing software and models are rapidly advancing. This progress is closely linked to better research infrastructure, with the Dflow project being a key component.

Read more »

The tight-binding model based on second quantization is a widely used theoretical model in condensed matter physics. In this model:

  • Atoms in a lattice are represented as discrete points with a specific number of electrons.
  • Each electron occupies a corresponding atomic orbital.
  • Using creation and annihilation operators, electron transitions between atomic orbitals are described in the second quantization framework.
  • The Hamiltonian comprises:
    • Transition terms between atomic orbitals.
    • Energy levels of the orbitals.

Project on GitHub: https://github.com/deepmodeling/tbplas

Read more »
0%