Posted onInOpenLAMWord count in article: 653Reading time ≈2 mins.
On the journey toward developing a Large Atomic Model (LAM), the core Deep Potential development team has launched the OpenLAM initiative for the community. OpenLAM’s slogan is "Conquer the Periodic Table!" The project aims to create an open-source ecosystem centered on microscale large models, providing new infrastructure for microscopic scientific research and driving transformative advancements in microscale industrial design across fields such as materials, energy, and biopharmaceuticals.
Posted onInDeepFlameWord count in article: 1.5kReading time ≈5 mins.
DeepFlame is an open-source combustion fluid dynamics platform developed for the AI for Science era [1-3], aimed at overcoming the longstanding challenges of applying traditional Computational Fluid Dynamics (CFD) in the field of combustion. Since its release, DeepFlame has garnered significant interest and attention from both academia and industry, attracting a group of outstanding developers and users. This ongoing support has provided continuous momentum for DeepFlame's development and has been a crucial driving force in its application to real-world scenarios.
In recent years, research on aerosol or spray detonation propulsion using liquid fuels has been experiencing a resurgence, and supersonic combustion, such as detonation combustion in gas-liquid two-phase systems, has been gaining increasing attention. The DeepFlame team has captured these trending topics and, based on the OpenFOAM open-source library, coupled the Euler-Lagrange model into the high-speed flow solver dfHighSpeedFoam and the low-speed flow solver dfLowMachFoam. This enables the solvers to simulate two-phase reactive flows, thereby expanding the application scenarios of DeepFlame.
Posted onInDeePTBWord count in article: 1.8kReading time ≈6 mins.
In 2023, the AI for Science Institute, Beijing team introduced the v1 version of the DeePTB method, which was published on arXiv and joined the DeepModeling community. After nearly a year of rigorous peer review, it was officially published on August 8, 2024, in the international academic journal Nature Communications with the title "Deep learning tight-binding approach for large-scale electronic simulations at finite temperatures with ab initio accuracy" [1], DOI: 10.1038/s41467-024-51006-4.
The v1 version of DeePTB focuses on developing a deep learning-based method for constructing tight-binding (TB) model Hamiltonians. Based on the Slater-Koster TB parameterization, it builds first-principles equivalent electronic models using a minimal-basis set. By incorporating the localized chemical environment of atoms/bonds into the TB parameters, DeePTB achieves TB Hamiltonian predictions with near-DFT accuracy across a range of key material systems. By integrating with software like DeePMD-kit and TBPLaS, it enables the calculation and simulation of electronic structure properties and photoelectric responses in large-scale systems of up to millions of atoms in finite-temperature ensembles. This groundbreaking advancement has garnered widespread attention in the academic community and was ultimately published in Nature Communications. For more technical details on the DeePTB version, interested readers can refer to the DeePTB article in Nat Commun 15, 6772 (2024).
Posted onInOpenLAMWord count in article: 1.1kReading time ≈4 mins.
On the journey toward developing a Large Atomic Model (LAM), the core Deep Potential development team has launched the OpenLAM initiative for the community. OpenLAM’s slogan is "Conquer the Periodic Table!" The project aims to create an open-source ecosystem centered on microscale large models, providing new infrastructure for microscopic scientific research and driving transformative advancements in microscale industrial design across fields such as materials, energy, and biopharmaceuticals.
Posted onInUni-MolWord count in article: 1.1kReading time ≈4 mins.
Pre-trained models are sweeping through the AI field by extracting representative information from large-scale unlabeled data and then performing supervised learning on small-scale labeled downstream tasks, becoming the de facto solution in many application scenarios. In drug design, there is still no consensus on the "best way to represent molecules." In the field of materials chemistry, predicting molecular properties is equally important. Mainstream molecular pre-training models typically start from one-dimensional sequences or two-dimensional graph structures, but molecular structures are inherently represented in three-dimensional space. Therefore, directly constructing pre-trained models from three-dimensional information to achieve better molecular representations has become an important and meaningful problem. To further promote research on molecular representation and pre-trained models, Uni-Mol will join the DeepModeling community to work with community developers to advance the development of a three-dimensional molecular representation pre-training framework.
Posted onInOpenLAMWord count in article: 721Reading time ≈3 mins.
On the journey toward developing a Large Atomic Model (LAM), the core Deep Potential development team has launched the OpenLAM initiative for the community. OpenLAM’s slogan is "Conquer the Periodic Table!" The project aims to create an open-source ecosystem centered on microscale large models, providing new infrastructure for microscopic scientific research and driving transformative advancements in microscale industrial design across fields such as materials, energy, and biopharmaceuticals.
Posted onInDflowWord count in article: 734Reading time ≈3 mins.
From the development of software ecosystems in fields such as electronic structure calculations and molecular dynamics, to the systematic evaluation of large models like OpenLAM, and gradually addressing scientific and industrial R&D problems such as biological simulations, drug design, and molecular property prediction, a series of AI4Science scientific computing software and models are rapidly advancing. This progress is closely linked to better research infrastructure, with the Dflow project being a key component.
Posted onInOpenLAMWord count in article: 711Reading time ≈3 mins.
The slogan for OpenLAM is "Conquer the Periodic Table!" We hope to provide a new infrastructure for microscale scientific research and drive the transformation of microscale industrial design in fields such as materials, energy, and biopharmaceuticals by establishing an open-source ecosystem around large microscale models. Relevant models, data, and workflows will be consolidated around the AIS Square; related software development will take place in the DeepModeling open-source community. At the same time, we welcome open interaction from different communities in model development, data sharing, evaluation, and testing.
Posted onInOpenLAMWord count in article: 1.7kReading time ≈6 mins.
The slogan for OpenLAM is "Conquer the Periodic Table!" We hope to provide a new infrastructure for microscale scientific research and drive the transformation of microscale industrial design in fields such as materials, energy, and biopharmaceuticals by establishing an open-source ecosystem around large microscale models. Relevant models, data, and workflows will be consolidated around the AIS Square; related software development will take place in the DeepModeling open-source community. At the same time, we welcome open interaction from different communities in model development, data sharing, evaluation, and testing.
Posted onInOpenLAMWord count in article: 625Reading time ≈2 mins.
Peter Thiel once said, "We wanted flying cars, instead we got 140 characters (Twitter)." Over the past decade, we have made great strides at the bit level (internet), but progress at the atomic level (cutting-edge technology) has been relatively slow.
The accumulation of linguistic data has propelled the development of machine learning and ultimately led to the emergence of Large Language Models (LLMs). With the push from AI, progress at the atomic level is also accelerating. Methods like Deep Potential, by learning quantum mechanical data, have increased the space-time scale of microscopic simulations by several orders of magnitude and have made significant progress in fields like drug design, material design, and chemical engineering.
The accumulation of quantum mechanical data is gradually covering the entire periodic table, and the Deep Potential team has also begun the practice of the DPA pre-training model. Analogous to the progress of LLMs, we are on the eve of the emergence of a general Large Atom Model (LAM). At the same time, we believe that open-source and openness will play an increasingly important role in the development of LAM.
Against this backdrop, the core developer team of Deep Potential is launching the OpenLAM Initiative to the community. This plan is still in the draft stage and is set to officially start on January 1, 2024. We warmly and openly welcome opinions and support from all parties.
The slogan for OpenLAM is "Conquer the Periodic Table!" We hope to provide a new infrastructure for microscale scientific research and drive the transformation of microscale industrial design in fields such as materials, energy, and biopharmaceuticals by establishing an open-source ecosystem around large microscale models. Relevant models, data, and workflows will be consolidated around the AIS Square; related software development will take place in the DeepModeling open-source community. At the same time, we welcome open interaction from different communities in model development, data sharing, evaluation, and testing.
OpenLAM's goals for the next three years are: In 2024, to effectively cover the periodic table with first-principles data and achieve a universal property learning capability; in 2025, to combine large-scale experimental characterization data and literature data to achieve a universal cross-modal capability; and in 2026, to realize a target-oriented atomic scale universal generation and planning capability. Ultimately, within 5-10 years, we aim to achieve "Large Atom Embodied Intelligence" for atomic-scale intelligent scientific discovery and synthetic design.
OpenLAM's specific plans for 2024 include:
Model Update and Evaluation Report Release:
Starting from January 1, 2024, driven by the Deep Potential team, with participation from all LAM developers welcomed.
Every three months, a major model version update will take place, with updates that may include model architecture, related data, training strategies, and evaluation test criteria.
AIS Cup Competition:
Initiated by the Deep Potential team and supported by the Bohrium Cloud Platform, starting in March 2024 and concluding at the end of the year;
The goal is to promote the creation of a benchmarking system focused on several application-oriented metrics.
Domain Data Contribution:
Seeking collaboration with domain developers to establish "LAM-ready" datasets for pre-training and evaluation.
Domain datasets for iterative training of the latest models will be updated every three months.
Domain Application and Evaluation Workflow Contribution:
The domain application and evaluation workflows will be updated and released every three months.
Education and Training:
Planning a series of educational and training events aimed at LAM developers, domain developers, and users to encourage advancement in the field.
How to Contact Us:
Direct discussions are encouraged in the DeepModeling community.
For more complex inquiries, please contact the project lead, Han Wang (王涵, wang_han@iapcm.ac.cn), Linfeng Zhang (张林峰, zhanglf@aisi.ac.cn), for the new future of Science!