Skip to content

Commit e7c6f8a

Browse files
committed
update readme
1 parent 25fc111 commit e7c6f8a

File tree

1 file changed

+42
-0
lines changed

1 file changed

+42
-0
lines changed

README.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,10 +19,52 @@
1919

2020
`torch-molecule` is a package under active development that facilitates molecular discovery through deep learning, featuring a user-friendly, `sklearn`-style interface. It includes model checkpoints for efficient deployment and benchmarking across a range of molecular tasks. Currently, the package focuses on three main components:
2121

22+
Here are the three Markdown tables, each showing only the models you have already supported for each category:
23+
24+
Below are three Markdown tables that list only the supported models. Each table includes a column for the model name and a column for its reference. Since specific references were not provided, a placeholder "[Reference unknown]" is used.
25+
2226
1. **Predictive Models**: Done: GREA, SGIR, IRM, GIN/GCN w/ virtual, DIR. TODO: SMILES-based LSTM/Transformers, more
2327
2. **Generative Models**: Done: Graph DiT, GraphGA, DiGress. TODO:, GDSS, more
2428
3. **Representation Models**: Done: MoAMa, AttrMasking, ContextPred, EdgePred. Many pretrained models from HF. TODO: checkpoints, more
2529

30+
### Predictive Models
31+
32+
| Model | Reference |
33+
|----------------------|---------------------|
34+
| SGIR | [Semi-Supervised Graph Imbalanced Regression. KDD 2023](https://dl.acm.org/doi/10.1145/3580305.3599497) |
35+
| GREA | [Graph Rationalization with Environment-based Augmentations. KDD 2022](https://dl.acm.org/doi/abs/10.1145/3534678.3539347) |
36+
| DIR | [Discovering Invariant Rationales for Graph Neural Networks. ICLR 2022](https://arxiv.org/abs/2201.12872) |
37+
| SSR | [SizeShiftReg: a Regularization Method for Improving Size-Generalization in Graph Neural Networks. NeurIPS 2022](https://arxiv.org/abs/2206.07096) |
38+
| IRM | [Invariant Risk Minimization](https://arxiv.org/abs/1907.02893) |
39+
| RPGNN | [Relational Pooling for Graph Representations. ICLR 2019](https://arxiv.org/abs/1903.02541) |
40+
| GNNs | [Graph Convolutional Networks. ICLR 2017](https://arxiv.org/abs/1609.02907) and [Graph Isomorphism Network. ICLR 2019](https://arxiv.org/abs/1810.00826) |
41+
| Transformer (SMILES) | [Attention is All You Need. NeurIPS 2017](https://arxiv.org/abs/1706.03762) based on SMILES strings |
42+
| LSTM (SMILES) | [Long short-term memory (Neural Computation 1997)](https://ieeexplore.ieee.org/abstract/document/6795963) based on SMILES strings |
43+
44+
### Generative Models
45+
46+
| Model | Reference |
47+
|------------|---------------------|
48+
| Graph DiT | [Graph Diffusion Transformers for Multi-Conditional Molecular Generation. NeurIPS 2024](https://openreview.net/forum?id=cfrDLD1wfO) |
49+
| DiGress | [DiGress: Discrete Denoising Diffusion for Graph Generation. ICLR 2023](https://openreview.net/forum?id=UaAD-Nu86WX) |
50+
| GDSS | [Score-based Generative Modeling of Graphs via the System of Stochastic Differential Equations. ICML 2022](https://proceedings.mlr.press/v162/jo22a/jo22a.pdf) |
51+
| MolGPT | [MolGPT: Molecular Generation Using a Transformer-Decoder Model. Journal of Chemical Information and Modeling 2021](https://pubs.acs.org/doi/10.1021/acs.jcim.1c00600) |
52+
| GraphGA | [A Graph-Based Genetic Algorithm and Its Application to the Multiobjective Evolution of Median Molecules. Journal of Chemical Information and Computer Sciences 2004](https://pubs.acs.org/doi/10.1021/ci034290p) |
53+
54+
### Representation Models
55+
56+
### Pretraining Methods
57+
58+
| Model | Reference |
59+
|--------------|---------------------|
60+
| MoAMa | [Motif-aware Attribute Masking for Molecular Graph Pre-training. LoG 2024](https://arxiv.org/abs/2309.04589) |
61+
| AttrMasking | [Strategies for Pre-training Graph Neural Networks. ICLR 2020](https://arxiv.org/abs/1905.12265) |
62+
| ContextPred | [Strategies for Pre-training Graph Neural Networks. ICLR 2020](https://arxiv.org/abs/1905.12265) |
63+
| EdgePred | [Strategies for Pre-training Graph Neural Networks. ICLR 2020](https://arxiv.org/abs/1905.12265) |
64+
| InfoGraph | [InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization. ICLR 2020](https://arxiv.org/abs/1908.01000) |
65+
| Supervised | Supervised pretraining |
66+
| Pretrained | More than ten pretrained models from [Hugging Face](https://huggingface.co) |
67+
2668
> **Note**: This project is in active development, and features may change.
2769
2870
## Installation

0 commit comments

Comments
 (0)