Spaces:
Sleeping
Sleeping
Update README.md
Browse files
README.md
CHANGED
|
@@ -15,7 +15,7 @@ This repository hosts a Hugging Face Space that provides an API for submitting m
|
|
| 15 |
Here **XGBoost** models are trained on the Tox21 dataset, and the trained models are provided for inference. The input to the model is a **SMILES** string of the small molecule, and the output are 12 numeric values for each of the toxic effects of the Tox21 dataset.
|
| 16 |
|
| 17 |
|
| 18 |
-
**Important:** For leaderboard submission, your Space needs to include training code. The file `train.py` should train the model using the config specified inside the `config/` folder and save the final model parameters into a file inside the `checkpoints/` folder. The model should be trained using the [Tox21_dataset](https://huggingface.co/datasets/
|
| 19 |
```python
|
| 20 |
from datasets import load_dataset
|
| 21 |
ds = load_dataset("ml-jku/tox21", token=token)
|
|
@@ -62,7 +62,7 @@ That’s it, your model will be available as an API endpoint for the Tox21 Leade
|
|
| 62 |
To run (and train) the XGBoost, clone the repository and install dependencies:
|
| 63 |
|
| 64 |
```bash
|
| 65 |
-
git clone https://huggingface.co/spaces/
|
| 66 |
cd tox_21_xgb_classifier
|
| 67 |
|
| 68 |
conda create -n tox21_xgb_cls python=3.11
|
|
|
|
| 15 |
Here **XGBoost** models are trained on the Tox21 dataset, and the trained models are provided for inference. The input to the model is a **SMILES** string of the small molecule, and the output are 12 numeric values for each of the toxic effects of the Tox21 dataset.
|
| 16 |
|
| 17 |
|
| 18 |
+
**Important:** For leaderboard submission, your Space needs to include training code. The file `train.py` should train the model using the config specified inside the `config/` folder and save the final model parameters into a file inside the `checkpoints/` folder. The model should be trained using the [Tox21_dataset](https://huggingface.co/datasets/ml-jku/tox21) provided on Hugging Face. The datasets can be loaded like this:
|
| 19 |
```python
|
| 20 |
from datasets import load_dataset
|
| 21 |
ds = load_dataset("ml-jku/tox21", token=token)
|
|
|
|
| 62 |
To run (and train) the XGBoost, clone the repository and install dependencies:
|
| 63 |
|
| 64 |
```bash
|
| 65 |
+
git clone https://huggingface.co/spaces/ml-jku/tox21_xgboost_classifier
|
| 66 |
cd tox_21_xgb_classifier
|
| 67 |
|
| 68 |
conda create -n tox21_xgb_cls python=3.11
|