Align public training codebase with local training setup
Browse files
README.md
CHANGED
|
@@ -52,6 +52,8 @@ That dataset repo contains:
|
|
| 52 |
```bash
|
| 53 |
git clone https://huggingface.co/alegendaryfish/CodonTranslator
|
| 54 |
cd CodonTranslator
|
|
|
|
|
|
|
| 55 |
pip install -r requirements.txt
|
| 56 |
pip install -e .
|
| 57 |
```
|
|
@@ -88,6 +90,11 @@ python train.py \
|
|
| 88 |
--weight_decay 1e-4
|
| 89 |
```
|
| 90 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 91 |
### Sample
|
| 92 |
|
| 93 |
```bash
|
|
|
|
| 52 |
```bash
|
| 53 |
git clone https://huggingface.co/alegendaryfish/CodonTranslator
|
| 54 |
cd CodonTranslator
|
| 55 |
+
conda env create -f environment.yml
|
| 56 |
+
conda activate codontranslator
|
| 57 |
pip install -r requirements.txt
|
| 58 |
pip install -e .
|
| 59 |
```
|
|
|
|
| 90 |
--weight_decay 1e-4
|
| 91 |
```
|
| 92 |
|
| 93 |
+
The included Slurm launchers use the same training flags as the local single-node H200 workflow:
|
| 94 |
+
|
| 95 |
+
- `slurm/train_v3_h200_8x_single.sbatch`
|
| 96 |
+
- `slurm/submit_train_v3_h200_8x_chain.sh`
|
| 97 |
+
|
| 98 |
### Sample
|
| 99 |
|
| 100 |
```bash
|