| datasets: | |
| - dtruong46me/mathqa-python | |
| language: | |
| - en | |
| base_model: | |
| - meta-llama/Llama-3.2-3B-Instruct | |
| Trained 19k data points of the above mentioned dataset, with 512 max sequence length, along with QLORA config, for 3 epochs. |
| datasets: | |
| - dtruong46me/mathqa-python | |
| language: | |
| - en | |
| base_model: | |
| - meta-llama/Llama-3.2-3B-Instruct | |
| Trained 19k data points of the above mentioned dataset, with 512 max sequence length, along with QLORA config, for 3 epochs. |