File size: 1,185 Bytes
b46c3f1 d494bd5 b46c3f1 d494bd5 b46c3f1 d494bd5 691a4e7 b46c3f1 279a861 1962e9f e375883 bc37801 9afbe7b 47badb3 c6c420f b46c3f1 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | ---
language:
- en
---
# Model Card for LiLiZhou/roberta-large-retacred
## Model Description
## Climate performance model card
| [LiLiZhou/roberta-large-retacred](https://huggingface.co/LiLiZhou/roberta-large-retacred) | |
|--------------------------------------------------------------------------|----------------|
| 1. Is the resulting model publicly available? | Yes |
| 2. How much time does the training of the final model take? | 1h 34m 16s |
| 3. How much time did all experiments take (incl. hyperparameter search)? | 11d 18h 36m |
| 4. What was the power of GPU and CPU? | 0.26kw |
| 5. At which geo location were the computations performed? | Denmark |
| 6. What was the energy mix at the geo location? | 155.7 to 239 gCO2eq/kWh |
| 7. How much CO2eq was emitted to train the final model? | 63.55674 to 97.5598 kg |
| 8. How much CO2eq was emitted for all experiments? | N/A |
| 9. What is the average CO2eq emission for the inference of one sample? | N/A |
|