Instructions to use HanningZhang/MathReward-2b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use HanningZhang/MathReward-2b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("HanningZhang/MathReward-2b", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 443b6a5684c1aa3b1a79435f5a31089e0f8b0e36e27cfd91631dbce363538579
- Size of remote file:
- 555 Bytes
- SHA256:
- db82f8bd9b25d14f9c788e6bde64de84d42f1c2538f1c245ba6cb3e872d14b18
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.