Nottybro commited on
Commit
c043a9f
·
verified ·
1 Parent(s): b9b3ba6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -14,8 +14,8 @@ Architecture: A 24-layer, 1280-embedding dimension Transformer.
14
  Training: Trained on the C4 dataset for over 500,000 steps (~8 hours on a TPU v3-8).
15
  Frameworks: Built with JAX, Flax, and Optax.
16
  Deployment: A live demo was created using Gradio.
17
- The trained model weights are hosted separately on the Hugging Face Hub, as they are too large for a standard Git repository:
18
- https://huggingface.co/Nottybro/wigip-1
19
 
20
  My Journey
21
  This project was a deep dive into the real-world challenges of MLOps, including debugging file corruption, solving JAX compiler errors (XlaRuntimeError), and managing long-running jobs in a cloud environment. It was built with the help of an AI assistant for debugging and guidance.
 
14
  Training: Trained on the C4 dataset for over 500,000 steps (~8 hours on a TPU v3-8).
15
  Frameworks: Built with JAX, Flax, and Optax.
16
  Deployment: A live demo was created using Gradio.
17
+ See this please:
18
+ https://github.com/skibidibrainrot20times/wigip-1-jax-model.git
19
 
20
  My Journey
21
  This project was a deep dive into the real-world challenges of MLOps, including debugging file corruption, solving JAX compiler errors (XlaRuntimeError), and managing long-running jobs in a cloud environment. It was built with the help of an AI assistant for debugging and guidance.