Spaces:
Runtime error
Runtime error
A newer version of the Gradio SDK is available: 6.14.0
GROQ MODEL UPDATE GUIDE
Issue Fixed: Model Decommissioned Error
The error you encountered:
The model llama3-70b-8192 has been decommissioned and is no longer supported
β SOLUTION APPLIED:
Updated the model in modules/llm_handler.py from:
- β
llama3-70b-8192(decommissioned) - β
llama-3.1-70b-versatile(active)
Currently Supported Groq Models (as of Oct 2024):
Recommended for Your Use Case:
llama-3.1-70b-versatile- Best for detailed interview evaluationllama-3.1-8b-instant- Faster, good for quick responsesmixtral-8x7b-32768- Good alternative with large context
All Available Models:
llama-3.1-405b-reasoning- Most powerful (if available)llama-3.1-70b-versatile- High quality, good balancellama-3.1-8b-instant- Fast responsesllama3-groq-70b-8192-tool-use-preview- With tool supportllama3-groq-8b-8192-tool-use-preview- Faster with toolsmixtral-8x7b-32768- Mixtral modelgemma2-9b-it- Google's Gemmagemma-7b-it- Smaller Gemma
For Hugging Face Spaces Deployment:
Your app should now work properly with the updated model. The interview functionality will work normally without the API error.
Testing:
Run your app and try creating an interview - the question generation should now work without the decommissioned model error.