How to use Menlo/AlphaSpace-1.5B with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Menlo/AlphaSpace-1.5B") model = AutoModelForCausalLM.from_pretrained("Menlo/AlphaSpace-1.5B")
This PR updates the model card to link to the correct paper and adds a link to the GitHub repository.
· Sign up or log in to comment