# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("wxgeorge/AlfredPros-CodeLlama-7b-Instruct-Solidity")
model = AutoModelForCausalLM.from_pretrained("wxgeorge/AlfredPros-CodeLlama-7b-Instruct-Solidity")Quick Links
Untitled Model (1)
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: AlfredPros/CodeLlama-7b-Instruct-Solidity
merge_method: passthrough
dtype: bfloat16
- Downloads last month
- 3
Model tree for wxgeorge/AlfredPros-CodeLlama-7b-Instruct-Solidity
Base model
AlfredPros/CodeLlama-7b-Instruct-Solidity
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="wxgeorge/AlfredPros-CodeLlama-7b-Instruct-Solidity")