allenai/qasper
Viewer • Updated • 1.59k • 5.61k • 99
How to use DataHammer/mozi_llama_7b with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="DataHammer/mozi_llama_7b") # Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("DataHammer/mozi_llama_7b", dtype="auto")# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("DataHammer/mozi_llama_7b", dtype="auto")Mozi is the first large-scale language model for the scientific paper domain, such as question answering and emotional support. With the help of the large-scale language and evidence retrieval models, SciDPR, Mozi generates concise and accurate responses to users' questions about specific papers and provides emotional support for academic researchers.
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="DataHammer/mozi_llama_7b")