How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="uralstech/hAI-Spec-Merged")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("uralstech/hAI-Spec-Merged")
model = AutoModelForCausalLM.from_pretrained("uralstech/hAI-Spec-Merged")
Quick Links

hAI! Spec

This is the HuggingFace model repository for Nasa SpaceApps Challenge team hAI! Spec. You will find the hAI! Spec LLM here.

Our solution, named hAI! Spec, is a Language Learning Model (LLM) designed to optimize the management of technical standards within the aerospace industry. Trained on changes between historical and current versions of various standards, hAI! Spec uses a unique Section-By-Section training approach. This allows the model to provide accurate recommendations for improving document consistency, relevance, and accuracy, even down to the smallest grammatical detail in extensive documents like NASA standards.

Downloads last month
9
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support