NisoriV2 LLaMA 7B

A fine-tuned version of Meta’s LLaMA 7B, created by Priyanshu Bhatt.
This model (NisoriV2) has been fine-tuned on Dev Bhoomi Uttarakhand University (DBUU) data for instruction-following and question answering in a conversational style.


🧠 Model Details

  • Base Model: LLaMA 7B
  • Fine-Tuned Model: NisoriV2 LLaMA 7B
  • Fine-Tuned By: Priyanshu Bhatt (FalconX80)
  • Dataset: Dev Bhoomi Uttarakhand University (DBUU) educational & institutional data
  • Architecture: Decoder-only Transformer
  • Libraries: Hugging Face Transformers

🎯 Intended Use

This model is designed for:

  • Question Answering related to Dev Bhoomi Uttarakhand University
  • Educational and Institutional Information Retrieval
  • Conversational AI for DBUU context

πŸ“Š Example

Input Prompt

who is the chairman of dev bhoomi uttarakhand university?

Model Output

The Chairman of Dev Bhoomi Uttarakhand University is Shri Sanjay Bansal. 
He is a well-known businessman and philanthropist who has made significant contributions to the field of education. 
Under his leadership, the university has made tremendous progress and has established itself as a premier institution of higher learning in the region.

πŸš€ How to Use

import re
from transformers import AutoTokenizer, LlamaForCausalLM
import torch

model_name = "FalconX80/NisoriV2llama7b"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = LlamaForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto")

def chat(prompt):
    inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
    outputs = model.generate(inputs.input_ids, max_length=200)
    text = tokenizer.decode(outputs[0], skip_special_tokens=True)

    # Cleaning (remove [INST] tokens and repeated prompt)
    if text.lower().startswith(prompt.lower()):
        text = text[len(prompt):].strip()
    text = re.sub(r"\[/?INST\]", "", text).strip()

    return text

print(chat("Tell me about Dev Bhoomi Uttarakhand University"))

πŸ› οΈ Training Details

  • Fine-Tuning Task: Instruction Following & QA
  • Framework: Hugging Face Transformers + PyTorch
  • Hardware Used: GPU (Kaggle / Colab / custom environment)
  • Dataset: Proprietary Dev Bhoomi Uttarakhand University (DBUU) dataset prepared for institutional knowledge and educational Q&A.

⚠️ Limitations & Biases

  • The model knowledge is limited to DBUU data used during fine-tuning.
  • May hallucinate facts outside the scope of DBUU.
  • Not optimized for safety-critical applications.
  • Responses may vary depending on prompt phrasing.


πŸ“¬ Contact

Created by Priyanshu Bhatt

Downloads last month
3
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for FalconX80/NisoriV2llama7b

Quantizations
1 model