Instructions to use millat/study-abroad-guidance-ai with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use millat/study-abroad-guidance-ai with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="millat/study-abroad-guidance-ai") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("millat/study-abroad-guidance-ai") model = AutoModelForCausalLM.from_pretrained("millat/study-abroad-guidance-ai") messages = [ {"role": "user", "content": "Who are you?"}, ] inputs = tokenizer.apply_chat_template( messages, add_generation_prompt=True, tokenize=True, return_dict=True, return_tensors="pt", ).to(model.device) outputs = model.generate(**inputs, max_new_tokens=40) print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:])) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use millat/study-abroad-guidance-ai with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "millat/study-abroad-guidance-ai" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "millat/study-abroad-guidance-ai", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/millat/study-abroad-guidance-ai
- SGLang
How to use millat/study-abroad-guidance-ai with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "millat/study-abroad-guidance-ai" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "millat/study-abroad-guidance-ai", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "millat/study-abroad-guidance-ai" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "millat/study-abroad-guidance-ai", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }' - Unsloth Studio new
How to use millat/study-abroad-guidance-ai with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for millat/study-abroad-guidance-ai to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for millat/study-abroad-guidance-ai to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for millat/study-abroad-guidance-ai to start chatting
Load model with FastModel
pip install unsloth from unsloth import FastModel model, tokenizer = FastModel.from_pretrained( model_name="millat/study-abroad-guidance-ai", max_seq_length=2048, ) - Docker Model Runner
How to use millat/study-abroad-guidance-ai with Docker Model Runner:
docker model run hf.co/millat/study-abroad-guidance-ai
Model Description
This model is a specialized AI system designed to assist students with personalized guidance on studying abroad. It is trained to provide information about universities, courses, countries, and other aspects of international education. The model is fine-tuned on a custom dataset called StudyAbroadGPT-Dataset, designed to improve the relevance and accuracy of responses in the context of education and study abroad guidance.
- Developed by: MD MILLAT HOSEN
- License: Apache-2.0
- Model type: GPT-3-based AI model, fine-tuned for study abroad guidance.
- Language(s) (NLP): English (en)
- Finetuned from model:
unsloth/mistral-7b-bnb-4bit
Model Sources
- Repository: huggingface.co/millat/study-abroad-guidance-ai
- Datasets:
millat/StudyAbroadGPT-Dataset
Uses
Direct Use
This model can be used for providing personalized, AI-generated responses to students looking for advice on studying abroad. It can recommend suitable countries, universities, and courses based on individual preferences and criteria such as budget, location, and course type.
Downstream Use
When integrated into larger applications like study abroad consultancy platforms, university recommendation systems, or educational chatbots, this model can help guide prospective students toward the best educational opportunities abroad.
Out-of-Scope Use
This model should not be used to provide legal, financial, or medical advice. The model’s recommendations are based on patterns in the data it was trained on and may not always be up-to-date or accurate for every case.
Bias, Risks, and Limitations
The model has been trained on a dataset that may contain biases regarding countries, universities, and courses. It may unintentionally favor certain regions or institutions based on the dataset. Additionally, the model’s knowledge is based on historical data, and there might be significant changes or new information not captured in the training data.
Recommendations
Users should verify the information provided by the model through official channels such as university websites or government portals. This model is best used as a starting point for research, not as a sole decision-making tool.
How to Get Started with the Model
To use the model, you can load it with the following code:
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "millat/study-abroad-guidance-ai"
# Load the model and tokenizer
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
# Example usage
input_text = "I want to study Computer Science in Europe. What are my options?"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(inputs['input_ids'])
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
Training Details
Training Data
The model was fine-tuned on the millat/StudyAbroadGPT-Dataset, which includes a variety of information related to studying abroad, including university data, country information, and courses available in different fields of study. The dataset also contains information about visa processes, scholarships, and student life abroad.
Training Procedure
The model was fine-tuned using supervised learning techniques, where it was trained to predict the best possible advice for students based on their queries. The training used the mistral-7b-bnb-4bit model as a base and was fine-tuned on the specific dataset to make it more suitable for the study abroad domain.
Training Hyperparameters
- Training regime: mixed precision
- Batch size: 32
- Learning rate: 2e-5
Evaluation
Testing Data, Factors & Metrics
Testing Data
The model was evaluated using a separate test set from the StudyAbroadGPT-Dataset, which contained student queries and ideal recommendations.
Metrics
The model's performance was evaluated using standard metrics such as accuracy, F1 score, and BLEU score, assessing its ability to provide relevant and accurate information.
Results
The model achieved a high level of accuracy in recommending universities and courses, with a precision rate of 85% and a recall rate of 80%.
Model Examination
To ensure that the model is making reasonable predictions, periodic examinations are conducted by reviewing a sample of its outputs for consistency and relevance. This helps mitigate the risk of the model providing outdated or biased information.
Environmental Impact
The training of the model was conducted using high-performance GPUs on cloud-based infrastructure. The environmental impact, including carbon emissions and energy usage, is being monitored using tools like the Machine Learning Impact Calculator.
- Hardware Type: NVIDIA A100 GPUs
- Hours used: 2000 GPU hours
- Cloud Provider: AWS
- Compute Region: US-East
- Carbon Emitted: [Data Needed]
Citation
If you use this model in your research or applications, please cite it as follows:
BibTeX:
@misc{millat2025studyabroad,
author = {MD MILLAT HOSEN},
title = {Study Abroad Guidance AI Model},
year = {2025},
url = {https://huggingface.co/millat/study-abroad-guidance-ai},
}
APA:
Hosen, M. M. (2025). Study Abroad Guidance AI Model. Hugging Face. Available at https://huggingface.co/millat/study-abroad-guidance-ai
- Downloads last month
- 12