Spaces:
Sleeping
Sleeping
File size: 2,242 Bytes
244696f 1fa9b4f daf4234 0dd2fed 244696f 99a9f7b 244696f 1fa9b4f daf4234 1fa9b4f d86e880 1fa9b4f d86e880 1fa9b4f 6a93f88 1fa9b4f cf3f715 9ec19a7 1fa9b4f 3222332 1fa9b4f | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 | ---
title: AlpineLLM Live Demo
emoji: 🏔️
colorFrom: indigo
colorTo: blue
sdk: gradio
sdk_version: 5.42.0
app_file: app.py
pinned: false
hf_oauth: true
hf_oauth_scopes:
- inference-api
license: mit
short_description: A domain-specific language model for alpine storytelling.
---
# AlpineLLM Live Demo
A domain-specific language model for alpine storytelling.
Try asking about mountain adventures! 🏔️
## About AlpineLLM
AlpineLLM-Tiny-10M-Base is a lightweight base language model with ~10.8 million trainable parameters. It was pre-trained from scratch on raw text corpora drawn primarily from public-domain literature on alpinism, including expedition narratives and climbing essays.
This demo showcases the model’s text generation capabilities within its specialized domain. Please note that AlpineLLM is a base model, and it has not been fine-tuned for downstream tasks such as summarization or dialogue. Its outputs reflect patterns learned directly from the training texts.
This space shows a free CPU-only demo of the model, so inference may take a few seconds. Text generation of the tiny model may lack full coherence due to its limited size and character-level tokenization. For improved results, consider using the source repository to load larger pretrained weights and run inference on a GPU.
Complete source code and full model documentation is available in the related repositories.
### Related Repositories
- [**🤗 AlpineLLM Model Page @ HuggingFace**](https://huggingface.co/Borzyszkowski/AlpineLLM-Tiny-10M-Base)
- [**⛏️ AlpineLLM Source Code @ GitHub**](https://github.com/Borzyszkowski/AlpineLLM)
### How to install?
The software has been tested on Ubuntu 20.04 with CUDA 12.2 and Python3.10.
Please use a Python virtual environment to install the dependencies:
python3.10 -m venv venv_AlpineLLM
source venv_AlpineLLM/bin/activate
pip install -r requirements.txt
### How to start?
The application starts automatically upon pushing changes to the Hugging Face Space.
For local development, please run:
```
python app.py
```
### Contact and technical support
- <b>Bartek Borzyszkowski</b> <br>
Web: <a href="https://borzyszkowski.github.io/">borzyszkowski.github.io</a>
|