LexaLCM_Pre0 / README.md
Lexa
Initial commit
3d79eb3
|
raw
history blame
1.19 kB

LexaLCM Pre0 288M Pre-trained Large Concept Model

A pre-trained LCM model with 288M parameters based on Meta FAIR's LCM architecture.

[Paper]

Note: These instructions are for running the model on a single machine with a single GPU. If your system does not have a GPU that supports at least CUDA 12.1, or if you intend to execute this in the cloud, you'll need to modify the code per your requirements.

1. Instal the Intel MKL runtime

sudo apt update
sudo apt install libmkl-rt
export LD_LIBRARY_PATH=/opt/intel/mkl/lib/intel64:$LD_LIBRARY_PATH
source ~/.bashrc

2. Install dependencies

uv sync --extra gpu --extra eval --extra data

3. Update the model cards' paths

These two model cards' paths must be updated to use the current paths based on where they exist in your local filesystem.

  • '_LexaLCM_Pre0/Checkpoints/LCM_TwoTower_Pre0/model_card.yaml'
  • '_LexaLCM_Pre0/Checkpoints/LCM_TwoTower_Pre0/checkpoints/step_250000/model_card.yaml'

4. Test the model's inference

uv run --extra gpu scripts/run_inference.py