Instructions to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- llama-cpp-python
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with llama-cpp-python:
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp", filename="Wan2.2-TI2V-5B-Q2_K.gguf", )
output = llm( "Once upon a time,", max_tokens=512, echo=True ) print(output)
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- llama.cpp
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with llama.cpp:
Install from brew
brew install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K # Run inference directly in the terminal: llama-cli -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Install from WinGet (Windows)
winget install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K # Run inference directly in the terminal: llama-cli -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Use pre-built binary
# Download pre-built binary from: # https://github.com/ggerganov/llama.cpp/releases # Start a local OpenAI-compatible server with a web UI: ./llama-server -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K # Run inference directly in the terminal: ./llama-cli -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Build from source code
git clone https://github.com/ggerganov/llama.cpp.git cd llama.cpp cmake -B build cmake --build build -j --target llama-server llama-cli # Start a local OpenAI-compatible server with a web UI: ./build/bin/llama-server -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K # Run inference directly in the terminal: ./build/bin/llama-cli -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Use Docker
docker model run hf.co/isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
- LM Studio
- Jan
- Ollama
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with Ollama:
ollama run hf.co/isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
- Unsloth Studio new
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp to start chatting
- Docker Model Runner
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with Docker Model Runner:
docker model run hf.co/isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
- Lemonade
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with Lemonade:
Pull the model
# Download Lemonade from https://lemonade-server.ai/ lemonade pull isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Run and chat with the model
lemonade run user.wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp-Q2_K
List all available models
lemonade list
| # CMAKE generated file: DO NOT EDIT! | |
| # Generated by "Unix Makefiles" Generator, CMake Version 3.31 | |
| # Delete rule output on recipe failure. | |
| .DELETE_ON_ERROR: | |
| #============================================================================= | |
| # Special targets provided by cmake. | |
| # Disable implicit rules so canonical targets will work. | |
| .SUFFIXES: | |
| # Disable VCS-based implicit rules. | |
| % : %,v | |
| # Disable VCS-based implicit rules. | |
| % : RCS/% | |
| # Disable VCS-based implicit rules. | |
| % : RCS/%,v | |
| # Disable VCS-based implicit rules. | |
| % : SCCS/s.% | |
| # Disable VCS-based implicit rules. | |
| % : s.% | |
| .SUFFIXES: .hpux_make_needs_suffix_list | |
| # Command-line flag to silence nested $(MAKE). | |
| $(VERBOSE)MAKESILENT = -s | |
| #Suppress display of executed commands. | |
| $(VERBOSE).SILENT: | |
| # A target that is always out of date. | |
| cmake_force: | |
| .PHONY : cmake_force | |
| #============================================================================= | |
| # Set environment variables for the build. | |
| # The shell in which to execute make rules. | |
| SHELL = /bin/sh | |
| # The CMake executable. | |
| CMAKE_COMMAND = /usr/local/lib/python3.12/dist-packages/cmake/data/bin/cmake | |
| # The command to remove a file. | |
| RM = /usr/local/lib/python3.12/dist-packages/cmake/data/bin/cmake -E rm -f | |
| # Escaping for special characters. | |
| EQUALS = = | |
| # The top-level source directory on which CMake was run. | |
| CMAKE_SOURCE_DIR = /kaggle/working/stable-diffusion.cpp | |
| # The top-level build directory on which CMake was run. | |
| CMAKE_BINARY_DIR = /kaggle/working/stable-diffusion.cpp/build | |
| # Include any dependencies generated for this target. | |
| include ggml/src/CMakeFiles/ggml.dir/depend.make | |
| # Include any dependencies generated by the compiler for this target. | |
| include ggml/src/CMakeFiles/ggml.dir/compiler_depend.make | |
| # Include the progress variables for this target. | |
| include ggml/src/CMakeFiles/ggml.dir/progress.make | |
| # Include the compile flags for this target's objects. | |
| include ggml/src/CMakeFiles/ggml.dir/flags.make | |
| ggml/src/CMakeFiles/ggml.dir/codegen: | |
| .PHONY : ggml/src/CMakeFiles/ggml.dir/codegen | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o: ggml/src/CMakeFiles/ggml.dir/flags.make | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o: /kaggle/working/stable-diffusion.cpp/ggml/src/ggml-backend-dl.cpp | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o: ggml/src/CMakeFiles/ggml.dir/compiler_depend.ts | |
| @$(CMAKE_COMMAND) -E cmake_echo_color "--switch=$(COLOR)" --green --progress-dir=/kaggle/working/stable-diffusion.cpp/build/CMakeFiles --progress-num=$(CMAKE_PROGRESS_1) "Building CXX object ggml/src/CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o" | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && /usr/bin/c++ $(CXX_DEFINES) $(CXX_INCLUDES) $(CXX_FLAGS) -MD -MT ggml/src/CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o -MF CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o.d -o CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o -c /kaggle/working/stable-diffusion.cpp/ggml/src/ggml-backend-dl.cpp | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-dl.cpp.i: cmake_force | |
| @$(CMAKE_COMMAND) -E cmake_echo_color "--switch=$(COLOR)" --green "Preprocessing CXX source to CMakeFiles/ggml.dir/ggml-backend-dl.cpp.i" | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && /usr/bin/c++ $(CXX_DEFINES) $(CXX_INCLUDES) $(CXX_FLAGS) -E /kaggle/working/stable-diffusion.cpp/ggml/src/ggml-backend-dl.cpp > CMakeFiles/ggml.dir/ggml-backend-dl.cpp.i | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-dl.cpp.s: cmake_force | |
| @$(CMAKE_COMMAND) -E cmake_echo_color "--switch=$(COLOR)" --green "Compiling CXX source to assembly CMakeFiles/ggml.dir/ggml-backend-dl.cpp.s" | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && /usr/bin/c++ $(CXX_DEFINES) $(CXX_INCLUDES) $(CXX_FLAGS) -S /kaggle/working/stable-diffusion.cpp/ggml/src/ggml-backend-dl.cpp -o CMakeFiles/ggml.dir/ggml-backend-dl.cpp.s | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o: ggml/src/CMakeFiles/ggml.dir/flags.make | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o: /kaggle/working/stable-diffusion.cpp/ggml/src/ggml-backend-reg.cpp | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o: ggml/src/CMakeFiles/ggml.dir/compiler_depend.ts | |
| @$(CMAKE_COMMAND) -E cmake_echo_color "--switch=$(COLOR)" --green --progress-dir=/kaggle/working/stable-diffusion.cpp/build/CMakeFiles --progress-num=$(CMAKE_PROGRESS_2) "Building CXX object ggml/src/CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o" | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && /usr/bin/c++ $(CXX_DEFINES) $(CXX_INCLUDES) $(CXX_FLAGS) -MD -MT ggml/src/CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o -MF CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o.d -o CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o -c /kaggle/working/stable-diffusion.cpp/ggml/src/ggml-backend-reg.cpp | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-reg.cpp.i: cmake_force | |
| @$(CMAKE_COMMAND) -E cmake_echo_color "--switch=$(COLOR)" --green "Preprocessing CXX source to CMakeFiles/ggml.dir/ggml-backend-reg.cpp.i" | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && /usr/bin/c++ $(CXX_DEFINES) $(CXX_INCLUDES) $(CXX_FLAGS) -E /kaggle/working/stable-diffusion.cpp/ggml/src/ggml-backend-reg.cpp > CMakeFiles/ggml.dir/ggml-backend-reg.cpp.i | |
| ggml/src/CMakeFiles/ggml.dir/ggml-backend-reg.cpp.s: cmake_force | |
| @$(CMAKE_COMMAND) -E cmake_echo_color "--switch=$(COLOR)" --green "Compiling CXX source to assembly CMakeFiles/ggml.dir/ggml-backend-reg.cpp.s" | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && /usr/bin/c++ $(CXX_DEFINES) $(CXX_INCLUDES) $(CXX_FLAGS) -S /kaggle/working/stable-diffusion.cpp/ggml/src/ggml-backend-reg.cpp -o CMakeFiles/ggml.dir/ggml-backend-reg.cpp.s | |
| # Object files for target ggml | |
| ggml_OBJECTS = \ | |
| "CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o" \ | |
| "CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o" | |
| # External object files for target ggml | |
| ggml_EXTERNAL_OBJECTS = | |
| ggml/src/libggml.a: ggml/src/CMakeFiles/ggml.dir/ggml-backend-dl.cpp.o | |
| ggml/src/libggml.a: ggml/src/CMakeFiles/ggml.dir/ggml-backend-reg.cpp.o | |
| ggml/src/libggml.a: ggml/src/CMakeFiles/ggml.dir/build.make | |
| ggml/src/libggml.a: ggml/src/CMakeFiles/ggml.dir/link.txt | |
| @$(CMAKE_COMMAND) -E cmake_echo_color "--switch=$(COLOR)" --green --bold --progress-dir=/kaggle/working/stable-diffusion.cpp/build/CMakeFiles --progress-num=$(CMAKE_PROGRESS_3) "Linking CXX static library libggml.a" | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && $(CMAKE_COMMAND) -P CMakeFiles/ggml.dir/cmake_clean_target.cmake | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && $(CMAKE_COMMAND) -E cmake_link_script CMakeFiles/ggml.dir/link.txt --verbose=$(VERBOSE) | |
| # Rule to build all files generated by this target. | |
| ggml/src/CMakeFiles/ggml.dir/build: ggml/src/libggml.a | |
| .PHONY : ggml/src/CMakeFiles/ggml.dir/build | |
| ggml/src/CMakeFiles/ggml.dir/clean: | |
| cd /kaggle/working/stable-diffusion.cpp/build/ggml/src && $(CMAKE_COMMAND) -P CMakeFiles/ggml.dir/cmake_clean.cmake | |
| .PHONY : ggml/src/CMakeFiles/ggml.dir/clean | |
| ggml/src/CMakeFiles/ggml.dir/depend: | |
| cd /kaggle/working/stable-diffusion.cpp/build && $(CMAKE_COMMAND) -E cmake_depends "Unix Makefiles" /kaggle/working/stable-diffusion.cpp /kaggle/working/stable-diffusion.cpp/ggml/src /kaggle/working/stable-diffusion.cpp/build /kaggle/working/stable-diffusion.cpp/build/ggml/src /kaggle/working/stable-diffusion.cpp/build/ggml/src/CMakeFiles/ggml.dir/DependInfo.cmake "--color=$(COLOR)" | |
| .PHONY : ggml/src/CMakeFiles/ggml.dir/depend | |