Chat Model Provider Sample
This VS Code extension demonstrates how to implement a custom chat model provider using the Language Model (LM) API. It serves as a sample implementation for developers who want to integrate their own AI models with VS Code's chat functionality.
Features
This extension provides:
- Custom Chat Model Provider: Implements the
LanguageModelChatProvider2interface to register custom AI models - Multiple Model Support: Demonstrates how to provide multiple models from a single provider
- Sample Models: Includes two example models:
- Dog Model: Responds with dog-themed messages ("Woof!")
- Cat Model: Responds with cat-themed messages ("Meow!")
Architecture
The extension consists of two main components:
Extension Activation (src/extension.ts)
- Registers the sample chat model provider with VS Code's LM API
- Uses the vendor ID
"sample"to identify the provider
Chat Model Provider (src/provider.ts)
- Implements
LanguageModelChatProvider2interface - Provides model information including capabilities (tool calling, vision support)
- Handles chat requests and returns appropriate responses
- Includes token counting functionality
Model Capabilities
Each sample model declares the following capabilities:
- Tool Calling: ✅ Enabled
- Vision: ✅ Enabled
- Max Input Tokens: 120,000
- Max Output Tokens: 8,192
Getting Started
Prerequisites
- VS Code version 1.103.0 or higher
- Node.js and npm installed
Installation and Development
- Clone this repository
- Navigate to the extension directory:
cd chat-model-provider-sample - Install dependencies:
npm install - Compile the extension:
npm run compile - Press
F5to launch a new Extension Development Host window - The extension will be active and ready to provide chat models
Building and Watching
- Build once:
npm run compile - Watch mode:
npm run watch(automatically recompiles on file changes) - Lint code:
npm run lint
Usage
Once the extension is active:
- Open VS Code's chat interface
- Click the model picker and click manage models
- Select the sample provider
- Check the models based on what you want in the model picker
- Send a request to the model
API Usage
This extension uses the proposed chatProvider API. The key components include:
vscode.lm.registerChatModelProvider()- Registers the providerLanguageModelChatProvider2interface - Defines the provider contractLanguageModelChatInformation- Describes model capabilitiesChatResponseFragment2- Handles streaming responses
Customization
To create your own chat model provider:
- Modify the
SampleChatModelProviderclass insrc/provider.ts - Update the model information in
getChatModelInfo() - Implement your custom logic in
provideLanguageModelChatResponse() - Adjust the vendor ID and model IDs as needed
- Update the
package.jsonwith your extension details