leex279 commited on
Commit
b842e0c
·
unverified ·
2 Parent(s): e196442840dd59

Merge pull request #1124 from stackblitz-labs/leex279-patch-readme-changes-v1

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -1,10 +1,12 @@
1
- # bolt.diy (Previously oTToDev)
2
  [![bolt.diy: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.diy)
3
 
4
  Welcome to bolt.diy, the official open source version of Bolt.new (previously known as oTToDev and bolt.new ANY LLM), which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
5
 
6
- Check the [bolt.diy Docs](https://stackblitz-labs.github.io/bolt.diy/) for more information.
 
7
 
 
8
  Also [this pinned post in our community](https://thinktank.ottomator.ai/t/videos-tutorial-helpful-content/3243) has a bunch of incredible resources for running and deploying bolt.diy yourself!
9
 
10
  We have also launched an experimental agent called the "bolt.diy Expert" that can answer common questions about bolt.diy. Find it here on the [oTTomator Live Agent Studio](https://studio.ottomator.ai/).
@@ -91,7 +93,7 @@ project, please check the [project management guide](./PROJECT.md) to get starte
91
 
92
  ## Features
93
 
94
- - **AI-powered full-stack web development** directly in your browser.
95
  - **Support for multiple LLMs** with an extensible architecture to integrate additional models.
96
  - **Attach images to prompts** for better contextual understanding.
97
  - **Integrated terminal** to view output of LLM-run commands.
 
1
+ # bolt.diy (Previously oTToDev)
2
  [![bolt.diy: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.diy)
3
 
4
  Welcome to bolt.diy, the official open source version of Bolt.new (previously known as oTToDev and bolt.new ANY LLM), which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
5
 
6
+ -----
7
+ Check the [bolt.diy Docs](https://stackblitz-labs.github.io/bolt.diy/) for more offical installation instructions and more informations.
8
 
9
+ -----
10
  Also [this pinned post in our community](https://thinktank.ottomator.ai/t/videos-tutorial-helpful-content/3243) has a bunch of incredible resources for running and deploying bolt.diy yourself!
11
 
12
  We have also launched an experimental agent called the "bolt.diy Expert" that can answer common questions about bolt.diy. Find it here on the [oTTomator Live Agent Studio](https://studio.ottomator.ai/).
 
93
 
94
  ## Features
95
 
96
+ - **AI-powered full-stack web development** for **NodeJS based applications** directly in your browser.
97
  - **Support for multiple LLMs** with an extensible architecture to integrate additional models.
98
  - **Attach images to prompts** for better contextual understanding.
99
  - **Integrated terminal** to view output of LLM-run commands.