How to Get Started with Bolt.diy: Your Open Source AI Coding Assistant (oTToDev+Bolt.new)

BREAKING NEWS: oTToDev is Now the OFFICIAL Open Source Bolt.new

Previously known as oTToDev, Bolt.Diy is the open-source counterpart of Bolt.New. You can select your preferred Language Study Model (LLM) for any project using this platform. Bolt.Diy has you covered whether you want to use OpenAI, Anthropic, Ollama, or any of the other supported models! Additionally, it is made to be easily expanded to include many more models using the Vercel AI SDK.

Why Choose Bolt.diy?

Cole Medin was the original founder of the community-driven project Bolt.diy. The best open-source AI coding assistant has emerged, and a thriving community is adding features to make your development process more effective and smooth.


Supported LLMs

Currently, Bolt.diy supports:

  • OpenAI
  • Anthropic
  • Ollama
  • OpenRouter
  • Gemini
  • LMStudio
  • Mistral
  • xAI
  • HuggingFace
  • DeepSeek
  • Groq

And many more! New integrations are constantly being added by contributors.


Features

  • AI-Powered Development: Build full-stack web applications directly in your browser.
  • Extensible Architecture: Integrate additional LLMs easily.
  • Version Control: Revert code to earlier versions for better debugging.
  • Project Export: Download your project as a ZIP file.
  • Integrated Terminal: See the output of LLM-run commands in real time.
  • Prompt Enhancements: Attach images to prompts for better context.
  • Docker Support: Containerize your project for easy deployment.
  • Streaming Outputs: View code execution results dynamically.

Setting Up Bolt.diy Locally

Prerequisites

Before starting, ensure the following are installed:

  1. Git: Download Git
  2. Node.js: Download Node.js

After installation:

  • Windows: Check Node.js is in your system PATH.
  • Mac/Linux: Verify the PATH using echo $PATH.

Step 1: Clone the Repository

Run the following command to clone the stable branch:

git clone -b stable https://github.com/stackblitz-labs/bolt.diy

Step 2: Configure Environment Variables

  1. Rename .env.example to .env.local.

Add your API keys. Example:

GROQ_API_KEY=YOUR_GROQ_API_KEY
OPENAI_API_KEY=YOUR_OPENAI_API_KEY
Note: Ollama runs locally and doesn’t require an API key.

Step 3: Install Dependencies

Run:

pnpm install

If pnpm isn't installed, install it with:

sudo npm install -g pnpm

Step 4: Start the Application

Run the development server:

pnpm run dev

This will start the app locally in development mode.


Running with Docker

Prerequisites

Steps

Start the container:

docker-compose --profile development up

Build the Docker image:

npm run dockerbuild

With the development profile, changes will hot-reload automatically.


Keeping Bolt.diy Updated

Restart the app:

pnpm run dev

Update dependencies:

pnpm install

Pull the latest changes:

git pull origin main

Navigate to your project folder:

cd your-project-folder

Check the Bolt.diy Docs for detailed instructions and updates. Together, we’re building the future of AI-driven coding!

Source: https://github.com/stackblitz-labs/bolt.diy