How to setup Void IDE in Linux
What Is Void IDE? Void IDE is an open-source alternative to Cursor, licensed under Apache 2.0 and hosted on GitHub. It was born as a fork of the VS Code codebase, retaining full extension and theme compatibility so you can migrate seamlessly. Void ships with Agent Mode (full read/write file operations), Gather Mode (read-only code exploration), and standard AI-powered completions—all locally controlled. Prerequisites Before installing Void IDE, make sure you have the following: Ubuntu 18.04+ (Desktop or Server). A user with sudo privileges. 200 MB of free disk space for the IDE itself. Ollama installed and running locally for LLM hosting: Install via Homebrew: brew install ollama Or via the official Linux script: curl https://ollama.com/install.sh | sh Verify and pull a model (e.g., llama2): ollama pull llama2 ollama run llama2 --prompt "Hello" Ensure the Ollama daemon is running: ollama daemon start Quick Install on Ubuntu Download the latest Linux .deb from GitHub Releases (e.g. void_1.99.30034_amd64.deb). Download Link: https://github.com/voideditor/binaries/releases Install with APT to auto-resolve dependencies: cd ~/Downloads sudo apt update sudo apt install ./void_1.99.30034_amd64.deb Void vs. Cursor AI: Pros & Cons Void IDE Cursor AI Source & Privacy Fully open source under Apache 2.0—audit the code, self-host your models, and keep all data in your infrastructure. Proprietary; code and prompts typically go through Cursor’s managed service unless you opt—and pay—for enterprise hosting. Model Flexibility Connect any LLM—open-source or commercial—without vendor lock-in. Provides a curated frontier-model backend with SLA guarantees but limited ability to swap in your own LLMs. Cost Free to use, no subscription or usage fees. Subscription-based; costs scale with usage and seats, potentially expensive for heavy users. Ecosystem Leverages the vast VS Code extension marketplace, but smaller community around AI-specific plugins. Rich, first-party integrations (Composer, diff viewers, built-in agents) with deeper AI features out of the box. Stability & Polish Early-stage UX; occasional rough edges and less extensive documentation. Mature, polished UI and streamlined onboarding optimized for rapid productivity. Community Community-driven roadmaps; rapid iteration on GitHub with weekly contributor meetups. Backed by Anysphere with deep pockets (recent \$9 bn valuation); robust support but less direct community control. Windows is Easy just install MSI package from github release and one click install. Happy Coding

What Is Void IDE?
Void IDE is an open-source alternative to Cursor, licensed under Apache 2.0 and hosted on GitHub. It was born as a fork of the VS Code codebase, retaining full extension and theme compatibility so you can migrate seamlessly. Void ships with Agent Mode (full read/write file operations), Gather Mode (read-only code exploration), and standard AI-powered completions—all locally controlled.
Prerequisites
Before installing Void IDE, make sure you have the following:
- Ubuntu 18.04+ (Desktop or Server).
- A user with sudo privileges.
- 200 MB of free disk space for the IDE itself.
- Ollama installed and running locally for LLM hosting:
-
Install via Homebrew:
brew install ollama
-
Or via the official Linux script:
curl https://ollama.com/install.sh | sh
-
Verify and pull a model (e.g.,
llama2
):
ollama pull llama2 ollama run llama2 --prompt "Hello"
-
Ensure the Ollama daemon is running:
ollama daemon start
Quick Install on Ubuntu
-
Download the latest Linux
.deb
from GitHub Releases (e.g.void_1.99.30034_amd64.deb
).
Download Link: https://github.com/voideditor/binaries/releases
- Install with APT to auto-resolve dependencies:
cd ~/Downloads
sudo apt update
sudo apt install ./void_1.99.30034_amd64.deb
Void vs. Cursor AI: Pros & Cons
Void IDE | Cursor AI | |
---|---|---|
Source & Privacy | Fully open source under Apache 2.0—audit the code, self-host your models, and keep all data in your infrastructure. | Proprietary; code and prompts typically go through Cursor’s managed service unless you opt—and pay—for enterprise hosting. |
Model Flexibility | Connect any LLM—open-source or commercial—without vendor lock-in. | Provides a curated frontier-model backend with SLA guarantees but limited ability to swap in your own LLMs. |
Cost | Free to use, no subscription or usage fees. | Subscription-based; costs scale with usage and seats, potentially expensive for heavy users. |
Ecosystem | Leverages the vast VS Code extension marketplace, but smaller community around AI-specific plugins. | Rich, first-party integrations (Composer, diff viewers, built-in agents) with deeper AI features out of the box. |
Stability & Polish | Early-stage UX; occasional rough edges and less extensive documentation. | Mature, polished UI and streamlined onboarding optimized for rapid productivity. |
Community | Community-driven roadmaps; rapid iteration on GitHub with weekly contributor meetups. | Backed by Anysphere with deep pockets (recent \$9 bn valuation); robust support but less direct community control. |
Windows is Easy just install MSI package from github release and one click install.
Happy Coding