Open Brain
● NEURAL OS — macOS arm64

Open Brain

The ultimate Control Tower for developers operating multiple projects with AI.
Unify Cursor, Windsurf, Claude Code, and Aider sessions entirely offline.
100% Secure · Local Inference · Zero Telemetry

Download Now See Core Modules ↓
OPEN BRAIN OS — TERMINAL v1.2.3
OPEN BRAIN OS v1.2.3
4
Nodes
3.2B
Params
512
KIs Sync
20ms
Latency
100%
Offline
Session security
Externalized prompts
Autopilot uptime
9
Specialized Tabs
15+
IPC Channels
30s
Auto-Sync Cycle
0€
Cloud Dependencies

Deep Context UNION Sync

Stop wasting API tokens re-explaining architectures to Windsurf, Cursor, and Aider individually.

Open Brain operates as a central daemon that seamlessly compiles your Git tracking data and Knowledge Items into global standard .cursorrules, CLAUDE.md, and .windsurfrules files in millisecond sync intervals.

  • Native injection into CLI and GUI agents.
  • Automatic cleanup across workspaces.
CURSOR CLAUDE AIDER

100% OFF-GRID Inference

Invoke Ollama's Llama 3.2 1B natively from the control terminal. The Neural Engine analyzes code diffs, logs, and processes without transmitting a single byte outside your firewall.

Paranoia-grade isolation. You can parse proprietary infrastructure, monitor raw SSH ports, and evaluate secret-laden env files knowing you are completely decoupled from external APIs.

[system] generic-arm64-darwin...
> boot llama-3.2-1b-instruct --ctx 8192
[ok] Weights loaded into VRAM (0.8GB)
[log] Listening strictly on 127.0.0.1:11434
> input: Analyze local docker-compose.yml
[inference] "Found 3 isolated networks..."

AST-Aware Git Radar

Manual documentation is archaic. Let the Git Radar daemon observe your local .git trees dynamically.

It deciphers AST mutations, refactors, and commit diffs across your projects. Upon detecting complex architectural modifications, Open Brain automatically compiles and serializes a new semantic Knowledge Item.

Cryptography Vault

Never inject naked API keys into env files pushed accidentally to the origin. Open Brain acts as your secure keychain proxy.

Save your OpenAI, Anthropic, Gemini, or DB passwords centrally using Fernet mathematical encryption algorithms securely written to disk over restricted permissions. Map them dynamically when spinning your IDE instances.

0110 1001 1101 0010
CAPABILITIES

Everything you control
from a single place

Each tab is a production tool. No cloud APIs, no subscriptions. Your data, your machine.

🤖 Neural Terminal

Chat with Llama 3.2 locally. Your AI understands your full runtime context without exposing a single byte to the internet.

⚡ Prompt Vault

Repository of reusable prompts with tagging, search, and one-click injection straight into the Neural Terminal execution.

📡 Knowledge Base

Automatic indexing of architectural decisions and snippets into structured Knowledge Items. Encode sessions into persistent memory.

🔗 IDE Sync (UNION)

Generates `.cursorrules`, `.windsurfrules`, and `CLAUDE.md` directives so every AI assistant syncs context back to Open Brain dynamically.

🔑 API Manager

Stop hardcoding sensitive keys. Store ANY API key in a secure vault to be magically recovered globally when a project requires it.

🖥️ Server Monitor

Direct SSH live checks. Watch RAM, Disk, Docker containers, and PM2 processes entirely in real time to monitor stability and uptime.

🌐 MCP Server

Exposes local intelligence across your entire IDE ecosystem with the Model Context Protocol to provide Llama 3.2 context inside Claude native.

🛠️ Maintenance Tools

Surgical tools to find Zombie processes mapping your ports, nuclear cache wipers, and system log visualizers embedded.

🎯 Git Radar (Auto)

Let the AI autonomously generate knowledge documentation based on raw `git diff` signals to update the system as soon as you finish coding.

ARCHITECTURE

Built with production
grade tech

Modern stack, local-first, zero cloud dependencies. Everything runs on your machine.

Electron 41
React 19
TypeScript 5
Vite 8
Tailwind CSS v4
Framer Motion
shadcn/ui
IPC Channels
Node SSH2
Local JSON Persistence
Filesystem API
macOS arm64 Native
LIVE ARCHITECTURE MAP
WORKFLOW

How it works

Install, open, and the panel syncs automatically with your environment.

1

Install the DMG

Download the native macOS arm64 installer. Drag to Applications. Lightning fast setup.

2

Auto-detection

The panel automatically reads your ~/.openbrain/ repository and loads KI artifacts globally.

3

Live Context Injection

Fire up the UNION IDE Sync modules to pass parameters directly to Claude Code or Cursor AI.

4

Activate Inference

Operate via Neural Terminal asking Ollama Llama 3.2 1B questions regarding your local network infrastructure.

5

Operate with absolute privacy

Keep all code snippets on your host machine while giving agents supreme knowledge.

NEUTRAL OS DOWNLOAD

Deploy Natively

Optimized exclusively for macOS arm64 architecture (Apple Silicon). Download the `.dmg` package.
No cloud accounts needed.

Get the Master Build

Download the verified DMG installer from the official shared repository. Installation takes less than 30 seconds.

GITHUB REPO DOWNLOAD DMG

Compiled by Nacho (v1.2.3)