· GET STARTED ·

Requirements

Liege runs the heavy LLM work locally on your machine, so your hardware matters. Here's what you need.

Operating system

OSSupportedNotes
Windows 10 / 11 (x64)✅ PrimaryMost tested. PowerShell used for env setup.
macOS 13+ (Apple Silicon)✅ SupportedM1/M2/M3. Intel Macs work but slower.
Linux (Ubuntu 22+)⚠️ UnsupportedShould work, not tested. Use at own risk.

Hardware minimums

MinimumRecommended
RAM16 GB32 GB
GPU VRAM8 GB (RTX 3060 or better)12 GB+ (RTX 4070 / M2 Pro / M3 Pro)
Disk30 GB free100 GB free (for models + logs)
CPU4 cores8+ cores
No GPU? You can still run Liege — local CPU inference on small models works for basic tasks. Heavy content drafting and intel tasks will escalate to Claude (budget ~$30–$50/mo). Performance drops ~10x vs GPU.

Software prerequisites

Cloud accounts you'll connect

You don't need all of these on day one. Connect what applies to your business:

Budget assumptions

Local inference is $0 per run. You only pay for:

Check your machine now

Run this quick check from PowerShell (Windows) or Terminal (Mac):

node --version
npm --version
git --version
ollama --version

All four should print a version. If any say "command not found," install them first.

Next

Ready? Pick your OS: Windows install · Mac install