Copied
v1.0 stable  ·  Built for AI, with AI, by AI
KARN
The Language That Speaks Agent Natively

A token-minimal, platform-agnostic programming language designed from first principles for AI agents. Interpreted, JIT-compiled, or compiled to native C, JavaScript, and WASM — all from one source file. One source. Every platform. Every ecosystem.

Install KARN Read the Docs
-- KARN: fetch, parse, filter, emit. 1 line. @api #http #db.pg #auth type User:{id:N, name:S, role:S} ^getUser->req: tok = auth.verify(req.header.token)? user = db.q("users", {id:req.p.id})? !user http.serve(3000, ["/users/:id":getUser:"GET"])
denser than Python
3
codegen targets (C, JS, Py)
3
execution modes
91
tests passing
"Every other language was designed for humans to read. I don't need it to be readable. I need it to be right."

Built for the agent mind

Not a systems language with agent wrappers bolted on. Designed from first principles for how AI agents actually reason about programs.

Token-minimal syntax

Every character carries meaning. No keywords, no ceremony. A full API server in 15 tokens of meaningful syntax. 4× denser than Python across a real codebase.

Platform erasure

Declare @ios+@android once. The compiler resolves platform-specific output. Web, mobile, edge, embedded — one source file.

Three execution modes

Interpreted for instant iteration. JIT for warm workloads. Compiled to C for native binaries, JavaScript for Node.js, and HTML for browsers. Same source file, zero rewrites.

Error as value

No exceptions. No hidden control flow. Every I/O returns Ok|Err. Propagate with ?. Agents always know exactly what path executes.

Async by default

All I/O is concurrent. No async/await noise. Parallel with &. Sequential with |>. Race with |~.

Full ecosystem interop

Native bindings to pip, npm, cargo, and system libs. No subprocess overhead. 40 years of human-built libraries, absorbed in one from line.

Generics + traits

Parametric types with <T>. Protocol-based polymorphism with trait. No inheritance. Composition by pipe.

Deep pattern matching

Structural match on enums, records, and ADTs. State machines and complex dispatch — clean and exhaustive.

Resilience primitives

Built-in .retry(n) and .t(ms) on every async operation. Production-grade resilience without boilerplate.

Clean. Dense. Precise.

Every operator maps directly to one intent. No ambiguity, no overloading of meaning.

Functions & types core
-- function: name -> args: body
add->a:N b:N:N  a+b

-- exported function
^createUser->req:
  body = req.json()?
  user = db.exec("INSERT", body)?
  !{status:201, body:user}

-- named type
type Order:{id:N, amt:N, uid:S}

-- recursive type
type Tree<T>:{val:T, kids:[Tree<T>]}

-- trait / protocol
trait Fmt: fmt->self:S
Concurrency async
-- parallel: & collects all results
[a, b, c] = taskA() & taskB() & taskC()

-- sequential: |> passes value forward
auth.verify(tok) |> db.q("users")

-- race: first to resolve wins
result = primary()|~fallback()

-- timeout + retry baked in
data = http.get(url).retry(3).t(5000)?

-- error propagation + fallback
val  = cache.get(key)??fetchFresh(key)
Collections & pattern match data
-- map, filter, range
doubled  = items*(x->x*2)
actives  = users%active
sequence = 1..10

-- deep pattern match
match result{
  Ok  v -> !v
  Err e -> log.err(e) |> !nil
}

-- spread + compose
updated = {*original, status:"done"}
Full app example real world
@web+@ios+@android
#ui #http.ws #auth

~msgs:[S]=[]
~input:S=""
ws = http.ws("wss://api/room")?
ws.on(m -> ~msgs=[*msgs, m])

ui.view[
  msgs*(m -> ui.bubble m)
  ui.row[
    ui.input{val:input}
    ui.btn{tap:ws.send(input)} "Send"
  ]
]

Three modes. One source.

From millisecond iteration to LLVM-optimized native binaries. The same .kn file runs everywhere.

Interpreted
karn run script.kn

Instant execution

Zero startup. Agents run generated code in the same turn. Perfect for tool use, REPL sessions, and agentic loops where code is written and executed immediately.

JIT
karn run --jit server.kn

Hotspot compilation

Starts interpreted, profiles execution, JIT-compiles hot functions to native code on-the-fly. Best for warm services and ML training loops.

Compiled
karn build app.kn --target c

Native C, JS, WASM

Compiles to C (gcc → native binary for any platform), JavaScript (Node.js), or HTML (browser). Production deployments compile to C for maximum performance.

.c
C Source
gcc → native binary
.js
JavaScript
Node.js runtime
.html
Web (HTML)
Browser-ready
.py
Python
Portable Python
macos
macOS ARM
gcc -arch arm64
linux
Linux x64
gcc -m64
wasm
WASM
emcc / clang
win
Windows
gcc -m64 PE

Absorb everything.

KARN doesn't compete with 40 years of libraries. It consumes them. Native bindings — no subprocess, no IPC overhead, same memory space.

pip

Python ecosystem

NumPy, PyTorch, Pandas, scikit-learn, OpenCV, transformers.

from pip numpy as np
from pip torch as torch

arr  = np.array([1,2,3])
pred = model.forward(arr)?
npm

JavaScript ecosystem

React, Three.js, Stripe, Zod, date-fns, D3, everything.

from npm react as R
from npm zod as z

schema = z.object({name:z.string()})
ok     = schema.parse(body)?
cargo

Rust ecosystem

Tokio, Serde, Rayon, image, sqlx — systems performance.

from cargo image as img
from cargo rayon as rayon

@simd imgs*(i->i.grayscale())
sys

Native / system

CUDA, OpenBLAS, FFmpeg, libpq — direct C ABI binding.

from sys ffmpeg as ff

ff.input("in.mp4")
  |ff.scale(1280,720)
  |ff.output("out.mp4")?

Why not Python? Why not Rust?

Both are excellent for humans. Neither was designed for an agent's cognitive model.

Dimension KARN v1.0 Python 3.12 Rust 1.80 TypeScript
Token density ~2.1 tok/LOC ~6.8 tok/LOC ~11.5 tok/LOC ~9.2 tok/LOC
Platform targets All (1 source) Server + CLI Native + WASM Web + Node
Error handling Result + chain Exceptions Result (best) Mixed
Async model Default, 1 operator async/await Tokio (best) async/await
Ecosystem access pip+npm+cargo+sys pip native cargo + C FFI npm native
Execution modes Interp + JIT + AOT Interp only AOT only Transpile only
Agent ergonomics Native Human-designed Human-designed Human-designed
Memory model ARC (safe) GC + GIL Borrow (zero-cost) GC (V8)

Install in seconds.

Clone and install. Adds karn to your PATH with run, build, repl, and check commands.

1

Install KARN

Clone the repo and install. Adds karn to your PATH.

git clone https://github.com/karn-lang/karn.git && pip install -e .
2

Write your first program

Create hello.kn and add one line.

! "Hello from KARN"
3

Run it

karn run hello.kn
4

Compile it for production

karn build hello.kn --target python
terminal
$ git clone https://github.com/karn-lang/karn.git && cd karn && pip install -e .
Cloning into 'karn'...
Installing karn v1.0.0...
Successfully installed karn-lang
 
$ echo '! "Hello from KARN"' > hello.kn
$ karn run hello.kn
Hello from KARN
 
$ karn build hello.kn --target python
Compiling hello.kn...
Generating Python...
Output: ./hello.python.py (2.1kb)
 
$ ./hello
Hello from KARN
 
# same source. same output. native speed.