In Koru, Coding Is Compiler Coding
We wrote a post about the tyranny of general-purpose languages. It introduced four levels of Koru coding: flow orchestration, event design, proc implementation, library design. We argued that the language structure naturally guides developers to the right level.
We left something out.
There’s a level above library design. And discovering it — not by planning it, but by watching the language reveal it as we built real things — is what this post is about.
What We Left Out
The four-level post described library design like this:
// HTTP library
~event http.get { url: []const u8 }
| ok { body: []const u8, status: u16 }
| not_found
| server_error { code: u16, msg: []const u8 }
| network_error []const u8 Clean event declarations. Thoughtful branch design. That’s genuinely good library work — and it’s what something like our gzip library looks like:
~pub event compress {
data: []const u8,
level: ?Level,
allocator: std.mem.Allocator
}
| compressed { data: []const u8, original_size: usize }
| error []const u8 Events and procs. C wrapped in a clean Koru interface. Levels 1 through 3. This is real, useful, good work.
But then we built orisha.
The Router That Reveals Everything
Here is a complete HTTP server:
~import "$orisha"
~orisha:handler = orisha:router(req)
| [GET /] |> response { status: 200, body: "Hello World" }
| [GET /users/:id] p |> response { status: 200, body: p.id }
| [POST /users] _ |> response { status: 201, body: "Created" }
| [*] |> response { status: 404, body: "Not Found" }
~orisha:serve(port: 3000)
| shutdown s |> std.io:print.ln("{{s.reason}}")
| failed f |> std.io:print.ln("{{f.msg}}") That [GET /users/:id] syntax? That’s not in Koru. We invented it. The orisha:router transform reads the continuation branches, parses the method and path pattern out of the branch labels, generates conditions, extracts parameters, and inlines everything — producing a flat if/else chain with zero runtime overhead.
173,000 requests per second. No dispatch table. No hash map. The router is gone before the first request arrives.
And then there’s the full server:
~import "$orisha"
~import "$std/runtime"
~import "$std/fmt"
~pub event greet { name: []const u8 }
| greeted []const u8
~greet = std.fmt:ln("Hello, {{name:s}}!")
| line l |> greeted l.text
~pub event add { a: i64, b: i64 }
| sum i64
~add = sum a + b
~std.runtime:register(scope: "api") {
greet(5)
add(2)
}
~orisha:static(name: "site", root: "public", fallback: "index.html")
~orisha:handler = orisha:router(req)
| [GET /api/health] |> response { status: 200, body: "{\"status\":\"ok\"}", content_type: "application/json" }
| [POST /api/eval] _ |> std.runtime:run(source: req.body orelse "", scope: "api", budget: 1000)
| result r |> std.fmt:ln("{\"result\":{{r.value.toJsonBuf():s}},\"budget_used\":{{r.used}}}")
| line l |> response { status: 200, body: l.text, content_type: "application/json" }
| exhausted e |> std.fmt:ln("{\"error\":\"budget_exhausted\",\"used\":{{e.used}}}")
| line l |> response { status: 429, body: l.text, content_type: "application/json" }
| parse_error e |> std.fmt:ln("{\"error\":\"parse_error\",\"message\":\"{{e.message:s}}\"}")
| line l |> response { status: 400, body: l.text, content_type: "application/json" }
| event_denied e |> std.fmt:ln("{\"error\":\"event_denied\",\"name\":\"{{e.name:s}}\"}")
| line l |> response { status: 403, body: l.text, content_type: "application/json" }
| [*] |> orisha:static_router(name: "site")
~orisha:serve(port: 3000)
| shutdown s |> std.io:print.ln(text: s.reason)
| failed f |> std.io:print.ln(text: f.msg) 83 lines. What’s happening here:
- Compiled native events —
greetandaddrun at Zig speed - A sandboxed live interpreter —
~std.runtime:registerdeclares which events the interpreter can call, andbudget: 1000cuts it off after 1000 operations - Compile-time embedded static files —
~orisha:staticwalks the filesystem, reads every file, gzip-compresses where beneficial, computes SHA256 ETags, and bakes everything into the binary atkoructime - A router that composes all three — pattern-matched, inlined, producing the right HTTP status codes for every error branch
This is not a demo. This is a working server.
What The Library Author Is Actually Doing
Here is the type signature of orisha:router:
~[comptime|transform]pub event router {
invocation: *const Invocation,
item: *const Item,
program: *const Program,
allocator: std.mem.Allocator
}
| transformed *const Program The library author is writing a rewrite rule. The compiler’s transform pass finds every invocation of orisha:router in the program and calls this proc — handing it a pointer to the exact invocation being compiled, a pointer to the containing flow item, and the entire program AST. It returns a new program.
They have:
- The full program — global visibility over everything the user wrote
- A direct pointer to the current node — no searching required
- An allocator — to build whatever AST they want
They walk the continuation branches, parse [GET /users/:id] out of branch labels that don’t exist in Koru’s grammar, generate Zig conditions and parameter extraction, and return a rewritten program. The [*] catch-all with orisha:static_router nested inside it? That’s a nested transform, composed at the AST level.
The user never sees any of this. They write | [GET /users/:id] p |> and get parameter extraction for free.
The Keywords Are The Same Thing
This is not a special capability reserved for orisha. Here is ~if:
~[keyword|comptime|transform]pub event if {
expr: Expression,
invocation: *const Invocation,
item: *const Item,
program: *const Program,
allocator: std.mem.Allocator
}
| transformed *const Program ~if is a library event. It lives in control.kz in the standard library. When you write ~if(x > 0) | then |> ... | else |> ..., the compiler runs the if transform, which finds the | then |> and | else |> branches, constructs a ConditionalNode, and returns a rewritten program.
The [keyword] annotation is what lets you write ~if(...) instead of ~std.control:if(...). It promotes the event name into unqualified syntax. The machinery is identical.
~for, ~capture, ~const — all transforms. All user-space. The orisha team and the standard library team are playing with the same tools.
The Compiler Pipeline Is Also Just Koru
There’s a further level. Here is the default compiler pipeline, in compiler.kz:
~coordinate = context_create(program_ast, allocator)
| created c0 |> frontend(ctx: c0.ctx)
| ctx c1 |> analysis(ctx: c1)
| ctx c2 |> std.optimizer:optimize(ctx: c2)
| ctx c3 |> emission(ctx: c3)
| ctx c4 |> coordinated { ast: c4.ctx.ast, code: c4.code } coordinate is an [abstract] event. frontend, analysis, emission — all abstract. A user program can write ~std.compiler:coordinate = my_pipeline(...) and replace the entire compilation process. Or just override emission to target a different backend.
This is the compiler’s pipeline written in the language the compiler compiles.
The Levels, Completed
So the four levels from the earlier post are right, but incomplete. Here they are in full:
Level 1 — Flows. What you read when you look at Koru code. Events chaining through branches. Readable, composable, accessible to anyone.
Level 2 — Event declarations. The domain model. Naming things, describing their shape and all possible outcomes. This is where the problem space lives.
Level 3 — Procs. Implementation. Where the Zig goes. Mechanical once the event exists — could be written by a junior, an AI, or a domain specialist.
Level 4 — Transforms, expands, compiler overrides. Where you extend the language itself. Where ~if and ~orisha:router live. Where the library author becomes a compiler author.
And the key property of all four levels: none of them are locked. There’s no special access, no privileged API, no plugin system. A library author writing a router has exactly the same tools as the people who wrote the control flow keywords. The only difference is complexity — and complexity is earned, not granted.
The Ergonomics Claim
Here is the claim: Koru is the only language where a library can be simultaneously this inviting to a newcomer and this satisfying to someone who wants optimal code.
Not because it has the most powerful metaprogramming — though it does. But because the levels shield each other.
A newcomer sees | [GET /users/:id] p |> and understands it immediately. They never see the transform that generates it. They never read ast_functional.replaceFlowRecursive. They get a clean, readable API that feels like the language was designed for HTTP servers.
An expert opens orisha/lib/index.kz, reads the router transform, and sees exactly what it’s doing — no magic, no hidden machinery, just a well-written compiler pass using the same tools they have access to.
The gzip library works at levels 1-3. The router works at level 4. The user of gzip and the user of orisha both write clean Koru. The authors never left the language.
What We Discovered
We didn’t plan this. We built a gzip library, then an HTTP framework, then looked at what we’d made.
The gzip library wraps C. Clean events, good error branches, no surprises. Levels 1-3.
The router doesn’t exist at runtime. The library author wrote a compiler pass. The user writes routing tables. Level 4.
The standard library’s ~if is written the same way as ~orisha:router. There’s no difference in kind between a keyword and a library.
The compiler pipeline is written in the language it compiles. You can override it.
This is what we discovered: the levels were always there. The language just had to grow into them.
The orisha web framework is available at github.com/korulang/orisha. The Koru compiler is at github.com/korulang/koru. Both are early but working.