The Fever Dreams Before Dawn: BEIST and the Prehistory of Koru

· 15 min read

The Fever Dreams Before Dawn

The overwrought ancestors of event continuations.


The Discovery

We did digital archaeology on our own codebase.

We were looking for traces of something called “Semantic Algorithm Evolution” - a phrase that surfaced in an old directory called 6digit-beist copy. What we found was the conceptual prehistory of Koru: fever dreams from late 2024 that contained genuinely good ideas buried under layers of overwrought presentation.

This is the prequel to The Birth of Event Continuations. That post documented the Night of August 27, 2025 - the sleepless night when S-expression chains and ring buffers emerged. But the concepts that became event continuations and taps? They were born earlier, in a project called BEIST.

Bidirectional Entropy Intelligence Synthesis Technology.

Yes, really.


The BEIST Era

BEIST was an attempt to make LLMs generate algorithms through “semantic evolution” rather than direct code generation. The theory was sound: instead of asking an LLM to produce complete, syntactically correct code in one shot, you would guide it through stages of understanding, each stage producing fragments that built toward implementation.

The execution was… enthusiastic.

The Report Generator

We found a file called beist-evolution-report-generator.ts. It was 1,055 lines of TypeScript that generated beautiful HTML reports showing the “evolution” of algorithms through semantic stages.

The reports had:

  • Gradient backgrounds (dark mode, naturally)
  • Collapsible sections with smooth animations
  • Interactive flow graphs with play/pause/step controls
  • Confidence bars showing “evolution progress”
  • A “Prompt Audit” section for “transparency”

All of this… to generate sphere volume calculations:

// Test with problems that are NOT in any prompts
const cleanTests = [
  'Calculate the volume of a sphere given its radius',
  'Convert miles to kilometers',
  'Calculate body mass index (BMI) from weight and height'
];

Freshman homework dressed up as “Semantic Algorithm Evolution.”

And yes, there was a hardcoded API key committed to source control:

const apiKey = 'sk-or-v1-99695d1f4ed8...';  // Yes, really. Committed to source control.

We were young. We were naive. We were very, very tired.


The Actually Good Ideas

But here’s the thing: buried under the gradient backgrounds and confidence bars were three genuinely powerful concepts that would eventually become core Koru features.

1. Mnemonics: Named Conditions

In BEIST, a mnemonic was a boolean condition with a semantic name:

const MnemonicSchema = z.object({
  name: z.string().describe('Semantic name for this condition'),
  expression: z.string().describe('Boolean expression'),
  purpose: z.string().describe('Why this condition is needed'),
  enables: z.string().describe('What this mnemonic enables when true')
});

// Example mnemonic
{
  name: "has_more_elements",
  expression: "state.index < state.array.length",
  purpose: "Check if iteration should continue",
  enables: "process_next_element"
}

The insight was separation: the name carries the intent, the expression carries the implementation.

An LLM could reason about has_more_elements without parsing state.index < state.array.length. A human could understand the algorithm’s structure by reading the mnemonic names alone. The boolean expression was an implementation detail.

2. Mutations: Named State Changes

A mutation was a state transformation with a semantic name:

const MutationSchema = z.object({
  name: z.string().describe('Semantic name for this transformation'),
  updates: z.record(z.string()).describe('Field assignments'),
  purpose: z.string().describe('What this mutation accomplishes'),
  requires: z.array(z.string()).describe('What conditions must be true')
});

// Example mutation
{
  name: "advance_to_next",
  updates: {
    "index": "state.index + 1",
    "current": "state.array[state.index + 1]"
  },
  purpose: "Move iteration forward",
  requires: ["has_more_elements"]
}

Again, separation: the name advance_to_next tells you what happens semantically. The expressions tell the compiler what to emit.

3. Traveling Mutations: Subscription by Naming

This one was wild. The “Traveling Mutations Architecture” proposed inverting the traditional model of shared mutable state:

Traditional:  [Worker A] → writes → [Shared State] ← writes ← [Worker B]

                                   Contention Point!

Traveling:    [Worker A] → emits mutation → travels to → [Subscriber's Local State]

                                                       No contention!

Instead of workers contending over shared state, mutations would travel to subscribers. And subscription was determined by naming convention:

state = {
    work__task: null,           // Subscribes to "work:task" channel
    pixels__update: null,       // Subscribes to "pixels:update" channel
    local_counter: 0            // Pure local state (no double underscore)
}

The double underscore meant “I want to receive mutations on this channel.” The system would route mutations automatically based on field names. Zero configuration. Zero ceremony.

And crucially: if no one subscribed to a channel, no code was generated for it. Zero-cost observability through dead code elimination.


The Evolution

Here’s how those three concepts transformed into Koru:

BEIST ConceptWhat It WasKoru FeatureWhat It Became
MnemonicsNamed boolean conditionsEvent Branches\| valid \| invalid \| timeout
MutationsNamed state changesTaps~tap(source -> dest) - a user-space library!
Subscription by namingfield__channel conventionWildcard patterns~tap(* -> event), ~tap(event -> *)
Zero-cost if unusedDead code eliminationCompile-time AST transformTaps that literally disappear

Mnemonics → Event Continuations

The mnemonic pattern - named conditions that control flow - became the branch syntax:

BEIST mnemonic:
{
  name: "request_succeeded",
  expression: "response.status == 200",
  enables: "process_response"
}

Koru event continuation:
~http.get(url: endpoint)
| ok response |> process(response)
| error e |> handle_error(e)
| timeout |> retry(endpoint)

The branches ok, error, timeout are semantic names. They carry intent. The payload bindings carry data. The continuation arrows carry control flow.

But there’s something deeper here: the bounded context.

In BEIST, we talked about “semantic algorithm evolution” - the idea that understanding should evolve through stages, each stage bounded by what had been learned so far. Each mnemonic created a context: “within this condition being true, these things are valid.”

Event continuations capture this exactly. When you write:

~fetch(url: endpoint)
| ok data |>
    // Inside this continuation, 'data' exists and is valid
    // The fetch succeeded - that's our bounded context
    parse(data)
    | valid record |> store(record)
    | invalid e |> log_error(e)
| error msg |>
    // Different bounded context: the fetch failed
    // 'data' doesn't exist here - and can't
    retry(endpoint)

Each branch creates a bounded semantic context. Within | ok data |>, you know the fetch succeeded. Within | error msg |>, you know it failed. The compiler enforces these boundaries - you literally cannot access data in the error branch because it doesn’t exist in that context.

This is “semantic algorithm evolution” made structural. The understanding doesn’t evolve through LLM prompts; it evolves through the type system. Each branch is a stage of understanding, bounded by what is known to be true at that point.

But there’s an even deeper connection. Look at a Koru event signature:

~event read { conn: *Connection[connected] }
| data { bytes: []const u8, conn: *Connection[connected] }
| closed { conn: *Connection[closed] }

Just from this signature, you know:

  • It takes a connection that must be in the connected state
  • It either returns data (connection stays connected) or signals closed (connection transitions to closed)
  • The bytes come from somewhere, the connection is threaded through

You know 97% of how to implement it just from reading the signature.

And a subflow makes the coupling explicit:

~user.fetch = found { name: "Test User", email: "test@example.com" }

This says everything: ”user.fetch always returns found with these fields.” The ~event.name = ... syntax binds the implementation directly to the event shape.

This is what “Semantic Algorithm Evolution” was actually reaching for. Not LLM-guided code generation through prompt stages, but signatures that capture semantics so completely that implementation becomes obvious. The event declaration IS the semantic understanding. The ~proc or ~subflow is bounded by that understanding.

BEIST tried to evolve understanding through LLM conversations. Koru captures understanding in type signatures and enforces it structurally.

Mutations → Taps

The mutation pattern - named transformations with semantic purposes - became the tap system:

BEIST mutation:
{
  name: "log_request",
  updates: { /* side effect */ },
  purpose: "Record HTTP request for debugging"
}

Koru tap:
~import "$std/taps"

~tap(http.get -> *)
| ok r |> log.info("Request succeeded:", r.url)
| error e |> log.error("Request failed:", e.msg)

Taps observe event transitions without interfering with the main flow. They’re read-only. They’re non-exhaustive (you can observe just errors, ignoring successes). And they compile to inlined code at transition points - zero runtime overhead.

And here’s the kicker: taps are a user-space library. You ~import "$std/taps" and get the tap keyword. It’s implemented using the same [keyword|comptime|transform] system available to any Koru library. The language is so metacircular that even its observation system is not built-in.

Subscription by Naming → Wildcard Patterns

The field__channel naming convention became wildcard pattern matching:

BEIST subscription:
state = {
    http__response: null,    // Subscribes to "http:response"
    db__query: null          // Subscribes to "db:query"
}

Koru wildcard tap:
~import "$std/taps"

~tap(* -> http.response)   // Observe all calls TO http.response
~tap(http.* -> *)          // Observe all branches FROM any http.* event
~tap(* -> *)               // Observe ALL transitions (universal profiling)
| Profile p |> profiler(p)

Same concept, cleaner syntax. And the same zero-cost property: wildcards that match nothing generate no code.


The Self-Hosting Triumph

The most satisfying evolution: taps are implemented in Koru itself.

In BEIST, the “traveling mutations” system was a theoretical architecture document. In Koru, the tap system is a compile-time AST transform written in Koru:

// From koru_std/taps.kz
// LESSON IN METAPROGRAMMING: This file implements event observation as a
// compile-time AST transformation. Taps are not a language primitive - they're
// built using the same transform system available to any Koru library.

~[keyword|comptime|transform]pub event tap {
    expr: Expression,
    invocation: *const Invocation,
    program: *const Program,
    allocator: std.mem.Allocator
}
| transformed { program: *const Program }

The tap system walks the AST, finds matching transitions, and injects observation calls. Then it removes itself from the output. The feature bootstraps through the language’s own metaprogramming facilities.

The BEIST architecture document described what should happen. Koru makes it happen - and the mechanism is visible, debuggable, and written in the language itself.


The Artifacts

The original BEIST files still exist in our archives. If you want to see:

  • 1,055 lines of HTML report generation for sphere volumes: 6digit-beist copy/beist/beist-evolution-report-generator.ts
  • The traveling mutations architecture: 6digit-beist copy/beist/docs/traveling-mutations-architecture.md
  • The “fragment evolution” concept: 6digit-beist copy/CRITICAL_INSIGHTS.md
  • The semantic dataflow vision: 6digit-beist copy/beist/docs/semantic-dataflow-architecture.md

We’re not hiding our embarrassing past. Those fever dreams led somewhere real.


The Lesson

Good ideas can emerge from overwrought experiments.

The BEIST “Semantic Algorithm Evolution” project produced:

  • Gradient HTML reports for freshman math problems
  • Hardcoded API keys in source control
  • Architecture documents for systems that didn’t exist
  • Grand visions documented at 2 AM

But it also produced:

  • The mnemonic/mutation separation that became event branches and taps
  • The subscription-by-naming pattern that became wildcard matching
  • The zero-cost-if-unused principle that became compile-time AST transforms
  • The bounded semantic context idea that became exhaustive branch handling

You don’t get to the clean ideas without going through the messy ones. The fever dreams before dawn are part of the story.


The Timeline (Extended)

DateWhat Happened
Late 2024BEIST “Semantic Algorithm Evolution” experiments
Mnemonics, mutations, traveling subscriptions conceived
Many gradient HTML reports generated
At least one API key exposed
Aug 27, 2025, 1:18 AMVISION.md: “Zig with ring buffers”
Aug 27, 2025, 2:42 AMS-expression chains with branching
Aug 31, 2025beist-os: Lock-free ring buffer (60M events/sec)
Sep 2, 2025Events carrying continuations (then field)
Sep 14, 2025Full event continuations spec
Sep 16, 2025Koru initial commit
Sep 17, 2025Hello World compiles
Jan 2026This blog post (digital archaeology)

The concepts from late 2024 found their form in late 2025. The fever dreams became a language.


What’s Next

If you haven’t read it yet, The Birth of Event Continuations covers the Night of August 27, 2025 - when the syntax crystallized.

Phantom Obligations shows what becomes possible when the compiler sees all branches exhaustively.

And the paper formalizes it all.

But this post is about the prehistory. The messy experiments. The overwrought presentations. The ideas that survived the journey from fever dream to language feature.

Every clean abstraction has an embarrassing origin story. This is ours.


This post was created through digital archaeology - searching old repositories for the conceptual ancestors of Koru’s core features. The BEIST files are real. The hardcoded API key was real (and has since been rotated). The journey from sphere volume calculations to a programming language paper was real.

Sometimes you have to build the wrong thing enthusiastically before you can build the right thing simply.