Two faces of the same code
The wire format is what your LLM sees and what's stored on disk — flat, slot-based, every keyword a single token. The display format renders the same code as a Laudan archive entry, for humans. Round-trip is lossless.
Wire
fn safe_div
vis appearing
eff pure
in a: int
in b: int
out int?
ex safe_div(10, 2) == Some(5)
ex safe_div(7, 0) == None
ens result.is_some() iff b != 0
do
if b == 0 { return None }
return Some(a / b)
end
Display
══════════════════════════════
LAUDAS ▸ vol I ▸ safe_div
appearing · pure
─────────────────────────────
in a : int
b : int
out int?
ex safe_div(10, 2) → Some(5)
safe_div( 7, 0) → None
ens result.is_some() iff b ≠ 0
─────────────────────────────
do
if b == 0 { return None }
return Some(a / b)
══════════════════════════════
The verifier finds bugs your tests don't
Three examples; the Z3-backed verifier checks the postcondition for
every input. Here's the verifier catching a bug in a
deliberately-broken safe_div:
do
if b == 0 { return Some(0) } // bug — should be None
return Some(a / b)
end
ver ✗ ens result.is_some() iff b != 0
counterexample: a = -1, b = 0 → result = some(0)
a = -1 — an input you
never tested. That's the difference between testing and verification.
Every diagnostic is also emitted as structured JSON with ranked
suggested fixes — designed to be consumed by an LLM, not a human.
What makes Laudas different
- Verifier-first feedback loop. Diagnostics are structured for an LLM, with ranked fixes and plain-English explanation.
- Specs are tests.
exslots run as tests.req/ensare checked by Z3.proseis LLM-checked. - Effects in signatures. A function calling
eff iomust declare it. No "pure" functions secretly hitting a database. - One way to do it. No macros, no operator overloading, no implicit conversions. Smaller hallucination surface.
- Tokenizer-aware grammar. Every keyword is a single token in cl100k_base / o200k_base. Slot order fixed. Grammar-mask friendly.
- Day-one Python interop. Any pip-installable package via
extern python "module.func". - Spec-first inversion. Empty
doblock?laudas request-body file.laudcalls Claude, fills the body, voronin verifies.
Laudas programs run
A fn main is the entry point. laudas run calls it with
the CLI arguments and uses its int return value as the exit code.
Combined with the io and text modules and
Python FFI, that's enough to build real shell tools.
The first one in the repo: examples/csv2json.laud, twenty lines of Laudas:
type Crew { name: str, role: str, ship: str }
fn parse_row
vis appearing
eff pure
in row: str
out Crew
ex parse_row("Mira,engineer,Voronin") == Crew { name: "Mira", role: "engineer", ship: "Voronin" }
do
let parts = text.split(row, ",")
return Crew { name: parts.at(0), role: parts.at(1), ship: parts.at(2) }
end
fn main
vis appearing
eff io
out int
do
let raw = io.read_stdin()
let rows = text.split(raw, "\n")
let crew = rows.skip(1).filter(r -> r.length() > 0).map(r -> parse_row(r))
io.println(text.to_json(crew))
return 0
end
Pipe a CSV in, get JSON out:
$ Get-Content examples/crew.csv | laudas run examples/csv2json.laud
[{"name":"Mira","role":"engineer","ship":"Voronin"},
{"name":"Dara","role":"captain","ship":"Osei"},
{"name":"Cass","role":"navigator","ship":"Telles"}]
The parse_row function carries its own ex slots, so
laudas examples/csv2json.laud (no piping) still verifies the
parsing logic end-to-end. The two modes — verify the spec and
run the program — coexist on the same file.
Install
Standalone (Windows, no Python required)
Download laudas.exe (~18 MB), drop it on PATH, run it.
laudas demo_fixed.laud # parse + run examples + verify laudas --show file.laud # render as Laudan archive entries laudas run program.laud [ARGS...] # execute the program's `main` laudas request-body spec.laud # fill empty `do` blocks via Claude
From source (any OS with Python 3.10+)
git clone https://github.com/IshiakiZ/laudas cd laudas pip install -e . laudas demo_fixed.laud
The crossing
Laudas takes thematic vocabulary from The Disappearing and The Answering, a sci-fi trilogy in which AI-built rockets carry humanity to an exoplanet called Laudas. The hook — AIs built the rockets that saved humans — mirrors the language's positioning as AI-built infrastructure for the post-AI computing era.
The thematic frame is paint, not structure: removable without
harming the language. But it's the through-line that makes the project
feel coherent, from the visibility keywords (appearing /
disappearing) to the trait-impl keyword
(answering) to the toolchain (mira,
voronin, osei) to the archival display format.