Einlang

Explicit indices, in-language autodiff, and tensor code you can still read.

Tensor math that stays legible.

Most tensor stacks hide the important structure behind helper calls, layer wrappers, and gradient machinery. Einlang puts indices, reductions, recurrences, and derivatives directly in the source so the program still reads like the idea.

Indices Reductions Autodiff Recurrences

Why it feels different

  • The computational shape stays in view.
  • Gradients live in the same notation.
  • Examples scale without changing style.
01 Why

Tensor code should still be readable.

Too often, tensor programs vanish into helper calls and framework ceremony. Einlang keeps the important relationships on the page, so the source still reads like the idea behind it.

You can see the axes

Indices and reductions are written directly instead of being implied by library calls.

You can see the derivative

Gradient expressions appear beside the original definitions, not in a separate system.

You can stay in one notation

The same style carries from simple algebra to recurrences, optimization, and larger models.

The payoff

Programs become easier to inspect, easier to explain, and easier to trust.

02 Syntax

Syntax features, side by side.

This section is about syntax, not slogans. Each example shows one language feature in a more typical style and in Einlang, so you can see what the syntax keeps on the page.

Einstein notation

Indices and reductions are written directly instead of being hidden in a string or helper call.

NumPy / JAX

NumPy / JAX numpy
y = np.einsum("bi,ci->bc", x, W) + bias

Einlang

Einlang ein
let y[b, c] = sum[i](x[b, i] * W[c, i]) + bias[c];

Where clauses

Index relations stay attached to the expression instead of being managed as loop bookkeeping.

Python loops

Python loops python
for oh in range(H_out):
    for ow in range(W_out):
        out[oh, ow] = sum(
            inp[oh + kh, ow + kw] * kernel[kh, kw]
            for kh in range(KH)
            for kw in range(KW)
        )

Einlang

Einlang ein
let out[oh, ow] = sum[kh, kw](
  input[ih, iw] * kernel[kh, kw]
) where ih = oh + kh, iw = ow + kw;

Autodiff syntax

The derivative is part of the program text, not a separate API wrapped around it.

JAX / framework style

JAX / framework style jax
def loss_fn(W):
    logits = x @ W.T + bias
    probs = jax.nn.softmax(logits)
    return -(target * jnp.log(probs)).sum()

dloss_dW = jax.grad(loss_fn)(W)

Einlang

Einlang ein
let logits[b, c] = sum[i](x[b, i] * W[c, i]) + bias[c];
let probs[b, c] = exp(logits[b, c]) / sum[k](exp(logits[b, k]));
let loss = -sum[b, c](target[b, c] * log(probs[b, c]));
let dloss_dW = @loss / @W;

Fibonacci recurrence

The recurrence uses the same tested form as the repo examples: base cases first, then the recursive rule.

Python loop

Python loop python
fib = np.zeros(11, dtype=int)
fib[0] = 1
fib[1] = 1
for n in range(2, 11):
    fib[n] = fib[n - 1] + fib[n - 2]

Einlang

Einlang ein
let fib[0] = 1;
let fib[1] = 1;
let fib[n in 2..11] = fib[n-1] + fib[n-2];

Comprehensions and ranges

Generated arrays stay as expressions, and ranges are part of the syntax.

Python

Python python
squares = [i * i for i in range(1, 10)]

Einlang

Einlang ein
let squares = [i * i | i in 1..10];

Block expressions

Multiple statements can still produce a value, so setup code stays inside the expression that uses it.

Python

Python python
tmp = x + y
if tmp > 0:
    result = tmp * scale
else:
    result = 0

Einlang

Einlang ein
let result = {
  let tmp = x + y;
  if tmp > 0 { tmp * scale } else { 0 }
};
03 Showcase

The same language, from first read to full workload.

Einlang is most convincing when the same syntax appears in both small fragments and larger programs. These are good places to start.

04 Start

Run one file, then keep going.

A small example is enough to begin. From there, the examples and the reference docs open up the rest of the language.