Why it feels different
- The computational shape stays in view.
- Gradients live in the same notation.
- Examples scale without changing style.
Explicit indices, in-language autodiff, and tensor code you can still read.
Most tensor stacks hide the important structure behind helper calls, layer wrappers, and gradient machinery. Einlang puts indices, reductions, recurrences, and derivatives directly in the source so the program still reads like the idea.
Why it feels different
let logits[b, c] = sum[i](x[b, i] * W[c, i]) + bias[c];
let probs[b, c] = exp(logits[b, c]) / sum[k](exp(logits[b, k]));
let loss = -sum[b, c](target[b, c] * log(probs[b, c]));
let dloss_dW = @loss / @W;
Too often, tensor programs vanish into helper calls and framework ceremony. Einlang keeps the important relationships on the page, so the source still reads like the idea behind it.
You can see the axes
Indices and reductions are written directly instead of being implied by library calls.
You can see the derivative
Gradient expressions appear beside the original definitions, not in a separate system.
You can stay in one notation
The same style carries from simple algebra to recurrences, optimization, and larger models.
The payoff
Programs become easier to inspect, easier to explain, and easier to trust.
This section is about syntax, not slogans. Each example shows one language feature in a more typical style and in Einlang, so you can see what the syntax keeps on the page.
Einstein notation
Indices and reductions are written directly instead of being hidden in a string or helper call.
NumPy / JAX
y = np.einsum("bi,ci->bc", x, W) + bias
Einlang
let y[b, c] = sum[i](x[b, i] * W[c, i]) + bias[c];
Where clauses
Index relations stay attached to the expression instead of being managed as loop bookkeeping.
Python loops
for oh in range(H_out):
for ow in range(W_out):
out[oh, ow] = sum(
inp[oh + kh, ow + kw] * kernel[kh, kw]
for kh in range(KH)
for kw in range(KW)
)
Einlang
let out[oh, ow] = sum[kh, kw](
input[ih, iw] * kernel[kh, kw]
) where ih = oh + kh, iw = ow + kw;
Autodiff syntax
The derivative is part of the program text, not a separate API wrapped around it.
JAX / framework style
def loss_fn(W):
logits = x @ W.T + bias
probs = jax.nn.softmax(logits)
return -(target * jnp.log(probs)).sum()
dloss_dW = jax.grad(loss_fn)(W)
Einlang
let logits[b, c] = sum[i](x[b, i] * W[c, i]) + bias[c];
let probs[b, c] = exp(logits[b, c]) / sum[k](exp(logits[b, k]));
let loss = -sum[b, c](target[b, c] * log(probs[b, c]));
let dloss_dW = @loss / @W;
Fibonacci recurrence
The recurrence uses the same tested form as the repo examples: base cases first, then the recursive rule.
Python loop
fib = np.zeros(11, dtype=int)
fib[0] = 1
fib[1] = 1
for n in range(2, 11):
fib[n] = fib[n - 1] + fib[n - 2]
Einlang
let fib[0] = 1;
let fib[1] = 1;
let fib[n in 2..11] = fib[n-1] + fib[n-2];
Comprehensions and ranges
Generated arrays stay as expressions, and ranges are part of the syntax.
Python
squares = [i * i for i in range(1, 10)]
Einlang
let squares = [i * i | i in 1..10];
Block expressions
Multiple statements can still produce a value, so setup code stays inside the expression that uses it.
Python
tmp = x + y
if tmp > 0:
result = tmp * scale
else:
result = 0
Einlang
let result = {
let tmp = x + y;
if tmp > 0 { tmp * scale } else { 0 }
};
Einlang is most convincing when the same syntax appears in both small fragments and larger programs. These are good places to start.
A first read
Start with the smallest examples to get the feel of the notation.
Vision-style models
Convolutional examples keep local structure legible instead of hiding it in layers of framework code.
Attention-heavy programs
DeiT and Whisper artifacts show how the notation holds up under denser workloads.
Beyond standard ML
Optimization, recurrence, and numerical examples show the same language in a wider setting.
A small example is enough to begin. From there, the examples and the reference docs open up the rest of the language.
python3 -m pip install "git+https://github.com/einlang/einlang.git"
python3 -m einlang examples/hello.ein