#inference #llm

app llwm

Run LLM inference with WebAssembly

1 unstable release

0.0.0 Dec 1, 2024

#109 in #inference

Download history 103/week @ 2024-11-25 52/week @ 2024-12-02

155 downloads per month

Apache-2.0

7KB

llama.wasm

Run LLM inference with WebAssembly. This is for experiment.

No runtime deps