1 unstable release
0.0.0 | Dec 1, 2024 |
---|
#109 in #inference
155 downloads per month
7KB
llama.wasm
Run LLM inference with WebAssembly. This is for experiment.
0.0.0 | Dec 1, 2024 |
---|
#109 in #inference
155 downloads per month
7KB
Run LLM inference with WebAssembly. This is for experiment.