This mini-tool is built in KotlinJS using fritz2 and then running Deep Learning inference using onnxruntime and its WebGL backend (the most supported one).
onnxruntime additionally supports
WebGPU
,WASM
(cpu)WebNN
with simple flag-change.
This mini-tool is built in KotlinJS using fritz2 and then running Deep Learning inference using onnxruntime and its WebGL backend (the most supported one).
onnxruntime additionally supports
WebGPU
,WASM
(cpu)WebNN
with simple flag-change.