Running Llama 3.2 100% locally in the browser on WebGPU w/ Transformers.js

Posted by xenovatech@reddit | LocalLLaMA | View on Reddit | 37 comments