How Far Can a MacBook M5 Air Go? Testing Billion-Parameter AI Models Locally
Posted by Aham_bramhasmmi@reddit | LocalLLaMA | View on Reddit | 5 comments
How many billion-parameter models can it actually handle ?
CalligrapherFar7833@reddit
None actually usable for anything but chat with 0 knowledge
Aham_bramhasmmi@reddit (OP)
token output is slow if i am using big model like 12billion parameter model mistral nemo hence asking if someone have tried anything which works great
CalligrapherFar7833@reddit
Again no usable models at that size for anything but chat with 0 knowledge
Asleep-Party-1870@reddit
Doesn't that depend or how much memory you have?
Aham_bramhasmmi@reddit (OP)
i have 16gb,512 storage variant