Running LLMS on RAM ?

Posted by Electronic_Image1665@reddit | LocalLLaMA | View on Reddit | 8 comments

Hey guys, I have been seeing some posts here and there about people that are able to run the local models partly on the RAM and I had not heard of this until this sub Reddit is there a good source of information on how to do this? I’m running a 4060TI 16gb and I also have an RX 6700 nitro, but I took that one out as most of my web searches said that trying to do both at the same time would be a huge pain and I’d be better off selling it. But I do have 64 GB of RAM. Thanks!