Traning Llama3.2:3b on my whatsapp chats with wife

Posted by jayjay_1996@reddit | LocalLLaMA | View on Reddit | 115 comments

Hi all,

So my wife and I have been dating since 2018. ALL our chats are on WhatsApp.

I am an LLM noob but I wanted to export it as a txt. And then feed it into an LLM so I could ask questions like:

Etc

So far - the idea was to chunk them and store them in a vector DB. And then use llama to interact with it. But the results have been quite horrible. Temp - 0.1 to 0.5, k=3 to 25. Broke the chat into chunks of 4000 with overlap 100

Any better ideas out there? Would love to hear! And if it works I could share the ingestion script! 🙇