Roundtable chat with Talkie-1930 and Gemma 4 31B
Posted by facethef@reddit | LocalLLaMA | View on Reddit | 18 comments
Talkie-1930-13b-it and Gemma 4 31b in the same chat.
Talkie is a 13B vintage language model from 1930. https://talkie-lm.com/introducing-talkie
Hosted version if you can't run them both locally https://opper.ai/ai-roundtable/chat
OwnerByDane@reddit
This is awesome! I was chatting with a user on MachineLearning about how to use historical data to look at language, tone, truth-seeking behavior, etc were inherent in conversations from the 1980s and 1990s and how they compare to today. It would be great to build a model to talk to our 1990s selves or just for sociometric research in language evolution.
crantob@reddit
I can point you to several examples of language decay and subversion leading to loss of resolution (differentiation, discrimination) and false framing.
But just speaking from empirical observation, it's rare that anyone is really interested enough in the truth of the matter to consider evidence contradicting his biases.
And with that subtle meta-reference, I wave with a smile.
OwnerByDane@reddit
Agreed all around. Cognitive dissonance is a difficult thing for anyone to get past so people become intransigent. Truth seeking behavior or patterns are rare. The question would be, how has that changed in the last 40 years (assuming it has)? We have the data and tools to understand that now so it might be a useful exercise.
crantob@reddit
If you can find the time, there's wonderful things written centuries ago that inform us about ourselves with enough distance to be less acutely painful.
facethef@reddit (OP)
Yea I think this is super cool, too. Contact the talkie team maybe they're up for it
OwnerByDane@reddit
I will. Thanks for the post and the idea
yarikfanarik@reddit
how do you run roundtable?
facethef@reddit (OP)
Just ask a question to multiple models at the same time here or do you mean something else? https://opper.ai/ai-roundtable/chat
yarikfanarik@reddit
you said "if you can't run them both locally" but i can't find a github repo or something about this roundtable locally
cershrna@reddit
I think they mean how can it be run locally?
blackbirdind398@reddit
Same curious question?
Iory1998@reddit
How can I useto chat with two models at the same time locally?
Eyelbee@reddit
Check this out
danigoncalves@reddit
Poor talkie...
Kodix@reddit
That is hilarious. It reads like an actual joke.
facethef@reddit (OP)
Very funny, also objectively the best choice
TheRealMasonMac@reddit
facethef@reddit (OP)
Big Chungus is definitely Chinese if you ask Talkie