🇪🇬 The First Open-Source AI Model in Egypt!
Posted by assemsabryy@reddit | LocalLLaMA | View on Reddit | 61 comments

Today, with great pride, I am excited to officially announce the first open-source AI model series emerging from Egypt.
The Horus-1.0 series consists of text generation models, fully trained from scratch on trillions of clean training tokens.
Today, I am also proud to announce the release of the first model in the Horus series: Horus-1.0-4B, featuring an 8K context length.
The model is available in 7 different versions:
- The full version with original weights
- 6 compressed variants designed to fit different hardware and deployment needs
This provides exceptional flexibility for developers and researchers based on their available computational resources.
Horus is available as an open-source model under TokenAI, and you can explore all available versions along with detailed usage instructions on the official website:
You can also easily download and use the model through the neuralnode Python framework, which offers a seamless integration experience with the Horus models.
In addition, Replica Text-to-Speech is fully integrated within neuralnode.
You have access to 20 voices across 10 different languages, including Arabic, allowing easy voice integration with your applications and AI workflows.
Now let’s talk about the scale and significance of this achievement.
Since there are almost no officially announced AI models in Egypt that are fully built and trained from scratch as open-source models, Horus represents a major milestone:
- Horus is the first open-source AI model built from scratch in Egypt
- Horus is one of the strongest language models in the Arab world
- Horus is one of the strongest models globally within its size class
And all of this is backed by numbers and benchmark results.
The Horus model family is:
- Open-source
- Fully trained from scratch
- Multilingual
- Highly capable in Chain-of-Thought and reasoning
- Supports Thinking capabilities
The Horus-1.0-4B model outperformed several benchmarks, including MMLU, achieving results higher than well-known larger models such as Qwen 3.5-4B and Gemma 2 9B.
It also surpassed the same models in the more challenging MMLU Pro, and even outperformed Llama 3.1 8B, despite that model being more than twice the size of Horus.
We are looking at a project capable of placing Egypt on the global AI map.
Horus is not the first AI model from Egypt, but it is the first officially announced, fully open-source, fully scratch-trained model from Egypt.
My goal is not only to build a model, but to build a real Egyptian open-source AI infrastructure.
And this is only the beginning of what I believe will become the best AI model in the Arab world.
#HorusAI #OpenSourceAI #LLM #ArtificialIntelligence #Egypt #MachineLearning
Ok_Use_These@reddit
I recognize you. You were the guy that copied (not forked, copied) an open source project (I can't remember which one for the moment sadly) changed the name, the visuals, published it as yours and refused to give any credits to the author despite the requests.
Beautiful-Arm5170@reddit
Sauce?
jester_kitten@reddit
https://old.reddit.com/r/LocalLLaMA/comments/1sfl8tw/the_first_opensource_ai_model_in_egypt/of0kbll/
StoneCypher@reddit
asking people for sources is loser shit
Nindaleth@reddit
OK, here are the sources u/Beautiful-Arm5170 u/Hathos_
Note, I don't have an opinion on the Horus model, it may as well be legit.
Ok_Use_These@reddit
u/-p-e-w- take a look
-p-e-w-@reddit
Yeah, I did see this model announcement and of course recognized the author immediately, given that they are responsible for the most unpleasant episode of my nearly 20-year open source career.
That being said, I haven’t examined this model myself, and have no opinion on what it may or may not be.
Good_Act7586@reddit
Discolsure: This is a disposable account to avoid abuse.
The weights in https://huggingface.co/tokenaii/horus/tree/main/Horus-1.0-4B are bit identical to https://huggingface.co/nvidia/Llama-3.1-Minitron-4B-Width-Base. tokenizer.json is identical apart from the merges section which appears to have been reformatted but should be functionally identical. The 100% benchmarks in the model card for gpqa_diamond, ifeval, bfcl, omnidocbench terminal_bench and browsercomp would mean a 4 billion parameter model beating frontier models of any size as none have achieved this feat. config.json has been altered and may only work NeuralNode framework which looks to be by the same author.
Hathos_@reddit
Thank you!
Hathos_@reddit
How about you provide a source and don't use a throwaway account. Otherwise, this just looks like an attempt to smear the author.
ELPascalito@reddit
Assem is actually a pretty known name in the Egyptian dev scene, for the worng reasons, he has a history of plagiarizing and posting fake projects, even in his hometown Alexandria, for example this project is also Sus, it's clearly vibecoded, the HF page is newly created and he is for some reason using the same name as "token AI" company, but adding another "i" so it says Aii? Even the model name has a typo lol
https://huggingface.co/tokenaii/Hours-1.0-4B-GGUF
Ok_Use_These@reddit
It's not a disposable account, it's mine, I don't usually post but this guy is just a fraud and I could not let it pass
Please see u/nindaleth's message, he found what I was talking abount
Hathos_@reddit
I understand. Thanks u/nindaleth for the link. There is nothing wrong with what you are posting, but you just needed to have some sort of proof or substance for people to look at.
PowerBottomBear92@reddit
It's the Pyramids all over again
Electronic-Metal2391@reddit
Usage parameters? Architecture? This info not mentioned anywhere.
Ok_Warning2146@reddit
plain old llama. you can read that from their config.json
assemsabryy@reddit (OP)
Llama
IrisColt@reddit
Ooh, shiny new toy... gimme, gimme! Thanks!
nuclearbananana@reddit
Very cool. Always good to see more countries beyond China and America.
Do you have a tech report? What architecture is it?
LinkSea8324@reddit
Weird country bro
Cherlokoms@reddit
r/USdefaultism
nuclearbananana@reddit
That's not US defaultism, the U.S is literally a dictionary defintion for 'America' https://www.merriam-webster.com/dictionary/America
When people are referring to the continents they say North or South America
assemsabryy@reddit (OP)
Llama
assemsabryy@reddit (OP)
everything explained on the website
TokenAI
Fine_League311@reddit
Cool! Will play with it. Thanks!
Fine_League311@reddit
PS: MIT and no GIT?
Rheumi@reddit
Visiting Hurghada next week. I like Egypt.
xatey93152@reddit
Lol. Better save your time for something else
BetApprehensive4546@reddit
Cool
mKtos@reddit
It seems that GGUF version has wrong chat template (or at least stop token definitions) because it is spitting out tokens like this:
"(...) Do you have any more questions? Can we play some funny games together?<|end|><|end|><|user|>Can we play hide and seek, then?<|end|><|assistant|>Of course! Hide and Seek is always fun to play. Here are the rules:"
In the previous turn it generated "(...) However, one thing we can all agree on: it is a great joke!<|end|>" and stopped, the LM Studio said "EOS Token found".
redbarone@reddit
Is it any good at processing SAR tomography looking for caverns?
BestSeaworthiness283@reddit
nice!!
StoneCypher@reddit
egypt's first open source AI model was released in the 1980s, most likely before you were born
LoveMind_AI@reddit
Every globally developed opensource model is a true gift.
WorldlinessTime634@reddit
great work 👍
pmttyji@reddit
From HF:
https://huggingface.co/tokenaii/horus
https://huggingface.co/tokenaii/Hours-1.0-4B-GGUF
https://huggingface.co/tokenaii/Horus-1.0-4B-MLX
pmttyji@reddit
u/assemsabryy GGUF name has typo, needs fix.
assemsabryy@reddit (OP)
Thank you so much, I fixed it 🙏🏼
pmttyji@reddit
Still that model card has Hours(instead of Horus) in 3 places. Damn Autocorrects!
assemsabryy@reddit (OP)
Can you tell me where exactly?
pmttyji@reddit
Just open the GGUF page https://huggingface.co/tokenaii/Horus-1.0-4B-GGUF
And search for Hours (Ctrl+F, type Hours), you'll see 3 entries
spaceman3000@reddit
Languages? They are not mentioned anywhere. On huggingace it only says English and Arabic.
Impossible_Art9151@reddit
Gratulations!
A question that came to my mind - Egyptian, "Horus" - is it also trained on reading ancient egyptian writings?
TheOutsider25@reddit
Great work. good to see more work coming from Egypt.
interested to know how does it compare to karnak which is also an Egyptian model but fine tuned instead of trained from scratch ?
currently it tops the OALL Arabic leaderboards.
assemsabryy@reddit (OP)
Karnak has 41B Parameters, Horus has 4B so we’re talking about 10x difference It would be unfair to compare + Karnak has fine tuned on Qwen 30B
Queasy-Contract9753@reddit
Fist
TopChard1274@reddit
You have landed wrong, YouTube is too pages on the right
Queasy-Contract9753@reddit
Fair enough. I'll take my downvotes. Still cool that there's new base models out there from around the world
Visual_Strawberry276@reddit
Benchmarks looks really good 3ash! 💪🏾
sunychoudhary@reddit
Very cool milestone.
Honestly, the part I like most is not just “new model,” but trained from scratch + open-source + multilingual. More regions building their own models and infrastructure is good for the whole ecosystem, especially for language coverage and local use cases. The post says Horus-1.0-4B was trained from scratch, supports 8K context, and ships in multiple variants for different hardware setups.
d00m_sayer@reddit
Poorer countries like India made +100B models there is nothing impressive here.
Beginning-Window-115@reddit
you sound like a good person
Faktafabriken@reddit
Fun guy to be around also probably….
ExosFantome@reddit
Nice. I guess if they build a coding agent, it will be named "Horus-code" which sounds a bit like horoscope :D
Kitchen_Zucchini5150@reddit
Kindly check ur pm.
insanemal@reddit
Congratulations to the team behind this!
Azuriteh@reddit
Hey Assem, what a coincidence to see you here :), it's Irving. Will take a look.
Azuriteh@reddit
You should probably add a Huggingface repo or you'll get killed on r/LocalLLaMA btw
assemsabryy@reddit (OP)
I want the people to visit the main page of the model on the website to see the full guide to use the model through neuralnode
Azuriteh@reddit
I see, that's fair, I'll delete the comment lol. You should also try and get some GGUFs so people can actually try it on their GPUs
assemsabryy@reddit (OP)
Hi my friend, Fr what a coincidence