The first AI Model in Egypt 🇪🇬

The first AI Model in Egypt 🇪🇬

Following up on the Horus project — the first fully built-from-scratch language model in Egypt.

If this is your first time hearing about Horus: it’s a fully built-from-scratch language model, and it’s open-source.

https://preview.redd.it/v0lw20vuh5zg1.jpg?width=3267&format=pjpg&auto=webp&s=10af499b2c5aab925c549a64cd6a6149217c490a

https://preview.redd.it/3blbewtuh5zg1.jpg?width=1459&format=pjpg&auto=webp&s=fc7ce3c706ba94bc776305f8f172169a69c00818

Hugging Face repo: https://huggingface.co/tokenaii/horus

About a week ago, the source code used to train the model was also released, making it available for developers to explore, use, and build on.

https://github.com/tokenaii/horus-1.0

This makes Horus the first fully trained-from-scratch LLM in Egypt, developed by Assem Sabry and TokenAI.

Today, I’m sharing some early details about the next version: Horus 1.5 Instruct.

This new version is expected to be 5x better than Horus 1.0, with a 64K context length, which is 8x larger than the 8K context in Horus 1.0 4B.

But it’s not just about scaling — Horus 1.5 comes with major improvements in architecture and overall capability, pushing the model to a completely different level.

Also, there are updates about a new cybersecurity model from TokenAI.

A specialized model designed to detect vulnerabilities and fix them instantly. It’s planned to be a large-scale model, trained on trillions of highly specialized security-related data, which puts us in front of something extremely powerful.

All of this is fully built in Egypt, in the field of AI.

TokenAI is starting to seriously shift the AI scene in Egypt and the Arab world, and what we're building is honestly something exceptional.

More official announcements are coming soon about the next Horus models

bigger, stronger, and significantly more efficient.

submitted by /u/assemsabryy
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top