Reka Edge 2603 multimodal support has been merged into llama.cpp

Hi r/LocalLLaMA! I work at Reka and organized our AMA last month. Some of y'all have asked for llama.cpp support - this is a follow-up to let you know that Reka Edge 2603 is now supported upstream in llama.cpp.

To get started:

One note: the model does not currently support reasoning, so run llama-server with `--reasoning off`. Happy hacking!

submitted by /u/Available_Poet_6387
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top