Experts-Volunteers needed for Vulkan on ik_llama.cpp

ik_llama.cpp is great for both CPU & CUDA. Need legends to make Vulkan better as well.

https://github.com/ikawrakow/ik_llama.cpp/discussions/590#discussioncomment-16357564

So, after bringing the Vulkan back-end up to speed some time ago, I felt that I simply don't have the bandwidth to also maintain it. In llama.cpp there are two maintainers who do nothing else but Vulkan.
But if you are willing to do that, we can try to resurrect Vulkan. Of particular interest would be to implement the graph parallel stuff in the Vulkan back-end (after porting quite a few missing ops that have accumulated since my last effort).
I guess, the issue will be that I'm a complete beginner when it comes to Vulkan. So, unlike your CPU changes prepared with the help of Claude where I was able to quickly spot a problem, with Vulkan we will be left at Claude's mercy, which may turn into a complete disaster with time. So, I think, if you want to become a Vulkan maintainer for ik_llama.cpp, you need to become significantly more knowledgable than me.

https://github.com/ikawrakow/ik_llama.cpp/pull/608

https://github.com/ikawrakow/ik_llama.cpp/discussions/562

Thanks in advance!

submitted by /u/pmttyji
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top