Compare GPUs for running LLMs

Compare GPUs for running LLMs

Hey everyone,

People keep asking me to compare GPUs, so I put together a simple static site to speed up the research process.

đź”— Link: https://lucam185.github.io/GPU-comparison-website/

You can:

  • Search and filter
  • Compare speeds side-by-side based on active parameters

Quick note: The speed estimates are theoretical based on bandwidth and TFlps, and im guesstimating efficiency based on age and other factors. Obviously, real-world performance depends heavily on offloading, drivers, tensor cores, and specific optimizations, but it should work as a decent starting point for your research.

Let me know if you have any questions or specific requests! 🚀

submitted by /u/LucaM185
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top