Do cheap 32GB V100s still make sense for homelab AI?
I already have an RTX 5060 Ti 16GB and a 5070 Ti, but I’m wondering whether picking up a couple of Tesla V100 32GB cards could actually make sense as a value proposition specifically for larger local models. I know the V100 is old, power-hungry, and mi…