LocalLLaMA

Suggest a gpu cloud provider that has reasonable costs to host a open source model for personal use

I suspect that I am priced out on buying hardware for atleast until 2027/28, and this is as a serious hobbyist. The alternative for me is to just fuck-it-all and sign up with chinese ai submitted by /u/Mental-At-ThirtyFive [link] …