LocalLLaMA

I cannot decide for local OCR model for most of the tasks preferably I would like more individual experiences than reviews.

I have a 16GB VRAM GPU and I'm looking for a reliable local OCR model. Ideally it should stay under ~60% VRAM usage, so around 9–10GB max, because I want to keep it available on-demand rather than loading a huge model only for occasional batch jobs…