Are there any local LLM models that work on or within a browser, that are currently deployed right now in a project?

I'm just wondering about this because I know that having a local LLM model working within the browser could be really brilliant for a lot of applications. I'm just wondering if anything's been built now around it and if even LLM models are working at this stage that you can have an application within the browser that would use the person's own device to return LLM responses.

submitted by /u/10c70377
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top