Built-in AI in your browser

Pranav Tiwari
2 min readJun 25, 2024

--

Day 177 / 366

The good thing about open-source LLMs is that they come in all shapes and sizes. That means that you get really small models that might not be as accurate as a GPT, but they can be run on really low resources. People have already been able to run small LLMs on Android phones. And soon enough Google Chrome would be coming with its own inbuilt LLM as well!

This is really interesting for developers. Right now they rely on API calls to hosted models in order to get their GenAI fix. This is costly, has a higher latency, and means that you have a dependency on an external service like OpenAI.

On the other hand, with an on-device model, running an LLM will be as easy as calling a JS function like console.log or localStorage. All the computation will happen right there on the user's browser. It will be totally free, and there will be no privacy concerns as well, as the user's data will be processed locally and will not be sent to some server.

I think for a lot of use cases, we do not need an extremely powerful model. And something like this will be perfect for that. And it seems that they will provide an option to fine-tune the local model as well. I cannot wait to try this!

--

--

Pranav Tiwari
Pranav Tiwari

Written by Pranav Tiwari

I write about life, happiness, work, mental health, and anything else that’s bothering me

No responses yet