It might take a tremendous amount of computing resources to train AI models, but tinkers are discovering that they can be run on significantly less sophisticated hardware.
A researcher has managed to make GPT4All, a lightweight open source version of ChatGPT, run on a TI_84 calculator which was released in 2004. In a demo, researcher Brian Roemmele showed how one could enter a brief prompt on the calculator, and then after a bit of a wait, the calculator returns the result. “You will own your own Personal AI and it could run on a: TI-84 calculator released in 2004,” he wrote on Twitter.
GPT4All is a 7B param language model fine tuned from a curated set of 400k GPT-Turbo-3.5 assistant-style generation. It is opensource, and people have been running it on relatively lightweight systems, including Apple M1 laptops and even older versions.
Other tinkerers have also managed to get lightweight LLMs to run on personal computers. A researcher had managed to get LLaMA to run on a Google Pixel, while another had managed to get it running on a Raspberry Pi. The Raspberry Pi version was only able to generate a token every 10 seconds, but it’s a start.
This opens up the tantalizing possibility that high-quality LLMs could one day fit on personal devices. Currently, users need to access the internet, send over a query to companies like OpenAI, which perform the computation on their servers, and then receive an answer, again over the internet. If high quality LLMs, which are opensource, can run on personal devices, they could completely eliminate the need to have an intermediary, essentially giving anyone with a basic device the power of LLMs at their fingertips. These models would not only work without the internet, but could perform an unlimited number of queries for free. It remains to be seen how these developments shape up, but we need to buckle up for a world that’s going to be very different from the one we’ve gotten used to over the last few decades.