Linux Format

Our chatty AI man

Our first test involves running the LLaMA payload – discussed in LXF304, O it creates a ChatGPT-like bot that runs locally. Due to the extremely large size of the model, which has to be obtained via BitTorrent, this task cannot be completed solely on the Raspberry Pi – instead, a workstation is required to handle some of the more computationally intensive preparation tasks.

Furthermore, be aware that the quantisation steps have to be performed once again even if you’ve done them in the past. The LLaMA framework receives frequent upgrades – trying to use a recent version with an ancient quantisation leads to errors.

Excursus: swap it

Since advanced operating systems became available on all kinds of workstations, expanding the working memory

You’re reading a preview, subscribe to read more.

More from Linux Format

Linux Format1 min read
Vector Vexations
Why does MySQL not support vectors in its community edition? Generative AI is the hot topic in tech. GenAI relies on vector data. Yet Oracle has no plans to support vectors in the community edition of MySQL. If you want to try out vector data with ot
Linux Format5 min read
Tips For Managing Docker Containers
Everyone knows how containers revolutionised application building and deployment. Using a E disposable stack of containers that make up an app that aren’t using the docker-compose command to manage the stack are missing a trick. It allows the shippin
Linux Format1 min read
Wine For Wayland
2023 was a great year for the Wayland driver for Wine. The goal was to move forward from the experimental phase and make the driver a proper upstream component. A year later, after several merge requests, many people are now already able to use the l

Related