You are viewing a single comment's thread from:

RE: GPT4All: How to run "ChatGPT" locally on your PC, Facebook/Meta has ignited the open-source uncensored GPT community, what an irony 🚀

in STEMGeekslast year

The cool thing is, you can run the models both on the GPU and CPU, and also split the inference between them, so preferably on GPU, because it's much faster, but it's possible to run the entire model on the CPU RAM only.