Much like Stable Diffusion last year, research in large language models (LLMs) means that you can now run your very own chatbot on your local PC. (https://tinyurl.com/local-chatbot) Caveats apply.
This isn’t ChatGPT. You’ll also want a GPU with as much VRAM as possible, and plenty of system memory—32GB is a realistic minimum. Nvidia GPUs are vastly preferable as well,