Since OpenAI has not open-sourced the code for ChatGPT, replicating the chatbot is a herculean task, and even the big-tech are struggling. But, AI startup Colossal-AI has found a way to build your own ChatGPT with less computing resources.
Towards this goal, the company has leveraged a PyTorch-based implementation that covers all three stages from pre-training, reward model training, and reinforcement learning. They offer a demo version of the training process that requires only 1.62 GB of GPU memory and can be done on a single consumer-grade GPU, with 10.3x growth on one GPU model capacity.
Check out the GitHub repository here.
Comments are closed.