#
Tag: gpt
See all tags.
Fine-tuning Tutorials
GPT Fine-tuning
This tutorial guides you on how to fine-tune GPT-based models open-sourced by Hugging Face on the MoAI Platform.
Fine-tuning Tutorials • GPT Fine-tuning
1. Preparing for Fine-tuning
Preparing the PyTorch script execution environment on the MoAI Platform is similar to doing so on a typical GPU server.
Fine-tuning Tutorials • GPT Fine-tuning
2. Understanding Training Code
Once you have prepared all the training data, let's take a look at the contents of the train_gpt.py
script to execute the actual fine-tuning process.
Fine-tuning Tutorials • GPT Fine-tuning
3. Model Fine-tuning
Now, we will train the model through the following process.
Fine-tuning Tutorials • GPT Fine-tuning
4. Checking Training Results
As in the previous chapter, when you run the train_gpt.py
script, the resulting model will be saved in the
Fine-tuning Tutorials • GPT Fine-tuning
5. Changing the Number of GPUs
Let's rerun the fine-tuning task with a different number of GPUs.
Fine-tuning Tutorials • GPT Fine-tuning
6. Conclusion
So far, we have looked at the process of fine-tuning the GPT-based model from HuggingFace on the MoAI Platform.