#
Tag: baichuan
See all tags.
The following tutorial will take you through the steps required to fine-tune Baichuan2 13B model with an example dataset, using the MoAI Platform.
Preparing the PyTorch script execution environment on the MoAI Platform is similar to doing so on a typical GPU server.
If you've prepared all the training data, let's now take a look at the contents of train_baichuan2_13b.py
Now, we will train the model through the following process.
Similar to the previous chapter, when you execute the train_baichuan2_13b.py
script, the resulting model will be saved in the
Let's rerun the fine-tuning task with a different number of GPUs.
So far, we've explored the process of fine-tuning the Baichuan2 13B model, which is publicly available on Hugging Face, using the MoAI Platform.