1. Mo​AI Platform
  2. Tags
  3. baichuan

# Tag: baichuan

See all tags.

Tutorials • Fine-tuning Tutorials
Baichuan2 Fine-tuning

This tutorial introduces an example of fine-tuning the open-source Baichuan2-13B model on the MoAI Platform.

Tutorials • Fine-tuning Tutorials • Baichuan2 Fine-tuning
1. Preparing for Fine-tuning

Preparing the PyTorch script execution environment on the MoAI Platform is similar to doing so on a typical GPU server.

Tutorials • Fine-tuning Tutorials • Baichuan2 Fine-tuning
2. Understanding Training Code

If you've prepared all the training data, let's now take a look at the contents of train_baichuan2.py

Tutorials • Fine-tuning Tutorials • Baichuan2 Fine-tuning
3. Model Fine-tuning

Now, we will train the model through the following process.

Tutorials • Fine-tuning Tutorials • Baichuan2 Fine-tuning
4. Checking Training Results

Similar to the previous chapter, when you execute the train_baichuan2_13b.py script, the resulting model will be saved in the

Tutorials • Fine-tuning Tutorials • Baichuan2 Fine-tuning
5. Changing the Number of GPUs

Let's rerun the fine-tuning task with a different number of GPUs.

Tutorials • Fine-tuning Tutorials • Baichuan2 Fine-tuning
6. Conclusion

So far, we've explored the process of fine-tuning the Baichuan2 13B model, which is publicly available on Hugging Face, using the MoAI Platform.

© Copyright Moreh 2024. All rights reserved.