1. Mo​AI Platform
  2. Tags
  3. mistral

# Tag: mistral

See all tags.

Tutorials • Fine-tuning Tutorials
Mistral Fine-tuning

This tutorial guides you on fine-tuning the open-source Mistral 7B model on the MoAI Platform.

Tutorials • Fine-tuning Tutorials • Mistral Fine-tuning
1. Preparing for Fine-tuning

Preparing the PyTorch script execution environment on the MoAI Platform is similar to doing so on a typical GPU server.

Tutorials • Fine-tuning Tutorials • Mistral Fine-tuning
2. Understanding Training Code

Once you have prepared all the training data, let's delve into the contents of the train_mistral.py script to execute the actual fine-tuning process.

Tutorials • Fine-tuning Tutorials • Mistral Fine-tuning
3. Model Fine-tuning

Now, we will train the model through the following process.

Tutorials • Fine-tuning Tutorials • Mistral Fine-tuning
4. Checking Training Results

Running the train_mistral.py script, as in the previous section, will save the resulting model in the

Tutorials • Fine-tuning Tutorials • Mistral Fine-tuning
5. Changing the Number of GPUs

Let's rerun the fine-tuning task with a different number of GPUs.

Tutorials • Fine-tuning Tutorials • Mistral Fine-tuning
6. Conclusion

From this tutorial, we have seen how to fine-tune the Mistral 7B model on the MoAI Platform.

© Copyright Moreh 2024. All rights reserved.