# 6. Conclusion

So far, we've explored the process of fine-tuning the Baichuan2 13B model, which is publicly available on Hugging Face, using the MoAI Platform. With MoAI Platform, you can easily fine-tune PyTorch-based open-source LLM models on GPU clusters while keeping your existing training code intact. Moreover, with MoAI Platform, you can effortlessly configure the number of GPUs you need without changing any code. So please dive in and develop new models quickly and effortlessly with your data!

In case if you still have any questions regarding this tutorial feel free to ask Moreh(support@moreh.io).

# Learn More