Open Universal Machine Intelligence

Everything you need to build state-of-the-art foundation models, end-to-end.

Get Started →

What is Oumi?#

Oumi is an open-source platform designed for ML engineers and researchers who want to train, fine-tune, evaluate, and deploy foundation models. Whether you’re fine-tuning a small language model on a single GPU or training a 405B parameter model across a cluster, Oumi provides a unified interface that scales with your needs.

Who is Oumi for?

  • ML Engineers building production AI systems who need reliable training pipelines and deployment options

  • Researchers experimenting with new training methods, architectures, or datasets

  • Teams who want a consistent workflow from local development to cloud-scale training

What problems does Oumi solve?

  • Fragmented tooling: Instead of stitching together different libraries for training, evaluation, and deployment, Oumi provides one cohesive platform

  • Scaling complexity: The same configuration works locally and on cloud infrastructure (AWS, GCP, Azure, Lambda Labs)

  • Reproducibility: YAML-based configs make experiments easy to track, share, and reproduce

Quick Start#

Prerequisites: Python 3.10+, pip. GPU recommended for larger models (CPU works for small models like SmolLM-135M).

Install Oumi and start training in minutes:

# Install with GPU support (or use `pip install oumi` for CPU-only)
pip install oumi[gpu]

# Train a model
oumi train -c configs/recipes/smollm/sft/135m/quickstart_train.yaml

# Run inference
oumi infer -c configs/recipes/smollm/inference/135m_infer.yaml --interactive

For detailed setup instructions including virtual environments and cloud setup, see the installation guide.

What will you build?#

Oumi provides a unified interface across the entire model development lifecycle. The workflows below cover training, evaluation, inference, data synthesis, hyperparameter tuning, and cloud deployment—all driven by YAML configs that work identically on your laptop or a multi-node cluster.

Fine-tune a model on my data

Start with a pre-trained model and customize it for your task using SFT, LoRA, DPO, GRPO, and more.

Training
Evaluate my model’s performance

Run benchmarks and compare against baselines using standard evaluation suites and LLM judges.

Evaluation
Deploy a model for inference

Run inference anywhere—vLLM and llama.cpp locally, or OpenAI and Anthropic remotely—with a unified interface.

Inference
Generate synthetic training data

Create high-quality training data with LLM-powered synthesis pipelines.

Data Synthesis
Optimize my hyperparameters

Find the best learning rate, batch size, and other settings automatically using bayesian optimization.

Hyperparameter Tuning
Run training on cloud GPUs

Launch jobs on AWS, GCP, Azure, or Lambda Labs with a single command.

Running Jobs on Clusters

Hands-on Notebooks#

Explore the most common Oumi workflows hands-on. These notebooks run in Google Colab with pre-configured environments—just click and start experimenting. Try “A Tour” for a high-level overview, or dive straight into a specific topic.

Getting Started: A Tour

Quick tour of core features: training, evaluation, inference, and job management

https://colab.research.google.com/github/oumi-ai/oumi/blob/main/notebooks/Oumi%20-%20A%20Tour.ipynb
Model Finetuning Guide

End-to-end guide to LoRA tuning with data prep, training, and evaluation

https://colab.research.google.com/github/oumi-ai/oumi/blob/main/notebooks/Oumi%20-%20Finetuning%20Tutorial.ipynb
Model Distillation

Guide to distilling large models into smaller, efficient ones

https://colab.research.google.com/github/oumi-ai/oumi/blob/main/notebooks/Oumi%20-%20Distill%20a%20Large%20Model.ipynb
Model Evaluation

Comprehensive model evaluation using Oumi’s evaluation framework

https://colab.research.google.com/github/oumi-ai/oumi/blob/main/notebooks/Oumi%20-%20Evaluation%20with%20Oumi.ipynb
Remote Training

Launch and monitor training jobs on cloud platforms (AWS, Azure, GCP, Lambda)

https://colab.research.google.com/github/oumi-ai/oumi/blob/main/notebooks/Oumi%20-%20Running%20Jobs%20Remotely.ipynb

Community & Support#

Oumi is a community-first effort. Whether you are a developer, a researcher, or a non-technical user, all contributions are very welcome!

  • Join our Discord community to get help, share your experiences, and chat with the team

  • Check the FAQ for common questions and troubleshooting

  • Open an issue on GitHub for bug reports or feature requests

  • Read CONTRIBUTING.md to send your first Pull Request

  • Explore our open collaboration page to join community research efforts