Overview of Kubeflow Fairing

Beta

This Kubeflow component has beta status. See the Kubeflow versioning policies. The Kubeflow team is interested in your about the usability of the feature.

Kubeflow Fairing streamlines the process of building, training, and deployingmachine learning (ML) training jobs in a hybrid cloud environment. By usingKubeflow Fairing and adding a few lines of code, you can run your ML trainingjob locally or in the cloud, directly from Python code or a Jupyternotebook. After your training job is complete, you can use Kubeflow Fairing todeploy your trained model as a prediction endpoint.

  • To set up your development environment, follow the guide to installingKubeflow Fairing.
  • To ensure that Kubeflow Fairing can access your Kubeflow cluster, followthe guide to .

Kubeflow Fairing is a Python package that makes it easy to train and deploy MLmodels on Kubeflow. Kubeflow Fairing can also been extended totrain or deploy on other platforms. Currently, Kubeflow Fairing has beenextended to train on .

Kubeflow Fairing packages your Jupyter notebook, Python function, or Pythonfile as a Docker image, then deploys and runs the training job on Kubeflowor AI Platform. After your training job is complete, you can use KubeflowFairing to deploy your trained model as a prediction endpoint on Kubeflow.

  • Easily package ML training jobs: Enable ML practitioners to easily package their ML model training code, and their code’s dependencies, as a Docker image.
  • Easily train ML models in a hybrid cloud environment: Provide a high-level API for training ML models to make it easy to run training jobs in the cloud, without needing to understand the underlying infrastructure.
  • Streamline the process of deploying a trained model: Make it easy for ML practitioners to deploy trained ML models to a hybrid cloud environment.

Was this page helpful?

Glad to hear it! Please tell us how we can improve.

Last modified 04.02.2020: