Operationalizing Machine Learning
Operationalizing Machine Learning refers broadly to the process of deploying predictive models to a production environment, together with ongoing measurement, monitoring and improvement of those models.

KubeFlow
KubeFlow - Pipelines and Components
A pipeline is a description of a machine learning (ML) workflow, including all of the components in the workflow and how the components relate to each other in the form of a graph.
A pipeline component is a self-contained set of user code, packaged as a container, that performs one step in the pipeline. For example, a component can be responsible for data preprocessing, data transformation, model training, etc.

KubeFlow with AI Platform