You can use SageMaker Model Registry to
SageMaker Experiments is
a feature of SageMaker Studio that you can use to automatically create ML experiments by using different combinations of data, algorithms, and parameters.
Amazon SageMaker Model Monitor
Monitors the quality of Amazon SageMaker AI machine learning models in production. With Model Monitor, you can set up:
Amazon SageMaker Model Monitor Data Capture
is a feature of SageMaker endpoints.
- record data that you can then use for training, debugging, and monitoring.
- use the new data that is captured by Data Capture to re-train the model.
- runs asynchronously without impacting production traffic.
SageMaker Clarify
provides tools to help explain how machine learning (ML) models make predictions.
- you can use to check for bias and explainability in datasets and models.
- checks for bias by analyzing predictions after you deploy the model.
TensorBoard is a capability of SageMaker that you can use to
SageMaker Pipelines
SageMaker Pipelines Batch transforms are
The most cost-effective inference method for models that are called only on a periodic basis.
Choose SageMaker asynchronous endpoint when
SageMaker real-time endpoint
Use SageMaker batch transform job to run inference when
SageMaker serverless endpoint
SageMaker Canvas
SageMaker network isolation.
This solution will block internet access and external/customer’s VPC network access.
SageMaker input modes
SageMAker FrameworkProcessor provides premade containers for the following machine learning frameworks:
SageMaker AMT
searches for the most suitable version of a model by running training jobs based on the algorithm and objective criteria.
You can use a SageMaker AMT warm start tuning job to
use the results from previous training jobs as a starting point.
SageMaker AMT can use early stopping to
compare the current objective metric (accuracy) against the median of the running average of the objective metric. Then, early stopping can determine whether or not to stop the current training job.
SageMaker AMT IDENTICAL_DATA_AND_ALGORITHM setting
assumes the same input data and training image as the previous tuning jobs
Hyperparameter tuning can
accelerate your productivity by trying many variations of a model.
AMT MaxNumberOfTrainingJobs
The maximum number of training jobs to be run before tuning is stopped.
AMT MaxNumberOfTrainingJobsNotImproving
The maximum number of training jobs that do not improve performance against the objective metric from the current best training job. As an example, if the best training job returned an objective metric that had an accuracy of 90%, and MaxNumberOfTrainingJobsNotImproving is set to 10. In this example, tuning will stop after 10 training jobs fail to return an accuracy higher than 90%.
SageMaker ModelBiasMonitor class
create a bias baseline and deploy a monitoring mechanism that evaluates whether the model bias deviates from the bias baseline.