Machine Learning Operations: 3 Challenges Evaluating Machine Learning Models

machine-learning-model-drift.png

Machine learning operations present a unique set of challenges compared with conventional software development operations (DevOps). One of these challenges is that machine learning models in production may experience "model drift". As the world changes relative to the data used to train the model, prediction performance may deteriorate. 

A change in model performance can directly impact business KPIs and the bottom line. Instacart.com provides a recent operational model example of the impact of model drift. The coronavirus pandemic caused a change in consumer shopping habits, which led to a decrease in its machine learning model's ability to predict the availability of items in stores. Model accuracy dropped from 93% to 61%.

This decrease in model accuracy could result in frustrated customers who expected delivery of items that appeared to be available via the Instacart platform. Mismatched expectations could lead to reduced engagement and customer churn.

Fortunately for Instacart, they had a system to evaluate machine learning models against performance benchmarks. Once model accuracy dropped to 61%, this triggered Instacart engineers to intervene and retrain the model to account for the sudden change in consumer shopping habits. Organizations who don't have a system to notify them of machine learning model drift will often notice the problem after it's too late and has already harmed the business.

This incident highlights the importance of ongoing monitoring and evaluating machine learning models. It's not enough to deploy machine learning models to production. There must be a plan to manage the entire machine learning life cycle. 

From an operational standpoint, enterprises must determine how data teams should evaluate machine learning models on an ongoing basis. It can be challenging to create and execute this plan efficiently.

I recently spoke with Ofer Razon, Co-Founder & CEO at Superwise.ai, who shared a story of the challenges a Superwise.ai customer, Monday.com, faced with the operational plan they had implemented to manage and evaluate machine learning models properly. Superwise.ai provides a solution in the category of machine learning operations ("MLOps"), which is also commonly referred to as "AIOps," or more specifically, model performance management ("MPM").

The Monday.com team relies on machine learning models to convert free trial customers to paying customers quickly. We'll look at the challenges the Monday.com data team faced from each of their perspectives — data scientists, data engineers, and business analysts.

We'll take a look at:

  1. Machine Learning Operations Challenges

  2. Machine Learning Operations Solutions

  3. Machine Learning Operations Solution Outcomes

Machine Learning Operations Challenges

Machine Learning Operations Challenges Evaluating Machine Learning Models

Machine Learning Operations Challenges Evaluating Machine Learning Models

  1. Data Scientists“It’s hard for us to build machine learning models to grow the business if we also have to monitor and manage performance changes once ml models get to production.”

  2. Data Engineering"The retraining strategy is not data-driven and consumes a lot of our resources."

  3. Business Analysts“If model performance drops below a certain threshold, it may very quickly begin to have a direct negative impact on our bottom line. We can’t understand when or why this happens without help from the Data Science team, and this won’t scale.”

The Monday.com team had built an internal solution to evaluate machine learning models made with DataRobot, but they were still struggling. It would take at least 21 days to detect model drift and another 7 to 14 days to troubleshoot the model and fix the issue.

Machine Learning Operations Solutions

Machine Learning Operations Solutions to Evaluating Machine Learning Models

Machine Learning Operations Solutions to Evaluating Machine Learning Models

  1. Data Scientists"I wish we had a way for the Business Analysts to be able to easily monitor and evaluate machine learning models in production in a self-service way. That would free up our time to build more ml models to grow the business."

  2. Data Engineering“We could be more efficient if we could use insights from production to retrain machine learning models automatically."

  3. Business Analysts"If only there were a visual dashboard to evaluate ml models in production, we could track performance on our own and make business-critical decisions much faster."

Superwise.ai provided a solution to meet these very needs. It was quick and easy to set up in one day without any manual configuration. It retrieved data from DataRobot via REST API and immediately began to deliver actionable insights to the data scientists, data engineers, and business analysts.

The Superwise.ai solution provides a visual interface for business analysts to monitor and evaluate machine learning models on their own effectively. This self-service solution freed up the data science team to create new models to deliver more value to the business. And the data engineering team was delighted to discover that once model drift caused prediction performance to fall below acceptable levels, the solution would trigger models to be retrained automatically. 

Machine Learning Operations Solution Outcomes

Machine Learning Operations Outcomes Evaluating Machine Learning Models with Superwise.ai

Machine Learning Operations Outcomes Evaluating Machine Learning Models with Superwise.ai

Once this customer implemented the Superwise.ai solution, the business impact was clear. Time to detect model drift was reduced by 96%, down from 21 days to only one day. Time to troubleshoot and fix the model was reduced by 93%, down from weeks to minutes. 

The benefits of a solution like Superwise.ai might seem obvious. Razon explained that "The biggest challenge we face when talking to customers is showing them how monitoring can help them be more data-driven to create better processes across the whole ML flow."

As we've seen with both Instacart and this Superwise.ai customer, it's clear that you must implement a solution to track and evaluate machine learning models on an ongoing basis. With this solution, you will be able to scale your machine learning operations and take on more machine learning projects to deliver more value to your customers.


Want a demo of Superwise.ai?

Fill this out and Superwise will be in touch to schedule your custom demo!



Stay Current in Data Science Technology

Subscribe to our free weekly newsletter.


Subscribe to our weekly Data Science & Machine Learning Technology Newsletter


Recent Posts

Posts by Category

Previous
Previous

Machine Learning is Like a Baby

Next
Next

Data Science Technology New Year’s Resolutions for 2021