GCP – Retailers find flexible demand forecasting models in BigQuery ML
Retail businesses understand the value of demand forecasting—using their intuition, product and market experience, and seasonal patterns and cycles to plan for future demand. Beyond the need for forecasts that are as accurate as possible, modern retailers also face the challenge of being able to perform demand planning at scale. Product assortments that span tens of thousands of items across hundreds of individual selling locations or designated marketing areas lead to a number of time series that cannot be managed without the help of big data platforms, and time series modeling solutions that scale accordingly.
So far, there have been two ways to address this challenge:
-
Purchase a full end-to-end demand forecasting solution, which takes significant time and resources to implement and maintain.
-
Or leverage an all-purpose machine learning platform to run your own time series models, which requires deep experience in both modeling and data engineering.
To help retailers with an easier, more flexible solution for demand planning, we’ve published a Smart Analytics reference pattern for performing time series forecasting with BigQuery ML using autoregressive integrated moving average (ARIMA) as a basis. This ARIMA model follows the BigQuery ML low-code design principle, allowing for accurate forecasts without advanced knowledge of time series models. Moreover, the BigQuery ML ARIMA model provides several innovations over the original ARIMA models that many are familiar with, including the ability to capture multiple seasonal patterns, automated model selection, a no-hassle preprocessing pipeline, and most of all, the ability to effortlessly generate thousands of forecasts at scale with nothing but a few lines of SQL.
In this blog, we’ll take a look at the two most common ways demand forecasting teams have been organized, and how BigQuery ML fills a gap between the two, plus discuss how BigQuery ML can help your demand planning recover from unforeseen events like COVID-19.
To see the end-to-end process to implement the demand forecasting design pattern, check out this video:
Two types of demand forecasting teams
Historically, large organizations have had two types of demand forecasting teams. We’ll call them the Business Forecasting team and the Science Forecasting team.
The Business Forecasting team typically uses full enterprise resource planning (ERP) or software as a service (SaaS) forecasting solutions (or occasionally a homegrown solution) that don’t require an advanced level of data science skill to use. These ERPs produce entirely automated forecasts. Team members often come from the business side of the organization, and instead of deep technical skills, bring extensive domain and business knowledge to their role. Many large brick-and-mortar organizations often use this approach. These types of solutions may scale well, but they require significant time and resources, both to implement and to support. This typically includes large implementation and DevOps teams, multiple dedicated compute and data storage instances, and fixed-schedule hours-long batch cycles to refresh the forecasts.
The Science Forecasting team typically features PhD or MSc-level practitioners working within a data science or a tech organization, who are fluent in Python or R. They work with a Cloud AI platform and perform all of the end-to-end forecasting themselves: choosing, building, training, and evaluating a model. Then they deploy the model to production and communicate results to business stakeholders and leadership. This type of team is often found in digital-native organizations.
A new type of forecasting team
Recently, a new hybrid type of forecasting team has emerged. Often these are in businesses looking to become more data and model driven, but don’t have the resources to invest in an expensive ERP or hire a PhD-level data scientist. They may have a decent knowledge of forecasting and demand planning, but not enough experience or organizational resources to deploy custom models at scale. Still, this type of team, given the right tools, has the potential to merge the best of both worlds: the advanced modeling of the Science Forecaster and the deep domain knowledge of Business Forecaster.
Responding to the unforeseen
As nearly every business experienced firsthand in 2020, certain events like the COVID-19 pandemic throw a wrench into demand forecasting signals, making existing models questionable.
With an ERP forecasting solution, even a small change to the supply chain and store network configuration will result in a change in demand patterns that requires extensive reconfiguration of the demand planning solution, and the help of a large support team. BigQuery ML reduces the complexity of making such adjustments due to both expected and unexpected events, and because it’s serverless, it autoscales and saves costs in DevOps time and efforts. Regenerating forecasts to adapt to a change in the supply chain network configuration is now a matter of hours, not weeks.
Getting started with a BigQuery ML reference pattern
To make it easier to get up and running with Google Cloud tools like BigQuery ML, we recently introduced Smart Analytics reference patterns—technical reference guides with sample code for common analytics use cases. We’ve heard that you want easy ways to put analytics tools into practice, and previous reference patterns cover use cases like predicting customer lifetime value, propensity to purchase, product recommendation systems, and more.
Our newest reference pattern on Github will help you get a head start on generating time series forecasts at scale. The pattern will show you how to use historical sales data to train a demand forecasting model using BigQuery ML, and then visualize the forecasts in a dashboard.
For more details and to walk you through this process, using historical transactional data for Iowa liquor sales data to forecast the next 30 days, check out our technical explainer. In the blog, you’ll learn how to:
-
Pre-process data into the correct format needed to create a demand forecasting model using BigQuery ML
-
Fit multiple BQ ARIMA time-series models in BigQuery ML
-
Evaluate the models, and generate forward-looking forecasts for the desired forecast horizon
-
Create a dashboard to visualize the projected demand using Data Studio
-
Set up scheduled queries to automatically re-fit the models on a regular basis
Let’s do a deeper dive into the concepts we just introduced you to.
BigQuery ML bridges the gap between the Businesses Forecaster and the Science Forecaster
Given the features we just described, we see how BQML helps fill the gap between the two current approaches to forecasting at scale, allowing you to build your own demand forecasting platform without the need for highly specialized time series data scientists. It’s an ideal solution for hybrid forecasters, featuring tools you can use to generate forecasts at scale on the fly.
Since BigQuery ML lets you train and deploy ML models using SQL, it democratizes your data modeling challenges, opening up your demand forecasting tools and business insights to a larger pool of your organizational talent.
For example, the BigQuery ML ARIMA model helps retailers recover from unexpected events with the ability to generate thousands of forecasts with fresh data over a shorter amount of time. You can recalibrate demand forecasts more cost effectively, detect changes in trends, and perform multiple iterations that capture new patterns as they emerge, without mobilizing an entire DevOps team in order to do so.
Using BigQuery ML as your forecast engine allows you to bridge the gap between your business or hybrid forecasting teams and advanced data science teams. For example, your forecast analysts will own the task of generating baseline statistical forecasts with BigQuery and reviewing them, but they will loop in a senior data scientist to perform a more advanced causal impact analysis on some of their demand data as needed, or to measure the effect of COVID-19 on shifting demand patterns. Think of it as “DemandOps” instead of “DevOps.”
This is also possible if you already have ERP demand planning tools as well, by simply exporting your forecasts and sales actuals into BigQuery whenever they are refreshed, or as needed. Chances are, a retail organization actually has multiple time series forecasts being run by separate business functions. Your merchandising team will be running tactical and operational demand forecasts, finance is performing top-line revenue forecasts, while supply chain are running their own forecasts for capacity planning at the data center level, each using their own specific tool set. These forecasts are being generated in isolation, but reconciling them would improve accuracy and provide the organization with valuable holistic insights into their business that siloed forecasts and analysis can’t provide.
For example, based on market and product signals, merchandising may forecast an increase in demand for a certain product. Separately, supply chain will be aware of various manufacturing and logistics stressors that project a decrease in the product shipments. Typically this discrepancy won’t be caught for several weeks, and will then be resolved via emails and meetings. By then it’s too late, since conflicting planning decisions were already made by the separate teams, and the proverbial damage is done. Using BigQuery as a centralized forecast analysis platform would allow a retailer to detect such discrepancy in a matter of hours or days, and react accordingly, instead of having to roll back planning decisions several weeks after the fact.
BigQuery and BigQuery ML provide the perfect platform for collaboration between disparate and diverse forecasting teams, beyond just the powerful modeling capabilities of BQARIMA.
Google Cloud offers several solutions to help you enhance your demand forecasting capabilities and optimize inventory levels amidst changing times. Besides the BigQuery ML tools described in this blog, there are also:
-
Building your own time series models, either statistical or ML-based, using your preferred open source frameworks on Cloud AI Platform Jupyterlab instances
-
Use AutoML Forecast to automatically select and train cutting edge deep learning time series models
-
Use our upcoming fully managed forecasting solution, Demand AI (currently in experimental status)
-
Work with a partner like o9 to implement their retail planning platform with forecasting capabilities on Google Cloud
For more examples of data analytics reference patterns, check out the predictive forecasting section in our catalog. Ready to get started with BigQuery ML? Read more in our product introduction.
Want to dig deeper into BigQuery ML capabilities? Sign up here for free training on how to train, evaluate and forecast inventory demand on retail sales data with BigQuery ML.
Read More for the details.