The Snowy Big Data Supply Chain Event of January 2016

Prolog:

Did you see it coming? Sometime during the droning sound of The Weather Channel announcers describing Winter Storm Jonas I realized, “Holy cow! There is a gigantic big data event going on.” Not just gigantically big, but galaxy-like big.

How much snow was going to fall? What will the distribution of snow depth be? What is our projected absolute error range? How many plows do we need? Where? What is their capacity? Do we have enough fuel in all of the stations in the target range? Where will the snow all go? Can we calculate which roadways should be cleared first in the city so that assets can be deployed early and be ready (the grid of Manhattan has to be very easy to optimize)? And so on. We just need all of the data.

While the people in the target story zone were fretting over bread and milk, the folks in Buffalo, Syracuse, and Rochester were saying “Let’s get some brie, baguettes, smoked meat, and winter lager and watch those folks out East deal with this event!” While many of us remain skeptical about weather forecasts, communities, cities, and governments cannot afford to under-estimate the potential impact. Readiness is paramount.

The amount of “things” required to prepare for and respond to storms or major events is enormous. Some of the things already are part of the data network needed to manage the response. Many elements are not in place yet (for example, distributed sensors informing the agencies about the current state) and should be part of future planning. The more “things” that can provide data, the better agility we’ll have. We all know, the Internet of Things (Iot) is the next Big Thing! Some say it’s bigger than any other next biggest thing. Ever.

Well, we have our history about weather. We have great models that modern-day media love to show off – despite historical accuracy issues (I wish I got paid to be wrong!). We have supply and capacity in place – but how much can we handle?

 

Winter Storm Jonas

The storm that is now in the record books provided a tremendous vehicle to assess the full breadth and depth of what’s capable for IoT experiences that can generate unprecedented volumes of real-time data that are rich with the informational nutrients of how to prepare and respond that are truly transformational to society.

This information is essential to the supply chain part of the discussion. Supply chains thrive on timely, accurate data and information to remain both agile and focused. The supply chain elements are the “things” in the Internet of Things aspect of this big data opportunity.

There are big data lakes everywhere along the infrastructure waiting for deeper analysis: transportation, food & beverage, energy distribution, emergency services, consumption trends, and more. These all need to be mined for the requisite information to determine readiness (historical analysis vs. current capacity) and deploy the assets (snow trucks, salt, sand, water, emergency services) when needed – even days ahead! Supply planning starts with historical data about consumption rates and performance and the impending storm event.

“Data is the new science. Big Data holds the answers. Are you asking the right questions?” Patrick P. Gelsinger, Senior High Technology Executive

At one point on TV, one official said “We cannot handle snow above 3 inches per hour!” That’s the capacity of the current system. Is there additional flexible capacity nearby that could have been deployed much earlier? How much more is actually needed? If we moved all of the cars out of the roadways earlier, would our plowing efficiency go up and increase effective capacity? If public officials were given the tools to ensure well-ahead of time that the sufficient supply and capacity was available, the impact of the storms could be measurably less.

If you look at the map above, one can see that areas of the region usually well-prepared and equipped for large snowfalls went un-touched (Great Lakes snow belt, northern New England). Was a formal assessment done to determine if those assets (plows, snow, and emergency crews) were available and mobile enough to help out the storm area? If not, this is an opportunity that the right analytics tools can determine.

Snow in Minnesota

With the right system in place to model and determine what is required, the new demand (projected storm size) is loaded into the supply network and artificial intelligence (AI) tools manipulate the data to suggest likely outcomes. The AI engine(s) recommends a full deployment map of all essential infrastructure (capacity) and consumable assets (supply) based on confidence models from the weather predictions. If there are gaps, the model can reach out to other assets close to the region at risk. Ready. Set. Go!

The models that do the weather predicting (demand generation) will eventually feed the emergency system directly and scenarios played out will look at redistribution of supply (assets) within range (cycle time) in advance of the storm. Back to the things part of this, adaptive sensors throughout the affected area will keep pace with storm accumulation and clearance rates to adjust both capacity and supply – an agile supply chain with real-time data. True, there might be occasional errors of over-commission that leaves assets under-utilized, but we know this is a less risky and costly situation than not having assets in place due to under-forecasting.

With the correct supply, projected demand, and historical consumption data available from the aforementioned things, cities will also be able to know in advance where all transportation assets (planes, trains, and automobiles) are sitting and which ones need to go get out of Dodge; which ones to stop from coming in; which ones to store; and which ones are needed. Traffic flows can be adjusted to streamline exit paths and security assets (police and/or National Guard) can be sent out to monitor and guide. They will be redeployed later in the day as the snow begins to accumulate and eventually fades into the sunset. Once it is all said and done, everybody goes home, safe and sound.

Why should anyone ever be stranded on a highway again? The entire system in the affected area can be prepared for the event and take preventive action much earlier. While the chances exist that we over-forecast, the alternative is much worse – which we’ve seen many times in the past.

“Get the milk! Get the bread! Get the toilet paper! It’s gonna snow a ton!”

Once the storm and clean-up are over, there will be a brand new set of data available about the entire event, from the original warning to the last bit of snow moved. These data go into the existing pools of data and help set up the supply/capacity response for the next storm. Of course, record storms reset all of the averages and distribution of potential impacts – future events will benefit from the learning achieved. Simulation models will have new parameters to use during the next event’s planning.

Our ability to respond to widespread severe weather events must be proportional to the degree of recovery needed. This may mean that readiness spreads out geographically. With some capabilities (electricity, for example) the risk is higher that a storm can affect millions of people. Do we have the data and information necessary to help assess and plan for the impact of outages? With information such as this, government and private services can be deployed at the right levels and at the right cost.

Until the next biggest weather-driven supply chain event!

The discussion can really go on longer.

What do you think about this?

Well, time to check out The Weather Channel once more before going down for a long winter’s nap. I’m rooting for NYC to beat its all-time record and watch people stranded helplessly on TV. That’s winning, right?

“Let it snow, let it snow, let it snow!” Sammy Cahn (writer)

 

Epilog

After flying successfully from Seattle to New York City’s LaGuardia Airport, the reality of the problem became very evident. Hundreds of people were landing in the NYC airports once the flights were released. Unfortunately, the roads were not quite as ready. Accidents, slow moving traffic, and atypical congestion clogged up the LaGuardia airport late Monday afternoon through the night. Clearly, the incoming traffic exceeded a reasonable level given the overall situation. Busses, shuttles, limos, and taxis could not get through. Rental cars were quite depleted and emergency crews were dealing with the mess. 1 hour and a few dollars later, the driver from Avis accepted my “offer he couldn’t refuse” and took us to the Hertz terminal. Clearly, this is yet another opportunity for minimizing the disruption in this type of event with better data, information, analysis, and integration with the services in the region.

 

Michael Massetti is an Executive Partner with Gartner who really does enjoy being a supply chain professional! Seriously. All opinions are my own.

Dealing with Big Data: From Quants to Smarts

For those tracking the world of Big Data and Analytics, you’ve probably heard of “Quants.” Just to be sure, a quant is the epithet of a person who is an expert in analytics and “quant”-itative analysis – thus the moniker “quant.” Quants are the people who businesses rely upon to illuminate the gnarliest analytical, mathematical, and numerical problems.

Quants play a crucial role in many industries and functional areas from health care and manufacturing to banking and retail supporting supply chain, finance, marketing, and more. As long as there are streams of data to understand, quants are here to stay. As the streams of flow into larger data lakes, especially with the Internet of Things (IoT) promising to generate gazillions of bytes more of data, quants have unparalleled job security. My former CEO at AMD, Rory Read, used to say that management’s role is to “torture data until the truth surfaces.” This role is relegated to data savants, a.k.a. The Quants!

“Torture data until the truth surfaces!”  Rory Read

One important area of focus for a company’s Business Intelligence (BI) activity is to identify the areas where more quantitative data analysis is needed. Many companies find themselves using high-performing employees mired in the laborious and unproductive practice of aggregating, rationalizing, cleaning, and reporting on data. BI and analytics programs drive to reduce the time needed to get information while allowing additional time for employees to dig deeper for fundamental, root-cause analysis capability, develop data models, simulate scenarios, and provide decision support.

Once the business achieves data cleanliness and stability, the focus changes from reporting and informing about past performance to anticipating and predicting future potential outcomes. Driving the business by looking exclusively through the rear-view mirror does not allow companies to avoid dangers lurking ahead. Once the ability to anticipate is in place, business discussions move from “what happened” to “what may happen” – scenario planning and modeling.

The migration to a forward-looking analytics model compels organizations to ask, “Are we ready for the ramifications that these BI development programs have on individual skills, management expectations, organization structure, and communications across the enterprise?” As the company evolves toward more anticipation-oriented and proactive decision-making based on data, the skills and disciplines required on the teams change. The questions management should be asking change, too. How will this actually work?

Teams must frame the decision scope and ensure clarity of the business problem(s) or opportunity(ies) being addressed. The quantitative analysis that follows will include reviewing historical data, establishing a hypothesis, gathering data, and performing the analysis. At the back-end of the process, decision makers must evaluate the results from the story that the data analysis conveys.

Trust is vitally important in both the data and the individual/teams doing the quantitative work. Verification never ceases. Critical statistical validation and thorough inquiry should always be the norm. Transparency of the analytical approach and assumptions is central to successful decision-making. If the models are used by upper management to guide and inform decisions, the underlying assumptions and algorithms used to provide the core information must be visible and understood by all involved.

“In the end, the science of analysis must be married with the art of intuition and experience to make BI programs bring the anticipated results.”

Thomas H. Davenport highlighted a number of key questions that everyone involved in the process should be asking in a Harvard Business Review article from the July-August, 2013 edition “Keep Up with Your Quants.” These questions included:

  • What was/were the source(s) of the data?
  • How well does the sample represent the population?
  • Were there outliers and did they affect the results?
  • What assumptions are in your analysis and models? Do any conditions render this invalid?
  • Why did you choose the specific analytical approach? What options did you consider?
  • How are you certain of causality vs. coincidental correlation of the outcomes with the variables used?

This type of inquiry has not been common or pervasive until recently. Management must understand the fundamentals and assumptions used to establish institutional confidence in the information and guidance that comes out of the analytics and decision support models.

In the end, the science of analysis must be married with the art of intuition and experience to make BI programs bring the anticipated results. Artificial Intelligence (AI) will play a more significant role in the future. In the meantime, the behaviors identified above are fundamental to the success of any analytics and BI program. It is necessary to keep strong and open relationships between the quants and the decision makers.

Do you want to be a quant? There are myriad opportunities across all industries and functional areas for individuals to delve into the world of quantitative analysis. As the data streams continue to grow exponentially with IoT to become tsunamis, the need will increase further. Quants will no longer be a luxury for business success, they will be necessary. The future will bring more AI into play – people working on data and models now are likely to grow and evolve to develop new AI algorithms.  Their role will expand to not only help evaluate data with known models, but also to create and maintain models that are intelligent and adaptive to update themselves. It’s an exciting future in Big Data, BI, Analytics, and AI!

Who’s your quant?

 

“I never guess. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.” Sir Arthur Conan Doyle (Sherlock Holmes author)

 

Michael Massetti is an Executive Partner with Gartner who really does enjoy being a supply chain professional! Seriously.

All opinions are my own.