The Snowy Big Data Supply Chain Event of January 2016


Did you see it coming? Sometime during the droning sound of The Weather Channel announcers describing Winter Storm Jonas I realized, “Holy cow! There is a gigantic big data event going on.” Not just gigantically big, but galaxy-like big.

How much snow was going to fall? What will the distribution of snow depth be? What is our projected absolute error range? How many plows do we need? Where? What is their capacity? Do we have enough fuel in all of the stations in the target range? Where will the snow all go? Can we calculate which roadways should be cleared first in the city so that assets can be deployed early and be ready (the grid of Manhattan has to be very easy to optimize)? And so on. We just need all of the data.

While the people in the target story zone were fretting over bread and milk, the folks in Buffalo, Syracuse, and Rochester were saying “Let’s get some brie, baguettes, smoked meat, and winter lager and watch those folks out East deal with this event!” While many of us remain skeptical about weather forecasts, communities, cities, and governments cannot afford to under-estimate the potential impact. Readiness is paramount.

The amount of “things” required to prepare for and respond to storms or major events is enormous. Some of the things already are part of the data network needed to manage the response. Many elements are not in place yet (for example, distributed sensors informing the agencies about the current state) and should be part of future planning. The more “things” that can provide data, the better agility we’ll have. We all know, the Internet of Things (Iot) is the next Big Thing! Some say it’s bigger than any other next biggest thing. Ever.

Well, we have our history about weather. We have great models that modern-day media love to show off – despite historical accuracy issues (I wish I got paid to be wrong!). We have supply and capacity in place – but how much can we handle?


Winter Storm Jonas

The storm that is now in the record books provided a tremendous vehicle to assess the full breadth and depth of what’s capable for IoT experiences that can generate unprecedented volumes of real-time data that are rich with the informational nutrients of how to prepare and respond that are truly transformational to society.

This information is essential to the supply chain part of the discussion. Supply chains thrive on timely, accurate data and information to remain both agile and focused. The supply chain elements are the “things” in the Internet of Things aspect of this big data opportunity.

There are big data lakes everywhere along the infrastructure waiting for deeper analysis: transportation, food & beverage, energy distribution, emergency services, consumption trends, and more. These all need to be mined for the requisite information to determine readiness (historical analysis vs. current capacity) and deploy the assets (snow trucks, salt, sand, water, emergency services) when needed – even days ahead! Supply planning starts with historical data about consumption rates and performance and the impending storm event.

“Data is the new science. Big Data holds the answers. Are you asking the right questions?” Patrick P. Gelsinger, Senior High Technology Executive

At one point on TV, one official said “We cannot handle snow above 3 inches per hour!” That’s the capacity of the current system. Is there additional flexible capacity nearby that could have been deployed much earlier? How much more is actually needed? If we moved all of the cars out of the roadways earlier, would our plowing efficiency go up and increase effective capacity? If public officials were given the tools to ensure well-ahead of time that the sufficient supply and capacity was available, the impact of the storms could be measurably less.

If you look at the map above, one can see that areas of the region usually well-prepared and equipped for large snowfalls went un-touched (Great Lakes snow belt, northern New England). Was a formal assessment done to determine if those assets (plows, snow, and emergency crews) were available and mobile enough to help out the storm area? If not, this is an opportunity that the right analytics tools can determine.

Snow in Minnesota

With the right system in place to model and determine what is required, the new demand (projected storm size) is loaded into the supply network and artificial intelligence (AI) tools manipulate the data to suggest likely outcomes. The AI engine(s) recommends a full deployment map of all essential infrastructure (capacity) and consumable assets (supply) based on confidence models from the weather predictions. If there are gaps, the model can reach out to other assets close to the region at risk. Ready. Set. Go!

The models that do the weather predicting (demand generation) will eventually feed the emergency system directly and scenarios played out will look at redistribution of supply (assets) within range (cycle time) in advance of the storm. Back to the things part of this, adaptive sensors throughout the affected area will keep pace with storm accumulation and clearance rates to adjust both capacity and supply – an agile supply chain with real-time data. True, there might be occasional errors of over-commission that leaves assets under-utilized, but we know this is a less risky and costly situation than not having assets in place due to under-forecasting.

With the correct supply, projected demand, and historical consumption data available from the aforementioned things, cities will also be able to know in advance where all transportation assets (planes, trains, and automobiles) are sitting and which ones need to go get out of Dodge; which ones to stop from coming in; which ones to store; and which ones are needed. Traffic flows can be adjusted to streamline exit paths and security assets (police and/or National Guard) can be sent out to monitor and guide. They will be redeployed later in the day as the snow begins to accumulate and eventually fades into the sunset. Once it is all said and done, everybody goes home, safe and sound.

Why should anyone ever be stranded on a highway again? The entire system in the affected area can be prepared for the event and take preventive action much earlier. While the chances exist that we over-forecast, the alternative is much worse – which we’ve seen many times in the past.

“Get the milk! Get the bread! Get the toilet paper! It’s gonna snow a ton!”

Once the storm and clean-up are over, there will be a brand new set of data available about the entire event, from the original warning to the last bit of snow moved. These data go into the existing pools of data and help set up the supply/capacity response for the next storm. Of course, record storms reset all of the averages and distribution of potential impacts – future events will benefit from the learning achieved. Simulation models will have new parameters to use during the next event’s planning.

Our ability to respond to widespread severe weather events must be proportional to the degree of recovery needed. This may mean that readiness spreads out geographically. With some capabilities (electricity, for example) the risk is higher that a storm can affect millions of people. Do we have the data and information necessary to help assess and plan for the impact of outages? With information such as this, government and private services can be deployed at the right levels and at the right cost.

Until the next biggest weather-driven supply chain event!

The discussion can really go on longer.

What do you think about this?

Well, time to check out The Weather Channel once more before going down for a long winter’s nap. I’m rooting for NYC to beat its all-time record and watch people stranded helplessly on TV. That’s winning, right?

“Let it snow, let it snow, let it snow!” Sammy Cahn (writer)



After flying successfully from Seattle to New York City’s LaGuardia Airport, the reality of the problem became very evident. Hundreds of people were landing in the NYC airports once the flights were released. Unfortunately, the roads were not quite as ready. Accidents, slow moving traffic, and atypical congestion clogged up the LaGuardia airport late Monday afternoon through the night. Clearly, the incoming traffic exceeded a reasonable level given the overall situation. Busses, shuttles, limos, and taxis could not get through. Rental cars were quite depleted and emergency crews were dealing with the mess. 1 hour and a few dollars later, the driver from Avis accepted my “offer he couldn’t refuse” and took us to the Hertz terminal. Clearly, this is yet another opportunity for minimizing the disruption in this type of event with better data, information, analysis, and integration with the services in the region.


Michael Massetti is an Executive Partner with Gartner who really does enjoy being a supply chain professional! Seriously. All opinions are my own.


What does your supplier relationship management program look like?

Supplier Management Picture

How To Differentiate Between Your Suppliers With SRM

Supplier Relationship Management – SRM

Democratic institutions around the world promote the core doctrine that all people are created equally. This is clearly not the case in the world of supplier management. All suppliers are not created equally and they should all be managed with this distinction in mind.

This article will look at a model for stratifying and segmenting your supply base within the confines of a structured and formal supplier relationship program (SRP).

Basic Construct

The development of supplier relationship management (SRM) accompanied the stratification of the supply base into a number of different hierarchical models. The most tactical suppliers are fundamentally transactional and have the lowest level of relationship management. This highest level is for those that are the most strategic with the largest and most critical spend allocation and the deepest relationships. There are considerably more suppliers at the base than at the top.

For this SRP discussion, we’ll use the following three-tiered model. Supplier management organizations can add levels above “strategic” in this construct for situations where specific partnerships or alliances exist. In some cases, purely transactional suppliers in regional or out-sourced models may be placed below the level of “approved” in this model.

  1. Strategic
  2. Preferred
  3. Approved

The following establishes some essential differences between each level of supplier.


•      Long term relationship & business commitment

•      Formal supply agreement/contract in place

•      Value-based interdependency

•      Strong executive-level engagement

•      Cooperative strategic planning

•      Detailed business reviews


•      On-going supplier-customer relationship

•      Formal supply agreement/contract in place

•      Steady supplier management & sales relationship

•      Some executive relationships

•      Regular business reviews


•      Transactional buy-sell relationship

•      Part of category approved supplier list (ASL)

•      Formal supply agreement/contract optional

Supplier Relationship Program (SRP)

Supplier Management Pyramid

Supplier relationship management is a process for developing and managing collaborative relationships with suppliers at all levels to achieve the required business objectives and overall supply chain strategy. SRP allows the supply team to determine the criteria for establishing the appropriate relationship levels needed to drive supplier engagement.

The process should be managed across the enterprise consistently to ensure alignment with all corporate and supply chain objectives. An annual process of planning, evaluation, and feedback is recommended to adjust for changes in business conditions, supplier performance/capability, new suppliers, and more.
It is critical that this type of program be aligned across the entire corporate enterprise to present a single face to the suppliers. Additionally, including product development, finance, marketing, and other key organizations helps guarantee the success of the program and integration of all activities in the company. Successful supplier programs depend on cross-functional integration and alignment.

All Aboard!

In any SRP approach, there are multiple levels of management engaged in the process. All levels of the organization must be involved, from the procurement person responsible for the supplier up through the CEO, depending on the value of the supplier. Further, multiple organizations will touch and interact with various peer organizations at the supplier.

Thomas A. Stewart described a classical single focal point model and contrasted it to a more enlightened multi-point model for relationships between a supplier and a customer in his book “Intellectual Capital.” The classical model is called a “Bow-Tie Model” as the engagements between the sales and procurement representatives always go through them – a single control point structure that resembles a bow-tie. The newer model is called the “Diamond Model” and more accurately reflects the reality of multi-touch and multi-functional interactions with suppliers.

SRP Diamond Model

To be successful, the SRP must include all of the key organizations in the planning, selection, review, and decision-making processes. The sourcing or procurement function will own the relationship and manage the overall process. Without the engagement across the enterprise, the supplier may get mixed messages from different organizational agendas.


SRP incorporates multiple levels of communications with the supplier throughout the year. Tactical communications include daily interactions and updates on supply. Weekly and/or monthly communications address demand, purchase orders, inventory levels, deliveries, quality, and more. Of course, issue management is critical and will involve all critical parties until resolution. Many organizations have developed web-based portals or use other cloud-based services to tie the supplier and buyer together.

Structured meetings for review of the relationship, performance, and progress are another integral part of a successful SRP. These meetings will look at all aspects of the engagement with less focus on tactical issues and more attention to the long-term. The procurement team needs to include all of the essential organizations for each review, per the model above. Successful SRM relies on cross-functional engagement and alignment.

Regular, formal business reviews (whether monthly or quarterly) include all aspects of the procurement scorecard to ensure the supplier is performing up to the required and agreed-to levels. Typical metrics will include on-time delivery, cost savings/reductions, quality performance, innovation contributions, and more. Higher-level management from both sides of the relationship should attend.

The more strategic the supplier is, the more important it is for executive management to participate. Of course, the most strategic suppliers may have senior executive involvement and commitment up to the CEO level. Many companies include annual supplier events to roll out strategic plans, recognize top suppliers, and share the direction of the company/supply chain. Some companies run quarterly all key supplier meetings where they present updates about the business, product strategy, and other subjects. This allows one single message to be sent more efficiently than covering the material one supplier at a time.

Many organizations have multiple locations and manufacturing sites associated with their business. In addition to the meetings that take place at procurement’s location, supplier engagement at higher levels is important for the rest of the corporate footprint. Structured meetings with engineering, manufacturing, marketing, and other groups that are spread out around the world are important to the overall communications hierarchy.

Executive commitment to the process is critical to its success. Managing the relationship at higher levels requires an appropriate investment in time from senior executives. The more strategic the supplier, the more imperative it is that executives from the buying side invest time and effort into the relationship. Suppliers need to understand how important they are to the business all the time, not just when a crisis occurs. Senior management’s full commitment to the process demonstrates that.

One cornerstone of any SRP is transparency from both sides of the relationship. If your suppliers are not intimately aware of their status in your hierarchy, the program needs work. You do not want a supplier to believe they are more important to you than how they are classified in your SRP hierarchy.


Relationship Investment 

SRP requires investments by both the supplier and customer to be effective. There are many ways to describe the attributes of a success engagement. Suppliers will achieve different levels of investment depending on where they sit in the hierarchy – strategic or preferred or approved.

The first attribute is access to technology, knowledge, and capability. The supplier ensures executive engagement in critical functions, active participation in business reviews, and early access to product development roadmaps. In return, the receiving organization provides access to key decision makers across the enterprise and feedback to the supplier for wins and losses of competitive bids.

Second is the commitment to world-class quality and execution. The supplier must consistently achieve quality levels agreed to with procurement. Appropriate quality standards must be realized and adhered to as they evolve. Immediate attention to any excursions by the supplier is vital. If the supplier performs to expectations, or better, they will improve or maintain their SRP rating.

The third area of importance is the speed and agility of the supplier. They must deliver or realize that the competition may very well beat them out. Delivery performance relies upon typical supply chain tools – buffer stock, supplier-managed inventory (SMI or VMI), just-in-time (JIT) delivery, lead-time improvement projects, improved forecasts, etc. Additionally, new product introduction (NPI) must be prioritized by the supplier to enable the customer to meet its market requirements. In return, the customer provides early engagement in the design process. Finally, the procurement team will participate closely with the supplier in collaborative planning and forecasting to support agility.

Cost is next on the list. In the world of never-ending cost reduction expectations, the supplier-customer relationship must deal with this directly. There needs to be an intimate understanding of the cost needs and drivers on both sides. The supplier must demonstrate commitment to on-going cost reduction and engage closely with the procurement and engineering teams to identify ways to lower costs, not just reduce prices. Procurement must clarify long-term cost requirements clearly. Active engagement by both parties in cost reduction workshops and planning sessions helps to manifest the mutual investment. Procurement will allocate more spend to a supplier who, all else equal, helps drive cost down.

Finally, the more important the supplier is, the more resources must be visible to the procurement and other teams at the customer. Suppliers must share their technology and operational roadmaps to gain alignment. The customer design teams require access to technical expertise from the supplier to achieve their goals. The supplier who does this best will gain access to early product design requirements and closer engagement with the design team itself. Innovation is required to drive long-term success of both parties. The supplier has to strive to be a market leader to remain atop the SRM hierarchy.

The table below summarizes these investment attributes.

SRP Investments 

Pulling It All Together

Supplier relationship management is a core process for a successful procurement or supply chain organization. We’ve touched on a number of essential elements of SRM and SRP in this article. The table below highlights a more comprehensive view of supplier relationship programs across many of the dimensions discussed above. It also highlights the differentiation across the three levels of SRP hierarchy in our model – Strategic, Preferred, and Approved.

SRP Attributes


Effective supplier relationship management is an important element of a successful procurement program. The concepts highlighted here can be expanded or reduced to fit any company’s supply management objectives. Once you decide on the scope and complexity of the program appropriate to manage your supply base, then you must drive integration of the process across your organization and with the suppliers.

Suppliers should know where they stand in the SRP hierarchy, why, and how to improve. Similarly, the supplier needs to communicate to procurement where they stand in the supplier’s customer relationship hierarchy. Clarity in how the process works, the roles of each party, defining success, and commitment to make the process work are the key components of the SRP foundation. Without these, SRP will not deliver the expected results.


Michael Massetti is a global high-tech supply chain executive who really does enjoy being a supply chain professional! Seriously. Thanks to the editorial feedback from Antonio Braga, Alex Brown, Joe Carson, John Fredette, and Richard Porcaro.

Michael Massetti LinkedIn Profile

Tech Support During Computing’s Jurassic Era

This is a true story. These events actually happened during June of 1977. There’s little (no) chance that Steven Spielberg will find the storyline worthy of a Hollywood blockbuster, though. Please enjoy!

To set the time and place better, let’s explore what was going on in the summer of 1977. Rod Stewart had Billboard’s #1 song of the year, “Tonight’s the Night.” “Margaritaville” was brand new that year, too! Annie Hall, Star Wars, and Smokey and the Bandit were big summer movie hits. Saturday Night Fever would not come out until December. The world record in the one-mile run was 3:49.4 by John Walker of New Zealand. The New York Yankees were on their way to their first World Series victory in 15 years.

The personal computing world was in its embryotic state. The Apple 2 came out in 1977. Radio Shack and Commodore also introduced new computers. Mind you, these were not of the “turn them on and run your application” variety. You turned them on and got a command prompt. The cursor after the prompt would blink endlessly in some shade of green. It was time to write a program or work on one you had already written. Video game images looked like Lego blocks!

Atari Combat 2

Atari “Combat 2

This story took place at the University of Notre Dame. I had just finished my sophomore year in EE there.

It Started Out As Any Other Day

I spent the entire summer on campus working to pay for college. June in northern Indiana is beautiful and the campus was glorious, albeit not nearly as crowded as when school was in session. For me it was an 8am to 4pm shift in the Electrical Engineering (EE) office doing odds and ends to support the professors while making a bit more than minimum wage – but it all went towards tuition. Evenings were spent at an automobile window factory between 5pm and 1am the next morning. The days were long.

This day started out just as any other would. I arrived on time and asked Nettie (the secretary) what she needed today. “Not too much, actually, Michael. Dr. (James) Melsa (Chairman of the EE Department) and Dr. (David) Cohn (EE Professor) are in Chicago doing their microprocessor seminar.” For me, I’d do the normal daily chores, visit one or two professors to make sure they were OK, talk with a few graduate students and try to keep out of the way.

The microprocessor that was beginning to rock the nation and kick open doors for computing around the world was the 8-bit Intel 8080. If you were not working as a university professor or in one of the handful of computing companies in the industry, this was a non-event. On this day, Dr. Melsa and Dr. Cohn were meeting with over a hundred very interested engineering people in Chicago to tell them all about programming the 8080 based on the book they had written that year.

South Bend, We’ve Got a Problem!

The phone rang. It was Nettie’s job to answer and she always did so with the utmost in professionalism. “Electrical Engineering office, this is Nettie.” I was not paying attention. I never got a call.

“Michael, it’s Dr. Melsa, he needs your help.”

“What could he possibly need my help with?” was all I could think. He never calls me!

I picked up the phone and listened intently. “The 8080 development system is not working. We do not have the boot code. Michael, can you go to the lab and write down the boot code for us? Then, call us back once you are done. If you cannot get this, our seminar has to be cancelled.”

“Sure, I believe can do that, Dr. Melsa. I just took the Introduction to Microprocessors course in the spring semester.” I’m sure he was very impressed with that resume.

“What did I just get myself into?” Talk about a test!

One Toggle Switch at a Time

So, why did I consider this the “Jurassic Era” of computing? Well, there was no tech support line. There was no Internet. There were no dial-up modems. There was no documentation. There wasn’t even a “boot disk” to restart the system. There was this rudimentary development system with a computing board and a few mechanical interfaces to allow humans to develop software and programs. I wish our system had the interface that the picture here shows. We had nothing close.

8080 Dev System

Getting the boot code meant turning on the system and toggling the “next address” switch to write down the instructions for each step from the LEDs. The code was somewhere between 50 and 70 instructions.

Boot code is better known as BIOS today for Basic Input/Output System. It was the set of instructions needed for the microprocessor to start the system up and get all the interfaces alive and well, like the toggle switches.

Of course, the boot code instructions were not FORTRAN or PASCAL or C or anything “object-oriented.” We didn’t even have a cross-assembler to convert 8080 code like “MOV A,B” for the boot code. “The instructions” were single byte (8-bits for those keeping score) octal codes like 352. Hexadecimal was not as widely used at the time.

I looked around the building and there was no one around to help. It was up to me. So much for that cushy summer job in the EE office. The task should not be that hard, right? It was 60 lines of 8 bits each. That’s 480 bits or 60 bytes or 0.000058 MB. Computers transfer that in fractions of a second today!

I single-stepped my way through the code twice. Thankfully, the second time through I got the same answer as the first time through. It seemed like an eternity.

Fingers Crossed

Almost one hour had passed since Dr. Melsa’s call. Since there were no cell phones to call with or text messages to update him that I was progressing along, he and Dr. Cohn plodded their way through the seminar with written materials only. I imagine they were not so patiently waiting for my return call.

I called the hotel where he was presenting and they got him on the line. We walked through each instruction line carefully. I’d say, “Instruction 17, 271” and he’d respond with “Instruction 17, 271.” We went back and forth through the instructions twice.

“OK, Michael thanks. We’ll call you back in a few minutes to let you know if it worked.” They sent the attendees off for a break and began to reprogram the system … one byte at a time … one toggle switch at a time!

The clock did not appear to move at all. I waited for what seemed like forever. “Did it work?” I wondered. Finally, the phone rang. “It worked!” Dr. Melsa exclaimed. “Thank you so much. See you tomorrow.”

Disaster averted. Maybe they’ll give me a full-ride scholarship for these heroics. Nope.

So What?

For any of us who grew up in the days where even the full word “application” was not used in the computing community much (they were “programs” back then!), it’s as though we’ve lived through the full Mesozoic timeline of computing from Triassic to Jurassic up through the Cretaceous period of dinosaurs on earth.

Computing has come a very long way from toggle switches to paper tape readers to punch cards up to today, where Siri and Cortana take voice commands from us. Today’s microprocessors are light years beyond the basic central processing units (CPU) they were back then. It’s been a great journey. We can only imagine what it will look like 40 years from now.



Michael Massetti is a global high-tech supply chain executive, and sometimes computer geek, who really does enjoy being a supply chain professional and a leader!

Michael Massetti LinkedIn Profile


 Computer History