Beyond The Control Zone--Part Two: Controlling the Uncontrollable
on Jun 2, 2016
In order to achieve visibility and control, supply chain managers will have to master new types of data and technologies.
Full Article Below -
In Part One, Confronting a Global Connected World, we talked about the issues confronting supply chain professionals in controlling the business process, as well as today’s risk and security—both physical and cyber—and painted a simple vision of creating a more visible and informed supply chain. Here in Part Two we will look at technology attributes to provide visibility and control.
If you think about it, the supply chain’s action is outside the office. It is in a factory making things, a warehouse packing and shipping things, in a store where customers are shopping, or in a conveyance on the move.
When we think about the technology, then, several problems confront us:
Firstly, is the disparate nature of the above-mentioned locations. Each system is unique with the data locked up within each system. Some data flows between, but that takes a lot of awkward coordination through integration and communications software to access the data. Then most organizations just don’t have a way to have a harmonious view of the interweaving processes and actions. They also then have to resort to audits and validations in a round robin of doubt.
Lack of item-level conditional data. Manufacturers are digitizing their products (creating smart products) which provide end-to-end visibility and provide intelligence to support total lifecycle management—from design through service. Retailers want data about merchandise and also about the shoppers. Most systems today just do not have the data model required to absorb the item-level/people-level data.
This wave of the Industrial Internet of Things (IoT) is based on a philosophy of real-time sensing. Not only do the legacy back-end enterprise systems not have the necessary sensor, location and/or geospatial data models, but importantly not the real-time capacity required to import the data, analyze and act with the response requirement. (Some tech company’s solution is to have a separate stack for IoT, but this then lacks the blending into the operations of the business.)
Though we are reaching a huge level of connected devices, they have to be consistently and dependably connected in order to have a system that is reliable. (The way around some of this limitation is to have multiple data streams that can provide the context and analytics that collect enough of the right kind of data to extrapolate and draw insights.)
Scalability is another aspect. All this new granular data needs to be absorbed, analyzed, used, stored and possibly accessed again and again.
Figure 1 - Multiple Data Streams
Internet of Things (IoT): Sensor-rich networks that provide source data about the things—assets, cargo, the carriers, equipment (machines, vehicles, devices), and the people. IoT includes multiple layers of intelligence/software embedded in the devices, in local onsite processors, and up in the cloud.
Context Aware: Live geospatial data, including environmental data that provides context. Placing the thing in a geospatial and business process context creates context awareness.
Complex event processing (CEP): Built on top of this context aware source data, CEPs look for and identify patterns involving multiple events and streams of data. “Under this circumstance, this result occurs.” This allows a system to continually monitor the vast amounts of IoT and related data, which in their raw form will overwhelm any person. CEPs thus enable rapid awareness of problems needing action and decision making on what to do about them. Users can create rules that generate alerts and alter processes to achieve better outcomes.
On Demand: Often talked about but poorly defined, on-demand technology allows users to consume software, one app (or service) at a time. But now users can gain on-demand information—data feeds from multiple information sources—both public and private—thereby reducing upfront data creation and data entry, as well as cost and time to go live.
Digital Supply Chain
In essence we are talking about creating a digital platform that has new rich data, fluid work/process flows, and analytics. What are some of the capabilities then for a digital supply platform?
Live Streaming—Zero-latency means not only instantaneous, but continuous delivery of data. That is, the ‘picture’ is a continuous data stream rather than an infrequent historical message. This is a new concept for enterprise users. Near-zero continuous information provides a profound time advantage: real-time information, not planned data, providing enough time to allow users to make new decisions to respond to changing conditions as they unfold.
Social—B2B collaboration between trading partners and/or social consumer engagement to interact with people—not just systems.
Location-precision—The market has plenty of systems that provide static maps. However, they don’t relate to a current, real-time picture of what is occurring in those locales. Geographical Information Systems (GIS), GPS, RFID, Mobile and other methods can pinpoint, locate, and track equipment and shipments.
Context Awareness—The term context is often bantered about in the technology field. But what does it mean? According to Webster’s Dictionary, context is “the circumstances that form the setting for an event.”Typically systems send lots of data around—a message, document and so on, but none of the rich data to provide the context.
... For example, ports, factories, and logistics routes are particularly vulnerable and can be the site of social or labor unrest, equipment failures, bad weather or other hazards that cause delays. Data streams that provide these scenarios and even social feeds would spot the impending or current events. Complex weather analytics and pattern recognition would spot an impending storm. In certain ports in the world, this is a critical problem for high value shipments such as electronics, petroleum, and automobiles, for example. More mundane, but everyday occurrences are delays due to poor information, which cause missed or delayed handoffs.
In today’s world of high-stakes global trade, companies are not only concerned about the value of the cargo, but the safety of employees. Context awareness allows users to have great insight so they can make better decisions, change strategies and so on.
These new data now need a platform—that inter/cross process, cross-site way to see, analyze and act on these. Modern and digital systems have to go beyond the new data and infuse many new capabilities. Rule-driven, complex event processing analytics, machine learning, predictive, big data— these are just a few terms bantered about. But what do they mean, really?
Dynamic Analytics—Most organizations still think of analytics as charting the static data they have accumulated from two-dimensional systems—columns and rows. An example is sales data. Here we are talking about more. Rich data from multiple streams is needed to understand the complex context of events or a stream of events. This can be composed of analog and digital data—big data from the web, or data from your own IT systems, or temporal data from a weather feed, social data, video, sensor data (temperature, vibration, altitude), and sound can all be blended together to sift and discover a chain of events or a key piece of data that is significant. For example, a loyal customer just enters the mall, or a ‘bad actor’ arrives at the airport. A chain of events is the most interesting, such as tracking weather patterns and associating other business data with these events. The analytics would discover that each time a condition exists, we note that something else happens that is important.
Machine Learning—As we or the systems sift an associate’s data with an event; it can store cause and effect and identify these events again. Aha, this then helps to understand what the impact will be if the event is allowed to unfold. What could be the impact of a change? Where should you focus your attention and efforts? Rather than humans having to discover all this, ‘the machine’ is learning and takes over the job for us.
Rules engines are becoming another important feature today. Now that I know about an event and its outcome, I may want to ‘instruct’ the process to act in a certain way. Rules then are developed to systematically and independently take the action.Users can build their own rules and also the system can recommend a rule. Over time an organization can build up a library ofrules that can be referred to as a policy statement on how we do many things, from accounting, inventory management, purchasing policies, how to safely operate equipment and so on.
Rules engines and workflow management or business process management (BPM) tie all the data crunching up in a nice bow. That is codifying and institutionalizing all that learning. Since we are talking about so many SKUs across so many miles being subjected to so many events, gleaning impacts both subtle and dramatic is hard to do without assistance. There are a lot of needles in haystacks to find, understand, and then ensure that the proper people are notified with the right data, and that decisions and actions take place quickly and accurately!
Finally, the supply chain platform has to be multi-party. Again, we are out of the office analyzing and interacting with the world. A cloud with modern integration is essential here.
All this is exciting to contemplate, but another lingering question then is the practicality of it. What will be the value proposition in adopting this new data and systems and the modes of behavior that will emerge from this richer vision of reality. In Part Three of this series, Higher Value, we will discuss the benefits.
To view other articles from this issue of the brief, click here.