Executive Briefings

Controlling the Uncontrollable in Supply Chain Management

If you think about it, the supply chain's action is outside the office. It is in a factory making things, a warehouse packing and shipping things, in a store where customers are shopping, or in a conveyance on the move.

When we think about the technology, then, several problems confront us:

First, is the disparate nature of the above-mentioned locations. Each system is unique with the data locked up within each system. Some data flows between, but that takes a lot of awkward coordination through integration and communications software to access the data. Then most organizations just don’t have a way to have a harmonious view of the interweaving processes and actions. They also then have to resort to audits and validations in a round robin of doubt.

Second is lack of item-level conditional data. Manufacturers are digitizing their products (creating smart products), which provide end-to-end visibility and provide intelligence to support total lifecycle management—from design through service. Retailers want data about merchandise and also about the shoppers. Most systems today just do not have the data model required to absorb the item-level/people-level data.

Third, this wave of the Industrial Internet of Things is based on a philosophy of real-time sensing. Not only do the legacy back-end enterprise systems not have the necessary sensor, location and/or geospatial data models, but importantly not the real-time capacity required to import the data, analyze and act with the response requirement.

Fourth, though we are reaching a huge level of connected devices, they have to be consistently and dependably connected in order to have a system that is reliable. (The way around some of this limitation is to have multiple data streams that can provide the context and analytics that collect enough of the right kind of data to extrapolate and draw insights.)

Finally, scalability is another aspect. All this new granular data needs to be absorbed, analyzed, used, stored and possibly accessed again and again.

Read Full Article

When we think about the technology, then, several problems confront us:

First, is the disparate nature of the above-mentioned locations. Each system is unique with the data locked up within each system. Some data flows between, but that takes a lot of awkward coordination through integration and communications software to access the data. Then most organizations just don’t have a way to have a harmonious view of the interweaving processes and actions. They also then have to resort to audits and validations in a round robin of doubt.

Second is lack of item-level conditional data. Manufacturers are digitizing their products (creating smart products), which provide end-to-end visibility and provide intelligence to support total lifecycle management—from design through service. Retailers want data about merchandise and also about the shoppers. Most systems today just do not have the data model required to absorb the item-level/people-level data.

Third, this wave of the Industrial Internet of Things is based on a philosophy of real-time sensing. Not only do the legacy back-end enterprise systems not have the necessary sensor, location and/or geospatial data models, but importantly not the real-time capacity required to import the data, analyze and act with the response requirement.

Fourth, though we are reaching a huge level of connected devices, they have to be consistently and dependably connected in order to have a system that is reliable. (The way around some of this limitation is to have multiple data streams that can provide the context and analytics that collect enough of the right kind of data to extrapolate and draw insights.)

Finally, scalability is another aspect. All this new granular data needs to be absorbed, analyzed, used, stored and possibly accessed again and again.

Read Full Article