Data interoperability is key to successful logistics operations, says Kevin Speers, chief executive officer and co-founder of Splice.
There is a plethora of applications and legacy systems out there, and as data sources change and evolve, a gap develops between the new and old. Data exchange isn't as accessible as it should be, and that often negatively affects import and export logistics operations, Speers says.
For example, for those looking to ship goods from the U.S., it’s critical to know when a container arrives at a terminal. “But the data to keep you informed comes from multiple places in multiple forms,” Speers says. “Keeping track of that, particularly when those receiving windows change at the last minute, is a real challenge posing problems for exporters. It makes transportation more expensive and costs higher for the supply chain.”
Ensuring information gets to players involved in container shipping is so important, Speers notes, that the Federal Maritime Commission is stepping in. “I think in the end, how data moves, how it gets where it needs to be, making it interoperable between systems, is really critical.”
Speers applauds the FMC’s position on instituting standards, but those will take time. “In the meantime, having systems that allow different data sources to talk to each other in a seamless way is going to be part of the solution to easing congestion and allowing use of the data that really affects the physical supply chain.”
Speers says there is a daily scramble among shippers, forwarders, trucking and drayage companies to get the information they need to make better decisions. An integration platform that translates any data type and harmonizes it with the ocean of other data available is key to a total and accurate picture. “You're going to need interoperability.”
Timely, incisive articles delivered directly to your inbox.