When connecting supply chain data, what helps guarantee your investment will have the expected benefit is the quality of the inputs received.
In practical terms, many companies “try to” achieve full data visibility by using different sources of inputs: EDI connections, Excel reports and input gathered manually from websites (commonly known as screen/web-scraping). But when complex systems are designed, aspects such as accuracy and timeliness of data exchanged are not always a top consideration, not because it is “not important” but because each company is working with what’s available In a survey about supply chain visibility published by MIT, nearly half of the respondents were less than moderately satisfied with their current visibility solutions. Timeliness is an aspect of attention, with only 40 percent of respondents reporting to receive updates within 12 hours across all ocean shipping milestones, which is still not ideal nor efficient.
Working with different files, formats and without a single source of truth can be problematic for several reasons. It primarily impacts the efficiency of daily operations as these can only be performed based on the information available. On a strategic level, it decreases the reporting and analytical capabilities for better decision- making.
Consider the scenario in which a shipper needs access to information to properly handle an exception; their ability to make the best decision will often be constrained by not having real time information. The dependency on information gathered manually or received in batches in an inconsistent and non-standardized way can result in either incomplete data or data available only after a decision has to be made, or both, resulting in a potentially suboptimal outcome for the shipper or their end client.
Timeliness: Is the data received on time?
The ability of conducting an effective day to day logistics operation is highly dependent on receiving critical data on time. Operational risks are reduced if transportation milestones are received in near real-time, especially when managing exceptions. Using the common industry standards for communication, such as EDI messages or updates shared via reports, the information is often shared in batches. Waiting hours to receive important milestones on the system, especially if they require immediate action, are costly. The cost is even higher if the information is being gathered manually, e.g., screen-scraping different websites and platforms, making several phone calls, and sending multiple emails.
Today, using traditional “data-aggregator” platforms, when beneficial cargo owners or freight forwarders search for shipping milestones, they most likely will have access to information showing limited number of container events, and these are and only “actuals”. This information highlights when the event occurred (occurrence time). When analyzing historical data, it is indeed valuable to understand whether an event occurred or not and when. However, to assess the real value of the data on day-to-day operations and future business strategies, there is another key element that needs to be included into this equation: when was this event submitted in the system (submission time) and made available to you (published time)?
This is an extremely important aspect to assess data quality as there are often delays between event occurrence and information actually received. The difference can vary massively, taking hours or in some cases even days. This can have a significant impact on the decision-making process, as well as, understanding the reliability and comparative performance of your data sources and logistics partners.
TradeLens conducted a study involving a side-by-side comparison of container events for 50 containers with an inland destination in Europe. The comparison involved data from the current system used by the 4PL provider and the events received by TradeLens. We studied the timeliness and accuracy of the milestones received. One of the key insights gathered during this proof of concept was around timeliness of the data. When all the milestones at the destination were tracked by TradeLens, the average time between occurrence and submission time was 31 minutes. This means that the milestones submitted by carriers, terminals and inland depots in this study were available and visible in around half an hour after they actually occurred. What about the data timeliness by the other entity? The current setup used by the 4PL did not have the functionality to measure data timeliness And this is the reality for the majority of the companies operating in global trade.
Accuracy: Is the data received accurate?
Understanding the accuracy of the data received/gathered allows companies to have different and enhanced conversations with their business partners about the quality of their input. To fight against a common industry problem, popularly known as GIGO — garbage in, garbage out, it’s key to consider new ways of exchanging data that are readily available today. A collaborative operating model cannot be achieved if the means of collecting data are not being tracked and the data itself is not easily available to companies. The strategic use of supply chain data is only realistic with a change to the traditional model, where data is often overwritten in systems by new updates not allowing history to be tracked. Therefore, limiting continuous data and supply chain improvements and further optimization to stay competitive and closer to a company’s mission.
Even the most “sophisticated” and complex systems today, involving several peer-to-peer EDI connections across different providers and constant manual input from supply chain partners that are not able to establish system-to-system integration (different reasons), are subject to high data inaccuracy. Having as a basis the same study mentioned above, the 4PL’s current system, which includes reports produced by EDI connections and manual updates, did not fully track the source of the update, therefore it was unable to identify the root cause of inaccurate data. The same milestones on TradeLens are easily auditable. Information about data providers and submission and published time are a mandatory part of every milestone and new updates available in the system are visible independently of the way companies are accessing the data (API or User Interface). Last but not least, in the back-end, there is no overwriting existing information, which allows companies to better understand data performance from the wider ecosystem for further improvements As the saying goes “you can’t fix what you can’t see (or measure)”.
The first step when looking at utilizing data to support supply chain optimization, and furthermore, as a competitive advantage is to operate with a system that allows data errors or discrepancies to be easily located and addressed. Strong feedback and collaboration with data providers needs to be backed by evidence and numbers. Keeping track of errors and discrepancies supports companies to identify possible risks to their supply chain related to inconsistencies in the information exchanged throughout the full cycle As described in the previous post on data quality, it’s time to trust your data — and more importantly, to truly use your data:
When our airline app tells us the flight has landed, we are confident our friend will soon be ready for us to pick them up at arrival. Not so in international shipping, where data for everything from a cargo’s departure, gate-in at a terminal or release from customs is far less dependable than we would like.
Only when data is properly tracked, shippers and consignees will be able to have conversations with their partners to identify and fix the issues that put our industry behind others in terms of data accuracy and transparency. Inaccuracy is often overlooked with the assumption that this is the norm. The good news is that already today, there are existing solutions and tools that can already, without massive investment or process change, improve data accuracy within the logistics industry.
How are accuracy and timeliness issues being addressed?
Understanding data quality and taking accountability on the information that is exchanged can strengthen partnerships and build trust across the logistics industry. The traditional model has often impeded companies from being able to monitor and track aspects such as accuracy and timeliness of their data. Working through different systems, costly peer-to-peer connections and several different providers translates into a lack of ownership when it comes to acknowledging, and ultimately fixing data qualities issues.
Solving this fundamental industry issue, generated by a fragmented landscape and multiple non-interoperable systems, is a massive task. When setting up your own operational system or working with data providers, it is important to keep in mind the fundamental aspects to track data quality and the process in place to flag discrepancies and create feedback loops to address concerns. Further, adding data quality to selection criteria for the right partners will become even more critical in the future.
TradeLens is a platform that enables you to build your business on true end-to-end visibility, simplifying the connection and information exchange between you and your partners. TradeLens is driving data validation through standards, proactive data monitoring and issue-detection, and by facilitating a feedback loop across all parties.
Article written by Alexa Rios & Adriele Pradi
Find out how TradeLens enables the digital transformation of your supply chain.
Anything you need, we’re here to help
By submitting this form, I agree to receive logistics related news and marketing updates from A. P. Moller-Maersk and its affiliated companies via e-mail. I understand that I can opt out of such Maersk communications at any time. To see how we process your personal data, please see our Privacy Notification.