Executive Briefings

The Evolution of Delivery Optimization Technology

Like most business processes, the delivery planning process and supporting technology has evolved slowly. Advances in computing power, algorithm sophistication, global positioning systems, the internet and wireless technology have all played an important part in shaping the state of the art. Numerous companies have been successful leveraging these new technologies; while others have made significant investment and been disappointed with the result. Our research and client work in this field suggests companies typically evolve through three distinct stages as they grapple with the alternatives available in the marketplace:

Stage I: "Map on the Wall": Pre 1980's, delivery planning was accomplished manually with a map on the wall, a pencil and brainpower. Customer orders were provided to either a driver or dispatcher to plan the routes. In many cases, the dispatcher was a former driver that "knew" transportation. Using nothing more than experience, rules of thumb and intuition, route plans were generated by circling geographic clusters on a map. As long as customer demand was satisfied (no customer complaints) and the business complied with DOT regulations (no driver complaints), transportation had successfully completed its mission.

Unquestionably, manual route planning is the most ineffective method of minimizing delivery costs. The shear number of possible load combinations quickly overwhelms the ability of any human to identify the optimum solution (i.e. fewest routes and miles). A single route with 40 different stops has over 1 trillion possible load combinations! The number of calculations necessary to derive the optimum solution from the universe of possible solutions is simply beyond human capability.

A misguided strategy often concocted by Stage I companies to improve delivery planning effectiveness is to add "more skilled" planners. Aside from skill level, the number of planners required is largely dependant on the number of stops per route and customer constraints, but as a general rule one skilled planner can manually handle 10-15 routes per day, assuming an average of 40 stops per route. So, hypothetically, a company with 20 dispatch points would need at least one dispatcher per location, plus backup for vacation and absentee relief. And while talented transportation professionals are critically important in the planning process, "throwing people" at the planning issue is quite expensive and does not improve planning effectiveness.

Stage II: "Map on the Computer": As the first personal computers and rudimentary GIS tools were introduced in the early 1980's, the map was taken off the wall and put on a computer screen. Pioneering software companies quickly recognized the market opportunity to develop and sell desktop transportation planning packages. Early versions contained lightweight mathematical algorithms and graphical user interface to produce attractive electronic maps. Certainly, this was an improvement over the manual approach and many organizations embraced the use of technology as an important step forward in route planning.

Many companies using desktop planning software are under the mistaken impression the algorithms imbedded in the software are producing optimized routes. In fact, the vast majority of desktop packages are by design heuristic models that produce "a possible answer among many"-not an optimized solution.

Although much better than the manual technique, this approach has two significant shortcomings--simplistic algorithms and limited computing power. Depending on the skill of the user, and the amount of time available to search for a better answer (i.e. better = fewer routes and miles), the effort becomes an iterative process to determine the minimum number of routes required to deliver a given workload. In the real world, users simply do not have the time, computing power or motivation to grind through all the iterations necessary to arrive at an optimum solution. Further, the decentralized nature of desktop software deployment insures "consistent inconsistency" in applying critical planning parameters. As a result, meaningful comparisons of key operational metrics on an enterprise scale are impossible.

Stage III: "Centralized Optimization": As the internet era matured in the early 2000's, the next generation of delivery planning technology began to emerge. Fundamentally different from desktop software, state of the art optimization challenges the conventional wisdom of decentralized deployment.
Centralizing the planning process has several benefits:

1. All the critical planning parameters are uniformly and consistently applied to the optimization engine, which allows enterprise KPI visibility
2. Advanced mathematical techniques such as integer programming produces far superior results compared to desktop heuristics; and
3. The technology runs in a powerful parallel server-computing environment, which reduces the time to find the "best" answer to seconds without human intervention. The operating concept is simple--companies transmit shipment requirements to the centralized data center via the internet; the optimization engine determines the best solution and transmits back digital maps and stop-by-stop driving directions for each route. New customers are added, scheduled for the appropriate delivery day(s), assigned to the correct route, and inserted into the optimum stop sequence.

As one would imagine, applying these fundamentally different approaches to the same delivery problem should produce vastly different answers. To measure the magnitude of difference, each approach was tested with a leading consumer products company-delivering product to residential customers. Company personnel were already manually planning daily routes so that approach became the baseline for the analysis. To ensure "apples-to-apples" comparison, volumes, customer locations and delivery constraints were held constant across the three scenarios.

In the United States a typical route cost $125,000-$150,000 annually to operate (vehicle, driver, fuel and insurance). Therefore, the business benefit of the centralized approach is compelling.

The cost containment challenge for distribution intensive companies never really ends. Today the hot topic is fuel costs; tomorrows may be insurance premiums, GPS devices or driver compensation.

However, the heart of transportation effectiveness remains constant--minimizing routes and miles.

Centralized optimization coupled with next generation technology enables many companies to reduce transportation costs.
http://www.scientific-logistics.com

Like most business processes, the delivery planning process and supporting technology has evolved slowly. Advances in computing power, algorithm sophistication, global positioning systems, the internet and wireless technology have all played an important part in shaping the state of the art. Numerous companies have been successful leveraging these new technologies; while others have made significant investment and been disappointed with the result. Our research and client work in this field suggests companies typically evolve through three distinct stages as they grapple with the alternatives available in the marketplace:

Stage I: "Map on the Wall": Pre 1980's, delivery planning was accomplished manually with a map on the wall, a pencil and brainpower. Customer orders were provided to either a driver or dispatcher to plan the routes. In many cases, the dispatcher was a former driver that "knew" transportation. Using nothing more than experience, rules of thumb and intuition, route plans were generated by circling geographic clusters on a map. As long as customer demand was satisfied (no customer complaints) and the business complied with DOT regulations (no driver complaints), transportation had successfully completed its mission.

Unquestionably, manual route planning is the most ineffective method of minimizing delivery costs. The shear number of possible load combinations quickly overwhelms the ability of any human to identify the optimum solution (i.e. fewest routes and miles). A single route with 40 different stops has over 1 trillion possible load combinations! The number of calculations necessary to derive the optimum solution from the universe of possible solutions is simply beyond human capability.

A misguided strategy often concocted by Stage I companies to improve delivery planning effectiveness is to add "more skilled" planners. Aside from skill level, the number of planners required is largely dependant on the number of stops per route and customer constraints, but as a general rule one skilled planner can manually handle 10-15 routes per day, assuming an average of 40 stops per route. So, hypothetically, a company with 20 dispatch points would need at least one dispatcher per location, plus backup for vacation and absentee relief. And while talented transportation professionals are critically important in the planning process, "throwing people" at the planning issue is quite expensive and does not improve planning effectiveness.

Stage II: "Map on the Computer": As the first personal computers and rudimentary GIS tools were introduced in the early 1980's, the map was taken off the wall and put on a computer screen. Pioneering software companies quickly recognized the market opportunity to develop and sell desktop transportation planning packages. Early versions contained lightweight mathematical algorithms and graphical user interface to produce attractive electronic maps. Certainly, this was an improvement over the manual approach and many organizations embraced the use of technology as an important step forward in route planning.

Many companies using desktop planning software are under the mistaken impression the algorithms imbedded in the software are producing optimized routes. In fact, the vast majority of desktop packages are by design heuristic models that produce "a possible answer among many"-not an optimized solution.

Although much better than the manual technique, this approach has two significant shortcomings--simplistic algorithms and limited computing power. Depending on the skill of the user, and the amount of time available to search for a better answer (i.e. better = fewer routes and miles), the effort becomes an iterative process to determine the minimum number of routes required to deliver a given workload. In the real world, users simply do not have the time, computing power or motivation to grind through all the iterations necessary to arrive at an optimum solution. Further, the decentralized nature of desktop software deployment insures "consistent inconsistency" in applying critical planning parameters. As a result, meaningful comparisons of key operational metrics on an enterprise scale are impossible.

Stage III: "Centralized Optimization": As the internet era matured in the early 2000's, the next generation of delivery planning technology began to emerge. Fundamentally different from desktop software, state of the art optimization challenges the conventional wisdom of decentralized deployment.
Centralizing the planning process has several benefits:

1. All the critical planning parameters are uniformly and consistently applied to the optimization engine, which allows enterprise KPI visibility
2. Advanced mathematical techniques such as integer programming produces far superior results compared to desktop heuristics; and
3. The technology runs in a powerful parallel server-computing environment, which reduces the time to find the "best" answer to seconds without human intervention. The operating concept is simple--companies transmit shipment requirements to the centralized data center via the internet; the optimization engine determines the best solution and transmits back digital maps and stop-by-stop driving directions for each route. New customers are added, scheduled for the appropriate delivery day(s), assigned to the correct route, and inserted into the optimum stop sequence.

As one would imagine, applying these fundamentally different approaches to the same delivery problem should produce vastly different answers. To measure the magnitude of difference, each approach was tested with a leading consumer products company-delivering product to residential customers. Company personnel were already manually planning daily routes so that approach became the baseline for the analysis. To ensure "apples-to-apples" comparison, volumes, customer locations and delivery constraints were held constant across the three scenarios.

In the United States a typical route cost $125,000-$150,000 annually to operate (vehicle, driver, fuel and insurance). Therefore, the business benefit of the centralized approach is compelling.

The cost containment challenge for distribution intensive companies never really ends. Today the hot topic is fuel costs; tomorrows may be insurance premiums, GPS devices or driver compensation.

However, the heart of transportation effectiveness remains constant--minimizing routes and miles.

Centralized optimization coupled with next generation technology enables many companies to reduce transportation costs.
http://www.scientific-logistics.com