Big Data Analytics at UPS — A Case Study

Sagar Sahoo
5 min readFeb 10, 2021

--

Introduction — About UPS

United Parcel Services i.e UPS is an American multinational package delivery and supply chain management organization founded in 1907. With annual revenues of $71 billion, the company offers forwarding and logistics services in more than 175 countries. The services include worldwide supply chain, freight forwarding and distribution, customs brokerage, mail and consulting services.

As per the latest press release by UPS (2019), the global delivery volume was 5.5 billion packages/documents in 2019 across 11 million customers. The delivery fleet includes about 125000 package cars/vans, and advanced technology vehicles, and close to 500 owned and leased jets. UPS caters to closely 300 million tracking requests per day. How are they managing at such a bigger scale?

The Challenge — Data Storage & Route Optimization

In 2013, UPS claimed that it housed the world’s largest DB2 relational database across two US-based datacentres to handle global operations. Now the global operations started increasing. With the volume, variety of data increasing, they needed to keep harmony with the business and customers. Juan Perz, Chief Information and Engineering officer said UPS is using technology to improve its flexibility, capability, and efficiency. The aim is to use all the various forms of data it collects and leverages the information out of it to take much-improved business decisions.

One of the key problems they wanted to solve was Route Optimization. As per an interview of UPS CTO Dave Barnes, a saving of 1 mile per day per driver could result in a saving of 1.5 million gallons of fuel per year, which equals $ 50 million in savings annually.

The average UPS driver makes about 120 to 140 stops per day. If you do the math, that comes out to a huge number of possible combinations of stops — something on the order of 6.8 x 10 to the 60th power. The regular drivers can’t possibly think through these combinations.

However, the setback was the volume of data UPS generates. And again the variety, velocity of data it collects from various sources. The data was spread out in DB2, some incorporate repositories, some in local, and some in spreadsheets. UPS needed to solve this data infrastructure problem first and then work on the route optimization.

The Solution — Big Data & ORION

Big data ecosystem handles the four V’s of data effectively i.e volume, veracity, velocity, and variety. UPS starting their journey of transition into the ecosystem. They started experimenting with Hadoop clusters and integrating their storage and computing system into the ecosystem. They upgraded the storage and computing capacity to process petabytes of data, one of the greatest technological advancements at UPS.

The Hadoop ecosystem across various business operations would have given access to a large amount of opaque data. Several components of Hadoop i.e HDFS for storage, Map Reduce for faster processing, Kafka streaming, Sqoop (SQL-to-Hadoop) for ingestion, Hive & Pig for structured querying in unstructured data, a monitoring system for the data-nodes & name-nodes would have given various dimensions to improve their operational efficiency and better understand customer needs. Although because of confidentially they haven’t released the tools and technologies they are using in their big data ecosystem, still, we could fairly guess the tools they might be using given the time frame.

Jack Levis, 36 year UPS veteran’s said: “Insight that doesn’t lead to a better decision is trivial, so you need to think through what’s the decision you’re going to make with data.” His goal was now to use bigger picture analytics to analyze operational efficiency, spelling out how descriptive, predictive, and prescriptive analytics power their daily operations.

The answer to the problems i.e route optimization problem was “ORION” which stands for “On-Road Integrated Optimization and Navigation”. This was a 4 year-long project which started in 2013 with approximate expenses of $1 billion per year. Scheduled to be deployed in 2016–17, this was viewed as the largest operations project and take current route optimization to the next level.

Results

ORION leveraged the storage and computing capacity of big data and used analytics across 300 million+ data points to deliver tens of thousands of route optimizations per minute based on real-time information. Apart from economic benefits, the Orion project has led to a reduction of about 100 million delivery miles and therefore 100,000 metric tons of carbon emissions

UPS does not apply a big data ecosystem only for route optimization. They are extending the application in the field of predictive and prescriptive analytics as well. As per online articles, predictive analytics have helped them save millions in unnecessary and unplanned maintenance. Recent developments show that they are using artificial intelligence as well. Using AI-powered chatbots they are helping customers find rates and tracking information across a series of the platform. Thanks to all the data collected, the decision to leverage the power data, and massive investments in big data, UPS has become one of the largest and efficient package/logistics companies in the world.

To summarize,

Which company identified the problem and felt the need to migrate the in-house system to Big Data ecosystem? Ans — UPS

Who are the target users benefited by the adoption directly ? Ans — UPS Operations & Business Team

What are the target (functional) areas benefited ? Ans — Operations (specifically Planning, Scheduling, Maintenance, and Delivery Fleet)

What value will they get because of the adoption? Ans — Faster access to a large amounts of data which will help understand the data and make data-driven business decisions and thereby improving customer experience and base. Implementation of ORION on top of the ecosystem helped the delivery fleet save 100 million+ delivery miles.

How they integrated the data and tools used? Ans — Components of Hadoop i.e HDFS for storage, Map Reduce for faster processing, Kafka streaming, Sqoop (SQL-to-Hadoop) for ingestion, Hive & Pig for structured querying in unstructured data, cluster monitoring systems would have been an integral part of the ecosystem.

What could be the future extension of the implementation? Ans — We have seen UPS dive into multiple dimensions post-implementation of the Hadoop ecosystem. Be it ORION, or AI-powered BOT or UPS My Choice Alert system or Network Planning tools.

Conclusion — “Big data is a how. It’s not a what”

Levis advised, “It’s the business decision that matters. Big data is a how. It’s not a what.”. This sounds true and UPS has shown us how big data analytics can be used to leverage operational & customer data to achieve improved operations, increased profits, better decision making, and bring out new offerings for their customers and grow as an organization. The leadership team along with the technology team should collectively analyze the data in hand, check the feasibility/infrastructure, envision the risk/budget/benefits prior to implementing the big data analytics ecosystem.

References

https://www.datanami.com/2015/10/26/why-big-data-is-a-how-at-ups-not-a-what/

https://www.datanami.com/2013/02/23/ups_delivers_on_prescriptive_analytics/

https://www.ciodive.com/news/ups-unpacks-big-data-to-deliver-insight/530295/

https://erpinnews.com/big-data-case-study-ups-using-analytics-improve-performance

--

--