Case Studies Banner

How we have delivered value by leveraging our platform and solutions capability

Case Studies

IDC estimates that the amount of useful data worldwide will increase 20x by 2020, and 77% of enterprise data will be unstructured. Over 40% of the organizations have more than 10 data sources, with information from applications, data warehouses and customer data sources, which need to be integrated to operate the business competitively.

Big data use cases geared towards generating results for companies are:

Data Warehouse optimization and modernization

Data Warehouse optimization and modernization

  • ELT/ETL on Hadoop
  • Replication on Hadoop
Enhanced view of Customer

Enhanced view of Customer

  • Correlating social data with CRM data for real-time sentiment analysis
Internet of Things (IoT) applications

Internet of Things (IoT) applications

  • Harnessing sensor data and building new applications

ETL on Hadoop

Increasing subscriber revenue for telecom operators

Need

A leading provider of value-added software solutions to telecom operators serving nearly 450+ million mobile subscribers, wanted to improve their processes and leverage real-time information to direct innovative campaigns for operators.

Technology Context

The existing EDW was implemented using databases running on expensive RAID-enabled infrastructure. The architecture inherently inhibited their ability to perform faster ETL operations. Given the cost and performance of their infrastructure, they chose to limit the amount of historical information stored. The metrics they were generating did not offer the best insights for effective and timely decision-making.

Solutions

Complement existing EDW with Hadoop and Storm. Using Apervi Conflux, we developed and deployed workflows to offload processing from existing EDW to Hadoop, and implemented real-time workflows leveraging Storm.

Approach
Hadoop Usecase

Apervi installed a multi-node Hadoop cluster and a Storm cluster to complement the existing EDW. Using Apervi Conflux, we created and deployed over 25 data integration workflows in two months to calculate over 70 KPIs, including ARPU, GPRS usage, voice usage, SMS usage, etc. Some of these workflows run daily, some weekly, and others monthly. We also created real-time workflows, for instance, to target roaming customers.

Outcomes and Value-add
  • Project was completed with a 5X reduction in implementation cost
  • The entire ETL process is now completed in 15 minutes (down from 8 hours)
  • Business users are empowered to create new KPIs and change target conditions without IT involvement
  • Apervi Conflux’s real-time streaming support allows dynamic KPI updates. The company is able to target subscribers with innovative campaigns in real-time.

Event Processing on Storm

Improving asset utilization in hospitals

Need

An established company that provides intelligent wireless technology wanted to develop solutions leveraging big data, to help hospitals maximize their investments and minimize their operational cost. They wanted to leverage Hadoop and Storm to develop an asset tracking and asset utilization solution.

Technology Context

The company already had wireless infrastructure in place at a hospital chosen to pilot the project. Here, movable and immovable medical assets were tagged with sensors that emit messages that are picked up by access points monitoring the different zones in the hospital. A zone, for instance, could be a hospital room. Access points sent out specific messages to a controller, like ascertaining the location of an asset, loss of signal, etc.

Solutions

Apervi was brought in as a partner and an expert in developing solutions leveraging Storm. Our solution was to extend the architecture to seamlessly support real-time and batch data processing, by leveraging message specific queues to capture events, and appropriately taking different processing paths based on business objectives.

Approach
Hadoop Usecase

The core solution architecture included several technologies like Storm, Cassandra, messaging systems, and relational databases. Our Storm topology correlated different types of events and messages from sensors to determine the real-time location of assets while also identifying assets needing maintenance. We leveraged Cassandra’s columnar architecture, scalability and low latency to fulfill objectives that needed real-time data. Sensor events were augmented with related data retrieved from relational databases to determine asset utilization and possible asset loss, along with reduction in down-time.

Outcomes and Value-add
  • We helped create a value-added solution that the company could market to healthcare organizations, and generate incremental revenue
  • Assets are able to be tracked near real-time enabling better utilization
  • Proactive notification helps improve maintenance of assets and extend useful life.
Clients & Partners
Top