Delivering customer value faster through data and analytics with Hewlett Packard
NashTech helps HPE not only build customer value, but also gain momentum for analytics transformation
HPE has a mission to tell customers what to do to prevent problems and optimise their experience with the 20 billion sensors deployed in their data centers all around the globe.
Learn how HPE built real-time data pipelines to power the customer experience with AI and predict and prevent problems.
- How we bridge the gap between data and insight
- How did we handle tradeoffs between data latency, volume, transformation, and integration.
- How we evolved their classic, batch mode, big data architecture to deliver the value faster.
- Increased elasticity and resilient self-healing required to deliver big data at speed.
As a recognised industry innovator, HPE is paving the path for faster, smarter data center infrastructure solutions by adding near real-time insights to the InfoSight predictive analytics platform.
Data Platform Architect for HPE InfoSight is busier than ever these days. “I’m swamped with phone calls,” Jeff Dutton, InfoSight Data Platform Architect of HPE says excitedly. “Everyone wants to know more about getting value from their data in real-time.
This surge in demand for real-time analytics across HPE products is coming not only from InfoSight customers; it’s coming from other business units also inside HPE. Customers, both external and internal, are hungry to learn how to leverage streaming edge-to-core-to cloud data pipelines for their use cases. Streaming data pipelines allow almost any organisation to deliver higher value, faster to customers.
In this case, InfoSight is transforming the customer experience with AI for the data center to predict and prevent problems.
Delivering on the real-time element of the customer experience, however, necessitated a major overhaul to the InfoSight architecture, which was one of the most critical lessons learned by Dutton and his team.
“Moving to a near real-time user experience requires an infrastructure that self-heals, scales massively and is always available to process streaming workloads, no matter what,” says Dutton.
A worthy challenge: Delivering customer value faster
HPE’s vision for helping customers is to slash the time it takes to turn ideas into value, which is transforming industries, markets and lives. For InfoSight, delivering on this vision means delivering customer value in seconds and minutes, rather than hours or days
HPE InfoSight was already recognised as a category leader based on its ability to monitor infrastructure, predict possible problems and recommend ways to enhance performance.
It works by receiving infrastructure information from “call home” sensors and running analytics against the massive amount of usage data it has accumulated over the years as well as incoming sensor data. Located in the storage devices themselves, these sensors also collect network, compute, and hypervisor data.
HPE has over 20 billion sensors deployed in data centers all around the globe sending trillions of metrics each day to InfoSight, providing analytics on petabytes of telemetry data. “We’re watching what’s going on in the environment and putting the data through expert systems using various AI and machine learning techniques,” says Dutton. “The goal is to tell the customer what to do to prevent problems and optimise their experience.”
The next challenge: delivering insights on all of this, faster.
The IT Challenge: Introducing stream processing: Closing the gap between data and insight
The speed by which Dutton and the team could deliver insights to customers was gated by the time it took to run batch Processes against very large, extremely complex data sets.
As an initial experiment, the team migrated from an enterprise database to Apache Spark to see if faster batch was the answer. They quickly understood they would need to transition to work with the data in real time to realise their business vision. This meant introducing stream processing.
Batch analytics is something nearly every developer and architect understand fundamentally, regardless of their tech stack: notes Dutton.
“The streaming part of the journey has been the most challenging”.
Stream processing solutions process data in computing on data directly as it is produced or received. This turns around the batch-mode paradigm of storing data in a database, a file system, or another form of mass storage where applications query or compute over the data at rest. With stream processing the application logic, analytics and queries exist continuously, and data flows through them continuously.
‘Batch continues to play an important architecture. It’s leveraging several petabytes of historical data; says Dutton. “The source for near real – time insights however, becomes the stream?
The Solution: The next leap forward: Big Data at speed
HPE is collaborating with NashTech to design, implement, and operate a reference architecture based on InfoSight, which is fast becoming a blueprint for other HPE product lines
Delivering insights on streaming datasets necessitated a new approach for Dutton and the team. Unlike batch, data services in a dynamic, streaming application must run forever and be able to scale up, as well as down, on demand. Here, up and down are measures of size. The actual scaling is horizontal, across multiple nodes, rather than vertically, within a single machine.
In addition to scaling elastically, the streaming application must be able to recover from failures quickly. A logical data service may failover to backup instances. Failed service instances may be restarted automatically. This means that failure must be a first class, expected concept in the software. Failures are normal, anticipated, handled, and, ideally, self-heal within the application.
Dutton turned to the NashTech for the elasticity and resilient self-healing required to deliver big data at speed.
NashTech includes microservices frameworks for processing continuous application logic; multiple streaming engines for handling tradeoffs between data latency, volume, transformation, and integration; machine learning and deep learning expertise for applying algorithms; and intelligent management and monitoring for reducing the risk of running this always-on system in production.
“NashTech has helped Hewlett Packard Enterprise migrate InfoSight IoT analytics to a streaming architecture based on Scala and the SMACK stack”, shares Dutton. “Very few companies have trained teams focused on these cutting-edge technologies like NashTech, and none can match the value that NashTech provides. I have tremendous trust and respect for Vikas (Global CTO, NashTech) and my team is excited to continue developing our partnership with the NashTech team.”, he further added.
Back to Dutton’s phone ringing off the hook. NashTech Fast Data science services leveraging the Lightbend Fast Data Platform and Dutton’s approach are fast becoming a reference for other HPE business units looking to deliver similar real-time customer value. It is making such sound business sense that HPE Storage and HPE Pointnext are teaming to bring the streaming solution to HPE’s top customers.
“NashTech has helped Hewlett Packard Enterprise migrate InfoSight IoT analytics to a streaming architecture based on Scala and the SMACK stack. Very few companies have trained teams focused on these cutting-edge technologies like NashTech, and none can match the value that NashTech provides.”
Jeff Dutton, InfoSight Data Platform Architect – Hewlett Packard Enterprise
Read more case studies
Explore how NashTech help the digital shelf analytics and unlock growth with a world leading data insights and eCommerce solutions provider.
By working closely and collaboratively with the NashTech development team in Vietnam, they were able to build a high quality, digital first, luxury rental car service. Looking ahead into the future,...