HPE plus MapR: Too much Hadoop, not enough cloud

Cloud murdered the fortunes of their Hadoop trinity–Cloudera, Hortonworks, and MapR–and the exact same cloud likely won’t rain victory down HPE, which recently gained the business assets of MapR. Though the arrangement promises to marry”MapR’s technology, intellectual property, and domain expertise in artificial intelligence and machine learning (AI/ML) and analytics information direction” using HPE’s”Intelligent Data Platform capabilities,” the bargain is devoid of the 1 component that both companies desire most: Cloud.

The problem, in other words, is not that MapR was not stuffed to the brim with intelligent people and fantastic technology, as Wikibon analyst James Kobielus claims . No, the problem is that MapR remains way too Hadoop-y rather than nearly cloudy enough in a world full of”fully incorporated [cloud-first] offerings which have a lower cost of acquisition and are more economical to climb,” since Diffblue CEO Mathew Lodge has said. Simply speaking, MapR may expand HPE’s information resources, but it does not make HPE a cloud competition.

Why cloud issues

Yes, hybrid is still something, and will remain so for many years to come. As much as businesses may want to maneuver workloads to a cloudy future, 95 percent of IT remains firmly implanted privately information centres. New workloads often proceed cloud, but there are literally decades of workloads still running on-premises.

But this hybrid universe, that HPE pitches so loudly (“invention with hybrid ,””from border to blur,””exploit the power of information where it lives,” etc.), has not been as big a deal in big information workloads. Part of the reason boils down to some reliance on old-school versions such as Hadoop,”built to be a giant sole source of information,” since mentioned by Amalgam Insights CEO Hyoun Park. That is a cumbersome version, especially in a world where big info is born in the cloud and wishes to stay there, instead of being sent to on-premises servers. Can you operate Hadoop from the cloud? Obviously. Companies like AWS do exactly that (Elastic MapReduce, anyone?) . But arguably even Hadoop from the cloud is a losing strategy for many big information workloads, because it simply does not match the streaming information world where we live.

And then there is the on-premises problem. Since AWS data science leader Matt Wood explained, cloud elasticity Is Essential to doing information science directly:

The ones that go out and buy expensive infrastructure see that the problem extent and domain change really quickly. By the time they get around to answering the initial question, the business has proceeded. You want an environment that’s flexible and enables you to quickly react to changing big data demands. Your source mix is continually evolving–should you buy infrastructure, then it is almost immediately irrelevant to your business because it is suspended in time. It is solving a problem you may not need or care about any longer.

MapR had made attempts to move beyond its on-premises Hadoop ago, but arguably too small, too late.

Brother, can you spare a cloud?

That brings us back to HPE. In 2015 the company dropped its public cloud offering, instead choosing to”double-down on our personal and cloud capabilities” That may have seemed acceptable back when OpenStack was breathing, but it pigeon-holed HPE as a mostly on-premises seller trying to associate its way to public cloud value. It is inadequate.

Whereas Red Hat, by way of example, can credibly claim to possess profound assets in Kubernetes (Red Hat OpenShift) that assist businesses build for hybrid vehicle and multi-cloud situations, HPE does not. It’s attempted to arrive through purchase (e.g., BlueData for containers), but it simply lacks a cohesive product collection.

More worryingly, every significant public cloud seller now has a good hybrid offering, and ventures considering modernizing will frequently decide to choose the cloud-first seller which also has expertise privately information centres, instead of betting on legacy vendors with ambitions for public cloud value. For Google, it is Anthos. For Microsoft Azure, hybrid vehicle was fundamental into the company’s product offering and promotion from the beginning. And for AWS, which at once eschewed private information centres, the company has built a ton of hybrid services (e.g., Snowball) and partnerships (VMware) to assist businesses have their own cloud cake and eat confidential information centres, also.

Input MapR, using its contrarian, proprietary method of the available source Hadoop marketplace. That strategy won it several key converts, but it never needed a broad-based following. Fantastic tech? Sure. Cloudy DNA and goods? Nope.

In sum, although I expect the union of HPE and MapR will yield happy, cloudy venture clients, this”doubling-down” by HPE on technology resources that maintain it firmly grounded on-premises does not hold much promise. Big info belongs in the cloud, and cloud is not something you can buy. It is another way of working, another way of thinking. HPE did not get that DNA with MapR.