By Charles King, Pund-IT inc. February 15, 2017
The increasing momentum around augmented and artificial intelligence has underscored the value of related machine learning processes and deep learning frameworks. That’s natural enough given the important roles those technologies play in developing analytical models and delivering cogent insights that organizations can use to their benefit.
However, less attention has been given to some of the practical processes required to fuel machine learning and achieve its insights. For example, like other analytically-intensive processes, the value of machine learning solutions depends in large part on the volume of information they can access and use.
The more, the better in other words, and we’re not talking about mere tens or hundreds of gigabytes of data, but rather multiple hundreds of terabytes and even petabytes. Similarly, it is seldom immediately apparent which algorithm is exactly right for any given machine learning project. Plus, disagreements may arise over the choice of which language or framework to use, resulting in lengthy, potentially costly delays.
These issues have significant, real world effects that can impact a project’s value and viability, but so do others. Moving massive data files into analytics systems takes time and effort, and may also result in inadvertent human errors and security risks. Similarly, the days or weeks that it takes data scientists to develop, test and retool an analytic model could be wasted if a better model exists.
For these reasons, IBM’s new Machine Learning platform and the decision to initially deliver it for the company’s zSystem mainframe solutions should be greeted warmly by enterprise customers and the data science community.
Go where the data lives
Why is that the case? First and foremost because IBM Machine Learning (which was extracted out of the company’s Watson cognitive platform) can utilize the vast majority of the world’s business information that already resides in zSystem infrastructures. That means that the banks, government agencies, insurers, retailers and transportation companies that already depend on IBM’s zSystem to perform billions of transactions daily can get a jump on machine learning projects quickly and without needing to make significant additional investments.
Those same companies can easily utilize IBM Machine Learning to seamlessly train models on historical and current information, then perform real time transaction scoring. That makes IBM Machine Learning valuable for and applicable across a wide range of scenarios, including dynamic retail sales forecasting, financial analysis and recommendations, complex supply chain management and personally tailored healthcare services.
As an example, IBM noted that Argus Health (one of DST System’s service companies) is evaluating how the technology can improve pharmacy costs by leveraging up to the minute pricing and predictive data in point of sale transactions. Over time, Argus Health hopes to use IBM Machine Learning in various scenarios to enable real-time results that benefit members, their caregivers and physicians.
IBM has also incorporated features that underscore the new platform’s usability and value, including currently supporting a popular language called Scale on Spark (and eventually other languages and frameworks), as well as any transactional data type. In addition, IBM Machine Learning will support Cognitive Automation for Data Scientists, a new solution from IBM Research that scores a customer’s data and project requirements against available algorithms, then recommends the best match.
IBM Machine Learning will be available in March as a private cloud offering on zSystem mainframe infrastructures, and will come to IBM’s Power Systems next. A Machine Learning-based offering called the IBM Data Science Experience will come to IBM Cloud in the near future and support hybrid implementations.
Why is any of this important? For three reasons. First, the new platform demonstrates the kinds of unique synergies that IBM is able to capture through its continuing strategic investments. Those include the company’s internal organic R&D efforts and external acquisition in advanced analytics, as well as the ongoing evolution of the IBM zSystem mainframe platform.
But at the same time, IBM Machine Learning should deliver significant benefits to and help create new business opportunities for both IBM and its enterprise customers, a not unsubstantial issue. Finally, it highlights the continuing evolution and undiminished enterprise value of IBM’s zSystem mainframe and Power Systems solutions. That is an important point in a time when so many IT vendors have abandoned homegrown innovations for off the shelf industry standard components.
In essence, IBM Machine Learning appears to be a solid offering that should prove valuable for many of the company’s customers. Given the growing interest in leveraging machine learning for advanced analytics in commercial business scenarios, the platform’s initial affinity for zSystem environments and eventual availability on Power Systems and IBM Cloud, uptake among the company’s enterprise clientele is likely to be brisk.
Moreover, while there are certainly other machine learning platforms and other vendors focusing considerable energies on this market, none is likely be able to emulate, let alone replicate IBM Machine Learning.
© 2017 Pund-IT, Inc. All rights reserved.