Category Archives: Reports

IBM’s “Explainable” AI – Building Transparency and Trust into Artificial Intelligence

By Charles King, Pund-IT, Inc.  April 10, 2019

Issues of trust seldom arise in discussions about modern computing systems. It’s not that hardware and software are perfect. In fact, publications and online forums contain tens of thousands of posts hashing out the relative merits of various PCs, workstations and servers. But those products have been so commonplace for so long that their essential “rightness” as well as the results they provide are hardly ever questioned.

However, that wasn’t always the case, and a similar dynamic applies to most all emerging technical and scientific breakthroughs, including commercial artificial intelligence (AI) solutions designed for businesses and other organizations. Considering the inherent complexity of machine learning, neural networks and other AI-related processes, customers’ confusion about AI isn’t all that surprising. But what can be done to assuage their misgivings and bring AI into the mainstream?

Vendors, including IBM are tackling the problem with solutions designed to make AI processes and results more explainable, understandable and trustworthy. That should satisfy clients’ doubts and accelerate the adoption of commercial offerings, but explainable AI also yields other significant benefits. Let’s consider why explainable AI is so important and how IBM’s innovations are impacting its customers.

The problem of “black box” AI

A lack of clarity or understanding is usually problematic. When it comes to inexplicable artificial intelligence, three potential issues can arise:

  1. Most importantly, a lack of transparency leaves users uncertain about the validity and accuracy of results. That is, the essential value of AI projects and processes is undermined.
  2. In addition, if AI projects are inexplicable, it’s possible that their results might be contaminated by bias or inaccuracies. Call this a problem that you can’t really be sure you have.
  3. Finally, when AI processes are not explainable, troubleshooting anomalous results is difficult or even impossible. That is, a lack of transparency leaves organizations unable to fix what’s broken.

How are AI-focused vendors addressing these issues? Unfortunately, often with fixes that worsen the situation, including “black box” solutions. These purport to deliver all the benefits of AI but fail to provide adequate transparency into how they work, how customers can determine the accuracy of results or how problematic issues can be addressed.

These solutions also encourage perceptions of AI as a mystery whose capabilities can’t be understood by mere mortals. In other words, whatever modest benefits “black box” AI may offer, leaving customers in the dark is detrimental to their work and goals.

The benefits of explainable AI

Is there a better way to proceed? Absolutely. How can organizations explain AI successfully? With a holistic approach that addresses several stages of the AI lifecycle:

  • Central to making AI projects more explainable is making AI models explainable instead of the black boxes many currently are.
  • Organizations must clearly and transparently articulate where and how AI is being used in business processes and for what ends.
  • They must also allow for analysis of AI outcomes and provide hooks to alter and override those outcomes as necessary.

To these ends, expert users and managers should employ technologies and solutions that vendors have designed to enhance AI transparency. These methodologies can speed the understanding of AI, which is great. It is also a critical issue for IT, marketing, sales and customer care organizations, especially those in highly regulated industries, such as banking and insurance.

This process can occur organically, as people experience AI and come to understand how it affects the business and them personally, thus impacting the very culture of an organization. Or it can be pursued proactively with the best tools and solutions currently available. Whichever way a company proceeds, people need to keep in mind the vast potential of AI. Why so? Because a time will come when AI is as essential to an organization’s success as the business technologies that are commonplace today.

The business benefits of explainable AI

Why is explainable artificial intelligence such an important issue and undertaking? It goes to the practical roots of how organizations do business. If they are to adopt and adapt to AI processes, they need to know that results are accurate. Otherwise, how can they assure customers and partners that AI-impacted decisions are valid and dependable? Consider two examples:

  1. In financial services, accurate results are obviously critical for maximizing business outcomes and customer interactions. However, like other businesses in highly regulated industries, banks and other financial organizations must be able to prove that AI-impacted processes comply with government and industry rules or risk significant sanctions and penalties. That’s bad enough, but inexplicable AI might also damage client relationships. If customers seeking loans, credit cards or other services are denied by an AI-related system, company officials must be able to explain why that determination was made and how the client might address or correct problematic issues.
  2. Global supply chain management is another promising area for AI because the complexity, volume and diversity of supply chain data make it extremely difficult for people to effectively track and adjust for real-time changes in demand. AI can enhance Forecast Value Added (FVA) metrics—learning from past successful and unsuccessful forecasts to help planners make better adjustments. But unless supply chain teams can easily monitor the accuracy of AI models, they can’t be certain that systems are really delivering the benefits they promise.

In light of these and other points, it’s difficult to see why vendors would develop or customers would consider inexplicable AI solutions.

What IBM is doing to open “black box” AI

IBM is working in numerous areas to develop and deliver explainable AI and advanced analytics solutions. The impetus for the company’s efforts was underscored in a recent blog by Ritika Gunnar, VP of IBM’s Watson Data and AI organization. “As humans, we’re used to the idea that decisions are based on a chain of evidence and logical reasoning anyone can follow. But if an AI system makes recommendations based on different or unknown criteria, it’s much more difficult to trust and explain the outcomes.”

Central to the company’s efforts is the Watson OpenScale platform that IBM launched in 2018. Designed to “break open the ‘black box’ at the heart” of AI models, Watson OpenScale simplifies AI processes, including detailing how recommendations are being made and automatically detecting and mitigating bias to ensure that fair, trusted outcomes are produced.

IBM is leveraging both existing open source technologies and proprietary algorithms developed at IBM Research to explore, enhance and explicate AI decision-making.

  • LIME (Locally Interpretable Model-Agnostic Explanations) is a widely used open source algorithm designed to explain predictions made by AI systems by comparing an explanation to an easily interpretable model.
  • Developed by IBM Research, MACEM (Model Agnostic Contrastive Explanations Method) goes well-beyond the capabilities of LIME by identifying both pertinent features that are present in a piece of data and those that are absent, enabling the construction of “contrastive explanations”.

One scenario for contrastive explanations is in banking, where it could be used to analyze loan application data. The system would alert the bank to issues, including poor credit ratings, but it could also spot and highlight missing documents, like an incomplete credit report. The bank could then notify the customer about the reasons for its decision and provide constructive advice.

In essence, solutions that deliver more accurate, transparent and trustworthy AI results, such as IBM Watson OpenScale, can help businesses make better decisions and enhance their services for and relationships with customers.

Final analysis

People are often concerned about new technologies, especially those that are highly complex or difficult to understand. Overcoming those doubts is central to technologies becoming widely trusted and commercially successful. In fact, without fostering understanding of and insights into emerging technologies, it’s unlikely that new solutions will find a place among the people and organizations they might otherwise benefit.

By making technologies like artificial intelligence and AI-based solutions and services clearly explainable, vendors can reduce the time required for new offerings to enter the mainstream. That’s why explainable AI offerings, like IBM Watson OpenScale are so important.

By breaking open the “black box at the heart of AI” to make processes and results fully explainable, IBM is aiding its customers and partners and furthering its own market strategies. More importantly, IBM’s explainable AI efforts should help establish the essential “rightness” of these solutions as entirely valid and wholly valuable business technologies.

Overall, IBM’s work in explainable AI should improve the mainstream understanding, acceptance and adoption of artificial intelligence among individuals and organizations worldwide.

© 2019 Pund-IT, Inc. All rights reserved.

Intel Defines “Data-Centric” Leadership with Cascade Lake

By Charles King, Pund-IT, Inc.  April 10, 2019

The tech industry so reflexively worships shiny new startups that folks often forget or even disparage larger, well-established vendors. That’s unfortunate for any number of reasons but high on the list is that such attitudes miss a critically salient point: Self-reinvention, not stasis is the key to long term survival in technology.

PR-minded venture firms continually trumpet the “unique” innovations of whatever IPO-bound company stands to fill their pockets. In contrast, established firms develop and deliver more substantial innovations month after month, year after year, decade after decade. At least the successful ones do.

That point was in clear view during Intel’s recent launch of its 2nd generation Xeon Scalable processors (aka Cascade Lake) and other new ”data-centric” solutions. Let’s consider these Intel offerings, why they are important and how they will help keep Intel and its OEM customers riding high in data center markets.

The view from Cascade Lake

So, what exactly did Intel reveal during its launch?

  • General availability of its new 2nd-Generation Xeon Scalable processors (aka Cascade Lake), including over 50 workload-optimized solutions and dozens of custom processors
  • The new offerings include the Xeon Platinum 9200 processor which sports 56 cores and 12 memory channels. According to Intel the Xeon Platinum 9200 is designed to deliver socket-level performance and memory bandwidth required for high performance computing (HPC), artificial intelligence (AI) and high-density data center infrastructures. Both IBM Cloud and Google Cloud announced plans to deliver services based on Xeon Platinum 9200 silicon.
  • Among the custom silicon offerings are network-optimized processors built in collaboration with communications service providers (SPs). These solutions are designed to support more subscriber capacity and reduce bottlenecks in network function virtualized (NFV) infrastructures, including 5G-ready networks.
  • The Xeon D-1600 processor, a system on chip (SoC) designed for dense environments, including edge computing, security and storage applications.
  • New features built into the 2nd-Gen Xeon Scalable processors include integrated Deep Learning Boost (Intel DL Boost) for AI deep learning inferencing acceleration, and hardware-enhanced security features, including protections against side channel attacks, such as Spectre and Foreshadow
  • Next generation 10nm Agilex FPGAs that will support application-specific optimization and customization for edge computing, networking (5G/NFV) and data center applications
  • Support for new memory and storage technologies, including Optane DC persistent (storage-class) memory, Optane SSD DC D4800X (Dual Port) for mission-critical enterprise applications and SSD D5-P-4326 (QLC 3D NAND) for read-intensive cloud workloads.

Pricing for the new solutions was not revealed during the launch. Product availability details can be found at intel.com.

Why it matters

Intel’s launch of its 2nd-Generation Xeon Scalable processors and other data center solutions comes at an odd time for the company. Conventional wisdom often depicts Intel as an overly complacent behemoth harassed and bloodied by swifter, agiler (often far smaller) foes. Critics claim the company is reacting too slowly to marketplace shifts and is being surpassed by more innovative technologies and vendors.

In some cases, Intel didn’t do itself any favors. The company unexpectedly dismissed its CEO, Brian Krzanich, then took months to find a new chief executive within its own ranks (Robert Swan, who joined Intel in 2016 as its CFO and led the company on an interim basis after Krzanich departed).

But in other cases, the company was at the top of its game. For example, while discovery of the Spectre, Foreshadow and other side channel vulnerabilities could have been a public relations nightmare, Intel’s willingness to take responsibility and transparently detail its efforts to repair the issues kept the situation from blowing out of proportion.

Despite these challenges, Intel continued to deliver steady, often impressive financial performance. While the company was no investor’s idea of a high flyer, it also had less distance to fall, as well as considerable padding to help mitigate unplanned impacts. The value of Intel’s conservative approach became apparent when GPU-dependent NVIDIA and AMD saw their fortunes swoon and share prices plummet when crypto-currency markets unexpectedly tumbled.

So, what do the new 2nd gen Xeon Scalable processors and other solutions say about Intel and how it sees its customers and competitors? First, consider the sheer breadth of technologies involved. Along with the latest/greatest features and functions that you’d expect in a next gen Xeon announcement, the inclusion of new SoC, FPGA and Optane memory and storage solutions, as well as workload-specific technologies, like Intel DL Boost demonstrate how and how effectively Intel is spreading its data center bets.

In addition, the depth of available solutions is impressive. That’s apparent in the 50+ SKUs that feature 2nd gen Xeon Scalable silicon and is also highlighted by the 56 core/12 memory channel Xeon Platinum 9200. It’s interesting that both IBM Cloud and Google Cloud proactively announced plans to develop offerings based on the chips since both focus on the needs of enterprise cloud customers and are sticklers for top-end hardware performance. Their support suggests that Intel’s claims about the Xeon Platinum 9200’s HPC, AI and high-density capabilities are fully justified.

Finally, the flexibility and customizability of the new chips is worth considering. Intel’s focus on workload-optimized solutions is one example of this since it reflects OEMs’ (and their customers) increasing focus on integrated, optimized systems. In fact, the new network-optimized processors that resulted from Intel’s collaboration with communications service providers offer intriguing insights into how Intel can and likely will continue to add discrete new value to its silicon portfolio.

Final analysis

Overall, the 2nd-Generation Xeon Scalable processors and other new data center technologies demonstrate both Intel’s ability to deliver substantial new innovations and its lack of complacency. That’s sensible from a tactical standpoint since the company’s competitors aren’t standing idly by – for one, AMD’s launch of its new Epyc “Rome” data center chips is just weeks away. However, it’s also strategically important for Intel to show how it became a leader in data center solutions in the first place and why it deserves to remain there.

The tech industry’s passion for perky start-ups and shiny new objects isn’t likely to fade any time soon. However, it would be a mistake to assume that leading-edge imagination and innovation reside solely in smaller organizations. Intel’s new “data-centric” 2nd-Generation Xeon Scalable, Agelix and Optane solutions prove that oft times, the most innovative vendor you can find is the one you’re already working with.

© 2019 Pund-IT, Inc. All rights reserved.

Lenovo DCG – A Refreshing Journey to Cascade Lake

By Charles King, Pund-IT, Inc.  April 3, 2019

A little discussed benefit of industry standard microprocessors and related components is the predictability they provided system vendors and enterprise customers. By leveraging the “tick-tock” of Moore’s Law-derived innovations that Intel (and, less frequently, AMD) provided, vendors could focus their attention and investments on enhancing server design, operational functions and facilities issues.

But that predictability also created room for a doubt: What would happen to the industry and Intel when steady improvements hit the brick wall of material limitations? Intel answered those concerns pretty clearly in this week’s launch of its new 2nd-Gen Intel Xeon Scalable platform (aka Cascade Lake). I’ll have more to say about that subject in next week’s Pund-IT Review. For now, let’s consider what 2nd Gen Xeon Scalable means to Lenovo’s Data Center Group (DCG) and its customers.

Refreshed/enhanced

Like other vendors, Lenovo announced that it will use Intel’s new Xeon solutions to refresh its DCG portfolio, including 15 ThinkSystem servers and five ThinkAgile appliances. As a result, Lenovo can offer clients the incremental-to-significant performance, power efficiency and workload-specific enhancements that are a predictable part of Intel’s next-gen silicon launches.

Those are important for supporting traditional business applications and conventional processes. However, they will also facilitate the development and adoption of new, rapidly evolving workloads, including advanced analytics, artificial intelligence and high-density compute infrastructures.

In addition, one specific feature of 2nd-Gen Intel Xeon Scalable chips—their support for Intel’s Optane DC persistent memory technology—could be particularly critical for and valuable to Lenovo customers. Why so?

According to Intel, Optane DC will enable customers to transform critical data workloads – from cloud and databases to in-memory analytics and content delivery networks by:

  • Reducing system restarts from minutes to seconds
  • Supporting up to 36 percent more virtual machines per system
  • Increasing system memory capacity by up to 2X, or as much as 36TB of memory in an eight-socket system

That last point is particularly important for Lenovo due to the company’s longstanding leadership in memory-intensive computing. The company has long provided the reference architecture for SAP’s HANA in-memory database solutions and technologies, and Lenovo is a leading provider of SAP HANA solutions. The importance of this point was reflected in Lenovo’s announcement that its ThinkSystem SR950 will be the industry’s first eight-socket server to support Optane DC and its 36TB of memory capacity option, making it ideal for demanding SAP HANA environments.

Lenovo is also using 2nd Gen Xeon Scalable silicon to develop new engineered systems for key workloads, including SAP HANA, Microsoft SQL Server and Red Hat OpenShift Containers. These solutions will be verified as Intel Select Solutions, signifying their ability to support superior workload performance, ease of deployment and simplified evaluation. The company also expects to introduce new Intel Select Solutions for workloads, including VMware vSAN, Network Function Virtualization Infrastructure (NFVI), Blockchain Hyperledger Fabric, & Microsoft Azure Stack HCI.

Compliments of Lenovo

This is not to suggest that the benefits of Lenovo’s refreshed portfolio are due entirely to Intel. The new solutions all benefit from the company’s ThinkShield which secures Lenovo devices with oversight of development, supply chain and lifecycle processes. Along with having unique control over its own global supply chain, the company is also aligned with Intel’s Transparent Supply Chain, allowing customers to locate the source of components in their new systems. In addition, Lenovo oversees the security of suppliers that build intelligent components, making sure they conform to Trusted Supplier Program guidelines and best practices.

Finally, while the new and refreshed solutions can be purchased directly, they are also available through Lenovo’s TruScale, the company’s recently announced consumption-based as-a-service program. TruScale enables customers to use and pay for Lenovo data center hardware solutions without having to purchase the equipment.

Final analysis

To some folks in IT, predictability is a mundane topic that is easily superseded by whatever shiny new object falls off the industry turnip truck. That attitude ignores the fact that for customers, especially businesses that depend on data center systems and other solutions, IT predictability can mean the difference between succeeding, faltering or failing.

Knowledgeable vendors deeply understand that point and do their best to utilize their own and their strategic partners’ innovations to ensure that their products are fully, predictably capable of supporting both existing applications and emerging workloads. Lenovo obviously isn’t the only vendor benefitting from Intel’s 2nd-Gen Xeon Scalable chips. However, Lenovo’s new/refreshed ThinkSystem and ThinkAgile offerings, and the company’s creative use of Intel’s Cascade Lake enhancements, provide excellent examples of how this process works and will deliver often profound benefits to Lenovo customers.

© 2019 Pund-IT, Inc. All rights reserved.

Apple Presses Hard into Digital Services

By Charles King, Pund-IT, Inc.  March 27, 2019

In a keynote event hosted at its headquarters facility in Cupertino, CA, Apple announced four new and refreshed digital services aimed at the company’s current customers and other consumers. The service offerings included:

  • Apple News+, a subscription service attached to the Apple News app that provides access to over 300 magazines, newspapers and digital publishers. Apple News+ is currently available in the U.S. for $9.99 a month and in Canada for $12.99 a month.
  • Apple Arcade, a game subscription service that will feature over 100 new games exclusive to Apple, including original releases from creators, including Hironobu Sakaguchi, Ken Wong and Will Wright. Apple Arcade will launch globally in fall 2019. No pricing details were provided.
  • Apple Card, a new credit card integrated with the Apple Wallet app. Apple Card will not charge annual, late, international or over-the-limit fees, or penalty interest rates on missed payments. Apple said its goal is to provide interest rates that are among the lowest in the industry. Apple is partnering with Goldman Sachs and Mastercard to provide issuing bank and global payment support. Apple Card will be available in the U.S. this summer.
  • Apple TV+ includes a new Apple TV app and a new subscription service featuring content and programming from creative artists, including Oprah Winfrey, Steven Spielberg, Jennifer Aniston, Reese Witherspoon, Octavia Spencer and others. Customers can subscribe à la carte to Apple TV channels, including services such as HBO, SHOWTIME, CBS All Access, EPIX and, Noggin. Shows can be watched in the Apple TV app, with no additional apps, accounts or passwords required. Pricing and availability for Apple TV+ will be announced in the Fall.

The good, the not so great and the potentially ugly

Client-focused services have a long and mostly successful history in the tech industry. When technologies and products are initially finding their way into the market, services offer great ways for vendors to reach out to and stay engaged with customers. More importantly, as products inevitably mature and improvements become increasingly incremental, services are critical to driving new value for customers and additional revenues for vendors.

Apple’s key iPhone and iPad solutions clearly belong in this second group, and with sales growth slowing (and no explosive new products or categories on the horizon), it behooves the company to find new ways to commercially leverage its customer relationships. That is, to get them to spend more money with Apple. With those thoughts in mind, let’s review the event by considering what might be loosely called the good, the not so great and the potentially ugly aspects of the new Apple services.

On the good side, the company’s clear focus on delivering unified features and functions across the services, including ease of use, user privacy and security, customer personalization and family sharing was both gratifying and impressive. Those are all areas where Apple has the experience, technologies and scope to add value to its new offerings.

Additionally, focusing on these areas, particularly security and privacy, should help differentiate and separate the company from its competitors, especially the hapless bumbling and self-inflicted wounds that Facebook and (though to a considerably lesser degree), Google have lately suffered.

That said, Apple’s optimistic view of the value of its “curation” capabilities—is a bit more complicated. Curating will open the company to criticism over what content it chooses to deliver. Some will bleat about relatively minor variety and quality issues. But others are likely to hone in on more substantial issues, like how the curation process might be used to undercut competitors, or whether the company’s fee structure is fair to its content partners.

That subject is already a point of discussion about Apple News+ (where the company will reportedly garner about half of subscription fees) and allegedly contributed to high profile publications, including the New York Times and Washington Post declining to participate. Additionally, subscribers should also remember that some publications, including the Wall Street Journal are limiting the content they provide to Apple News+.

Competitive disruption

The new Apple Card and Apple Arcade could be disruptive competitively, though for different reasons. First, Apple Card offers some features (“lower” [though not clearly specified] interest rates, no annual, international or late payment fees, and integrated cash back on purchases) that many consumers will find compelling. The elimination of international transaction and late fees also differentiates the Apple Card from many other charge cards.

Then again, Apple Card also suggests that the adoption of Apple Pay is not ramping as quickly among consumers, retailers and markets as Apple hoped. The company’s attempts to insert itself into credit/debit transactions is understandable and could eventually make a solid impact on the company’s bottom line. Whether Apple can significantly alter the credit/debit landscape remains to be seen. It’s probably best to reserve judgement until credit/debit competitors’ responses to Apple Card become clear.

Apple Arcade emphasizes new content from some well-known game developers. Plus, the ability to seamlessly play games across Apple devices should appeal to the company’s most tried and true customers. Will it be enough to disrupt markets or attract new clients to Apple devices? It’s hard to say at this point.

Apple TV+ sheer star power

Finally, the sheer star power present in the Apple TV+ reveal underscores its importance to the company. For weeks prior to the launch, orchestrated leaks focused on celebrity projects and surprise appearances. That the event would include big names in entertainment was a given. Plus, though Apple has been nibbling round the edges of video content and subscription services for years it has failed to make actual, substantial commitments. That, in turn, has allowed competitors including Netflix, Amazon and YouTube (Google) to grab and form huge swaths of the market in their own images. So how does Apple TV+ compare?

Unfortunately, that’s difficult or impossible to say. While the TV+ service is designed to compete directly with behemoth streaming services, like Netflix and Amazon Prime Video, few details were shared in Cupertino. Testimonials came from some powerhouse entertainment figures but what and how good or compelling their contributions will be is up in the air.

A few, like Reese Witherspoon, Jennifer Aniston and Steve Carell’s new drama, and the documentaries Oprah Winfrey is planning to produce are likely to click. Others, such as Steven Spielberg’s revival of his long-gone (the series originally ran 30+ years ago) Amazing Stories anthology, seem iffier. No samples of the new shows were shared and pricing for the service won’t be revealed for months, so it’s all a crapshoot at this point.

The all-star guest list in Cupertino suggests that Apple is committing a good portion of its substantial cash hoard at TV+. But throwing money hasn’t helped highly touted past Apple products and services succeed. Plus, Hollywood has never been a great place for guaranteeing that sizable budgetary outlays will turn big profits.

Final analysis

Apple’s new digital and refreshed content services all qualify as decent bets that should prove attractive to many of the company’s customers. But how do they qualify on the good/not so great and potentially ugly scale? Let’s consider that in reverse order.

Of the four, Apple News+ seems the most superfluous, and is designed to do little more than massage additional cash out of the Apple News app. With 300+ magazines and newspapers included, there should be something to appeal to most customers. Then again, consumers interested in higher value publications like the Wall Street Journal would be better off with standalone subscriptions. If the reports about Apple’s 50% fee structure are true, the Apple News+ story could turn pretty ugly.

Calling Apple Arcade a “gateway” service may seem unfair but consider that Apple has never been a powerhouse in traditional gaming tech. Instead, the company’s focus is mainly on the games available through the App Store. Apple Arcade is clearly a generational play aimed at younger iPhone owners but whether or how well Apple can entice them from the free/low fee game apps they love to a monthly subscription is uncertain. Call this one not so great.

Though Apple focused most of its firepower on the Apple TV+ subscription service launch, the prevalence of Baby Boomer and Gen X media celebrities made the proceedings seem a little tired. Sure, there were intriguing surprises, but they were mostly “safe” surprises that emphasized Apple’s deep pockets more than innovative thinking. Whether these efforts will result in compelling new content or produce significant threats to Netflix, Amazon Prime or other streaming services is anything but certain. Call it “good” but with reservations.

Of the new services, Apple Card seems to be the most potentially disruptive to the market and, perhaps, lucrative for Apple. Some critics have noted that plenty of credit cards offer better reward programs or forgive annual fees, but those offerings are typically aimed at higher income clients and those with solid credit histories. Providing those benefits across the board to Apple Wallet users could substantially level the credit/debit playing field and force other card issuers to follow Apple’s lead. Goldman Sachs involvement is also intriguing since the company has previously not been involved in consumer banking. If Apple Card develops as planned, it could be as good as gold.

Finally, Apple’s new services build on existing products and platforms, meaning they pose far less risk to the company than launching all-new offerings. In some cases, Apple is working with talented, knowledgeable partners who should substantial value. While it’s unlikely that all of the new services will be as game changing as Tim Cook and company suggest, there appears to be more upside than down. For Apple, its shareholders and its customers, that’s all to the good.

© 2019 Pund-IT, Inc. All rights reserved.

IBM SDU Refines and Redefines Enterprise Search

By Charles King, Pund-IT, Inc.  March 20, 2019

“Time is money” has been a central tenant for business technology for decades, from the mechanical calculators ubiquitous to office environments during the first half of the twentieth century to the servers and systems that became central to transaction processing and other applications in the latter half. Speeding and automating both simple and complex labor-intensive tasks enabled companies to decrease costs and increase efficiencies while becoming more competitive and profitable.

But as once exceptional technologies become increasingly commonplace and commoditized, it’s easy to forget a central point: That even amazing technologies don’t fix every problem organizations can and will confront. That’s as true for traditional solutions as it is for more recent developments, including eCommerce and customer relationship management (CRM) applications, as well as wide ranging, broadly available technologies, like search.

That last point – search – is central to a new offering IBM recently added to its Watson Discovery portfolio: Smart Document Understanding (SDU). Let’s consider what SDU is and does and why that will be welcomed by numerous enterprises.

The problem with search

Search is a settled technology, right? I mean, search engines have been around for decades, were key to the Internet’s development and evolution, landed Microsoft in court for antitrust and drive billions of dollars in advertising and other revenues. So, what could IBM or anyone else do to make search different or better than it already is?

The problem isn’t with search so much as it is with what information is being searched. That is, traditional search engines are great for crawling, indexing and querying the relatively homogenous information that constitutes web sites and online data. However, they’re less effective at dealing with the masses of heterogeneous structured (documents) and unstructured (image, video and sound files) information that businesses store in various on premises and cloud locations.

But what about the “big data” platforms and products everyone was talking about a few years back? Those can be great for managing and searching certain kinds of data and data repositories, but complex processes and enterprise information infrastructures sometimes require more hands-on efforts that impact the effectiveness of conventional solutions. In other words, the more diverse and dispersed an organization’s data resources are, the less likely they can be fully managed or exploited with existing search tools.

The IBM SDU solution

In a blog post introducing IBM’s SDU, Donna Romer, VP of Watson Platform, Offering Management, noted a pair of interesting challenges where Smart Document Understanding can be applied. The first was a situation that an IBM customer, U.S. Bank, encountered: creating pricing schemas for credit card and debit card transaction services that can be easily and transparently customized for business customers. The second was to find ways to improve and speed the ways that business documents are prepared for training artificial intelligence (AI) solutions.

How did IBM help U.S. Bank? The company and Elavon, one of its subsidiaries, decided to develop a pilot and test program for a statement analysis offering capable of analyzing prospect billing statements in real-time and generating optimized pricing proposals. Using Watson Discovery with SDU, the team cut the time required for proposal creation from 10 days to 2 minutes, radically improving sales processes for both U.S. Bank sales reps and the merchants they serve.

What about applying Watson Discovery with SDU to documents used for machine learning for AI training? Consider that AI training often requires thousands of documents that must be ingested and annotated, and those enrichments tested before they can be used to support successful machine learning.

Smart Document Understanding leverages advances from IBM Research, as well as the company’s recently introduced Corpus Conversion System, an AI-based cloud service that can ingest 100,000 PDF pages per day (with accuracy above 97 percent) and then train and apply advanced machine learning models to extract content from the documents at scale.

SDU allows Watson Discovery customers to visually train AI to understand documents, to distinguish textual elements, to extract valuable information and to exclude “noise” like headers and footers. That’s impressive but in addition, no technical training is required for using SDU. Instead, a visual interface allows workers to point and click on elements such as titles, subtitles, headers and footers in training documents. The Watson system then displays how it understands the fields so staff can correct and resubmit documents if necessary.

In essence, Watson Discovery with SDU can be used to significantly speed document-based machine learning preparation for AI training. Plus, SDU’s point and click classification can also be applied to images, spreadsheets, PDFs and optical character recognition (OCR) content. As a result, Watson Discovery with SDU can also be used to train AI systems to recognize and ferret-out valuable “small data” information assets contained in and typically obscured by massive volumes of case files, internal reporting documents, historical customer data, past transaction and interaction files and other business documents.

Final analysis

IBM’s addition of Smart Document Understanding to Watson Discovery highlights a pair of interesting points. First, that within IT few things are ever really finished or settled. That squares with the fact that technologies are tools that, with evolutionary refinement, can be successfully applied to increasing numbers and other types of problems.

The second is that time is still, and probably always will be money when it comes to business. A notable point to consider about Watson Discovery with SDU is how it can demonstrably benefit both old school processes like sales proposal creation for U.S. Bank and emerging efforts, including document-based machine learning for AI and searching for valuable “small data” assets.

Those are the kinds of problems that IBM’s new solution is solving today. It won’t be surprising if organizations find new ways to use IBM’s Watson Discovery with SDU in the months and years ahead.

© 2019 Pund-IT, Inc. All rights reserved.

Lenovo Launches Edge Computing Portfolio and Expands IoT Partnerships/Investments

By Charles King. Pund-IT, Inc.  February 27, 2019

A substantial, beneficial side effect of Moore’s Law and commodity computing is what might be called data center decentralization. As little as a decade ago, the vast majority of business computing efforts and workloads were concentrated in conventional on-premises IT facilities owned and operated by the organizations they served.

The balance shifted with the rise of public cloud platforms but is likely to be impacted even more dramatically by computing at the edges of networks. That trend, in turn, is being driven by robust mobile solutions such as the Internet of Things (IoT) that support emerging technologies, including 5G.

At the MWC (Mobile World Congress) Barcelona this week, Lenovo’s Data Center Group (DCG) introduced its new ThinkSystem SE350, the first of a new family of edge servers. The company also highlighted new partnerships and developments that will support IoT and edge computing as part of Lenovo’s long term IoT growth plan. Let’s consider these announcements in greater detail. Continue reading

Think 2019 – IBM Opens a New Chapter in Digital Transformation

By Charles King, Pund-IT, Inc.  February 20, 2019

Central to all tech vendor conferences is brand reinforcement which companies hope to achieve by explaining themselves in public. The process itself varies widely from vendor to vendor, with some opting for squishiness over substance and others spouting vagaries rather than concrete points. But others use these events to clarify their current positions, explicate core strategies and detail how they intend to help the organizations they serve successfully achieve desired goals.

Good examples of this latter approach were plentiful at IBM’s second annual Think conference last week in San Francisco. Over the course of 4+ jam-packed days, the company’s senior executives and product group leaders offered a clinic on presenting (with minimal jargon) IBM’s plans and why those efforts are meaningful to its customers and partners.

Let’s consider some of Think 2019’s key happenings and what the event said about the current and future state of IBM. Continue reading

Lenovo TruScale – Where Infrastructure-as-a-Service Customers Come First

By Charles King, Pund-IT, Inc.  February 13, 2019

As-a-Service (aaS) solutions are nearly ubiquitous in the IT industry and commercial markets. The aaS model largely defines public cloud platforms and solutions and is central to a range of other hosted IT services. Indeed, the “pay as you go” model is one of the most compelling approaches to IT that has arisen during the past two decades.

Why so? Because it significantly eases or eliminates two of the biggest headaches that enterprises and other IT customers face – the capital investments required for IT equipment and the continual operational expenditures required to staff, run and manage on premises IT infrastructures. However, it would be a mistake to assume that aaS offerings are perfect or a panacea for all IT challenges.

These points are germane when considering Lenovo’s TruScale Infrastructure Services, a new subscription-based offering the company says provides customers the precise hardware, software and services they need, whenever they require it but without onerous investment or commitment requirements. Let’s take a look at Lenovo’s TruScale and what the company is offering customers and broader markets. Continue reading

Dell EMC’s OEM Partnership Survey and the Value of Strategic Collaboration

By Charles King, Pund-IT, Inc.  February 6, 2019

“Collaboration” is a popular concept in the tech industry. It classifies an entire subset of business processes and related software and is used to indicate close relationships between vendors and their customers and strategic partners. But I believe that how collaboration fully impacts and can significantly enhance strategic partnerships is less discussed and understood than it should be.

That’s often due to companies’ natural hesitancy to highlight efforts that provide them substantial benefits lest competitors get wind of it and try to copy and peel off some of that success themselves. At the same time, individual partnerships are typically unique in how they are structured, the goals they pursue and the partners’ various responsibilities. Even relationships between a vendor and partners that use the same technological tools for the same processes or use cases can be wildly different.

So, it’s worth looking closely when vendors peel back the curtain on strategic partnerships, as Dell EMC has done with the new OEM Partnership Survey it conducted with Intel through Futurum Research. The study compiled responses from over 1,000 senior decision makers in OEM-type businesses worldwide, and asked their thoughts on how OEM and third-party relationships can drive innovation, improve time to market and increase competitive capabilities and value.

Dell EMC OEM – More than just digital intelligence

As background, the work Dell EMC’s OEM organization engages in is worth a quick look. Like other IT vendors in the business, Dell EMC OEM works with customers that develop and build products that depend on digital intelligence provided by the company’s compute, storage and other components.

Early on, Dell EMC OEM focused on conventional computerized OEM products and systems, including arcade games and bank ATMs. Today the company helps over 3,500 OEM customers in more than 40 industries deliver commercial solutions for applications, including telecom switching, industrial automation, cybersecurity, factory floor management and operations, and video security and surveillance.

Far from simply supplying technology components, Dell EMC OEM also offers customers a range of product design and development services, as well as manufacturing, sales, distribution, service and maintenance options. For companies working to explore or expand into new markets, being able to leverage Dell EMC’s global supply chain and distribution centers, and service personnel can be hugely beneficial.

Along with familiar solutions and conventional market plays, Dell EMC OEM is also on the leading edge of emerging markets and technologies, including Dell’s Internet of Things (IoT) efforts. As Bryan Jones, SVP and GM of Dell EMC OEM and IoT noted in a recent conversation, many of the company’s longstanding OEM customers “have been doing IoT since before it was called IoT.”

That is, along with supporting specific compute functions, customers were using Dell EMC-enabled products to collect data, share it with other networked devices and use that information to gain insight into their businesses and solve larger problems. The journey for those companies to business-class IoT solutions is far shorter than it is for many organizations.

Let’s consider the findings of Dell EMC’s OEM Partnership Survey.

The OEM Partnership Survey

So, what exactly does the study conducted by Futurum Research consider? In short, it explores issues related to the business value and economic benefits that can be achieved with strategic OEM partnerships.

For example, only a tiny minority (just 16 of the surveys 1,000+ respondents) said they derive no business transformational value from OEM products and solutions. Given that vote of confidence, it’s hardly surprising that over three quarters of the survey participants expect to increase their use of OEM partnerships and 26.7% said they anticipate increasing their partnership efforts dramatically.

What about economic benefits? An overwhelming majority (93%) of respondents said that OEMs can accelerate innovation in the products and services they develop. Another large majority (81%) noted that OEM partners are helping them embrace emerging technologies, including artificial intelligence (AI), multi-cloud and IoT. In addition, survey participants noted that their OEM partnerships are helping reduce costs on average by over 40%. Those are numbers you can take to the bank.

A blog post about the study by Ethan Wood, VP of marketing for Dell EMC OEM and IoT explored several OEM partner success stories that are worth considering. Those included:

  • Bionivid – says that it reduced product development costs by at least half by collaborating with Dell EMC OEM.
  • Tracewell Systems – noted that its collaboration with Dell EMC OEM has enabled it to scale rapidly and get products to market faster
  • Olivetti – collaborated with Dell EMC OEM and a member of Intel’s IoT Alliance, Allentia, to create and bring to market a turnkey solution for industrial floor and plant operations

Responses from survey respondents like these led Futurum to predict that OEM partnerships have the potential to achieve a compound annual growth rate (CAGR) of 20% to 25% during the coming decade.

Final analysis

What can we conclude from Futurum’s OEM Partnership Survey? First, that OEM-focused businesses and solutions are more diverse and dynamic today than they have ever been. Additionally, when one considers the continuing digitization of businesses and business processes in industries worldwide, it seems likely that the number of OEM-enabled solutions and corresponding commercial opportunities will continue to grow dynamically for years to come.

In the case of Dell EMC, the company’s OEM organization assists thousands of clients develop, deliver and maintain new products more speedily, more effectively and more profitably than they could on their own. In many cases, Dell OEM also helps customers accomplish what they could never have done on their own. That is part of the practical magic of strategic partnerships, and something that Dell EMC OEM practices successfully day after day.

© 2019 Pund-IT, Inc. All rights reserved.

Virtustream and Smithfield Partnership Highlights Multi-Cloud Benefits

By Charles King, Pund-IT, Inc.  January 30, 2018

Over the past 2-3 years, the ascendance of multi-cloud as a primary cloud consumption model for enterprises, has become increasingly evident. However, why that’s the case is often muddled with IT jargon and PR clichés. Thankfully, that’s not correct concerning the recently announced multi-cloud partnership between food processor Smithfield Foods and cloud service and software provider Virtustream. The details of their successful multi-cloud effort are worth a closer look.

Smithfield by the numbers

A subsidiary of WH Group, Smithfield Foods (based in Smithfield, VA) is a $15B business best known for brands, including Armour, Nathan’s Famous, John Morrell and Farmer John. The company is the world’s largest pork processor and hog producer, with some 40,000 workers employed at 50 U.S. facilities. Smithfield is also recognized as the #1 supplier of pork products for retail, food service and export markets, making it a global enterprise by any definition.

Not surprisingly, Smithfield is also a major consumer of IT products and services, including SAP solutions and systems supporting core business processes. Like many other organizations, the company has been exploring ways to more efficiently integrate its on-premises IT infrastructure and cloud-based data and applications with the aims of improving performance and lowering costs. As noted in the companies’ press release, that puts Smithfield in line with over half of the enterprise respondents in a recent Forrester study on multi-cloud trends.

Virtustream multi-cloud trims the fat from corporate IT

Smithfield determined that Virtustream offered the multi-cloud expertise its strategy required. After several months of what the pair describe as “meticulous preparation”, they launched a “One SAP” project in July 2018 that was designed to move all of Smithfield’s operations on SAP to a single, unified S4/HANA SAP platform on Virtustream Enterprise Cloud. Virtustream and Smithfield announced the project’s successful completion on January 24, 2019.

Will significant benefits result from the finished project? Absolutely. The most practical effect is that by having access to Virtustream’s dynamic, scalable multi-cloud resources, Smithfield will only pay for the services it consumes. In other words, partnering with Virtustream has enabled the company to embrace multi-cloud enabled, on-demand IT services and pricing schemas, improving IT consumption and performance efficiencies. That, in turn, will result in substantial savings. In fact, Smithfield estimates it will save $3 million in IT costs over the next three years, a tidy sum by most any measure.

Final analysis

There are numerous points to take away from Virtustream and Smithfield’s One SAP project. First and foremost, it’s critically important for an organization to clarify the goals it hopes to attain, and meticulously prepare prior to embarking on so large and significant an effort. Just as important is engaging expert partners that understand your goals, possess the skills your project requires and have the means to deliver the results you seek.

Like any other trending IT service or solution, the fine details and requirements of multi-cloud are often hard to separate from promotional pronouncements and technical jargon. But successes do exist, particularly when they involve the efforts of knowledgeable, experienced muti-cloud vendors and businesses that understand the importance of careful, rigorous preparation.

Smithfield’s One SAP project stands out today as an excellent example of what an enterprise can accomplish with multi-cloud solutions and services. But it is neither the first nor is it likely to be the last such success announced by a Virtustream partner.

© 2019 Pund-IT, Inc. All rights reserved.