Author Archives: cking

Enabling AI from the Edge to the Cloud: Gyrfalcon Technologies Offers New IP Licensing Model

By Charles King, Pund-IT, Inc.  April 24, 2019

Artificial Intelligence (AI) is a cause célèbre inside and outside the IT industry, inspiring often heated debate. However, a point that many—especially AI focused vendors—make is that cloud-based computing offers the best model for supporting AI frameworks, like Caffe, PyTorch and TensorFlow, and related machine learning processes.

But is that actually the case?

Gyrfalcon Technology (GTI) would argue that delivering robust AI at far edges of networks and in individual devices is both workable and desirable for many applications and workloads. In fact, the company offers a host of AI inference accelerator chips that can be used for those scenarios, as well as cloud-based server solutions for AI applications.

Now GTI is licensing its proprietary circuitry and intellectual property (IP) for use in System on Chip (SoC) designs. As a result, silicon vendors will be able to enhance and customize their own offerings with GTI’s innovations.

Let’s take a closer look at what Gyrfalcon Technologies is up to.

AI in the cloud

Why do most AI solutions focus on cloud-based approaches and architectures? You could call it an extreme case of “When all you have is a bulldozer, everything looks like a dirt pile” syndrome. The fact is that until fairly recently, the cost of AI far outweighed any practical benefits. That changed with new innovations, including cost-effective technologies like GPUs and FGPAs.

Some of the most intriguing and ambitious AI projects and commercial offerings, like human language processing, were undertaken by cloud vendors and infrastructure owners, including Amazon, Google and IBM, supported on the silicon side by NVIDIA, Intel and chipmakers. They had the compute and brain power to take on large-scale efforts where data accumulated by edge devices, like smart phone conversations and commands, is relayed to cloud data centers.

There, the data is used for training and enabling AI-based services, such as language translation and transcription, and products like smart home speakers.Are there any problems with this approach? Absolutely, with data privacy and security leading the charge. AI vendors uniformly claim that they are sensitive to their customers’ concerns about privacy and have tools and mechanisms in place to ensure that data is anonymized and safe. But Facebook, Google and others have been regularly dinged for mishandling or cavalierly maintaining customer data.

Cloud-based AI can also suffer latency issues, especially if network traffic is snarled. That might not be a big deal when you’re asking Alexa to recommend a good restaurant but it’s more problematic if it involves AI-enabled self-driving cars. There’s also the matter of using energy wisely. With the percentage of electricity consumed by data centers continuing to rise globally, building more IT facilities to support occasionally frivolous services seems like a literal waste.

AI at the edge

Gyrfalcon Technologies would argue that while cloud-based AI has an important role, it isn’t needed for every application or use case. Instead of a bulldozer, some jobs require a shovel or even a garden trowel. To that end, GTI offers a range of AI inference accelerator chips that support AI Processing in Memory (APiM) via ultra-small and energy efficient cores running GTI’s Matrix Processing Engine (MPE).

As a result, GTI’s solutions, like its Lightspeeur 2801 AI Accelerator can deliver 2.8 TOPS while using only 300mW of power. That makes it a great choice for edge-of-network devices, including security cameras and home smart locks. After being set up, chip adaptive training functions allow devices to learn from their surroundings. For example, a smart lock might use arrival and departure patterns to identify the residents of a home.

Enabling AI at the edge means that devices will be able to perform many functions autonomously or, if cloud connectivity is required, will be capable of vastly reducing the amount of data that needs to be transmitted. That lowers the costs, complexity and network traffic of AI implementations.

For cloud-based applications GTI offers the Lightspeeur 2803 AI Accelerators which are used in concert with GTI’s GAINBOARD 2803 PCIe card. For example, a single GAINBOARD card delivers up to 270 TOPS using 28 Watts for 9.6 TOPS/Watt, or about 3X greater efficiency than what competitors’ solutions offer.

Final analysis

The IT industry rightfully focuses on the value that innovative technologies and products provide to both consumers and businesses. Such solutions regularly come from massive Tier 1 vendors with decades of experience and billions of dollars in annual R&D funding. But oft times, innovative products and approaches are the brain children of smaller vendors like Gyrfalcon Technologies that are unawed by conventional wisdom.

With its AI Processing in Memory (APiM) and Matrix Processing Engine (MPE) technologies, GTI has enabled clients, including LG, Fujitsu and Samsung to reimagine how artificial intelligence can be incorporated into new consumer and business offerings. By licensing its Lightspeeur 2801 and 2803 AI Accelerators circuitry and intellectual property (IP) for use in System on Chip (SoC) designs, GTI is offering existing and future clients remarkable autonomy in determining how AI can best serve their own organizations and their end customers.

© 2019 Pund-IT, Inc. All rights reserved.

IBM Storage Enhances Mid-Market and Channel Solutions

By Charles King, Pund-IT, Inc.  April 17, 2019

Continual product evolution is one of the tech industry’s best and longest running selling points. It’s the foundational truth underlying technical chestnuts, like Moore’s Law and provides the subtext for innumerable marketing and promotional campaigns. But an often unaddressed yet valuable point to consider is the top-down way in which this evolution usually proceeds.

Developing new products costs money – lots, in fact, when it comes to business solutions. So not surprisingly, new products are initially designed to address the needs of large enterprises and other organizations that can afford to foot the bill and are willing to pay a premium for the new features, capabilities and benefits those solutions provide.

But eventually – often, fairly quickly – what were once enterprise-specific technologies find their way into more affordable, yet still innovative products designed for smaller companies and the channel/business partners that serve them. These points are clear in the new and updated additions IBM recently made to its Storwize V5000 family of solutions. Continue reading

IBM’s “Explainable” AI – Building Transparency and Trust into Artificial Intelligence

By Charles King, Pund-IT, Inc.  April 10, 2019

Issues of trust seldom arise in discussions about modern computing systems. It’s not that hardware and software are perfect. In fact, publications and online forums contain tens of thousands of posts hashing out the relative merits of various PCs, workstations and servers. But those products have been so commonplace for so long that their essential “rightness” as well as the results they provide are hardly ever questioned.

However, that wasn’t always the case, and a similar dynamic applies to most all emerging technical and scientific breakthroughs, including commercial artificial intelligence (AI) solutions designed for businesses and other organizations. Considering the inherent complexity of machine learning, neural networks and other AI-related processes, customers’ confusion about AI isn’t all that surprising. But what can be done to assuage their misgivings and bring AI into the mainstream?

Vendors, including IBM are tackling the problem with solutions designed to make AI processes and results more explainable, understandable and trustworthy. That should satisfy clients’ doubts and accelerate the adoption of commercial offerings, but explainable AI also yields other significant benefits. Let’s consider why explainable AI is so important and how IBM’s innovations are impacting its customers.

The problem of “black box” AI

A lack of clarity or understanding is usually problematic. When it comes to inexplicable artificial intelligence, three potential issues can arise:

  1. Most importantly, a lack of transparency leaves users uncertain about the validity and accuracy of results. That is, the essential value of AI projects and processes is undermined.
  2. In addition, if AI projects are inexplicable, it’s possible that their results might be contaminated by bias or inaccuracies. Call this a problem that you can’t really be sure you have.
  3. Finally, when AI processes are not explainable, troubleshooting anomalous results is difficult or even impossible. That is, a lack of transparency leaves organizations unable to fix what’s broken.

How are AI-focused vendors addressing these issues? Unfortunately, often with fixes that worsen the situation, including “black box” solutions. These purport to deliver all the benefits of AI but fail to provide adequate transparency into how they work, how customers can determine the accuracy of results or how problematic issues can be addressed.

These solutions also encourage perceptions of AI as a mystery whose capabilities can’t be understood by mere mortals. In other words, whatever modest benefits “black box” AI may offer, leaving customers in the dark is detrimental to their work and goals.

The benefits of explainable AI

Is there a better way to proceed? Absolutely. How can organizations explain AI successfully? With a holistic approach that addresses several stages of the AI lifecycle:

  • Central to making AI projects more explainable is making AI models explainable instead of the black boxes many currently are.
  • Organizations must clearly and transparently articulate where and how AI is being used in business processes and for what ends.
  • They must also allow for analysis of AI outcomes and provide hooks to alter and override those outcomes as necessary.

To these ends, expert users and managers should employ technologies and solutions that vendors have designed to enhance AI transparency. These methodologies can speed the understanding of AI, which is great. It is also a critical issue for IT, marketing, sales and customer care organizations, especially those in highly regulated industries, such as banking and insurance.

This process can occur organically, as people experience AI and come to understand how it affects the business and them personally, thus impacting the very culture of an organization. Or it can be pursued proactively with the best tools and solutions currently available. Whichever way a company proceeds, people need to keep in mind the vast potential of AI. Why so? Because a time will come when AI is as essential to an organization’s success as the business technologies that are commonplace today.

The business benefits of explainable AI

Why is explainable artificial intelligence such an important issue and undertaking? It goes to the practical roots of how organizations do business. If they are to adopt and adapt to AI processes, they need to know that results are accurate. Otherwise, how can they assure customers and partners that AI-impacted decisions are valid and dependable? Consider two examples:

  1. In financial services, accurate results are obviously critical for maximizing business outcomes and customer interactions. However, like other businesses in highly regulated industries, banks and other financial organizations must be able to prove that AI-impacted processes comply with government and industry rules or risk significant sanctions and penalties. That’s bad enough, but inexplicable AI might also damage client relationships. If customers seeking loans, credit cards or other services are denied by an AI-related system, company officials must be able to explain why that determination was made and how the client might address or correct problematic issues.
  2. Global supply chain management is another promising area for AI because the complexity, volume and diversity of supply chain data make it extremely difficult for people to effectively track and adjust for real-time changes in demand. AI can enhance Forecast Value Added (FVA) metrics—learning from past successful and unsuccessful forecasts to help planners make better adjustments. But unless supply chain teams can easily monitor the accuracy of AI models, they can’t be certain that systems are really delivering the benefits they promise.

In light of these and other points, it’s difficult to see why vendors would develop or customers would consider inexplicable AI solutions.

What IBM is doing to open “black box” AI

IBM is working in numerous areas to develop and deliver explainable AI and advanced analytics solutions. The impetus for the company’s efforts was underscored in a recent blog by Ritika Gunnar, VP of IBM’s Watson Data and AI organization. “As humans, we’re used to the idea that decisions are based on a chain of evidence and logical reasoning anyone can follow. But if an AI system makes recommendations based on different or unknown criteria, it’s much more difficult to trust and explain the outcomes.”

Central to the company’s efforts is the Watson OpenScale platform that IBM launched in 2018. Designed to “break open the ‘black box’ at the heart” of AI models, Watson OpenScale simplifies AI processes, including detailing how recommendations are being made and automatically detecting and mitigating bias to ensure that fair, trusted outcomes are produced.

IBM is leveraging both existing open source technologies and proprietary algorithms developed at IBM Research to explore, enhance and explicate AI decision-making.

  • LIME (Locally Interpretable Model-Agnostic Explanations) is a widely used open source algorithm designed to explain predictions made by AI systems by comparing an explanation to an easily interpretable model.
  • Developed by IBM Research, MACEM (Model Agnostic Contrastive Explanations Method) goes well-beyond the capabilities of LIME by identifying both pertinent features that are present in a piece of data and those that are absent, enabling the construction of “contrastive explanations”.

One scenario for contrastive explanations is in banking, where it could be used to analyze loan application data. The system would alert the bank to issues, including poor credit ratings, but it could also spot and highlight missing documents, like an incomplete credit report. The bank could then notify the customer about the reasons for its decision and provide constructive advice.

In essence, solutions that deliver more accurate, transparent and trustworthy AI results, such as IBM Watson OpenScale, can help businesses make better decisions and enhance their services for and relationships with customers.

Final analysis

People are often concerned about new technologies, especially those that are highly complex or difficult to understand. Overcoming those doubts is central to technologies becoming widely trusted and commercially successful. In fact, without fostering understanding of and insights into emerging technologies, it’s unlikely that new solutions will find a place among the people and organizations they might otherwise benefit.

By making technologies like artificial intelligence and AI-based solutions and services clearly explainable, vendors can reduce the time required for new offerings to enter the mainstream. That’s why explainable AI offerings, like IBM Watson OpenScale are so important.

By breaking open the “black box at the heart of AI” to make processes and results fully explainable, IBM is aiding its customers and partners and furthering its own market strategies. More importantly, IBM’s explainable AI efforts should help establish the essential “rightness” of these solutions as entirely valid and wholly valuable business technologies.

Overall, IBM’s work in explainable AI should improve the mainstream understanding, acceptance and adoption of artificial intelligence among individuals and organizations worldwide.

© 2019 Pund-IT, Inc. All rights reserved.

Intel Defines “Data-Centric” Leadership with Cascade Lake

By Charles King, Pund-IT, Inc.  April 10, 2019

The tech industry so reflexively worships shiny new startups that folks often forget or even disparage larger, well-established vendors. That’s unfortunate for any number of reasons but high on the list is that such attitudes miss a critically salient point: Self-reinvention, not stasis is the key to long term survival in technology.

PR-minded venture firms continually trumpet the “unique” innovations of whatever IPO-bound company stands to fill their pockets. In contrast, established firms develop and deliver more substantial innovations month after month, year after year, decade after decade. At least the successful ones do.

That point was in clear view during Intel’s recent launch of its 2nd generation Xeon Scalable processors (aka Cascade Lake) and other new ”data-centric” solutions. Let’s consider these Intel offerings, why they are important and how they will help keep Intel and its OEM customers riding high in data center markets.

The view from Cascade Lake

So, what exactly did Intel reveal during its launch?

  • General availability of its new 2nd-Generation Xeon Scalable processors (aka Cascade Lake), including over 50 workload-optimized solutions and dozens of custom processors
  • The new offerings include the Xeon Platinum 9200 processor which sports 56 cores and 12 memory channels. According to Intel the Xeon Platinum 9200 is designed to deliver socket-level performance and memory bandwidth required for high performance computing (HPC), artificial intelligence (AI) and high-density data center infrastructures. Both IBM Cloud and Google Cloud announced plans to deliver services based on Xeon Platinum 9200 silicon.
  • Among the custom silicon offerings are network-optimized processors built in collaboration with communications service providers (SPs). These solutions are designed to support more subscriber capacity and reduce bottlenecks in network function virtualized (NFV) infrastructures, including 5G-ready networks.
  • The Xeon D-1600 processor, a system on chip (SoC) designed for dense environments, including edge computing, security and storage applications.
  • New features built into the 2nd-Gen Xeon Scalable processors include integrated Deep Learning Boost (Intel DL Boost) for AI deep learning inferencing acceleration, and hardware-enhanced security features, including protections against side channel attacks, such as Spectre and Foreshadow
  • Next generation 10nm Agilex FPGAs that will support application-specific optimization and customization for edge computing, networking (5G/NFV) and data center applications
  • Support for new memory and storage technologies, including Optane DC persistent (storage-class) memory, Optane SSD DC D4800X (Dual Port) for mission-critical enterprise applications and SSD D5-P-4326 (QLC 3D NAND) for read-intensive cloud workloads.

Pricing for the new solutions was not revealed during the launch. Product availability details can be found at

Why it matters

Intel’s launch of its 2nd-Generation Xeon Scalable processors and other data center solutions comes at an odd time for the company. Conventional wisdom often depicts Intel as an overly complacent behemoth harassed and bloodied by swifter, agiler (often far smaller) foes. Critics claim the company is reacting too slowly to marketplace shifts and is being surpassed by more innovative technologies and vendors.

In some cases, Intel didn’t do itself any favors. The company unexpectedly dismissed its CEO, Brian Krzanich, then took months to find a new chief executive within its own ranks (Robert Swan, who joined Intel in 2016 as its CFO and led the company on an interim basis after Krzanich departed).

But in other cases, the company was at the top of its game. For example, while discovery of the Spectre, Foreshadow and other side channel vulnerabilities could have been a public relations nightmare, Intel’s willingness to take responsibility and transparently detail its efforts to repair the issues kept the situation from blowing out of proportion.

Despite these challenges, Intel continued to deliver steady, often impressive financial performance. While the company was no investor’s idea of a high flyer, it also had less distance to fall, as well as considerable padding to help mitigate unplanned impacts. The value of Intel’s conservative approach became apparent when GPU-dependent NVIDIA and AMD saw their fortunes swoon and share prices plummet when crypto-currency markets unexpectedly tumbled.

So, what do the new 2nd gen Xeon Scalable processors and other solutions say about Intel and how it sees its customers and competitors? First, consider the sheer breadth of technologies involved. Along with the latest/greatest features and functions that you’d expect in a next gen Xeon announcement, the inclusion of new SoC, FPGA and Optane memory and storage solutions, as well as workload-specific technologies, like Intel DL Boost demonstrate how and how effectively Intel is spreading its data center bets.

In addition, the depth of available solutions is impressive. That’s apparent in the 50+ SKUs that feature 2nd gen Xeon Scalable silicon and is also highlighted by the 56 core/12 memory channel Xeon Platinum 9200. It’s interesting that both IBM Cloud and Google Cloud proactively announced plans to develop offerings based on the chips since both focus on the needs of enterprise cloud customers and are sticklers for top-end hardware performance. Their support suggests that Intel’s claims about the Xeon Platinum 9200’s HPC, AI and high-density capabilities are fully justified.

Finally, the flexibility and customizability of the new chips is worth considering. Intel’s focus on workload-optimized solutions is one example of this since it reflects OEMs’ (and their customers) increasing focus on integrated, optimized systems. In fact, the new network-optimized processors that resulted from Intel’s collaboration with communications service providers offer intriguing insights into how Intel can and likely will continue to add discrete new value to its silicon portfolio.

Final analysis

Overall, the 2nd-Generation Xeon Scalable processors and other new data center technologies demonstrate both Intel’s ability to deliver substantial new innovations and its lack of complacency. That’s sensible from a tactical standpoint since the company’s competitors aren’t standing idly by – for one, AMD’s launch of its new Epyc “Rome” data center chips is just weeks away. However, it’s also strategically important for Intel to show how it became a leader in data center solutions in the first place and why it deserves to remain there.

The tech industry’s passion for perky start-ups and shiny new objects isn’t likely to fade any time soon. However, it would be a mistake to assume that leading-edge imagination and innovation reside solely in smaller organizations. Intel’s new “data-centric” 2nd-Generation Xeon Scalable, Agelix and Optane solutions prove that oft times, the most innovative vendor you can find is the one you’re already working with.

© 2019 Pund-IT, Inc. All rights reserved.

Lenovo DCG – A Refreshing Journey to Cascade Lake

By Charles King, Pund-IT, Inc.  April 3, 2019

A little discussed benefit of industry standard microprocessors and related components is the predictability they provided system vendors and enterprise customers. By leveraging the “tick-tock” of Moore’s Law-derived innovations that Intel (and, less frequently, AMD) provided, vendors could focus their attention and investments on enhancing server design, operational functions and facilities issues.

But that predictability also created room for a doubt: What would happen to the industry and Intel when steady improvements hit the brick wall of material limitations? Intel answered those concerns pretty clearly in this week’s launch of its new 2nd-Gen Intel Xeon Scalable platform (aka Cascade Lake). I’ll have more to say about that subject in next week’s Pund-IT Review. For now, let’s consider what 2nd Gen Xeon Scalable means to Lenovo’s Data Center Group (DCG) and its customers.


Like other vendors, Lenovo announced that it will use Intel’s new Xeon solutions to refresh its DCG portfolio, including 15 ThinkSystem servers and five ThinkAgile appliances. As a result, Lenovo can offer clients the incremental-to-significant performance, power efficiency and workload-specific enhancements that are a predictable part of Intel’s next-gen silicon launches.

Those are important for supporting traditional business applications and conventional processes. However, they will also facilitate the development and adoption of new, rapidly evolving workloads, including advanced analytics, artificial intelligence and high-density compute infrastructures.

In addition, one specific feature of 2nd-Gen Intel Xeon Scalable chips—their support for Intel’s Optane DC persistent memory technology—could be particularly critical for and valuable to Lenovo customers. Why so?

According to Intel, Optane DC will enable customers to transform critical data workloads – from cloud and databases to in-memory analytics and content delivery networks by:

  • Reducing system restarts from minutes to seconds
  • Supporting up to 36 percent more virtual machines per system
  • Increasing system memory capacity by up to 2X, or as much as 36TB of memory in an eight-socket system

That last point is particularly important for Lenovo due to the company’s longstanding leadership in memory-intensive computing. The company has long provided the reference architecture for SAP’s HANA in-memory database solutions and technologies, and Lenovo is a leading provider of SAP HANA solutions. The importance of this point was reflected in Lenovo’s announcement that its ThinkSystem SR950 will be the industry’s first eight-socket server to support Optane DC and its 36TB of memory capacity option, making it ideal for demanding SAP HANA environments.

Lenovo is also using 2nd Gen Xeon Scalable silicon to develop new engineered systems for key workloads, including SAP HANA, Microsoft SQL Server and Red Hat OpenShift Containers. These solutions will be verified as Intel Select Solutions, signifying their ability to support superior workload performance, ease of deployment and simplified evaluation. The company also expects to introduce new Intel Select Solutions for workloads, including VMware vSAN, Network Function Virtualization Infrastructure (NFVI), Blockchain Hyperledger Fabric, & Microsoft Azure Stack HCI.

Compliments of Lenovo

This is not to suggest that the benefits of Lenovo’s refreshed portfolio are due entirely to Intel. The new solutions all benefit from the company’s ThinkShield which secures Lenovo devices with oversight of development, supply chain and lifecycle processes. Along with having unique control over its own global supply chain, the company is also aligned with Intel’s Transparent Supply Chain, allowing customers to locate the source of components in their new systems. In addition, Lenovo oversees the security of suppliers that build intelligent components, making sure they conform to Trusted Supplier Program guidelines and best practices.

Finally, while the new and refreshed solutions can be purchased directly, they are also available through Lenovo’s TruScale, the company’s recently announced consumption-based as-a-service program. TruScale enables customers to use and pay for Lenovo data center hardware solutions without having to purchase the equipment.

Final analysis

To some folks in IT, predictability is a mundane topic that is easily superseded by whatever shiny new object falls off the industry turnip truck. That attitude ignores the fact that for customers, especially businesses that depend on data center systems and other solutions, IT predictability can mean the difference between succeeding, faltering or failing.

Knowledgeable vendors deeply understand that point and do their best to utilize their own and their strategic partners’ innovations to ensure that their products are fully, predictably capable of supporting both existing applications and emerging workloads. Lenovo obviously isn’t the only vendor benefitting from Intel’s 2nd-Gen Xeon Scalable chips. However, Lenovo’s new/refreshed ThinkSystem and ThinkAgile offerings, and the company’s creative use of Intel’s Cascade Lake enhancements, provide excellent examples of how this process works and will deliver often profound benefits to Lenovo customers.

© 2019 Pund-IT, Inc. All rights reserved.

Apple Presses Hard into Digital Services

By Charles King, Pund-IT, Inc.  March 27, 2019

In a keynote event hosted at its headquarters facility in Cupertino, CA, Apple announced four new and refreshed digital services aimed at the company’s current customers and other consumers. The service offerings included:

  • Apple News+, a subscription service attached to the Apple News app that provides access to over 300 magazines, newspapers and digital publishers. Apple News+ is currently available in the U.S. for $9.99 a month and in Canada for $12.99 a month.
  • Apple Arcade, a game subscription service that will feature over 100 new games exclusive to Apple, including original releases from creators, including Hironobu Sakaguchi, Ken Wong and Will Wright. Apple Arcade will launch globally in fall 2019. No pricing details were provided.
  • Apple Card, a new credit card integrated with the Apple Wallet app. Apple Card will not charge annual, late, international or over-the-limit fees, or penalty interest rates on missed payments. Apple said its goal is to provide interest rates that are among the lowest in the industry. Apple is partnering with Goldman Sachs and Mastercard to provide issuing bank and global payment support. Apple Card will be available in the U.S. this summer.
  • Apple TV+ includes a new Apple TV app and a new subscription service featuring content and programming from creative artists, including Oprah Winfrey, Steven Spielberg, Jennifer Aniston, Reese Witherspoon, Octavia Spencer and others. Customers can subscribe à la carte to Apple TV channels, including services such as HBO, SHOWTIME, CBS All Access, EPIX and, Noggin. Shows can be watched in the Apple TV app, with no additional apps, accounts or passwords required. Pricing and availability for Apple TV+ will be announced in the Fall.

The good, the not so great and the potentially ugly

Client-focused services have a long and mostly successful history in the tech industry. When technologies and products are initially finding their way into the market, services offer great ways for vendors to reach out to and stay engaged with customers. More importantly, as products inevitably mature and improvements become increasingly incremental, services are critical to driving new value for customers and additional revenues for vendors.

Apple’s key iPhone and iPad solutions clearly belong in this second group, and with sales growth slowing (and no explosive new products or categories on the horizon), it behooves the company to find new ways to commercially leverage its customer relationships. That is, to get them to spend more money with Apple. With those thoughts in mind, let’s review the event by considering what might be loosely called the good, the not so great and the potentially ugly aspects of the new Apple services.

On the good side, the company’s clear focus on delivering unified features and functions across the services, including ease of use, user privacy and security, customer personalization and family sharing was both gratifying and impressive. Those are all areas where Apple has the experience, technologies and scope to add value to its new offerings.

Additionally, focusing on these areas, particularly security and privacy, should help differentiate and separate the company from its competitors, especially the hapless bumbling and self-inflicted wounds that Facebook and (though to a considerably lesser degree), Google have lately suffered.

That said, Apple’s optimistic view of the value of its “curation” capabilities—is a bit more complicated. Curating will open the company to criticism over what content it chooses to deliver. Some will bleat about relatively minor variety and quality issues. But others are likely to hone in on more substantial issues, like how the curation process might be used to undercut competitors, or whether the company’s fee structure is fair to its content partners.

That subject is already a point of discussion about Apple News+ (where the company will reportedly garner about half of subscription fees) and allegedly contributed to high profile publications, including the New York Times and Washington Post declining to participate. Additionally, subscribers should also remember that some publications, including the Wall Street Journal are limiting the content they provide to Apple News+.

Competitive disruption

The new Apple Card and Apple Arcade could be disruptive competitively, though for different reasons. First, Apple Card offers some features (“lower” [though not clearly specified] interest rates, no annual, international or late payment fees, and integrated cash back on purchases) that many consumers will find compelling. The elimination of international transaction and late fees also differentiates the Apple Card from many other charge cards.

Then again, Apple Card also suggests that the adoption of Apple Pay is not ramping as quickly among consumers, retailers and markets as Apple hoped. The company’s attempts to insert itself into credit/debit transactions is understandable and could eventually make a solid impact on the company’s bottom line. Whether Apple can significantly alter the credit/debit landscape remains to be seen. It’s probably best to reserve judgement until credit/debit competitors’ responses to Apple Card become clear.

Apple Arcade emphasizes new content from some well-known game developers. Plus, the ability to seamlessly play games across Apple devices should appeal to the company’s most tried and true customers. Will it be enough to disrupt markets or attract new clients to Apple devices? It’s hard to say at this point.

Apple TV+ sheer star power

Finally, the sheer star power present in the Apple TV+ reveal underscores its importance to the company. For weeks prior to the launch, orchestrated leaks focused on celebrity projects and surprise appearances. That the event would include big names in entertainment was a given. Plus, though Apple has been nibbling round the edges of video content and subscription services for years it has failed to make actual, substantial commitments. That, in turn, has allowed competitors including Netflix, Amazon and YouTube (Google) to grab and form huge swaths of the market in their own images. So how does Apple TV+ compare?

Unfortunately, that’s difficult or impossible to say. While the TV+ service is designed to compete directly with behemoth streaming services, like Netflix and Amazon Prime Video, few details were shared in Cupertino. Testimonials came from some powerhouse entertainment figures but what and how good or compelling their contributions will be is up in the air.

A few, like Reese Witherspoon, Jennifer Aniston and Steve Carell’s new drama, and the documentaries Oprah Winfrey is planning to produce are likely to click. Others, such as Steven Spielberg’s revival of his long-gone (the series originally ran 30+ years ago) Amazing Stories anthology, seem iffier. No samples of the new shows were shared and pricing for the service won’t be revealed for months, so it’s all a crapshoot at this point.

The all-star guest list in Cupertino suggests that Apple is committing a good portion of its substantial cash hoard at TV+. But throwing money hasn’t helped highly touted past Apple products and services succeed. Plus, Hollywood has never been a great place for guaranteeing that sizable budgetary outlays will turn big profits.

Final analysis

Apple’s new digital and refreshed content services all qualify as decent bets that should prove attractive to many of the company’s customers. But how do they qualify on the good/not so great and potentially ugly scale? Let’s consider that in reverse order.

Of the four, Apple News+ seems the most superfluous, and is designed to do little more than massage additional cash out of the Apple News app. With 300+ magazines and newspapers included, there should be something to appeal to most customers. Then again, consumers interested in higher value publications like the Wall Street Journal would be better off with standalone subscriptions. If the reports about Apple’s 50% fee structure are true, the Apple News+ story could turn pretty ugly.

Calling Apple Arcade a “gateway” service may seem unfair but consider that Apple has never been a powerhouse in traditional gaming tech. Instead, the company’s focus is mainly on the games available through the App Store. Apple Arcade is clearly a generational play aimed at younger iPhone owners but whether or how well Apple can entice them from the free/low fee game apps they love to a monthly subscription is uncertain. Call this one not so great.

Though Apple focused most of its firepower on the Apple TV+ subscription service launch, the prevalence of Baby Boomer and Gen X media celebrities made the proceedings seem a little tired. Sure, there were intriguing surprises, but they were mostly “safe” surprises that emphasized Apple’s deep pockets more than innovative thinking. Whether these efforts will result in compelling new content or produce significant threats to Netflix, Amazon Prime or other streaming services is anything but certain. Call it “good” but with reservations.

Of the new services, Apple Card seems to be the most potentially disruptive to the market and, perhaps, lucrative for Apple. Some critics have noted that plenty of credit cards offer better reward programs or forgive annual fees, but those offerings are typically aimed at higher income clients and those with solid credit histories. Providing those benefits across the board to Apple Wallet users could substantially level the credit/debit playing field and force other card issuers to follow Apple’s lead. Goldman Sachs involvement is also intriguing since the company has previously not been involved in consumer banking. If Apple Card develops as planned, it could be as good as gold.

Finally, Apple’s new services build on existing products and platforms, meaning they pose far less risk to the company than launching all-new offerings. In some cases, Apple is working with talented, knowledgeable partners who should substantial value. While it’s unlikely that all of the new services will be as game changing as Tim Cook and company suggest, there appears to be more upside than down. For Apple, its shareholders and its customers, that’s all to the good.

© 2019 Pund-IT, Inc. All rights reserved.

IBM SDU Refines and Redefines Enterprise Search

By Charles King, Pund-IT, Inc.  March 20, 2019

“Time is money” has been a central tenant for business technology for decades, from the mechanical calculators ubiquitous to office environments during the first half of the twentieth century to the servers and systems that became central to transaction processing and other applications in the latter half. Speeding and automating both simple and complex labor-intensive tasks enabled companies to decrease costs and increase efficiencies while becoming more competitive and profitable.

But as once exceptional technologies become increasingly commonplace and commoditized, it’s easy to forget a central point: That even amazing technologies don’t fix every problem organizations can and will confront. That’s as true for traditional solutions as it is for more recent developments, including eCommerce and customer relationship management (CRM) applications, as well as wide ranging, broadly available technologies, like search.

That last point – search – is central to a new offering IBM recently added to its Watson Discovery portfolio: Smart Document Understanding (SDU). Let’s consider what SDU is and does and why that will be welcomed by numerous enterprises.

The problem with search

Search is a settled technology, right? I mean, search engines have been around for decades, were key to the Internet’s development and evolution, landed Microsoft in court for antitrust and drive billions of dollars in advertising and other revenues. So, what could IBM or anyone else do to make search different or better than it already is?

The problem isn’t with search so much as it is with what information is being searched. That is, traditional search engines are great for crawling, indexing and querying the relatively homogenous information that constitutes web sites and online data. However, they’re less effective at dealing with the masses of heterogeneous structured (documents) and unstructured (image, video and sound files) information that businesses store in various on premises and cloud locations.

But what about the “big data” platforms and products everyone was talking about a few years back? Those can be great for managing and searching certain kinds of data and data repositories, but complex processes and enterprise information infrastructures sometimes require more hands-on efforts that impact the effectiveness of conventional solutions. In other words, the more diverse and dispersed an organization’s data resources are, the less likely they can be fully managed or exploited with existing search tools.

The IBM SDU solution

In a blog post introducing IBM’s SDU, Donna Romer, VP of Watson Platform, Offering Management, noted a pair of interesting challenges where Smart Document Understanding can be applied. The first was a situation that an IBM customer, U.S. Bank, encountered: creating pricing schemas for credit card and debit card transaction services that can be easily and transparently customized for business customers. The second was to find ways to improve and speed the ways that business documents are prepared for training artificial intelligence (AI) solutions.

How did IBM help U.S. Bank? The company and Elavon, one of its subsidiaries, decided to develop a pilot and test program for a statement analysis offering capable of analyzing prospect billing statements in real-time and generating optimized pricing proposals. Using Watson Discovery with SDU, the team cut the time required for proposal creation from 10 days to 2 minutes, radically improving sales processes for both U.S. Bank sales reps and the merchants they serve.

What about applying Watson Discovery with SDU to documents used for machine learning for AI training? Consider that AI training often requires thousands of documents that must be ingested and annotated, and those enrichments tested before they can be used to support successful machine learning.

Smart Document Understanding leverages advances from IBM Research, as well as the company’s recently introduced Corpus Conversion System, an AI-based cloud service that can ingest 100,000 PDF pages per day (with accuracy above 97 percent) and then train and apply advanced machine learning models to extract content from the documents at scale.

SDU allows Watson Discovery customers to visually train AI to understand documents, to distinguish textual elements, to extract valuable information and to exclude “noise” like headers and footers. That’s impressive but in addition, no technical training is required for using SDU. Instead, a visual interface allows workers to point and click on elements such as titles, subtitles, headers and footers in training documents. The Watson system then displays how it understands the fields so staff can correct and resubmit documents if necessary.

In essence, Watson Discovery with SDU can be used to significantly speed document-based machine learning preparation for AI training. Plus, SDU’s point and click classification can also be applied to images, spreadsheets, PDFs and optical character recognition (OCR) content. As a result, Watson Discovery with SDU can also be used to train AI systems to recognize and ferret-out valuable “small data” information assets contained in and typically obscured by massive volumes of case files, internal reporting documents, historical customer data, past transaction and interaction files and other business documents.

Final analysis

IBM’s addition of Smart Document Understanding to Watson Discovery highlights a pair of interesting points. First, that within IT few things are ever really finished or settled. That squares with the fact that technologies are tools that, with evolutionary refinement, can be successfully applied to increasing numbers and other types of problems.

The second is that time is still, and probably always will be money when it comes to business. A notable point to consider about Watson Discovery with SDU is how it can demonstrably benefit both old school processes like sales proposal creation for U.S. Bank and emerging efforts, including document-based machine learning for AI and searching for valuable “small data” assets.

Those are the kinds of problems that IBM’s new solution is solving today. It won’t be surprising if organizations find new ways to use IBM’s Watson Discovery with SDU in the months and years ahead.

© 2019 Pund-IT, Inc. All rights reserved.

Lenovo Launches Edge Computing Portfolio and Expands IoT Partnerships/Investments

By Charles King. Pund-IT, Inc.  February 27, 2019

A substantial, beneficial side effect of Moore’s Law and commodity computing is what might be called data center decentralization. As little as a decade ago, the vast majority of business computing efforts and workloads were concentrated in conventional on-premises IT facilities owned and operated by the organizations they served.

The balance shifted with the rise of public cloud platforms but is likely to be impacted even more dramatically by computing at the edges of networks. That trend, in turn, is being driven by robust mobile solutions such as the Internet of Things (IoT) that support emerging technologies, including 5G.

At the MWC (Mobile World Congress) Barcelona this week, Lenovo’s Data Center Group (DCG) introduced its new ThinkSystem SE350, the first of a new family of edge servers. The company also highlighted new partnerships and developments that will support IoT and edge computing as part of Lenovo’s long term IoT growth plan. Let’s consider these announcements in greater detail. Continue reading

Think 2019 – IBM Opens a New Chapter in Digital Transformation

By Charles King, Pund-IT, Inc.  February 20, 2019

Central to all tech vendor conferences is brand reinforcement which companies hope to achieve by explaining themselves in public. The process itself varies widely from vendor to vendor, with some opting for squishiness over substance and others spouting vagaries rather than concrete points. But others use these events to clarify their current positions, explicate core strategies and detail how they intend to help the organizations they serve successfully achieve desired goals.

Good examples of this latter approach were plentiful at IBM’s second annual Think conference last week in San Francisco. Over the course of 4+ jam-packed days, the company’s senior executives and product group leaders offered a clinic on presenting (with minimal jargon) IBM’s plans and why those efforts are meaningful to its customers and partners.

Let’s consider some of Think 2019’s key happenings and what the event said about the current and future state of IBM. Continue reading

Lenovo TruScale – Where Infrastructure-as-a-Service Customers Come First

By Charles King, Pund-IT, Inc.  February 13, 2019

As-a-Service (aaS) solutions are nearly ubiquitous in the IT industry and commercial markets. The aaS model largely defines public cloud platforms and solutions and is central to a range of other hosted IT services. Indeed, the “pay as you go” model is one of the most compelling approaches to IT that has arisen during the past two decades.

Why so? Because it significantly eases or eliminates two of the biggest headaches that enterprises and other IT customers face – the capital investments required for IT equipment and the continual operational expenditures required to staff, run and manage on premises IT infrastructures. However, it would be a mistake to assume that aaS offerings are perfect or a panacea for all IT challenges.

These points are germane when considering Lenovo’s TruScale Infrastructure Services, a new subscription-based offering the company says provides customers the precise hardware, software and services they need, whenever they require it but without onerous investment or commitment requirements. Let’s take a look at Lenovo’s TruScale and what the company is offering customers and broader markets. Continue reading