By Charles King, Pund-IT® August 7, 2019
For many years, durability was the defining point for business notebooks. Products were designed and built for constant use, to stand up to literal knocks and accidental drops, to easily connect to networks and peripheral devices. That situation changed significantly in 2006 and, again, in 2008 when Apple introduced the MacBook Pro and MacBook Air. The company’s design aesthetic helped breathe new life into Apple’s Mac business and, for a time, made it look like a serious challenger to mainstream PC vendors.
But that situation shifted at CES 2012 when Dell introduced its XPS 13, a notebook that fundamentally changed the company’s reputation as a maker of solid if unsurprising PCs. Along with stylish good looks, the XPS 13 utilized Dell-developed materials, including a light yet rigid and strong carbon fiber composite. Later XPS 13 iterations introduced other notable features and materials, including Dell’s near bezel-less InfinityEdge display, NASA’s Silicon Aerogel (for heat insulation), Dell Cinema for optimized multimedia performance and continually improving battery life.
Over time, the XPS 13 became Dell’s most successful notebook product so it’s not surprising that many of the XPS line’s features and innovations were seeded into the company’s other notebooks, including the new Latitude 7400 2-in-1 convertible. The company recently sent me a 7400 2-in-1 evaluation unit so let me tell you about my hands-on experience with the product that occupies the high end of Dell’s commercial notebook portfolio. Continue reading
By Charles King, Pund-IT® July 31, 2019
The high-tech industry has more than its share of products that qualify as “solutions in search of problems.” But there are also hundreds of practical issues and challenges that have inspired vendors to develop powerful, continually evolving technology products and services. Pivot3’s digital video security and surveillance offerings are an excellent example.
How so? Founded in 2003 by veteran executives from VMware, Compaq and Adaptec, the company was an early pioneer in hyperconverged infrastructure (HCI) solutions, including digital video offerings. Pivot3’s sophisticated software and its skills at leveraging powerful technologies, including enterprise virtualization, PCIe NVMe flash and scale-out storage systems have enabled it to capture business in other markets, including virtual desktop infrastructure (VDI), hybrid cloud, datacenter modernization and Internet of Things (IoT).
Today, Pivot3’s video security and surveillance solutions play vital roles in scores of global “Smart” (Cities, Casinos) and “Safe” (Transit, Campus, Airport) deployments. In fact, Gartner has called Pivot3 a leader in large scale (multi-petabyte) digital video installations. The company recently announced a new offering – the Virtual Security Operations Center (Virtual SOC) – that is designed to significantly enhance video security and surveillance deployments.
Let’s take a look at Virtual SOC and consider how Pivot3 is, yet again, positioned to fundamentally change digital video customers’ lives for the better. Continue reading
By Charles King, Pund-IT®
Over the past two decades, the Chief Data Officer position has become increasingly commonplace in enterprises and other organizations. Capital One and Yahoo! both appointed CDOs around 2002. A2013 Gartner study found that over 100 companies employed CDOs—more than double the number in 2012, and in a recent update to the study 257 CDOs and other high-level data and analytics leaders shared their thoughts.
Moreover, the critical importance of data analytics to businesses, industries and government agencies of every kind have led organizations to recognize the need to better control and fully leverage the information they create, collect and manage. But what are the best strategies and approaches for achieving those goals? In addition, how can organizations ensure that they are employing the right people and practices?
Answers to those questions can be hard to ascertain due to rapid technological shifts, including massive upticks in the volume of information that businesses create, collect and manage. Not surprisingly, these and other relevant subjects were central to the presentations and discussions at the recent “tenth edition” of IBM’s bi-annual Chief Data Officer (CDO) Summit that the company hosted in San Francisco.
Let’s take a look at the key focal points of IBM’s CDO Summit and how the company plans to help customers deal with these issues. Continue reading
By Charles King, Pund-IT® July 17, 2019
In IBM’s recent announcement closing its $34B acquisition of Red Hat, the company said the deal:
- Positions IBM as a leading hybrid cloud provider, accelerates its high-value business model and extends Red Hat’s open source innovation to a broader range of clients
- Will preserve Red Hat’s independence and neutrality. Red Hat will operate as a distinct business unit within IBM, its current executive team (including CEO Jim Whitehurst) will lead the organization, and IBM will maintain Red Hat’s headquarters, facilities, brands and practices.
- Will strengthen Red Hat’s existing partnerships to continue giving customers freedom, choice and flexibility. Those partners include major cloud providers, such as AWS, Microsoft Azure, Google Cloud and Alibaba.
- Leave unchanged Red Hat’s unwavering commitment to open source.
- Empower IBM and Red Hat to deliver the next-generation hybrid multicloud platform based on open source technologies, such as Linux and Kubernetes. That will allow businesses to securely deploy, run and manage data and applications on-premises and on private and multiple public clouds.
Do these points make sense given the two companies’ history and relative strengths? If so, are there any challenges or barriers they face in pursuing these goals? Finally, what can customers and partners expect moving forward? Continue reading
By Charles King, Pund-IT, Inc. July 2, 2019
In past commentaries on supercomputing and high-performance computing (HPC), including the results of the latest Top500.org lists, I noted the benefits that dynamically flow from high-end systems to smaller businesses. In large part, that is due to the continuing upward evolution of computing technologies that drives costs down across the spectrum of IT solutions. But that process has also been hastened by specific trends, including the rise and performance of supercomputers leveraging industry standard components.
How that impacts commercial markets is evident in Lenovo’s standing in the most recent Top500 list where the company increased its leadership position in the total number of systems listed. But the nature of those listings is equally important, as was a recent blog posted by Matt Ziegler, director of HPC and AI Product Management and Marketing at Lenovo, entitled “Exascale for Everyone.” Let’s take a closer look at Lenovo’s recent Top500 listings, Ziegler’s blog and what they mean for commercial markets and businesses. Continue reading
By Charles King, Pund-IT, Inc. June 26, 2019
While the term “cloud computing” has been around since the mid-1990s and its underlying concepts for even longer, it is a mistake to think of cloud as something fully formed or entirely mature. Instead, cloud solutions continue to evolve and follow the rapid commercial transformation that began when Amazon relaunched AWS, including the Elastic Compute Cloud service in 2006.
Though VMware isn’t always at the top of cloud computing pioneer lists, its cloud roots and pedigree are as deep as any vendor’s can be. Why so? Because hardware virtualization technologies, like the company’s vSphere offerings, are central to cloud functionality. VMware launched its first formal cloud effort (the vCloud initiative) over a decade ago and since then has steadily delivered solutions and services that address the cloud computing needs of its enterprise clients.
A recent Cloud Analyst Event offered some intriguing insights into the current state and future direction of VMware’s cloud efforts and strategies. Continue reading
By Charles King. Pund-IT, Inc. June 19, 2019
There are numerous ways to quantify the performance of supercomputers. In fact, Top500.org, the organization that publishes semi-annual lists ranking the world’s most powerful supercomputing installations, includes data ranging from architectures to processors to accelerators to interconnects. Vendors’ contributions to specific systems are provided, as well as how their solutions fared in terms of overall performance and share of the list. The group also ranks systems by energy efficiency (the Green500) and High Performance Conjugate Gradients (HPCG).
While it is common for specific vendors and systems to enjoy relatively stable positions at the high end of the Top500, the Green500 list tends to be more diverse and fluid. In addition, supercomputers owned by businesses are seldom seen among the 20 or 30 highest ranked Top500 installations. That’s mainly due to high-end supercomputers’ costs and complexities which are far easier to manage in universities and government research facilities than in real world business settings.
These points are worth considering in light of the new Top500 and Green500 lists published this week. IBM announced that it is the only vendor to have multiple supercomputers in the top 10 of both lists – the Summit, Sierra, and Lassen systems deployed at U.S. Department of Energy (DOE) labs. Plus, IBM announced that the new Pangea III supercomputer the company built for Total Exploration, an oil and gas company with operations in 130 countries, was ranked #11 on the Top500 list and #8 on the Green500.
Let’s take a closer look at this trifecta of announcements and consider what IBM’s top ranked systems portend for supercomputing applications and customers. Continue reading
By Charles King, Pund-IT, Inc. June 12, 2019
A common view of evolutionary process imagines continual upward motion—graphics of “the ascent of man” depicting the development of early hominids into modern humans is a classic example. But in the IT industry, computational evolution or at least the spark of evolution can be said to move both upward and downward.
That’s due in large part to the ongoing efforts among PC, workstation and server vendors to improve their products with the latest, greatest CPUs, GPUs, memory, storage, networking and display technologies. Those enhancements result in high-end solutions designed for enterprises and other well-funded organizations. But at the same time, they can also be used to develop products suitable for individuals and companies with more modest budgets, enabling them to continue their own evolutionary journeys.
The mobile, desktop and rack-enabled Precision workstations recently introduced by Dell Technologies at Computex offer great examples of this process and its potential benefits. Let’s dig into the details. Continue reading
By Charles King, Pund-IT, Inc. June 5, 2019
Original equipment manufacturers (OEMs) have been so central to the tech industry for so long that’s it’s easy to think you know everything about them. Mainly, people associate OEM functions and processes with the hardware vendors that make products which run popular commercial operating systems, like Microsoft Windows and Windows Server, along with thousands of business and consumer applications, toolsets, apps and utilities.
That’s a fair if generalized view of the subject, but it ignores the services that many of those same vendors offer to numerous companies, from tiny start-ups to multi-national enterprises. In essence, OEM organizations provide the computing “brains” that power everything from set-top cable boxes to bank ATMs to smart TVs and other home appliances to manufacturing and automation solutions to medical testing and imaging equipment to telco switching systems.
How OEMs contribute to the development, design and manufacturing of these products varies widely. That said, the notable success and continuing growth of Dell Technologies’ OEM organization over the past decade makes it a subject worth examining. Continue reading
By Charles King, Pund-IT, Inc. May 29, 2019
The success of corporate acquisitions is never guaranteed. Too many things can go wrong. Deals look better on paper than they do in real life. Planned strategies falter. Hoped for synergies are DOA. Executive power plays cause long range damage. Key employees feel unloved and seek greener pastures.
Plus, there’s simple poor planning or execution. An acquiring company may believe that the object of its attentions offers assets, people and capital that can aid its own endeavors. But, after the deal is done, it never properly or fully does what’s necessary to gain the full benefits of the investment. The practical effect is the business equivalent of a sugar rush. A few months of “Whoa, mama! How fast can this sucker go?” ending with a Thelma and Louise-style flameout.
It happens to even highly successful companies, often time and again, with Hewlett Packard Enterprise (HPE) being a notable example. Though its recent purchase of legendary supercomputing leader Cray has been widely lauded, the history of HPE’s acquisitions suggests a more cautious approach is warranted, especially when it comes to supercomputing. Let’s consider that more closely. Continue reading