Fellow execs - do you know anyone who is successfully running active/active workloads across two cloud service providers today?

Top Answer : There are many use cases of customers who have successfully been running active/active workloads across cloud providers using VMware Tanzu Application Service as an abstraction layer.  Here is an example of one customer demoing how they swung live e-commerce traffic from one provider to another during their mainstage presentation - Feel free to reach out to me if you want to talk more

Which Hyper Converged Infrastructure Product do you recommend?

Top Answer : VMware highly recommended

Related Tags
If these companies were affected then the foundation of computing could be at risk. If you could manipulate at the hardware layer via the firmware, BIOS, ect then a threat actor could weaponize well below the operating system which brings in to question the integrity of the entire computing stack and everything above it.  The firmware and bios are like the rebar and concrete for a building. If that foundation is weak then the entire structure and anything dependent on it is at risk. We cannot underestimate the potential or the severity of these companies being potentially affected by the SolarWinds hack and what that means for the foundational computing hardware they provide to the world.  What do others think ?  How could this impact your organization ?   Big tech companies including Intel, Nvidia, and Cisco were all infected during the SolarWinds hack - The Verge

Top Answer :

Pulse Flash Read

Edge computing: a vendor’s dream in a catastrophizing world?

Edge Computing, as a category, sounds like a bad marketing attempt to create intrigue around yet another ‘paradigm-shifting’ technology that’s ‘the future of the enterprise’. What the name actually refers to is its functionality -- pushing the IT perimeter right up to the edge of where the data is being collected, in the form of smart sensors or Internet-of-Things (IoT) devices. Edge Computing (EC) isn’t just about gathering data, though. Because gathering too much data eventually becomes problematic. Smart sensors mostly collect unusable ‘noise’. Think about all the images a smart camera picks up throughout each day on a busy street. If the cloud has to process hours of nothing but the occasional pigeon wandering into shot, that’s a ton of wasted server space. What EC does is place computing power inside those sensors. This computing is done in isolation, locally, cut off from the rest of the network, after which only the clean, usable data is sent on to the cloud or data center.  While EC solves some big problems in principle, there are a lot of discrete steps and factors to the whole process. Imagine the data as a game of hot potato. Each step of this game of hot potato requires entrenched security -- it’s bad news if someone can simply hack into that data being processed in the sensor (and there are many anecdotal reports of this happening), or indeed corrupt it before it’s passed on to the data center. That means containerization. There’s also latency and power consumption to consider. What’s the point of collecting data out in the world if it takes too long to make it to the cloud and back (a potentially fatal problem in industrial safety settings), or devices die out in remote locations? That means utilizing up-to-date 5G networks, LoRa (Long Range, low energy radio frequencies), BLE (Bluetooth Low-Energy) etc. And indeed, all the computing happening at source needs Machine Learning expertise thrown at it to figure out what’s useful.  This hot potato situation creates complexity as myriad, unproven vendors step in to offer ‘expertise’ and single point solutions to intrigued but overwhelmed IT teams.
Gartner predicts a ‘volatile’ situation over the next few years as quick adoption is met with insufficient solutions. Because of this, Gartner forewarns organizations to have an agile mindset going into any EC projects. However, we live in a world where climate-driven catastrophes and social unrest are becoming the norm. This could spell disaster for all of us tech companies for whom the internet is the lungs inhaling life into our organizations. Large scale outages will become a (bigger) problem based on current infrastructure. However, EC enables a situation where devices don’t depend on constant connection. Data can be stored on-device until a network can be re-established, preserving it during downtime. ECs can also be architected to form Mesh Networks, so that even if a node in that network goes down, the rest of the network can pick up the slack and find alternative pathways for upload. If the cloud forms the organization's brain, ECs can act as miniature, extra, failsafe memory banks. However, given that these sensors are physical objects, they won’t ever be totally safe from harm, either. Just look at how conspiracy theorists have torched 5G towers (or what someone on Facebook thought was a 5G tower, anyway). So physical device safety is another aspect to consider. As part of a movement made up of several different innovative, overlapping fields such as IoT networks, 5G, virtual machines, cloud computing, Kubernetes (to name a few), wide scale adoption might take a while to come to fruition. Perhaps over time more generalized, established, expert vendors will take control of the field. IBM is already pretty well positioned for that. ClearBlade boldly offers a scalable, managed solution for a ‘common software stack across the board’. RedHat offers a ‘Virtualization Manager’ for centralized oversight with a Graphical User Interface and an open source library to refer to, but that leaves IT a lot of work to do, still. For now, EC seems to be living up to its name more by existing on the edges of conversations rather than in IT infrastructure. But as data collection turns increasingly into data deluge, Edge Computing might form the necessary dam that saves the IT ecosystem from disaster.

Are there any vendors you’re excited about in the Edge Computing space? Or is Edge Computing not there yet for you?

Top Answer : AWS