What factors constrain edge computing? What factors enable/empower it?

Top Answer : The two things I believe in are economics and physics, and I think those two things will drive a huge amount of edge computing work, which is going to just completely change the balance of powers right away just by the makeup of where all the different computer storage and networking pieces live. And physics being the thing that makes you not be able to send it to a central data center and back fast enough.

5 views
2 comments
1 upvotes
Related Tags
Orange Processor
Software
The two things I believe in are economics and physics, and I think those two things will drive a huge amount of edge computing work, which is going to just completely change the balance of powers right away just by the makeup of where all the different computer storage and networking pieces live. And physics being the thing that makes you not be able to send it to a central data center and back fast enough.
0 upvotes
Pink USB Stick
Software
Here’s an awkward Sci-Fi reference, in Altered Carbon, they talk about needlecast. Presumably a faster than light communication broadcast. They're still bounded. The amount of data that they need to send has grown so large that even if you had a faster way to transmit it, you still wouldn't have enough bandwidth to transmit it. So, there's always latencies in the system. The processors in mobile devices like smart watches have become orders of magnitude better than anything that we had 10 years ago, for example, in a desktop machine, a laptop, or even a mainframe for that matter. But the fact of the matter is that the problem expands to take up all available resources in that processor, no matter how great it is. You may have storage limitations, or you may have a battery limitation, or you may have a bandwidth limitation on the watch. Edge computing is a lot trickier than “where are my workloads located?” It's a lot trickier than “what's the security of the workloads?” It's like understanding what the envelope is that you can operate that workload in. At the edge, it becomes constrained by so many different factors. That’s another argument for making it ubiquitous and making it spread out as much as possible. Nobody talks about the cross-platform highly ubiquitous scheduler that would have to be available on all platforms that would recognize power limitations, bandwidth limitations, network latency issues, availability of storage. All of these things, compute power, memory availability. Nobody really talks about that, except for the researchers at UC Berkeley, who are doing things like, InferLine. They're actually thinking about this stuff, and they've done a cross-platform scheduler for edge computing. They'd done it as an AI training scheduler so that they could do sectional training of ML on different devices to kind of speed it all up. So they could do inference at the edge, close to the edge and in the core, and they could move those workloads around. It's really a super interesting project. It's a couple of years old now.
0 upvotes