The most popular and comprehensive Open Source ECM platform
Edge and Fog Computing: Data Preprocessing to Avoid Latency and Bandwidth Problems
Data centers are dying. They are being shut down, downsized or overlooked as new IT expenditures are earmarked for the Cloud. Gartner says that by 2025 80 percent of today’s data centers will have been shut down. Cloud infrastructure provides advantages that today’s data centers can’t match.
The processors found in devices today run rings around the capabilities of supercomputers from previous generations. Processing power of devices are allowing data processing to occur at the source where the data is collected before summarizing and communicating results back to a central cloud. This kind of pre-processing happening at peripheral nodes in a system is called edge computing.
When a collection of device nodes are coordinated to perform a workflow to avoid latency and bandwidth problems of communicating with the centralized cloud, it is called Fog Computing.
Ray Bernard, , explains that “Fog Computing nodes are physical equipment (such as gateways, switches, routers, servers, etc.) or virtual components (such as virtualized switches and virtual machines) that are tightly coupled with the intelligent end devices and provide computing resources to the edge devices.”
Lip-Bu Tan, president and CEO of Cadence, said that “the edge is becoming more and more intelligent. Sending everything to the cloud is too slow, so you’re going to see the edge starting to take off. The hyperscale cloud will continue to explode, but for automotive and industrial the activity will be at the edge. The next big thing is the edge. The edge is between the IoT device and the cloud. It’s a mini-cloud, but it’s not so massive and it will be energy-efficient. There will be an automotive cloud and different vertical clouds.”