The most popular and comprehensive Open Source ECM platform
Edge AI refers to local hardware devices that can independently run AI algorithms to process data received by the device from sensors. The device doesn’t need to be connected to a network and can process data and make decisions without outside intervention.
Edge AI is able to make real-time decisions and is useful in applications like robotics and autonomous vehicles. Because data is processed locally, communication bandwidth and cloud data storage is reduced.
Dave McCarthy, research director at IDC, said that “AI is the most common workload in edge computing. As IoT implementations have matured, there has been an increased interest in applying AI at the point of generation for real-time event detection.”
Today most enterprise-generated data is created and processed centrally. Only 10 percent is processed outside of the data center, but Gartner predicts that this will change to nearly 75 percent by 2025.
Tomas Bittman, analyst at Gartner, wrote that “the agility of cloud computing is great – but it simply isn’t enough. Massive centralization, economies of scale, self-service and full automation get us most of the way there – but it doesn’t overcome physics – the weight of data, the speed of light. As people need to interact with their digitally-assisted realities in real-time, waiting on a data center miles (or many miles) away isn’t going to work. Latency matters. I’m here, right now, and I’m gone in seconds.”
Research company Omdia forecasts that the market for AI chips will grow from $7.7 billion in 2019 to $52 billion in 2025.