Access and Feeds

Artificial Intelligence: Applying Calculus to Advance Neuron Network Calculations

By Dick Weisinger

Calculus ruled for hundreds of years after it was introduced in the seventeen century as a primary tool for explaining physics and the world around us. But, while math remains important, it has taken a back seat over the last few decades to numerical analysis and computations.

One example is Finite Element Analysis, a numerical technique that simulates the stresses and strains and interactions of highly complex objects. The boundary conditions and geometries of physical things are typically too complex and too difficult to capture into a single or even a few equations. On the other hand, the numerical analysis of Finite Elements breaks problems down into hundreds or thousands of smaller pieces and individually solves the physics for each of the small more-regularly shaped subcomponents and then determines how the systems of pieces interact with each other. There are many smaller problems to solve, but the problems are much easier to solve. Finite Elements offers a consistent and reliable methodology for solving problems without requiring a deep understanding of math.

Machine Learning and Neural Networks are similar to Finite Elements in how they approach solving problems. They break down the problem into smaller steps and then try to find formulas that can describe the smaller pieces.

The idea behind breaking problems down into ever smaller and more granular increments is also a key concept of Calculus. Calculus tries to calculate changes that occur across infinitesimal steps and compresses the many small changes into a single equation.

Researchers at the University of Toronto have applied principles from Calculus to improve the performance of modeling with step-wise Neural Networks. The researchers created something that they call an ODE (Ordinary Differential Equations) network.

The MIT Technology Review commented that “like any initial technique proposed in the field, it still needs to fleshed out, experimented on, and improved until it can be put into production. But the method has the potential to shake up the field—in the same way that Ian Goodfellow did when he published his paper on GANs.”

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*

four × four =