Engineers have been chasing a form of artificial intelligence (AI) that could drastically lower the energy required to do typical AI things like recognize words and images. This analog form of machine learning does one of the key mathematical operations of neural networks using the physics of a circuit instead of digital logic. But one of the main things limiting this approach is that deep learning’s training algorithm, backpropagation, has to be done by GPUs or other separate digital systems.
University of Montreal AI expert Yoshua Bengio, his student Benjamin Scellier, and colleagues at Rain Neuromorphics, a UF startup and UF Innovate | The Hub alum building artificial intelligence processors inspired by the brain, have come up with a way for analog AIs to train themselves. That method, called equilibrium propagation, could lead to continuously learning, low-power analog systems of a far greater computational ability than most in the industry now consider possible, according to Rain Neuromorphics CTO Jack Kendall.Learn more about UF Startup and Academics Find Path to Powerful Analog AI.