And although some deep learning algorithms adapt the local form of back propagation

123.00 Dollar US$
April 9, 2024 United States, New York, New York City 10

Description

And although some deep learning algorithms adapt the local form of back propagation of the error - essentially between neurons - it requires that their connection back and forth be symmetrical. In the synapses of the brain this does not happen almost never. More modern algorithms adapt a somewhat different strategy by implementing a separate feedback path that helps neurons find errors locally. Although this is more biologically feasible, the brain does not have a separate computer network dedicated to finding scapegoats. But he has neurons with complex structures, unlike homogeneous "balls", which are currently used in deep training. Branching networks Scientists draw inspiration from pyramidal cells that fill the human cerebral cortex. "Most of these neurons are tree-shaped, their" roots "go deep into the brain, and" branches "come to the surface," Richards said. "What is remarkable, the roots receive one sets of input data, and the branches are different." Curiously, the structure of neurons often turns out to be "just the right way" to effectively solve a computational problem. 


Share by email Share on Facebook Share on Twitter Share on Google+ Share on LinkedIn Pin on Pinterest