posted by
Dopefish
on Saturday February 22 2014, @08:00PM
from the knowledge-is-power dept.
dyslexic writes "An Equation For Intelligence?
It is something like the philosopher's stone. A sort of E=mc2 that would put intelligence, and more particularly artificial intelligence, on a sound theoretical footing. But could it be as simple as this TED talk video (available on the link in addition to the article) suggests? The video explains some of this and provides examples of the principle in action where it is claimed to replicate a number of "human-like" intelligent behaviors including cooperation and tool use."
This discussion has been archived.
No new comments can be posted.
The relation between information and entropy is that they can be expressed similarly. But information is something in our heads, not in the state of a gas. Claiming that information and entropy are intertwined doesn't help understanding intelligence.
But the real problem of the equation is in computing the entropy. How do you know which path is going to yield the best options? You'd still need an evaluation function for the states, and that's precisely where the intelligence is. Not in some steepest gradient search function.
(Score: 3, Interesting) by TGV on Sunday February 23 2014, @03:37AM
The relation between information and entropy is that they can be expressed similarly. But information is something in our heads, not in the state of a gas. Claiming that information and entropy are intertwined doesn't help understanding intelligence.
But the real problem of the equation is in computing the entropy. How do you know which path is going to yield the best options? You'd still need an evaluation function for the states, and that's precisely where the intelligence is. Not in some steepest gradient search function.