Stories
Slash Boxes
Comments

Dev.SN ♥ developers

posted by Dopefish on Saturday February 22 2014, @08:00PM   Printer-friendly
from the knowledge-is-power dept.
dyslexic writes "An Equation For Intelligence? It is something like the philosopher's stone. A sort of E=mc2 that would put intelligence, and more particularly artificial intelligence, on a sound theoretical footing. But could it be as simple as this TED talk video (available on the link in addition to the article) suggests? The video explains some of this and provides examples of the principle in action where it is claimed to replicate a number of "human-like" intelligent behaviors including cooperation and tool use."
 
This discussion has been archived. No new comments can be posted.
Display Options Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by TGV on Sunday February 23 2014, @03:37AM

    by TGV (2838) on Sunday February 23 2014, @03:37AM (#5111)

    The relation between information and entropy is that they can be expressed similarly. But information is something in our heads, not in the state of a gas. Claiming that information and entropy are intertwined doesn't help understanding intelligence.

    But the real problem of the equation is in computing the entropy. How do you know which path is going to yield the best options? You'd still need an evaluation function for the states, and that's precisely where the intelligence is. Not in some steepest gradient search function.

    Starting Score:    1  point
    Moderation   +2  
       Interesting=1, Underrated=1, Total=2
    Extra 'Interesting' Modifier   0  

    Total Score:   3