To do this, Ayanna and her colleagues rely on two concepts in the field of artificial intelligence: "fuzzy logic" and "neural networks."
Fuzzy logic allows computers to operate not only in terms of black and white -- true or false -- but also in shades of gray. For example, a traditional computer would take the height measurement of a tree and assign that tree to some category -- say, "tall." But a fuzzy logic computer would say the tree has a 78 percent chance (for example) of belonging to the category "tall" and a 22 percent chance of belonging to some other category. The sharp distinction between "tall" and "short" becomes fuzzy.
google_protectAndRun("render_ads.js::google_render_ad", google_handleError, google_render_ad);
This probabilistic approach to categorisation allows the computer to learn from its experiences, since the assigning of probabilities can be adjusted the next time a similar object is encountered. Fuzzy logic is already in use today in software such as computer speech and handwriting recognition programs, which learn to perform better through "training."