The mechanisms that drive human face-recognition artificial intelligences have made major advances in the past decade, so much so that they perform similar if not better than humans but, despite this, they are (for now) far from the cognitive mechanisms that are activated in our brains when we look at a person's moving face. This is shown by the results of a study conducted by the University of Bologna. Comparing artificial neural networks and humans, the researchers showed that artificial intelligences are not a good model for understanding the way the brain analyzes moving faces. When we observe a face, we do not just analyze its identity, but automatically acquire a range of other information about its attitude and emotional state, which automatic recognition systems today do not consider. "Once the neural network has determined whether one face is different from another, its task is done," explains Maria Ida Gobbini, senior author of the study. "For humans, on the other hand, recognizing a person's identity is just the starting point for a series of other mental processes that artificial intelligence systems are currently still unable to mimic”.
|