Men and women have different patterns of smiling, new research reports — and this, the authors add, can allow AI to easily distinguish between the genders.
Many a man has been enraptured by the right smile, and many more will probably follow — although the opposite doesn’t seem to hold true. Regardless, while romance unfolds across the world, one team of researchers from the University of Bradford is working to bring this subtle yet powerful gesture to bear in our interactions with artificial intelligence (AI). According to them, computers can learn to differentiate between men or women simply by observing a smile.
Led by Professor Hassan Ugail, the team mapped 49 distinct points (or ‘landmarks) on smiling human faces — mainly around the eyes, mouth, and down the nose. These points were then used to measure how underlying muscle movements changed the participants’ faces while smiling. These recorded changes included both the distance between different points and the ‘flow’ of the smile, i.e. how much, how far, and how fast the different landmarks moved as a person was smiling.
Birds of a gender smile together
Then, the team crunched the data to determine if ladies smile differently than gents — and they did. The team notes that there are ‘noticeable differences’ in smile-patterns between the genders, with women able to boast having the more expensive ones.
“Anecdotally, women are thought to be more expressive in how they smile,” says Ugail. “Our research has borne this out.”
“Women definitely have broader smiles, expanding their mouth and lip area far more than men.”
Based on their findings, the team created an algorithm to analyze smile patterns and tested it against video footage of 109 people as they smiled. They report that the algorithm correctly determined the gender of the smile-es in 86% of the cases — and they believe that this accuracy can be easily improved. Ugali claims the algorithms relied on “fairly simple machine classification” as they were just testing the validity of the concept; a more sophisticated AI could easily improve the recognition rates, he adds.
Automatic gender recognition is already available and in use today. But existing methods draw on static images, and compare fixed facial features. This is the first software to use dynamic movement to distinguish between men and women, and the team hopes that their work will help enhance machine learning capabilities in the long run.
However, their research has also raised some intriguing questions that they’re planning on pursuing — for example, how would their software respond to the smile of a transgender person, and how would plastic surgery impact the smiling patterns of a subject?
“Because this system measures the underlying muscle movement of the face during a smile, we believe these dynamics will remain the same even if external physical features change, following surgery for example,” said Professor Ugail.
“This kind of facial recognition could become a next-generation biometric, as it’s not dependent on one feature, but on a dynamic that’s unique to an individual and would be very difficult to mimic or alter.”
The paper “Is gender encoded in the smile? A computational framework for the analysis of the smile driven dynamic face for gender recognition” has been published in the journal The Visual Computer.