“The market is driven by human emotions,” says Mario Savides, chief scientist on the project. “What came to us is, can we abstract things like expression or movements as early indicators of volatility? Everyone is excited, everyone is just shrugging their shoulders or scratching their heads or leaning forward… Did everyone have a reaction within the five-second time frame?”
The main phase of the study will take place over a 12-month period starting in the third quarter of 2023, and will include about 70 traders at investment firms located mostly in the United States. They will all have cameras installed on their computers to record their faces and gestures throughout the day, according to Savvides. The cameras will be linked to software from Oosto, an Israeli company formerly known as AnyVision Interactive Technologies Ltd. , which hopes to develop a system that alerts traders to trends, or a volatility indicator that can be sold to investment firms.
Oosto, which makes facial recognition scanners for airports and workplaces, declined to name the companies in the study, but said those companies will get early access to any new tool that comes out of the research. Everyone’s snapshots will remain on their computer or in their physical locations; Only the data and numbers representing their expressions and gestures will be uploaded to the researchers.
A person’s face is made up of 68 different points that change frequently, according to Savides, who co-authored a study on facial “landmarks” in 2017.
Its system will also track a trader’s gaze to see if they are talking to a colleague or looking at their screen, and noticing if their peers are doing the same. “We have a whole bunch of search algorithms that we’ll test to see if they correlate with a market signal,” Savvides said. “We are looking for needles in a haystack.”
Advertisers already use facial analysis to study how exciting an ad is, while retailers use it to see how bored customers are and hiring managers to determine whether a job candidate is motivated enough, somewhat intimidatingly.
Studying the stock market at first glance seems more miserable. Trading algorithms have tried for years to profit from information from weather, social media or satellites, but there is something offensive about the traders themselves being exploited for data. It can also be argued that the researchers put traders into a never-ending feedback loop as their actions and decisions become derivative and their lemon-like behavior is magnified. If you think that the market is already driven by a herd-like mentality, you can probably make it worse – but that’s how the market works too.
“Everyone on the street is talking,” says a London trader (not part of the study), who said they might find such alerts about their peers’ feelings helpful. “The whole part of doing what we do is discussing ideas and sharing information… Nonverbal communication is tremendous.” Years ago, trading floors were noisy places where people would often talk on three or four phone lines at the same time; Now many people communicate through chat rooms and talk is minimal.
But the study also points to another inconvenient phenomenon: facial recognition is here to stay, and its more controversial cousin, facial analysis, may well be. Despite all the concerns that have surfaced about facial recognition, including the mistakes it can make as a surveillance tool, tens of millions of us still use it without hesitation to unlock our phones.
Facial analysis like the kind used by Carnegie Mellon opens up an even bigger can of worms. Last summer, Microsoft vowed to get rid of facial analysis tools, which estimate a person’s sexual age and emotional state, acknowledging that the system could be unreliable and invasive. Any data they can get to gain an advantage. But this study – if successful – could encourage research into analyzing faces for other purposes, such as assessing an individual’s emotional state during a business meeting.
“If you’re doing a business deal over Zoom, can you get an AI that reads a face to tell if someone’s calling to scam you or if you’re a tough negotiator?” Savvides asks. “It is possible. Why not?”
Zoom Video Communications Inc. Last year sentiment tracking feature in a recorded business meeting. Called Zoom IQ, the software aimed at sales professionals gives meeting participants a score between 0 and 100, with anything over 50 indicating more engagement in the conversation. The system does not use facial analysis but tracks the interaction of the speakers, or how long one waits to respond, and provides their score at the end of the meeting.
More than two dozen rights groups have called on Zoom to stop working on the feature, arguing that sentiment analysis is backed by pseudoscience and is “inherently biased.” A Zoom spokesperson said the company is still selling the software, and that it “transforms customer interactions into meaningful insights.”
You could argue that the Carnegie researchers shouldn’t care what the face-analysis tool tells them about the sentiments of traders; They just need to identify patterns indicating associations, and turn those numbers into a search algorithm. But the downside to turning feelings into a number is this: It risks diminishing one of the essential attributes of being human. It might be better if it didn’t spread.
More from Bloomberg Opinion:
• Why Casinos Spy on Their Very Wealthy Clients: Parmee Olson
• Be Careful, Come Here Predictions for 2023: John Authers
• Magnus Carlsen’s strongest future opponent is the AI: Tyler Cowen
(1) Amazon continues to sell facial analysis software that estimates a person’s gender and also guesses if they are happy, confused, disgusted, or more.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Barmy Olson is a columnist for Bloomberg Opinion covering technology. She was a former correspondent for the Wall Street Journal and Forbes, and is the author of We Are Anonymous.
More stories like this are available at bloomberg.com/opinion