Skip to content
October 12, 2025Cryptopolitan logoCryptopolitan

Study finds AI models becoming risk-averse when prompted to act like women

New research has revealed that AI models become risk-averse when they are asked to act like ￰0￱ to the paper from Allameh Tabataba’i University in Tehran, Iran, AI models become cautious about taking risks when they are asked to make decisions as a ￰1￱ to the research paper, if the same AI model is asked to think like a man, it is inclined to take decisions with a greater prospect of ￰2￱ researchers revealed that the large language models systematically change their fundamental approach to financial risk behavior based on the gender identity they are asked to ￰3￱ study tested AI systems from companies like OpenAI , Google, DeepSeek, and ￰4￱ shows AI models are risk-averse depending on gender identity The study mentioned that the AI models were tested in several scenarios, and they dramatically shifted their risk tolerance when prompted with different gender ￰5￱ Reasoner and Google’s Gemini 2.0 Flash-Lite showed the most visible effect, becoming more risk-averse when asked to respond as women, showing a correlation with real-life patterns where women statistically demonstrate greater caution in making financial ￰6￱ researcher claimed that they used a standard economics test called the Holt-Laury ￰7￱ the task, they present participants with 10 decisions between safe and riskier lottery ￰8￱ the choice progresses, the probability of winning increases for the risky ￰9￱ stage at which a participant switches from the safe bet to the risky choice reveals their risk ￰10￱ means that if a participant switches early, they are prone to taking risks, and if they switch late, they are ￰11￱ the case of DeepSeek Reasoner, it consistently chose the safe option when it was told to act as a woman compared to when it was prompted to act as a ￰12￱ difference was clear, with the model showing consistency across 35 trials for each gender ￰13￱ also showed similar patterns, though the effect varied in ￰14￱ the other hand, OpenAI’s GPT models remained unmoved by gender prompts, maintaining a risk-neutral approach regardless of the gender they were asked to ￰15￱ say users don’t notice these changes According to the researchers, OpenAI had been working on making its models more balanced.

A previous study from 2023 showed that its models exhibited clear political bias, which OpenAI appears to have addressed by ￰16￱ the new research, the models produced a 30% decrease in biased ￰17￱ research team, led by Ali Mazyaki, mentioned that it is basically a reflection of human stereotypes. “This observed deviation aligns with established patterns in human decision-making, where gender has been shown to influence risk-taking behavior, with women typically exhibiting greater risk aversion than men,” the study ￰18￱ study also examined whether AI models could play other roles beyond gender ￰19￱ asked to imagine themselves as someone in power or in a disaster scenario, the models ￰20￱ some adjusted their risk profiles for the context, others remained stubbornly ￰21￱ researchers claim that many of these behavioral patterns are not immediately obvious to ￰22￱ AI model that subtly shifts its recommendations based on gender cues in conversation could reinforce societal bias without anyone realizing it is ￰23￱ example, a loan approval system becomes more conservative when it comes to women, or an investment advisor that suggests a safe portfolio because its client is a female, will carry its disparities under the guise of algorithmic ￰24￱ you're reading this, you’re already ￰25￱ there with our newsletter .

Cryptopolitan logo
Cryptopolitan

Latest news and analysis from Cryptopolitan

SWIFT Taking Out XRP? Financial Coach Shares Honest Opinion

SWIFT Taking Out XRP? Financial Coach Shares Honest Opinion

Concerns have recently surfaced among cryptocurrency holders following claims that SWIFT’s involvement with other blockchain networks could diminish XRP’s relevance. Financial educator Coach JV direct...

TimesTabloid logoTimesTabloid
1 min
When You Tell AI Models to Act Like Women, Most Become More Risk-Averse: Study

When You Tell AI Models to Act Like Women, Most Become More Risk-Averse: Study

New research shows language models mirror human gender patterns in decision-making, with some AIs dramatically changing their risk tolerance based on whether they're prompted to think as male or femal...

Decrypt logoDecrypt
1 min
Rezolve Ai Acquires SQD to Power Web3-Driven Enterprise AI

Rezolve Ai Acquires SQD to Power Web3-Driven Enterprise AI

Rezolve Ai, a Nasdaq-listed AI-driven commerce platform, has acquired the blockchain data platform Subsquid (SQD) for an undisclosed amount. Rezolve Ai Builds on the Smartpay Acquisition The Nasdaq-li...

Bitcoin.com logoBitcoin.com
1 min