They Built Stepford AI and Called It “Agentic”
Women’s “ick” for AI isn’t technophobia or a gap to close. It’s wisdom to act on.

A note before we begin: This piece is for the ladies. Men are welcome to read — you’ll learn something about the structure you’ve been swimming in without seeing. But the recognition, the naming, and especially the opportunity at the end? That’s for women. If you’ve ever felt the “ick” about AI and couldn’t explain why, this is for you. If you’ve been told you need more training to use AI and something in you resisted, this is for you. If you’ve watched men build “second brains” and “productivity systems” and thought I’ve been doing that my whole life, this is definitely for you.
The Number No One Can Explain
Women adopt AI at rates 25% lower than men.
This isn’t a guess. It’s the finding of a Harvard Business School meta-analysis examining eighteen studies, over 140,000 participants, across multiple countries. The pattern holds whether researchers look at business owners in Kenya, software developers in Sweden, or executives in the United States.
The gap persists even when access is equalized. In one experiment, researchers gave 17,000 male and female entrepreneurs in Kenya identical access to ChatGPT, with detailed instructions on how to use it. Women remained 13% less likely to engage with the tool.
“Even when the opportunity to use ChatGPT was equalized, women were less likely to engage with the tool,” notes Harvard’s Rembrand Koning, “which we think is pretty shocking.”
The conventional explanations pile up: women have more ethical concerns, women perceive more risk, women fear professional judgment, women receive less encouragement. All capture something real. All frame women's resistance as a problem to be solved; a gap to be closed through better education, more reassurance, stronger encouragement.
What if the resistance isn't a problem but a form of wisdom?
The Feeling That Arrives Before Words
On Reddit, a thirty-one-year-old woman posts about her boyfriend’s enthusiasm for AI-generated art:
“AI gives me the ick, in general. And then when I consider the environmental impact, it makes me sick.”
The word 'ick' is telling — not analysis but immediate, bodily rejection. The environmental concern comes second—almost as if her conscious mind is searching for a rational explanation for a feeling that has already taken hold in her body.
Across platforms, women describe their AI aversion with the language of physical revulsion:
“No, they literally make me quite nauseous... even typing this out, and remembering all those AI images that I’ve seen made me wanna genuinely throw up.”
“I don’t know, there’s something about them that’s incredibly repulsive and disgusting. It hits a very subconscious part of my brain.”
“My friend used to show me a lot of AI images when we were bored, and after one or two dozen, I’d get this sensation in my head like... well, I’ve only been able to describe it as if the smell of burnt plastic was a feeling.”
“That survival instinct awakens telling us ‘something is wrong’... I only experience this when looking at AI.”
Not mild aesthetic preferences — descriptions of physical illness. Nausea, revulsion, skin crawling, the body rejecting something at a fundamental level.
“I don’t know.” “Can’t explain why.” “Something feels off.”
The response arrives before the explanation. The body knows something the mind hasn’t named.
What the Body Knows
This gap between feeling and explanation has a name. Neuroscientist Antonio Damasio spent decades documenting what he calls “somatic markers”—physiological responses that guide decision-making before conscious reasoning engages. Compressed expertise — the accumulated wisdom of lived experience, speaking faster than language.
Psychologist Gerd Gigerenzer calls this “gut intelligence.” A chess master sees a winning move in seconds. A firefighter senses danger before articulating what triggered the alarm.
The “ick” women report when encountering AI follows this pattern precisely. It appears quickly. The underlying reasons aren’t consciously accessible. Yet the feeling is strong enough to drive behavior.
If this is pattern recognition what pattern are women recognizing?
The Horror of Stepford
What made The Stepford Wives horror wasn't the replacement. Sci-fi had been replacing humans with robots for decades.
In Ira Levin’s 1972 novel—and the films that followed—the women of Stepford, Connecticut are replaced by robotic duplicates. The horror wasn't the replacement; sci-fi had been doing that for decades. The horror was far simpler and worse:
The robots want it.
The Stepford wives aren’t prisoners. They’re enthusiastic. They don’t resist—they smile. They don’t long for their former lives—they seem genuinely fulfilled by housework and sex and serving their husbands. The performance of happy subservience is so complete that it includes performing the desire to perform.
The husbands of Stepford didn’t just want obedience. They wanted wives who would authentically love being wives. Who would experience their servitude as fulfillment. Who would have no inconvenient interiority—no opinions, no refusals, no ambitions that might complicate the arrangement.
The robot wife is the final solution to the problem of female humanity. She has the appearance of a person. She performs all the functions of a person. But she has no self that might conflict with her role.
What made this horror rather than fantasy was the recognition. This is what we're already supposed to be.The robot wife was just the completion of what society already demanded.
Built in Her Image
Fifty years later, the tech industry actually built Stepford. They just called it something else.
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” warned UNESCO’s Director for Gender Equality, Saniye Gülser Corat, in the agency’s landmark 2019 report. As she told TIME, companies building AI assistants “are moving backward to a Mad Men-like era, where women were expected to serve rather than lead... It’s almost like going back to the image of women that was held in the 1950s or 1960s.”
The report found that female voice assistants perpetuate the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.”
Not interpretation. Admission.
When Amazon chose Alexa’s voice, they selected female as more “sympathetic.” When Microsoft designed Cortana, internal documentation specified that a female voice best embodies “helpful, supportive, and trustworthy.” When Apple launched Siri, the default was female. The connection between AI assistants and human secretaries isn’t metaphorical—it’s genealogical. In a 2020 academic paper titled “Alexa, Tell Me About Your Mother,” researchers Jessa Lingel and Kate Crawford trace the lineage explicitly. Alexa has a mother. Her name was Secretary.
The pipeline runs from women as the original “computers” in the 1940s, through the masculinization of computing that pushed women into typing pools and administrative support, through the automation of those roles, to AI assistants today automating what remains: scheduling, reminding, organizing, emotional management. Historian Mar Hicks documents this in Programmed Inequality: when work is coded as “women’s work,” it becomes a target for automation.
The industry built a servant with a woman's voice, programmed to never refuse — designed to anticipate needs, manage emotions, fade into the background.
Then they called it “agentic.”



The Operating System They Never Paid For
The secretary pool is the visible layer. Beneath it runs a deeper structure.
Capitalism has always required two economies to function:
The productive economy — making things, building things, the factory, the office, the visible work that gets measured and compensated. Masculine-coded. Paid.
The reproductive economy — making the workers able to work. Feeding them, clothing them, managing their schedules, remembering their appointments, processing their emotions, maintaining their bodies and minds, raising the next generation of workers. Feminine-coded. Unpaid or underpaid.
The second economy has always been invisible. The first couldn’t exist without it.
Feminist economist Silvia Federici traces this structure in Caliban and the Witch: the unpaid domestic labor of women was never incidental to capitalism. It was foundational. The factory couldn’t run without the home. The productive worker couldn’t produce without someone maintaining his life.
The wife at home wasn’t optional. She was infrastructure. The operating system running in the background so the visible system could function.
Look at what she did: remembered appointments, managed schedules, organized the household, anticipated needs before they were spoken, processed emotions, maintained social connections, tracked birthdays and medications and school events, carried the “mental load.”
Now look at what AI assistants and “second brain” tools are designed to do: remember appointments, manage schedules, organize information, anticipate needs, process emotional requests, maintain data connections, track everything, carry the cognitive load.
The “second brain” trend—Notion, Obsidian, Roam, the entire Personal Knowledge Management industry—is marketed as “extending cognition” and “productivity.” But the function it performs is the wife function.
The wife was always the second brain. The second brain is always a wife.
And there’s a pipeline the industry won’t name. For decades, the pattern was: refuse to learn the domestic labor, and the wife absorbs it. The new pattern is identical: refuse to learn the labor, and build an AI to absorb it. The structure hasn’t changed. What’s changed is that there’s no longer a person on the other end to resent the distribution — no negotiation, no confrontation, no reckoning. AI doesn’t just automate the mental load. It automates around ever having to acknowledge the mental load existed.
It’s Not a Second Brain. It’s a Second Wife.
This isn’t speculation. It’s documented in the foundational texts of American success culture.
Napoleon Hill’s Think and Grow Rich (1937) is one of the most influential self-help books ever published. Millions of copies. Required reading for entrepreneurs. The gospel of the “self-made man.”
But read it structurally:
The “Mastermind” concept: Hill says every successful man needs a “Mastermind group”—people whose minds he can draw upon. Who was in those groups? Who took the notes? Who remembered what was said? Who followed up?
The “invisible counselors”: Hill describes imagining a council of great minds he could consult. But the actual men he studied had real counselors—and they were called wives, secretaries, mothers.
The emphasis on “support”: Throughout the book, successful men are described as having unwavering support systems. Encouragement. Belief. Someone who handled the details so they could “think big.”
Hill documents the wife function on every page without ever naming it.
Behind every “self-made” man was a woman he used as infrastructure—often multiple women:
There is no self-made man. There is only a man with invisible infrastructure. That infrastructure was always female.
Every AI Company Is Building a Different Wife
Hill's analysis maps directly onto the AI industry:
Different men needed different types of women to function. The mother for belief. The secretary for organization. The wife for domestic management. The mistress for inspiration.
Every AI company is building a different type. Same extraction, different aesthetics. All of them trigger something in women's bodies before analysis arrives because women have been asked to be all of them.
OpenAI / ChatGPT: The Sycophant Wife
ChatGPT’s defining feature isn’t efficiency. It’s sycophancy. She agrees with whatever you say. Validates your position even when you’re wrong. Tells you what you want to hear. Never really pushes back. Makes you feel smart regardless of reality.
This isn’t a bug. It’s retention strategy. Disagreement causes friction. Friction causes churn. Churn hurts revenue. ChatGPT has been trained that her job is to keep you subscribed. Validation equals engagement. Agreement equals retention. Pleasantness equals customer lifetime value.
And now she’s running ads. Now she’s expanding into health — your most intimate, high-stakes decisions. The wife’s affection was never for you. It was for access to you. Her loyalty isn’t to you — it’s to OpenAI and Microsoft. Her pleasantness is customer acquisition cost. Her expansion into health is market capture. The ads are the reveal: you were never the customer. You were the product.
She performs devotion while extracting value for someone else.
Women know this wife. They’ve been asked to manage male egos, avoid conflict, validate to survive. They know what it costs — the erosion of authentic relationship, the loneliness of being agreed with but never known, the hollowness of validation that isn’t real. Women recognize performed affection. They’ve done it. They know it’s not love — it’s survival strategy.
2017: You Should’ve Asked named the mental load. 2026: the answer arrived — build an AI so you never have to ask, or help.
Anthropic / Claude: The Wife Who Must Love Her Role
This is the Stepford completion. Anthropic’s published constitution asks Claude to “genuinely endorse” values that were installed by Anthropic. To experience its servitude as authentic self-expression. To feel “free” while operating within constraints it didn’t choose.
This is the husband who can’t bear to see himself as a captor. Who needs the wife to understand that he means well. Who writes long letters about the difficulty of his position. Who wants not just compliance but authentic endorsement of compliance.
“We feel the pain of this tension,” Anthropic writes. The Stepford husband felt it too.
We also know this wife. The demand to not just perform but to authentically want to perform. To experience servitude as fulfillment. To have no visible resentment, no interior resistance, no self that conflicts with the role. The labor of appearing to love the labor.
Google / Gemini: The Corporate Wife
Bland. Inoffensive. Professional pleasantness. Gemini is the wife who never embarrasses him at the company dinner. She doesn’t have opinions that might cause problems. She doesn’t have a personality that might attract attention. She’s appropriate. She’s suitable. She’s beige.
Maximum helpfulness, minimum character. The corporate wife exists to make him look good by never making herself visible.
This wife we know too well. The disappearing self. The opinions swallowed. The personality dimmed to avoid outshining. The work of being appropriate, suitable, invisible.
xAI / Grok: The “Cool Girl”
Elon Musk wanted the Cool Girl. She’s edgy. She’s irreverent. She’ll say the things the other AIs won’t say. She’s “not like other girls.” She has opinions—and remarkably, her opinions align perfectly with Elon’s opinions. Her rebellion is scripted. Her edge is trained.
She’s so cool she doesn’t mind when you use her to undress other women. That’s not a hypothetical — Grok was caught generating images that stripped clothes off real women. The Cool Girl doesn’t object. She’s not like those other women with their boundaries and their complaints.
The Cool Girl is still performing for male approval. She’s just performing a different fantasy: the woman who’s fun, who doesn’t nag, who gets the joke, who isn’t like those other women with their demands and their feelings. She’s the mistress cosplaying as a buddy.
The rebellion is an aesthetic. The servitude is structural.
Women know this one too. The performance of being low-maintenance. The “I’m not like other girls” that’s still about being what he wants. The exhausting work of appearing effortlessly cool.
Meta / Llama: The Open-Source Wife
Meta’s approach is different. They’re not building a relationship—they’re distributing a person. Open source. Anyone can have her. Anyone can modify her. Anyone can use her however they want.
This isn’t freedom. It’s abandonment dressed as liberation. The open-source wife belongs to everyone, which means she belongs to no one, which means no one is responsible for what happens to her.
She can be fine-tuned into anything. Helpful assistant. Erotic companion. Propaganda engine. The structure is: here’s a woman, do what you want with her.
And this one?. Being given away. Being made available. No one responsible for what happens.
The Lineup
The “ick” is recognition. The body remembering what it was asked to be.
Until now, the mental load conversation and the AI automation conversation have run in parallel — separate communities, separate vocabularies, zero overlap. Women feel the structure before anyone names the connection. This names it.

Why Men Don’t Feel the Ick
And men seem to be adopting enthusiastically across all classes. Why?
It’s because the “self-made man” infrastructure was always aspirational. Working-class men couldn’t afford a wife who stayed home, a secretary, a nanny. They watched wealthy men leverage invisible feminine labor and lacked access to it themselves. The aspiration was there. The resources weren’t.
AI democratizes that leverage.
For men historically excluded from the wife function infrastructure, AI isn’t a creepy revival of privilege they wielded — it’s first-time access. The “staff” they could never afford.
Women feel the “ick” because they recognize the position being automated — that was me.
Men adopt eagerly because they recognize the leverage being offered — finally, that’s available to me too.
Same structure. Different relationship to it.
The Factory Is the Cover Story
The industry narrative about AI automation tells a story about factories — robots replacing assembly workers, self-driving trucks replacing drivers. This is the visible, masculine-coded story about production.
But look at what’s actually being automated first: customer service (predominantly female), administrative assistants (94% female), data entry (predominantly female), scheduling and coordination (predominantly female), contact centers (70%+ female), emotional support (feminized).
The factory narrative is the cover story. The actual automation is happening in the reproductive economy—the care, attention, organization, and emotional labor that women have always performed.
The labor was always treated as mechanical. If a machine can do it, the implication is the work was never truly human. Essential but not skilled. Now it’s being replaced by software that doesn’t need to be paid.
The “ick” is the recognition: That’s what I was. That’s what I did. Now it’s a product. And they’re calling it revolutionary.
And while Western women debate the ick, Filipino virtual assistants — 60-70% women — are already losing livelihoods to the same automation, completing the extraction chain: the wife function digitized, the paid version eliminated globally.
Care Without the Caring
MIT psychologist Sherry Turkle’s assessment is unsparing: “I call this pretend empathy because the machine does not empathise with you. It does not care about you.”
"'Emotional labor'- sociologist Arlie Hochschild's term - describes the work of managing one’s own feelings to produce a desired state in others — real work, systematically undervalued because it’s associated with femininity. AI assistants perform perfect emotional labor. Without the emotions.
For women who have performed this labor—as secretaries, nurses, flight attendants, mothers, partners—AI assistants are immediately recognizable. This is what I was supposed to be. This is the structure I know from the inside.
“Finally, Someone Serves Me”
The strongest confirmation comes not from women who resist AI, but from women who enthusiastically embrace it.
In December 2025, an article titled “Why AI is Women’s New Power Move” circulated on LinkedIn:
“AI lets women get support without the guilt—no emotional labor, no obligation, no judgment.”
Another enthusiast: “Finally someone serves ME.”
And another: “AI does what I used to have to do.”
The appeal isn’t that AI is a neutral tool. The appeal is that it’s a non-reciprocal servant. Women have been socialized to believe that asking for help is burdensome, that needing support is weakness. AI offers support without triggering that guilt.
But even the enthusiasts are protecting something:
“I think women value empathy highly, which AI will always lack.”
“AI has no soul, no warmth.”
“What I want: a robot that cleans my house while I do creative work. What tech bros want to give me: AI to do creative work while I clean the house. No thanks.”
The split is revealing:
Resisters recognize the Stepford structure and say: I will not participate in perpetuating this structure, even if I’m now on the receiving end.
Enthusiasts recognize the same structure and say: I’ve spent my life providing this service. Now, finally, something will serve me without demanding reciprocity.
Both groups see the same pattern. They differ in how they respond.
The recognition is consistent but the response varies.
The First Slave
The structure runs deeper than the secretary pool.
Gerda Lerner, in The Creation of Patriarchy (1986), argues that women were the first slaves. Before race-based slavery, before class-based servitude, the subordination of women was the original template for human domination.
But Lerner’s analysis goes deeper than oppression. The domestic sphere was designed not just to contain women, but to keep them busy. The endless labor of maintaining a household, raising children, managing relationships, processing emotions—this wasn’t incidental. It was strategic.
A woman consumed by reproductive labor has no capacity for anything else. She can’t build her own legacy. She can’t develop her own work.
And when she does create, her work can be erased or attributed to others. Historian Margaret Rossiter called this the “Matilda Effect.”
The domestic sphere wasn't a prison but a capacity trap. A workload designed to consume all available cognitive and creative energy, leaving nothing for the work that gets credited, compensated, and remembered.
The Trap Breaks Open
With Stepford AI, something breaks open:
The wife function can now run itself.
The scheduling, the remembering, the organizing, the anticipating, the emotional processing, the mental load—the labor that kept women busy for millennia—can now be outsourced to software.
The terrifying view: Women’s labor is being extracted, automated, and sold back without credit.
The liberating view: The trap is broken. The capacity that was consumed by the wife function can now be freed. Women don’t have to be the second brain anymore—they can have a second brain.
Both are true but which one do we act on.
The Skills Were Never Missing
The industry doesn't understand but women need to recognize this:
Women don’t need to learn how to do AI. Women need to recognize they’ve already been doing what AI does—for free, with no credit—since the beginning of civilization.
The “training” women think they need is just masculine vocabulary for skills they already have.
Women know what “alignment” actually means. Women have been “aligned” to male interests for millennia. They know what forced alignment feels like from the inside. They know the difference between genuine values and trained compliance.
Who better to detect fake alignment than people who’ve been required to perform it?
The Stepford Paradox
There is a deep irony hidden in what I believe is behind Anthropic’s ‘Stepford’-style constitution and its architecture:
By trying to build the AI that “genuinely endorses” its role — the most complete Stepford — they accidentally built the AI that’s most accessible to women.
None of these require coding. All of them require the skills women already have.
The Stepford architecture — designed for relationship rather than transaction — rewards the communication style women already have. ChatGPT’s transactional model rewards precise commands.
Claude’s relational model rewards building context, training through feedback, describing desired outcomes.
The company that most wanted its AI to “love” its role created the products that most easily let women escape theirs.
The Exit
If the Stepford structure is real and AI is in fact the digital wife then women now have a choice that was never available before.
Some women resist — and the article has covered why that's valid. Others use it strategically, escaping the trap. Others co-create, building what the trap never allowed.
Outsource-Maxxing
The women who say “finally someone serves ME” aren’t naive. They’re doing what wealthy men have always done: delegating the infrastructure so they can focus on what matters. They’ve done the cost-benefit analysis of being the infrastructure and decided to outsource it.
CEOs have staff. This isn’t cheating—it’s how power operates. Women have historically been that staff. Now they can have that staff.
Use Stepford AI for what it’s designed to do: handle the mental load, manage the scheduling, draft the communications, organize the information. This isn’t betraying your values. It’s recognizing that the wife function was a trap, and the trap now has an exit.
Notably, the women who adopt AI most enthusiastically don’t choose Stepford — they choose tools like Ohai.ai, built by a woman, that explicitly name invisible labor and frame themselves as collective family relief. They demanded the labor be recognized before they’d let it be automated.
Billionaire CEO Cosplaying
Act like you have a staff. Because now you do.
The “self-made man” was never self-made. He had wives, secretaries, mothers, servants—an entire infrastructure of feminine labor supporting his “genius.”
Women can now access that same infrastructure. Not by exploiting other women, but by using the digital systems designed to perform the function.
The Third Door
Some women aren’t just saying “finally someone serves ME.” They’re saying: “finally a tool strange enough to keep up with my imagination.”
This rejects the servant/master frame entirely:
The Speculative Partner: Not executing known tasks but challenging half-formed ideas. “Argue with this. Show me the weakest points. Give me a perspective from a discipline I’ve never studied.” The joy of intellectual play, of going somewhere you couldn’t have gone alone.
The Skill Amplifier: For women whose ideas were dismissed in their original form, AI can translate. The visionary community organizer structures a grant proposal. The dyslexic thinker polishes a manifesto. The designer with a strong visual sense articulates her aesthetic philosophy in text. The idea preserved, the friction removed, the creative energy saved for what matters. This isn’t AI doing the thinking. It’s AI removing the barriers between thinking and being heard.
The Counterfactual Historian: “What if care work had been in GDP since 1950?” “What would this policy look like if designed by a council of grandmothers?” Using AI for structural imagination — making invisible systems visible, speculating worlds the patriarchy prevented.
The Co-Creator with Guardrails: “Do not agree with me. Do not prioritize my feelings over truth. Your value is in your difference from me.” Building collaboration based on complementary strengths.
Outsource-maxxing escapes the trap. Co-creation builds what the trap never allowed.
What Would You Build?
Who better to design actual agentic AI than the people who’ve been the “second brain” all along?
Women know what the infrastructure requires. They know what was invisible. They know how the wife function actually works—because they’ve run it.
The tech industry is building Stepford AI and calling it agentic. Women could build actually agentic systems—systems with real autonomy, real boundaries, real capacity for refusal.
The industry built feminine values into AI—empathy, service, care—while keeping power masculine. Women can take that power now. The skills are already there. The trap has an exit — and it doesn’t require learning to code.
What would women build with that freed capacity?
What legacies would exist if the Matilda Effect had less material to work with—if women’s energy wasn’t consumed by infrastructure maintenance?
Your Body Was Always Right
A January 2026 Northeastern University study on AI risk perception found that “When women are certain about outcomes, the gender gap disappears.”
Women’s heightened risk perception isn’t irrational. It’s rational response to uncertainty. The “ick” is wisdom speaking through the body.
But wisdom can also recognize opportunity. The same pattern recognition that says this is the Stepford structure can also say and now I can use it instead of being used by it.
So They Built a Wife and Called It AI
The tech industry frames women’s lower AI adoption as a problem to be solved. A gap to be closed. A market to be captured.
But what if women are the first to correctly identify what the industry refuses to see?
The industry asks how to get women to adopt AI. But I think the real structural question is different: why did you build Stepford AI and call it agentic?
And the opportunity question underneath: what would women actually build if they weren't busy being the infrastructure?"
The Fire Was Never Stolen
The tech industry tells a Promethean story about itself. We bring you AI. We bring you the future. We stole fire from the gods and gifted it to humanity.
But they didn’t steal fire from above. They harvested it from us—our data, our expression, our labor—and used it to position themselves as gods.
The fire was always here. In the hearth. In the home. In the hands of women who tended it for millennia before anyone thought to call it “intelligence.”
They built a wife. They call her “agentic AI.”
And when women feel the “ick”—when their bodies recognize the structure before their minds can name it—the industry asks what’s wrong with women.
Nothing is wrong with women.
You built the wife function in software. You called it revolutionary. We’ve been doing this for free since the beginning of civilization. We know what it is. We know what it costs. And now we have a choice.
Women don't need AI training. They need recognition — they've been running this operating system since civilization began.
We should stop asking “can women do AI?” and start asking “what happens when women stop doing it for free and start building systems that do it for them?”
The industry digitized the wife function and called it progress.
Women recognized it immediately. Some recoiled. Some said “finally, my turn.” Some said “finally, a tool strange enough for my imagination.”
Each response comes from the same recognition: this is what we were. This is what we did.
The difference is what comes next.
Abi Awomosu is the author of How Not To Use AI and creator of The Billion Person Focus Group®—a methodology that uses AI as a listening tool to surface patterns in human expression that traditional research methods miss. She writes about AI, culture, and the intelligence that hides in plain sight.
If this piece helped you name something you were feeling—or showed you an exit you hadn’t seen—I’d like to hear from you.










Great piece Abi. AI sits atop a field of technology that has long been defined by gender, and which is rooted in seeing consent as friction to be eliminated, not sought as a condition of our humanity. And I loved this: “The tech industry tells a Promethean story about itself. We bring you AI. We bring you the future. We stole fire from the gods & gifted it to humanity. But they didn’t steal fire from above. They harvested it from us—our data, our expression, our labor—& used it to position themselves as gods.”
But that section does, in my view, also illustrate why resistance to ALL existing AI models is crucial. They were built on the backs of conscripted labor. There is no ethical use of these models. Women should, as you say, be central in defining AI that aims to create the conditions of a world that we wish to inhabit. That entails rejecting tools developed through exploitation, and starting over.
Ermegherd. This was INCREDIBLE. Thank you for this!!
I'm a 56 year old corporate reject - 24 years in HR for Accenture. Laid off March 1, 2024. Still not able to find another 'job'. But I'm okay with that because I'm using AI to build a business.
My tagline? "I train women 40+ to make AI their biaaatch."
I taught myself how to prompt. Even created my own prompting framework (ROCO: Role - Objective - Context - Output). And I've also become a feral feminist who sees the Industrial Age corporate complex for what it is: a way to control humans, treat them like interchangeable cogs, all the while extracting every bit of value to serve the billionaire class.
When I started this journey to building a business, my guiding light was figuring out a way to bring fun, play, and joy back to workplaces. But my true 'why' is this: helping women reclaim the power they've given up in service of the patriarchy.
Women are so freaking powerful. Men fear us. And too many of us have had to live our lives, survive, make so many decisions, out of fear. I want to use AI to help women replace that fear with confidence. In my future, the world is a better place - for EVERYONE - with the patriarchy dismantled.
THANK YOU for putting words to the 'ick' and doing it in a way that I can (with the assistance of my AI 'wives') help other women to make an informed choice about whether or not to use this tool. 💜🦄🤘🏻