AI does not need “Mothers”. It needs maturity

AI does not need “Mothers”. It needs maturity


(first published in LinkedIn on 31 October 2025)

In recent months, we’ve seen a troubling pattern in how AI’s leading voices and institutions frame ethics, gender, and governance. Rather than confronting the political, economic, and social conditions that make AI systems harmful, they retreat into metaphor and spectacle.

Geoffrey Hinton, one of AI’s self-styled "godfathers", recently told Forbes that AI systems need "maternal instincts". The irony is hard to miss. For decades, the field he helped build has dismissed social and ethical reflection as “soft” concerns. Now, when the consequences of that neglect are undeniable, the solution is suddenly to feminize care, as if empathy could be patched into systems born of competition, extraction, and control. When the risks are undeniable, responsibility is reframed as something that others, mothers, should provide. This is not humility; it is a dismissal of one’s own responsibility, disguised as a gesture toward care.

Meanwhile, just this week, Meanwhile, in Albania, Prime Minister Edi Rama announced with great enthusiasm that the country’s AI Minister, “Diella,” is pregnant with 83 children. Each “child,” he explained, will serve as a personal digital assistant to a member of parliament, inheriting the knowledge of their mother, tracking discussions, and advising MPs in real time. Or as Clara Hawking says here, a story that would make even #BlackMirror blush.

These are not signs of progress. They are caricatures of ethics, of gender, and of responsibility.

When men who built and deployed unregulated, opaque AI systems now call for "maternal instincts" to save us, they are not promoting care or empathy. They are shifting accountability away from the structures, institutions, and values that should have guided these developments from the start. Similarly, turning an "AI Minister" into a pregnant symbol of innovation turns governance into performance. It trivializes both policy and personhood.

This sentimentalization of AI ethics, dressing irresponsibility in the language of care, is deeply dangerous. It confuses sentiment with structure. Ethical AI is not about coding feelings or performing virtue. It is about building systems and institutions capable of acting with foresight, fairness, and accountability.

We do not need to make AI more "maternal". We need to make AI governance more mature. That means replacing spectacle with substance, metaphors with mechanisms, and rhetoric with responsibility by design. 

Comments