AI is a choice, not a solution
There is no technology fix for inequality, sustainability, or justice.
These are political and social challenges. AI only amplifies existing structures, be these fair or unfair, depending on how we choose to design and deploy it. The direction is ours to choose.
AI is neither intelligent nor neutral.
It reflects human decisions, values, and blind spots. Every dataset excludes someone. Every model prioritizes something. Every system consumes energy, materials, money, and human labor that could have been used elsewhere. These are trade-offs - not inevitabilities. If we want better AI, we must first build fairer societies.
AI is not inevitable.
National strategies, research agendas, and innovation paths are political choices. Societies can decide when, why, and whether AI should be used at all, that is, have the right, or even more, the responsibility, to shape AI according to their values, not import someone else’s priorities by default.
AI is not virtual.
It has a footprint: data centers, energy use, rare minerals, invisible workers. Sustainability is not an add-on. If AI is not sustainable and fair by design, it is a failure, no matter how “innovative” it looks.
AI does not arrive like the weather.
We build it. We deploy it. We benefit from it, or suffer its consequences. Responsibility cannot be outsourced to algorithms.
Governance is not the enemy of innovation.
It is what makes innovation trustworthy, durable, and legitimate. Sometimes the most responsible, and innovative, decision is not to automate, but to protect human judgment where it matters most.
Clear rules do not slow progress.
They create the trust that progress depends on.
Responsible AI starts with Question Zero:
Not what can we do with AI? But should we use AI here at all, and why?
The real challenge is not building more AI. It is building more trust, inclusion, and shared understanding, and ensuring AI serves humanity, not the other way around.
AI will not determine our future. The choices we make about where to deploy it, where to refuse it, and what costs we are willing to accept will. Responsibility for those choices lies with people and institutions, not with technology.
[1] See:
- Reuters (Indonesia): https://www.reuters.com/legal/litigation/indonesia-temporarily-blocks-access-grok-over-sexualised-images-2026-01-10
- Reuters (Malaysia): https://www.reuters.com/business/media-telecom/malaysia-restricts-access-grok-ai-backlash-over-sexualised-images-widens-2026-01-12/
- AP News: https://apnews.com/article/c7cb320327f259c4da35908e1269c225
[2] See

Comments
Post a Comment