We live in an era where artificial intelligence is no longer an abstract future: it sits at the crossroads of capital, power, hardware and policy. On January 15, 2026, five stories - from a technical trade group's sober assessment of energy needs for AI data centres to a major vendor's blunt warning about enterprise "buyer's remorse," a continental human-rights body's briefing on algorithmic discrimination, a curious local experiment to build an "Imam AI" app in Kazakhstan, and a large tranche of EU AI investment - together sketch the forces shaping 2026. These stories are not isolated headlines they are connected chapters in one broader narrative: the AI ecosystem is maturing, and maturity brings overhead - literal kilowatts, human oversight, predictable capital, and regulatory accountability.
In this longform dispatch I'll summarize each story, draw practical implications for operators and investors, and argue why the dominant questions for this phase of AI are no longer purely "can we build models?" but "how do we power them, govern them, measure harm, and deploy them responsibly at scale?" Each section is written with an op-ed sensibility: evidence-backed, opinionated, and aimed at helping technologists, business leaders, policymakers and informed readers turn news into advantage.
Table of contentsThe energy problem: why gas - and planning - matter for AI's growing appetite
AI buyer's remorse: enterprise hardware mistakes and the shift to private AI inference-first architectures
Algorithmic discrimination in Europe: policy gaps, rights, and where governance must go next
Imam AI: the cultural contours and risks of AI in religious guidance
EU's 307M AI investment: what it funds, what it signals, and who benefits
Cross-cutting themes - from power to policy
Practical checklist for leaders: what to do this quarter
Conclusion: building durable AI franchises in 2026
A comprehensive report from the International Gas Union IGU frames an essential reality: data centres - the physical lungs of modern AI - are becoming the new "industrial load" as AI workloads proliferate. Electricity consumption from data centres is projected to roughly double to between 800-1,000 TWh by 2030, driven primarily by large-scale model training and massively expanded inference fleets. While renewables will provide a growing share of that electricity, their variability collides with data centres' flat, 24/7 demand profile and need for dispatchable capacity. The IGU argues that this mismatch makes natural gas and other dispatchable resources an important component of realistic energy planning for the AI economy.