# AI Weekly Issue #483: 100 Years From Now
AI Weekly's speculative series "100 Years From Now" imagines how society adapts to emerging technologies over a century. This week's installment explores a troubling scenario: what accountability looks like when the most powerful systems on Earth operate without clear responsibility chains.
The piece doesn't predict outcomes. Instead, it honestly speculates on consequences of choices we make today. As AI systems grow more autonomous and influential, tracing decisions back to individual humans becomes nearly impossible. A contract enforced by algorithm, a credit decision made by neural network, a medical diagnosis delivered by machine. Who bears responsibility when something goes wrong.
The core tension: powerful systems require accountability to function in society. Yet complexity and scale create what the series calls "the ghost in the contract," an absence where responsibility should exist. No person built it entirely. No person understands all its workings. No person can fully explain why it decided what it did.
The series frames this not as prediction but as honest speculation about where current choices lead. The implicit question: what systems do we build now that will force us to grapple with accountability gaps a hundred years ahead.