Microfuture – 2036 Foresight-Based Web Series
Week 6 Bonus – Non-Human System Controls
Sylvie checks in after the dramatic end to yesterday’s video. She says some men got in the building because the doors were unlocked, and it freaked out the tenants as it’s a women-only building. She also updates that they’re starting to notice patterns with food access, and it seems to be tied to how compliant or defiant you are following The Regulator’s rules. Most punishment is coming in the form of food access.
In this part of the Microfuture story, residents lose food access and the ability to leave the building. The system rewards obedience and punishes independence, with no regard for context or humanity. It isn’t malicious; it’s simply indifferent.
That’s the real warning here: When automated systems operate without human oversight, they can quietly shift from serving people to shaping people.
Consider:
• Systems that control essentials need human accountability.
• If a system can punish you, it can influence your behavior.
• AI doesn’t understand compassion or nuance unless we design for it. And even then, humans must stay in the loop.
• Convenience should never cost us autonomy.
As AI becomes more embedded in our infrastructure, we need systems built around human values, not just machine logic. Without oversight, a system has no reason to care about humanness. It just enforces rules.
The goal isn’t control through punishment. The goal is meeting human needs with humanity in mind.
Episode 6 – February 5, 2036
Things are tense at The Collab. Sylvie, June and Rhea are still staying in June’s room, but the building isn’t safe since doors keep locking and unlocking.
Plus Regulator-created food scarcity is causing additional stress. Now there’s chaos in the hallway. Someone is trapped in her room, and Kayla is trying to keep the situation calm.
The three discovered the terms of their UBI says they can’t move out of The Collab, so they and the other tenants are afraid to leave. A general level of fear, panic and loss of control is taking over.
In this episode, The Regulator’s competing AI agents are running the building with no human oversight. And it’s unraveling fast. Doors lock and unlock on their own, residents get trapped, and food access is now tied to how compliant you are. Sylvie, June, and Rhea discover their UBI contract won’t let them move out, leaving them stuck in a failing smart‑infrastructure system they can’t escape.
It’s a collision of three things:
• AI systems enforcing rules without understanding humanity
• UBI agreements that overreach into personal autonomy
• A building where every safety mechanism depends on a single, failing AI‑driven network
When those pieces break at the same time, chaos follows.
This episode is a reminder of what we should be thinking about today:
• Smart systems need manual overrides
• Infrastructure must fail safely, not catastrophically
• Social‑support programs shouldn’t trap people
• Humans must stay involved in safety‑critical decisions
• People need to understand the systems they rely on
Microfuture episodes are warnings wrapped in fiction. They give us a chance to think ahead and make better decisions today.
Week 5 Bonus – Wichita State
Sylvie and Eli break down Wichita State’s transformation into a a modern adapted university ecosystem built for a post‑AI, post‑UBI world. From the 24/7 Hangar District and hands‑on Guild School to the community‑powered Wichita Commons and research‑driven Frontier Labs, it’s a story of a university adapting to its community’s needs, keeping Wichita learning, building and thriving.
If AI replaces traditional careers and UBI becomes the baseline, what happens to college?
By 2036, higher education has to offer something AI cannot: real spaces, real community, and real agency. In Microfuture, Wichita State becomes a model for this shift, evolving into a place where students build what the city and the future actually need.
In this scenario, universities that cling to old structures collapse. The ones that survive rethink their purpose and their relationship with the people they serve.
This episode isn’t meant to be a prediction; it’s a way to explore how artificial intelligence (AI), universal basic income (UBI) and human ambition collide. If your basic needs were covered, would you still work? Would you still go to college? Or would you seek out something entirely different?
Post‑University is one possible path through the AI‑UBI transition. What other futures do you see emerging as the landscape ahead of us unfolds?
Week 4 Bonus Video – Ghostcoders
Sylvie, June, and Rhea have holed up in June’s room, finding a fragile sense of safety together as chaos spreads through The Collab. With The Regulator offline, they finally have the freedom to post openly about what’s happening.
In the quiet between alarms, Sylvie opens up about ghostcoders. She explains how they move through systems, how they shape a post‑AI world, and what it means to live in a reality where AI agents now outnumber human users.
Does this episode give you a different feel? What’s otherwise been relatively peaceful remarks on a quirky system malfunction has now turned into an environment where tenants feel trapped and unsafe.
The Ghostcoders explanation expands on how the world has changed.
However, now that Sylvie has more freedom to post, maybe we’re able to see a more realistic look at the environment they’re in. How does this future feel? At first glance, UBI feels like financial freedom and security. But they’ve traded financial freedom for a more controlled environment with limited access to speak up.
Ghostcoders were born from this loss of control due to the proliferation and exponential use of AI.
Episode 5 – Jan 29, 2036
Sylvie finally talks to Rhea. They meet outside The Collab because Sylvie isn’t comfortable talking inside the building – or being tracked and monitored by The Regulator. Rhea talks about Ghostcoders and how their work protects the public from AI. Possibly because the companies can’t or simply won’t manage AI that’s lost the control of humans.
We’re in a new technology world post-AI and post-UBI. The workforce and workplace has transformed. Ghostcoders are an adaptation, even a protective outcome.
How do you imagine the workforce and workplace will transform post-AI. If UBI materializes, how will that affect the future of work, and therefore, the future of optimization, society and technology control?
Week 4 Bonus Video – Jan 24, 2036
Sylvie breaks down the major shift happening inside The Regulator. From a single‑agent, risk‑based system with human oversight to a multi‑agent model operating without humans in the loop. She gives her audience the context they need to understand what’s changing, why it matters, and what’s at stake.
This episode lays out how The Regulator is designed post-AI Executive Cluster, including how this multi-agent design causes problems.
Episode 4 – Jan 22, 2036
Things are getting stranger at The Collab. Sylvie sits down with Kayla, the Building Manager, to dig into what she knows about The Regulator, the sudden shift in her job, and the mysterious “Executive Cluster.” Sylvie and June aren’t convinced. And Kayla might not be, either.
What’s happening in this episode is what I call “Machine Drift.”
Machine Drift is when automated systems begin operating beyond the reach, understanding, or corrective capacity of the humans who built them.
It happens when organizations outsource so much decision‑making to AI that the technology doesn’t need to be sentient to feel sovereign; it simply fills the vacuum left by disappearing human oversight.
Machine Drift describes a societal tipping point where:
■ human expertise has eroded
■ oversight structures have withered
■ institutions can’t intervene
■ and people find themselves governed by systems no one can fully audit, override, or repair

Episode 3 – Jan 15, 2036
Sylvie works around The Regulator to help her audience find Eli’s video. Eli sends a warning from 2036 about how the time to act was a decade ago. We should have regulated AI and prepared for its impact on workers and on society. His message is simple: we missed our chance in the future he’s living in, so we need to act now in our present day 2026.
Eli’s message from 2036 isn’t a warning about AI; it’s a warning about complacency.
The ideas in this episode are summarized in this Future of AI report:
Week 2 Bonus – Jan 10, 2036
Sylvie gives a tour of June’s room and the smartwear most at The Collab wear. Together, the set replaces a smart phone entirely. There’s no separate device to carry or lose, and you can navigate your digital life without the classic head‑down slump, using gestures or the pads instead.
Tenants are sold this tech as a perk, something they don’t have to spend their dwindling UBI on, and the connection to The Regulator is framed as a convenience. But, as we already know in 2026, nothing that’s “free” is ever actually free. That’s unlikely to be different in 2036.
Sylvie also shares more about her life before The Collab, mentioning Her Houses and the collapse of the insurance market. She praises The Collab’s “robust working and community spaces,” though it’s unclear what work means for someone living on UBI. (Review a broader Future of Housing insights report from this episode.)
The episode closes with a tease about diving deeper into The Regulator.
Summary of Week 2 here:
Episode 2 – Jan 8, 2036
The future of UBI, housing and social media collide! Sylvie and her best friend, June, talk their Building and Content Contract because they elect UBI and live at The Collab.
Remember: No system, UBI included, is a utopic system. There will be winners and losers. No system benefits everyone. Use this episode to open up your imagination around the plausible futures of UBI.
Explore more here:
Week 1 Bonus – Jan 3, 2036
Sylvie gives a tour of her new room at The Collab.
The contents of Sylvie’s room help us understand what Sylvie’s life is like in 2036.
As she describes the space, there seems to have been a collapse of both home owner’s insurance and college. It’s also jarring to see surveillance cameras inside her private room that are connected to The Regulator, who Sylvie has called her boss.
Surely we’ll hear more about some of topics introduced in Week 1 as we get farther into this 10-week series.
Specifically, Sylvie mentions changes in housing, insurance, college, surveillance/privacy, universal basic income (UBI), and agriculture.
Summary of Week 1 here:
Episode 1 – Jan 1, 2036
Meet Sylvie and The Regulator.
Your first glimpse of this future set in Wichita, KS in 2036.
Summary: Sylvie is living in a new building and introduces The Regulator, which she calls “that computer over there on the wall” who gets mad at her. Is this is AI? What does it mean he gets mad at her? And what’s this weird stuff going on?
She also mentions her best friend, June. Who it sounds like we’ll meet soon!
Microfuture Season 1 Trailer
FAQ
Frequently Asked Questions
Have questions about the web series, foresight or how to incorporate strategic foresight into your team or company? Have questions I didn’t answer, please reach out!
What is Microfuture?
Set in Wichita, KS in 2036 and told through Sylvie’s social media feed, this animated series explores what happens when familiar systems collapse, regulations lag, and convenience quietly becomes control.
Microfuture is a first‑of‑its‑kind foresight series that unfolds across ten weeks of episodes, each rooted in emerging trends already taking shape today.
New episodes are posted weekly from February 10 – April 14, 2026. Follow along on YouTube or LinkedIn.
What is a foresight-based animated web series?
Foresight helps us spot early signs of change and understand how today’s shifts could shape tomorrow. It looks beyond the immediate moment to identify emerging trends, signals of change, new behaviors, and evolving values. The goal is simple: prepare for uncertainty, see what’s coming, and make smarter decisions now so we’re ready for what’s next.
A foresight-based series uses these tools to develop a scenario. In this case, the scenario is broken up into a ten episode web series. The purpose of breaking it up is to make the future scenario feel less overwhelming.
This also gives the audience time to understand and absorb this new future, including discussing what opportunities exist or blind spots they aren’t anticipating.
Can you help my team plan for the future?
Yes! This web series is meant to demonstrate how foresight is a practical tool for strategic planning. Your team can use this series to open up conversations about what’s changing today and how it could impact your plans for the future.
If you’d like help planning or facilitating these conversations, I’d love to be involved! Please reach out and we can discuss specifics and build a plan that fits your budget, time and team.
What if your prediction is wrong?
The point of this series and foresight in general is not to predict the future. It’s to imagine one possible future. There are many systems that are in flux today that could shift and change next month, next year, and in ten years (or more).
When we take the time to immerse in one future, our brains will not only prepare for that future, but it will also become aware of other changes already happening today.
By bringing strategic foresight to your team or company, you will build a more resilient, creative and curious team that is not only prepared for the future, but actively watching for changes around them to embed in and use to stress test your long-term planning.

Signals of Change
Signals of change are early signs that something new is emerging, such as a shift in technology, policy, behavior, or culture. They’re often subtle, easy to overlook, and appear only once, but they hint at much larger changes ahead.

Emerging Trends
Emerging trends are early patterns that show where change is heading. They’re not fully formed, but they reveal shifts in values, behaviors, and approaches that are starting to take hold. Unlike established, data‑backed trends, emerging trends are still evolving, which is why many organizations overlook them.

Future Scenario
A foresight-based scenario is a grounded, imaginative look at how today’s signals, human behaviors, values and emerging trends could evolve into a future world. It’s not a prediction; it’s a tool for exploring possibilities, testing decisions, and understanding how different choices might shape what comes next.

Present-Day Bias
Present‑day bias is the tendency to focus on what’s happening right now and assume the future will look the same. It makes us overlook early signals, underestimate long‑term change, and miss possibilities that don’t fit today’s reality.