
The Times They Are A-Changin’
Posted on 21 May 2025
I was brought up by a Dylan-loving Boomer, marching against war and nuclear bombs, worried about…
Posted on 21 May 2025
By Adele Stowe-Lindner
I was brought up by a Dylan-loving Boomer, marching against war and nuclear bombs, worried about bad choices humans would make for sure—but those humans were usually the government.
I recall the first VHS video machine arriving at home (the only two videos we owned were The Blues Brothers and The Big Chill, but Saturday night trips to Blockbuster were thrilling). Even better, I remember our first family holiday with a giant video camera that had to rest on your shoulder—far too heavy for an eight-year-old.
Home phones (my youngest recently asked, in response to a medical form, “What’s a landline?”) became cordless, and the freedom to curl up in a bedroom chatting—while blocking anyone else from making or receiving calls—was delicious. My dad, a solicitor, suffered the time-wastage of waiting around courtrooms for hours just to hear sentencing results, so when the car phone arrived, it gave him the same sense of freedom that the cordless phone gave me. His clients already had phones in briefcases, but that’s another story.
Our first computer was a Commodore 64. It was big and sat in the middle of the house, where my poor mum wrote her master’s thesis, surrounded by children. It had Space Invaders.
Now, my parents sit in front of their laptops, FaceTime their grandchildren, and boss Siri around for everything from weather checks to directions. They had hybrid cars before me.
"What’s so seductive that we allow someone, somewhere, to know everything about us while we know nothing about them—or how they’ll use our information?"
All of this is to say—I don’t think those I know in my generation or my parents’ generation are particularly Luddite.
So I was alarmed when my own teenager accused me of being conservative and conspiracy-minded for voicing concerns about the privacy risks of large-scale AI. Being called conservative by your child is a painful parent-initiation ceremony that is not for the faint-hearted, an invitation to look in the mirror. It’s the moment you hum to yourself (if you had a Boomer parent), “get out of the way if you can’t lend a hand…”
I hope that government accountability for big corporations’ control of the web escapes partisan politics. Yet, traditional divides between free market advocates and social responsibility suggest the current crisis—who is responsible for what’s online?—is the perfect partisan battleground.
I fear the pact we’ve made—freely handing over our privacy and our attention for convenience, in a transaction that benefits someone else more than it benefits us. In the playground, that’s called unfair. Why aren’t we calling it out now? What’s so seductive that we allow someone, somewhere, to know everything about us while we know nothing about them—or how they’ll use our information?
Artificial intelligence has the potential for great good but also for incredible harm—to social cohesion, to individual rights, to privacy itself. As community organisations, we work directly with people, and we determine how we use the technology available to us. I don’t warn against using AI; in fact, I’ve written about its potential to increase NFP efficiency, freeing up time for the beneficiaries we serve. But I do warn against complacency.
A year ago we might have accused Boomers of lacking the digital literacy to see the risks but they put in the effort to learn and stay aware, partly because smart phones have kept people literate. My parents recently lectured me about AI risks in legal and medical contexts. Gen Z and Alpha, meanwhile, might have normalised the innovative possibilities of the tech but don’t always recognise the value of privacy in a democracy—or what they’re giving up for free. That leaves the responsibility to us, whoever us might be.
The online polarisation that abounds directly affects many of the boards and beneficiaries we work with at Community Directors. Misinformation, outrage-driven algorithms, and echo chambers make it harder for NFP boards to navigate complex issues with nuance—and harder to stay true to their mission. Beneficiaries suffer when debate turns to division, trust erodes, and decision-making is clouded by online noise. As leaders, we must cut through the polarisation, uphold critical thinking, and ensure that technology serves our missions—not the other way around. Ethical leadership has never been more urgent.
We are offered a binary by Mr Dylan: either get out of the way or lend a hand. But lending a hand should not mean following blindly.
We embrace technology, but somewhere along the way, we stopped questioning who controls it. AI isn’t the problem—human complacency is. If we teach kids coding, we must also teach them the value of privacy, the cost of convenience, and their rights and responsibilities in a democracy. The next revolution should be about informed choices. The times are changing—but that doesn’t have to happen without human agency. If we ask the right questions, we can shape use of AI in ways that are ethical and human-centred. That requires us, as community leaders and individuals, to step up and demand better.