
Hyperlocal news outlets give communities back their voice
Posted on 13 Jun 2025
Local news initiatives are winning the loyalty of the grateful communities they serve, but they…
Posted on 21 May 2025
By Matthew Schulz, journalist, Institute of Community Directors Australia
Lenka Brazda, a youth team leader at Wombat Housing Support Services in North Melbourne, has been recognised as Australia’s Best Accidental IT Person for creating WomBot, an artificial intelligence-powered chatbot that helps young people at risk of homelessness.
Brazda, who has no formal background in technology, built the AI assistant to provide round-the-clock housing advice drawn from a carefully curated knowledge base.
She was recognised for the impressive effort at this month’s 2025 Infoxchange Australian Not-for-profit Technology Awards. (Read more about other winners here.)
WomBot has emerged as a practical tool helping users understand housing options, rights, and services—particularly outside business hours, when face-to-face support is limited.
Brazda’s award is a sector favourite, celebrating non-IT staff who have made a significant impact by improving their organisation’s digital systems. Judges commended Brazda for prioritising client needs at every step of the bot’s development.
Brazda in turn praised her managers for backing “what seemed like a very whacky idea at the time.”
“AI is not going away. It’s only gaining traction. Why should we be left behind?"
The idea for WomBot grew from Brazda’s interest in technology and her desire to improve early access to housing support.
“I love technology,” she said. “I don’t think we use it enough in this sector. I wanted a way for people to easily access information before they reach a crisis point.”
The chatbot’s development was iterative, beginning with an online referral form, followed by a scripted bot, and eventually evolving into the AI-powered version launched in October 2024. Since then, WomBot has been used by more than 500 people—entirely through organic traffic, without promotional campaigns.
Brazda said the slow rollout allowed the organisation to stress test the tool with real scenarios which helped improve the system. The organisation is now planning to scale up.
WomBot is underpinned by strict ethical “guardrails”, designed to protect vulnerable users and ensure the accuracy of its advice. The system is trained exclusively on internal documents produced by caseworkers and does not access the internet.
Brazda said there were early challenges with the system going off track.
“Sometimes it would hallucinate, and it was a bit naughty actually,” she said.
“So, I had to ‘tame’ it, so to speak, put in guardrails and think about the information we would allow it to say, and not to say.”
Both Wombat and WomBot follow the same ethical framework.
This was tested in one instance, in which a user asked the bot advice about “Cyclone Alfred”—a topic outside WomBot’s training data. While the bot was not programmed to respond to such a request, it provided basic safety advice.
“So the bot shouldn't really be answering that question, but the caveat is the bot does prioritise safety,” Brazda said. “So in order to prioritise safety, it said, look … I don't really have that information, but because I want you to be safe, here's how you stay safe in that situation.”
The WomBot project was made possible through a grant from the Lord Mayor’s Charitable Foundation. Wombat Housing pitched a prototype to demonstrate its potential, helping secure the funding.
Looking ahead, Brazda hopes to scale the project across the broader community services sector and is actively seeking funding and collaboration opportunities.
“I’d love the bot to be upscaled for the whole community sector, so that is something we’re looking into,” she said. “We’re looking for funding partners and partnering agencies as well.”
Brazda has a message for other not-for-profit (NFP) organisations considering experimenting with digital solutions: “Start small. It’s what we did.”
Wombat adopted an “action-learning framework” during the pilot phase, trialling different methods to create the “building blocks” of the tool, which were then incorporated into the model. Testing also provided great insights into what worked, and what didn’t.
She said the sector must not fear technology.
“AI is not going away. It’s only gaining traction. Why should we be left behind? I think sometimes we're paralysed by the fear of technology and we're so worried about the vulnerable cohort that we work with, but they deserve an excellent service as well.”
“Why wouldn’t we use tools that help our workers and the people we support?”