First published June 14, updated Thursday, July 11, 2019
Community directors know better than most that not-for-profits must do as much as they can with the limited resources they’ve got.
This means properly measuring what you’re doing to see whether you’re having the impact you really want.
This month, ICDA hosted the Practical Impact conference, bringing together top talent in data, evaluation and impact measurement from Australia, New Zealand and the United States. The event took place at Our Community House, ICDA’s new base in North Melbourne, and the new hub for the social sector.
Delegates toured the facility that raises the bar for community co-working and data science sharing, all in an art- and plant-filled space designed for up to 400 workers.
ICDA’s diversity and leadership director, Kylie Cirak, kicked off the event by launching the ICDA Spotlight Report: NFP Impact and Data.
The eight-page report – online now – summarises the findings on data and impact from ICDA’s national survey of nearly 1900 community leaders, and finds that the sector still has a long way to go. You can read the Spotlight report here.
Setting the tone for the event, keynote speaker and eminent economist Nicholas Gruen spilled the beans on the failings of successive governments.
He pointed to a cavernous gap between intention and outcomes of governments, and their frequent inability to pursue the most effective interventions even in light of overwhelming evidence.
As he arrestingly illustrated with a gruesome image of a foot suffering from poor circulation, "the arteries are willing, but the capillaries are weak".
Dr Gruen said organisations in the NFP sector must be prepared to engage in a more sophisticated, deeper response to the social challenges they are set up to address, and to build the argument for better evaluation.
He is credited with introducing the concept of a federal Evaluator-General to bridge the gap between evaluators and the people running government programs, a notion taken up and championed by the Labor Opposition during the recent federal election campaign.
Andrew Means, the founder of Data Analysts for Social Good and the Impact Lab, based in Chicago, addressed community directors at the Practical Impact conference one week into his Australian visit, hosted by Our Community.
What he’d seen during the week – in which he coached not-for-profits and funders to do better with data – had shown him that Australia was no slouch in taking up evaluation and impact measurement methods, yet faced many of the same hurdles as organisations in the US and elsewhere.
Kicking off with the question: “Why do our organisations exist?”, Mr Means quickly narrowed delegates’ focus with this answer: “We exist to create change”.
He stressed that data had little value without purpose, saying that while there was no lack of data in the sector, there was a significant lack of insight into how to use it effectively to drive change.
Mr Means said not-for-profits and others operated in a “change economy” in which buyers and sellers were able to trade in that rare and valuable commodity.
A key to understanding change, he said, was being able to tell a truthful story about the change any organisation was making. Leaders needed to be able to back up promises with facts and data that demonstrated impact, he said.
Mr Means warned of the trap some organisations fell into of wanting to always tell the “best” stories, buying into the illusion that they were generating a much bigger impact than they really were, while adding to the problem of funding flowing to ineffective programs.
“We look for the most extreme version of our stories and then we tell them. We’re caught in a stories ‘arms race’ [in which] the only way we can get funding is to tell more and more outrageous stories,” Mr Means told delegates.
Instead, evidence of impact must be realistic, unarguable, standardised and fair, he said.
The bottom line: “The money should go to where it’s going to do the most good.”
And that, he said, could mean making difficult decisions about your future, asking, perhaps, whether your clients and members might be better served by other programs.
The straight-talking Robyn Mildon, executive director of the Centre for Evidence and Implementation, warned delegates about bad research that only told you what you wanted to hear and did little to advance your cause.
She urged community directors to cite and use good research, including work that may have already been conducted, to demonstrate the effectiveness of an intervention.
“Your big win is to cite and use good research upfront,” Dr Mildon told delegates.
She said not-for-profits needed to be strategic with their dollars and time and avoid collecting vast amounts of unnecessary data or leaping into full-blown evaluations.
They should be realistic about the kind of impact measures small budgets allowed. The key to success, she said, was “Better data, less of it”.
Dr Mildon echoed the sentiments of Andrew Means, suggesting that organisations must be prepared to follow their own findings, even to the extent of abandoning programs that aren’t working.
Our Community’s Data Lab leader, Sarah Barker, outlined how ICDA’s parent company, Our Community, was doing its bit to help community organisations become data savvy.
She said many organisations were already doing a lot with data, but not necessarily in the most systematic and powerful way.
“It's really about trying to develop a more detailed understanding of what’s going on.”
Ms Barker said Our Community was fine-tuning a framework designed to assist organisations to better understand their data needs and capacities, but organisations could start by reflecting on their existing work, and by simply asking important questions such as “How many volunteers do we have”, or “Why have people stopped donating?”
Methodist Mission Southern’s Laura Black, in an entertaining address, described how the Dunedin-based organisation had become data driven, to the point where several other New Zealand organisations were now seeking to duplicate its methods.
She was upfront about the challenges an organisation formed in 1890 faced through the upheaval in its social service and education programs across Otago and Southland.
Ms Black accepted that the worst early failures were “self-inflicted mistakes”, such as an early reliance on paper-based reporting, and large staff losses that resulted from workers struggling to cope with new expectations related to outcomes assessments.
As a result of numerous changes to its approach, the service was now considered a model for others, while the shift to better measurement had coincided with revenue more than doubling in four years, as the organisation introduced new services.
Ms Black highlighted three factors that she said were at the heart of the shift: “attitude”, “personnel” and “tools”.
She said “attitude” referred to the organisation’s focus on how its clients could benefit, independent of service providers; “personnel” referred to the people the organisation regarded as most important when it came to implementing change, and to the strategy of “leading with reassurance”; while “tools” (mostly feedback-informed treatment) were finetuned to be employed more quickly and effectively.
And her tips for “avoiding the gravel rash we experienced”?
Have tenacity, clarity and patience, and don’t over-rely on intuition.
Community Hubs chief Sonja Hood, who heads a network of welcoming places for migrants and refugee families in 71 locations across four states, said evaluation didn’t have to be overly complicated.
In fact, her view was that collecting too much data could be counterproductive. That’s why Community Hubs compiled multiple surveys into a single smaller survey that now boasts a completion rate of close to 100%.
Dr Hood, a member of the Community Directors Council, which advises ICDA on its direction, said that for too many not-for-profits, data was just "something we aspire to".
“We spend a lot of time tying ourselves in little knots trying to understand the difference between outputs, outcomes and impacts. I mix them up all the time. And we feel a great sense of inadequacy. But I’m here to tell you that you know a lot more than you think you do.”
Economist Nicholas Gruen is one of Australia's top thinkers about evaluation in government. Pictures: Matthew Schulz
Delegates were treated to some of the foremost minds in the country and beyond in understanding data.
Andrew Means stressed the need for organisations to focus on their mission when considering data.
Robyn Mildon from the Centre of Evidence and Implementation had some practical advice for community directors.
ICDA’s diversity and leadership director Kylie Cirak emceed the event.
Laura Black told a warts and all story of transformation of Methodist Mission Southern that's come up trumps.
Sonja Hood of Community Hubs believes not-for-profits are much more capable when it comes to data than they give themselves credit for.
Jocelyn Bignold of Macauley Services for Women has learnt to adapt to the changing expectations of funders and the lessons that come from evaluators.
She said data was not “a given good, just because it’s there”, instead it needed to be put to work, whether it was a program targeting marginalised women, or pushing community hubs into new areas.
Dr Hood’s on-the-ground experience had also reinforced the fact that “stories are a data set too” (and often a largely under-estimated one).
She said data sets also included timetables, time sheets and attendance lists, which could help in answering a crucial basic question: “What do I need to know?”
She stressed that collecting data needn’t be a “compliance exercise” to satisfy funders, urging delegates to “push back against stupid requests from governments”.
Rounding out the event were Jaime de Loma-Osorio Ricon of Banksia Gardens Community Services and McAuley Community Services for Women’s Jocelyn Bignold, who spoke about how evaluation had supercharged their work with vulnerable children and women.
Ms Bignold alerted delegates to the need to be open to the evaluation “mavericks” it had welcomed into the organisation as part of a push to better understand data in an organisation helping at-risk women deal with domestic violence and homelessness.
Those experts, she said, were able to employ a fresh perspective to better understand the evaluation needs of the organisation.
She described the process of starting from simple beginnings, recording information on a simple Access database, to now employing increasingly sophisticated models, professional evaluators and corporate-level consultants to support their practice, and to satisfy the demands of funders.
To nods around the room, Ms Bignold said it remained a challenge for McAuley to secure financial support to fund an evaluation for “a project that everyone loves, but no-one wants to fund”.
“Mostly, though, it’s all about the information that we want. We think we do a good job. Our clients tell us we do a good job, but is it really so?” Ms Bignold said.
Mr de Loma-Osorio Ricon said it was critical that organisations set aside time for the reflection that preceded good evaluation.
This approach had paid dividends for his organisation through its ground-breaking Project REAL (Reengagement in Education and Learning) targeting challenging students affected by trauma, abuse and neglect.
He said “spectacular” results were helping it to create capacity-building programs for schools to better work with those students, using sophisticated analysis to rate and track children daily across 30 categories, cross-referencing that information against schools’ and teachers’ data and feedback.
Mr de Loma Osorio Ricon said evaluation could be very expensive, and suggested that those thinking of taking the plunge shop around for the best options.
The benefits though, could be large.
“Evaluators can help you raise your head above the reeds and everyday reality, to look at what you’re doing … then you can start doing better informed work,” he said.
Mr de Loma Osorio Ricon said his organisation’s work across many schools was now extended to creating a “common rationale”, including common targets and collective impact across 15 partner schools in Melbourne’s north. A;ll of this, he said, began out of measuring impact.
Delegates will receive a link to download all presentations, plus look out for more reports and video interviews in the August 2019 edition of Community Directors Intelligence.
Tailored training programs can also be designed and delivered to meet your needs, location and budget. Learn more