Charles Darwin University

CDU eSpace
Institutional Repository

 
CDU Staff and Student only
 

Adapting Evaluation Materials for Remote Indigenous Communities and Low-Literacy Participants

Guenther, John Ch. and Boonstra, Mark (2009). Adapting Evaluation Materials for Remote Indigenous Communities and Low-Literacy Participants. In: 8th ISPCAN Asia-Pacific Regional Conference on Child Abuse and Neglect Incorporating the 12th Australiasian Conference on Child Abuse and Neglect, Perth, WA, 15-18 November 2009.

Document type: Conference Paper
Citation counts: Google Scholar Search Google Scholar

Author Guenther, John Ch.
Boonstra, Mark
Title Adapting Evaluation Materials for Remote Indigenous Communities and Low-Literacy Participants
Conference Name 8th ISPCAN Asia-Pacific Regional Conference on Child Abuse and Neglect Incorporating the 12th Australiasian Conference on Child Abuse and Neglect
Conference Location Perth, WA
Conference Dates 15-18 November 2009
Conference Publication Title APCCAN 2009 Conference Proceedings
Place of Publication Australia
Publisher Cat Conatus
Publication Year 2009
Total Pages 15
HERDC Category E2 - Conference Publication - Full written paper, non refereed proceedings (internal)
Abstract Funders (quite rightly) expect that the money invested in social programs will produce results. Further and increasingly, they are expecting that evidence of these outcomes is produced. However, these outcomes, sometimes described in terms of performance indicators, are often prescribed from a western frame of reference and consequently, there is an expectation that evidence is reported within the same frame of reference. This of course is not a problem when the funder, the service providers and the service users all have the same frame of reference, but what might happen when service users and service providers have a different frame of reference from the funder? Put another way, what happens when the social norms, values and expectations of service users differ so much from those of the funder that performance indicators determined by a funder are effectively meaningless to the program’s intended targets. Unfortunately what is likely to happen is that the outcomes
expected of a program will not be achieved. This may not be because the program is not working but 1) because the program objectives are reinterpreted to suit the local context or 2) the instruments used to measure outcomes are inadequate for the task.

The latter issue is the challenge faced by Families and Schools Together (FAST), a program designed to build capacity and resilience in families, strengthen family relationships and build connections between families and communities. FAST is an international program with a strong evidence base. Its emphasis on evaluation is a strong selling point for funders, who are seeking to make a difference. FAST has now for a number of years been delivered in remote communities of the Northern Territory. However, what FAST found was that in remote contexts the evaluation tools simply did not work. The language of the psychometric tools and the abstract concepts used were complex and at odds with the way Indigenous people would describe things. They felt that a different tool was required — one that reflected both the outcomes anticipated by the evidence base, and one that also reflected the language and frame of reference or worldview of local Indigenous people.

This paper considers approaches to evaluation in this context. It also describes a process used by evaluators and program staff to produce a tool that met the dual needs of the funder but that also represented the outcomes as they were perceived by participants.
 
Versions
Version Filter Type
Access Statistics: 59 Abstract Views  -  Detailed Statistics
Created: Mon, 29 Mar 2010, 17:37:20 CST by Sarena Wegener