A Case Study on the Use of Developmental Evaluation for Innovating: Navigating Uncertainty and Unpacking Complexity

Thumbnail Image
Lam, Chi Yan
Twitter , Systems Thinking , Evaluation , Social Media , Complexity Thinking , Classroom Assessment , Teacher Education , Program Evaluation , Assessment Pedagogy , Innovation , Development , Educational Technology
Developmental evaluation (Patton, 1994, 2011) is one of the latest approaches to be introduced into evaluation practice. It purports to support the development of social innovation by infusing evaluative thinking through collaboration between program clients and the developmental evaluator (Patton, 2011). In an attempt to build “practical knowledge” (Schwandt, 2008) about this emerging approach, this research seeks to investigate the capacity of developmental evaluation to support innovation. This thesis reports on a case study of the Assessment Pilot Initiative (API) where developmental evaluation was used to support the development of a novel approach to teacher education. Charged with a vision to innovate their own teaching practices and the learning of teacher candidates, the instructors of the case invited a developmental evaluator onboard in a yearlong collaboration. While the instructors, along with the developmental evaluator, were uncertain about the outcome of the initiative or how best to proceed, this engagement resulted in a novel adaptation of microblogging web technology (Twitter) that came to be piloted with a group of teacher candidates. This thesis presents an analysis of the development process and the contributions developmental evaluation made in enabling the development of the API. Such analysis is anchored in the records of the program development, and in the perspectives of the program clients and the developmental evaluator. Analyzing the program development records for developmental moments revealed certain trends and patterns that, when triangulated with interview data from program clients and with reflections from the developmental evaluator, provided intricate insights into how the development came about and of the contributions developmental evaluation made in this case. Development of API proceeded in a highly nonlinear, emergent process through six foci of development. Critical to addressing the uncertainty and complexity that might had otherwise inhibited development, developmental evaluation enabled a data-informed approach that lent a quality of responsiveness to the emergent, evolving nature of the initiative. The developmental evaluator was instrumental in identifying activities that helped make explicit values and assumptions underpinning the initiative and in structuring a learning framework to engage program clients in sense-making. The notion of design emerged from analysis as an important function of developmental evaluation. Implications of the findings are discussed.
External DOI