Please select the name from the list. If the name is not there, means it is not connected with a GND -ID?
GND: 133179672
Click on the author name for her/his data, if available
List of co-authors associated with the respective author. The font size represents the frequency of co-authorship.
Click on a term to reduce result list
The result list below will be reduced to the selected search terms. The terms are generated from the titles, abstracts and STW thesaurus of publications by the respective author.
b
Match by:
Sort by:
Records:
The information on the author is retrieved from: Entity Facts (by DNB = German National Library data service), DBPedia and Wikidata
Michael Quinn Patton
Alternative spellings: Michael Q. Patton Michael Quinn Patton Michael Quinn-Patton
B:1945 Biblio: Independant consultant with more than 40 years' experience conducting applied research and program evaluations Place of Activity: Minnesota
Michael Quinn Patton (born 1945) is an independent organizational development and program evaluation consultant, and former president of the American Evaluation Association. He is the founder and director of Utilization-Focused Evaluation. After receiving his doctorate in sociology from the University of Wisconsin–Madison, he spent 18 years on the faculty of the University of Minnesota (1973–1991), including five years as Director of the Minnesota Center for Social Research and ten years with the Minnesota Extension Service. Patton has written many books on the art and science of program evaluation, including Utilization-Focused Evaluation (4th ed., 2008), in which he emphasizes the importance of designing evaluations to ensure their usefulness, rather than simply creating long reports that may never get read or never result in any practical changes. He has written about evaluation, and worked in the field beginning in the 1970s when evaluation in the non-profit sector was a relatively new development. In "Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use," Patton makes a convincing case that evaluation can also be useful when there is not a fixed model being improved (as in formative evaluation) or tested (as in summative evaluation). In cases where there is not yet a clear model, or where the environment is too complex and changing too fast for the model of practice ever to be fixed, developmental evaluators can be of great assistance by helping people articulate their hunches and hopes, do "vision-directed reality testing," tracking emergent and changing realities, and "feeding back meaningful findings in real time so that reality testing facilitates and supports the dynamics of innovation." (p. 7) This type of evaluation is particularly helpful in the context of social innovation, where "goals are emergent and changing rather than predetermined and fixed, time periods are fluid and forward-looking rather than artificially imposed by external deadlines, and the purposes are innovation, change, and learning rather than external accountability (summative evaluation) or getting ready for external accountability (formative evaluation)." (p. viii). Instead of evaluating a program to determine whether resources are being spent on what they're supposed to be spent on, developmental evaluation helps answer questions like, "Are we walking the talk? Are we being true to our vision? Are we dealing with reality? Are we connecting the dots between here-and-now reality and our vision? And how do we know? What are we observing that's different, that's emerging?" (p. 13). (Source: DBPedia)