Public Art: A Guide to Evaluation
Public Art: A Guide to Evalution, is the culmination of ixia’s work on public art evaluation as seen in the other items. Just published in January 2009, this is their most recent publication. Best used by professional evaluators, the authors assume a certain knowledge base and novices may be challenged by this. The guide has four main sections: one on why evaluation is valuable, the next on ixia’s Matrix tool, the third on ixia’s Personal Project Analysis tool, and the fourth on indicators. In making the case for conducting evaluation, the authors offer some reasons for evaluation and why people often resist it. They also show evaluation as part of a learning cycle with a clarifying graphic. They then define two types of evaluation: outcome/summative evaluation – which includes identifying outputs, outcomes, and impacts – and process/formative evaluation. The second section is designed around ixia’s Matrix tools (which is also presented in Evaluation Toolkit, Research on Public Art: Assessing Impact and Quality, and Public Art Evalution Tools: Matrix and Personal Project Analysis); the guide walks readers through the Matrix. The Matrix encourages collaborators to have facilitated conversations in order to better understand their values (both shared and contrasting) as they embark on public art collaborations. The Matrix has four main sections and asks users to score the project on each subset dimension: Artistic Values [visual/aesthetic enjoyment, design quality, social activation, innovation/risk, host participation, challenge/critical debate] Social Values [community development, poverty and social inclusion, health and well being, crime and safety, interpersonal development, travel/access, and skills acquisition] Environmental Values [vegetation and wildlife, physical environment improvement, conservation, pollution and waste management-air, water and ground quality, and climate change and energy], Economic Values [marketing/place identity, regeneration, tourism, economic investment and output, resource use and recycling, education, employment, project management/sustainability, and value for money]. There is a nice, brief discussion of each of the “values” on page nine. The third section is about ixia’s Personal Project Analysis tool, used for process evaluation of a particular project or program. It is a guide that walks users through the purpose of the tool, when and how to use it, and its objective – to create a dynamic internal picture. It is two pages long and asks users to score, on a scale of zero to five, their views of the project they wish to assess on various dimension: importance, enjoyment, difficulty, visibility, control, initiation, stress, time adequacy, outcome, self-identity, others’ views, value congruency, progress, risk, absorption, competence, autonomy, and legacy. Users are then asked to score their support for the project. Next, users are asked a series of general, open-ended questions about the project. Finally, users answer open-ended questions about the context of the project. The last main section of this evaluation guide discusses indicators. While other ixia publications had little to say on the topic, this guide defines indicators, presents challenges with selecting them, and gives advice on identifying them. The authors delve into this discussion, though they do not offer examples. They direct users to government performance indicators as a baseline source, but that is where the discussion ends. As the Matrix tool is designed around “values” that seem quite similar to indicators, there may be an issue of British versus American terminology at hand. While ixia seems to mention indicators and then drops the topic, the authors may use indicators and “values” interchangeably, but American readers may be left guessing. At the conclusion of this guide is a nice summary graphic/table that presents the various stages of a project – project development, start of project, project duration, and project completion – and the elements that should be involved at each stage – the what, when, who and how. Appendix 1 presents explanatory notes on the four values (and their subsets) in the Matrix tool. The appendix elaborates on each subset item with brief definitions. Appendix 2 is a table on some ways of collecting qualitative evidence – such as questionnaires, interviews, discussions, media, diaries, participatory techniques, observation, and others – and it briefly offers the advantages and disadvantages of each. It is a handy summary of qualitative design approaches. Appendix 3 is a lengthy section on government performance indicators. As ixia is based in England, this section may offer some useful ideas about indicators to American readers, but some of it may not be relevant as the governmental information available differs by country. Ixia, the public art think tank, is a registered charity and regularly funded organization of Arts Council England. Its aim is to provide an independent and objective view of the factors that affect the quality of artists’ work in the public realm by undertaking research, enabling debate and effecting change at a strategic level. Its corporate strategies are the pursuit of objectivity and the building and transfer of knowledge and competencies. Ixia sees the spectrum of artistic practice represented by the term public art as encompassing art commissioned as a response to the notion of place, art commissioned as part of the designed environment and process-based artistic practice that does not rely on the production of an art object. It defines public art as a process of engaging artists’ creative ideas in the public realm. In 2004, ixia commissioned OPENspace, the research center for inclusive access to outdoor environments based at the Edinburgh College of Art and Heriot-Watt University, to research ways of evaluating public art. Much of the content of this guide has been informed by that academic research. However, the guide’s emphasis and content has also been shaped by feedback from ixia’s Evaluation Seminars and fieldwork conducted by ixia and consultants who have used its Evaluation Toolkit.