Keywords

1 Introduction

This paper aims to contribute to new ways of exploring human-computer interaction by positioning an HCI Case Study within an artistic environment, thus transforming the engagement experience, and generating new knowledge that can be applied across disciplines. By using an art exhibition as a platform for the Case Study instead of confining it solely within a laboratory or gallery environment, certain constraints are lifted and data more specific to the research can be collected that aims to reveal new behavioural patterns. There are some tendencies that can be clearly identified and improved upon, and others that fall into a more exploratory vein. The scope ranges from working with emerging art practitioners, their early work, and experimental spaces; to established practitioners in later states of their work, in more finished spaces. This exploration sets the groundwork for a second Case Study as indicated in the methodology section (Fig. 1). In revealing the demands of art designed, prototyped and evaluated within an HCI context, the authors have been able to test an articulated methodology for experimental making and exhibiting, which resulted in engaging an audience and evaluating their experience at the prototype phase of the work of emerging practitioners. In collecting, collating and analysing the data, and then presenting it here, the authors are aiming to contribute to new techniques in evaluating interactive systems through staging experimental curatorial interventions away from the laboratory in socially engaged spaces.

The curatorial intervention discussed in this paper is Denouement, the first of two experimental interventions into curating and evaluating digital interactive art. This study is situated within a larger inquiry of detecting and understanding human activity regarding human-computer interaction. The methodology and engagement frameworks sit within a practice-based research approach to creative practice. Here, art and technology lead the inquiry, highlighting the importance of HCI methods such as evaluation, reflection and iterative approaches to both artistic practice and the refinement of the curatorial appreciative system. The research process is then situated within the scope of HCI interactive scenarios and the experimental user interface. These experimental interventions contribute to a PhD inquiry identifying emergent techniques for evaluating participation in interactive art and which variables might contribute to the audiences experience of it.

2 Process

The process to be employed for the case studies will be practice-based, drawing upon the action research approach made famous by Lucy Suchman in the 1980s and relating to the way that humans and machines communicate with each other [9]. Relevant interdisciplinary models for engagement include E.T. Stringer’s approach into educational action research using terminology like LOOK \(\rightarrow \) THINK \(\rightarrow \) ACT. His work in 2003 is based in social reform and draws on Kurt Lewin’s Spiral of Steps model from 1947. Here Stringer attempts to depict comparative research on the conditions and effects of various forms of social action (such as art), and research leading to social action (art research) [8]. Social scientist Richard Johnson revised Lewin’s Spiral model in 2008, allowing for responsive engagement and utilizing words such as UNFREEZING \(\rightarrow \) CHANGING \(\rightarrow \) REFREEZING [7]. As recently as 2006–2010, artists and social scientists have responded to a call for more distinct language frameworks describing how audiences experience interactive art specifically within the context of HCI. As detailed by Edmonds in [3], Brigid Costello developed what she calls a ‘pleasure framework’ for play. This taxonomy articulates 13 different descriptions of pleasure that people experience when engaging with gamified or interactive art forms. Dialogue around descriptions of exploration, competition, captivation, sympathy, camaraderie, and subversion emerged, to name a few. In active research, Zafer Bilda has developed a model of engagement, specifying interaction modes and phases that a user might move through when experiencing interactive art. Modes are described with words like unintended, deliberate, intended/in control, intended/uncertain, and unexpected; where phases are described in stages of learning, adaptation, deeper understanding and anticipation. In developing interactive systems for expert users rather than general audiences, Andrew Johnston has identified language around digital music, namely engagement that is instrumental (the musician plays the system), ornamental (the system adds something to the original sound), and conversational (where the expert and the system respond to each other) [3].

Around the same time, a specifically curatorial take on these approaches is outlined in Lizzie Muller’s close reading and application of Donald Schön’s reflection-in-action approach to curating [5]. She developed Beta_space, a public lab for testing interactive art systems and processes at the Powerhouse Museum, Sydney, between 2004–2011, in fact, much of the foundation work in the above frameworks were also tested in Beta_space. In learning to curate under Muller at the Beta_space public laboratory, Turnbull Tillman is a product of Muller’s experimental approach to curating interactive art which utilises HCI methods in design and implementation via her documented process of designing \(\rightarrow \) experimenting \(\rightarrow \) reflecting \(\rightarrow \) prototyping \(\rightarrow \) iterating \(\rightarrow \) publishing [5]. The experimental curatorial frameworks established by Muller and experimented with in Beta_space are what the researchers are working from, extending on and experimenting with here.

In measuring the experience of interactive art across disciplines and groups, conceptual development takes place with the curator offering a brief and the practitioners responding to it. This is not to say the curator is the catalyst of an artist’s creative idea or a technologist’s animation of that idea, but they are definitely becoming a participant in new media and interactive practice. In meeting with, evaluating, iterating and exhibiting prototypes with an emerging or established practitioner, this will feed the reflective curatorial process, as will the evaluation of audience experience. Each of the below steps in the curatorial, exhibition and evaluation process represents the creative and professional progression of Turnbull Tillman’s practice-based research, incorporating the design functions of art experience through human computer interaction [1].

Fig. 1.
figure 1

New Media Curation research methodology/process. (2015)

Where detailed accounts of the theoretical and philosophical frameworks for this study can be found in earlier publications by these authors [10], this paper will present the pragmatic results of the examination of responsive systems, where media artists, curators and audience members engage critical and creative spaces via speculative design, experience and evaluation, as with [6]. Reflection on the part of researchers is presented here, in analysis of the data collected. The exhibition for examination was produced by PhD researcher Turnbull Tillman through research initiative New Media Curation (NMC)Footnote 1. Finally, in working through the reflection-in-action practice based research methodology (Fig. 1), this study is an opportunity for Turnbull Tillman to reflect on the way her appreciative system has developed over 8 years of independant curation, taking note of any behaviors or critical language that may have emerged around the audience’s experience of interactive art, as her practice developedFootnote 2, The methodology for presentation is well situated in the Creative Robotics LabFootnote 3, where the author’s studies are based on responsive systems and human interaction.

3 Results from Intervention #1: Denouement

3.1 Intervention #1 Exploratory: Denouement

SOMA3412 and Musify+Gamify (26 May–6 June 2015) www.musifygamify.info

Denouement is a curated set of gamified film clips created by emergent practitioners, namely the students of undergraduate media arts course SOMA3412Footnote 4. This curatorial intervention was staged from February–May 2015 and its design occurred in 4 stages:

  1. 1.

    Working with the students in SOMA3412 in Semester 1 of the 2015 school year to bring viable ideation to a set of working prototypes suitable for exhibition;

  2. 2.

    Securing partners, exhibition space, and ethics approval to promote and support a study that disrupts the audience’s experience of interactive art in a public space;

  3. 3.

    Staging the exhibition and performing evaluation on willing audience participants; and

  4. 4.

    Producing a reflective practice exercise, as per Muller and Schön.

3.2 Methodology (Stages 1 & 2)

Turnbull Tillman worked with Instructor Tom Ellard and the students from SOMA3412 to develop prototype artworks. This happened by way of a preliminary meeting between Turnbull Tillman and Ellard, a pitch to the students during lecture followed by a call to participate as an extracurricular activity (as in their grade in the course would not be affected), and visits to the media studios to engage in progress checks on the developing artwork in both ideation and functionality. She also worked on securing a site and exhibition to experiment within. As a part of the VIVID Music Festival, fellow media arts colleagues Oliver Bown and Lian Loke were curating an exhibition at the Seymour Centre Theatre called Musify+GamifyFootnote 5. NMC was hired to produce the exhibition component of the project, and after surveying the site, Turnbull Tillman became aware of an outdoor screen and courtyard that might be used for displaying the prototype interactive works for the curatorial intervention. With all of the other interactive artworks and the performances situated inside, curators Bown and Loke agreed that the courtyard needed an attractor [3] and were amenable to the proposed intervention.

3.3 The Artworks and Designed Engagement Experience (Stage 3)

Where 10–12 students showed interest in exhibiting prototype works, only two prototype works from the SOMA3412 class were selected to exhibit in this experimental setting: Puppet Boyfriend, by Lauren Wenham, and; Apollo Vs/..., by Seunghyun Kim. This was largely because these were the only two students who followed through on their interest to exhibit and test their prototypes, which were both robust enough for exhibition with the public, and sophisticated enough in both ideation and functionality that they could engage that public. Wenham’s work consisted of a series of sonified and animated photographs that the audience engaged with through a track-ball mouse. By moving the track-ball and clicking the mouse buttons, a simple switch patch created in IsadoraV2 gave the audience control over Lauren’s Puppet Boyfriend for as long as they chose to choreograph his movement (Fig. 2).

Fig. 2.
figure 2

Lauren Wenham, Puppet Boyfriend, with IsadoraV2 patch visible. (2015)

Kim’s work was slightly more complex. It was a game comprised of tiny moving squares meant to represent cells. Here the audience member (the offense) engaged with the work to battle a disease (the defence). Designed in Adobe Flash, it had an engagement timeline of no longer than 5 min. The input mechanism for Apollo Vs/... required a keyboard with a trackpad for both movement through the gamespace and the generation of cells on either side of the battle (Fig. 3).

Fig. 3.
figure 3

Installation shot of Denouement, consisting of Seunghyun Kim’s Apollo Vs/... (2015) and Lauren Wenham’s Puppet Boyfriend (2015). Plinth and engagement mechanisms also shown.

The overall engagement was speculatively designed by Turnbull Tillman as a single disruptive experience prior to entering the main exhibition space. The screens displaying the works served as the attractors, where the plinths displaying input mechanisms (trackpad/keyboard and track-ball mouse) served as sustaining mechanisms [3]. As such, the questions on the survey for audience evaluation queried the two artworks as one experience (Fig. 4). The evaluation methodology of surveys over semi-structured interviews or video-cued recall were selected because of the location, the time the audience might be able to commit before viewing a ticketed concert, and the data set suggested for the study which was minimum 100 people.

Fig. 4.
figure 4

Engagement (foreground) and Evaluation (background) in situ.

Table 1. Age distribution.

Immediately after the participants interacted with the artworks, they were asked to complete a simple survey designed specifically for this experiment. The survey consisted of a set of 16 questions divided into three groups: Demographic, General, and Experiential (Appendix). The data was then captured, categorised and analysed to pinpoint any emergent trends. In the next section, interdisciplinary collaborators Forseck and Turnbull Tillman present the findings most relevant to their investigation into curating digital interactive art, beginning with the audience’s experience of and engagement with it.

3.4 Data

Demographic: During the six days of data collection 96 subjects were surveyed with their age ranging from 18 to 65 with an average of 30.62 and a standard deviation of 10.85, the age distribution is shown on Table 1.

The first 2 indicators on the survey began with a general audience demographic querying gender (Table 2) and profession. We also tallied how many people appeared in each category. The male/female ratio is 1.07:1, slightly higher on the male side, with only a very small number of people not wanting to reveal their gender. Regarding profession, a high number of participants were textbook professionals expected to attend an interactive exhibition; they are either students at the nearby university, working in relevant academic fields, the arts, computer & IT, or engineering. A few financiers and retirees attended, and the ‘other’ category was made up of independent subject types, such as travellers, baristas, or massage therapists, for example.

Table 2. People in attendance per day, by gender.

General (Questions 1–3): Questions 1–3 were included for the venue’s marketing team, to monitor what affectively attracts people to the site. The results showed that the festival branding and word of mouth (or friends) were the main reasons people had heard about the program and attended, but also that the majority of people who did so had been to the site previous.

Experimental (Questions 4–13): The remaining questions centred around what we, the researchers, wanted the subjects to explore regarding their experience by recording what they remembered about it immediately afterwards (Fig. 4). They were asked general questions, mostly to be able to determine in a “yes/no” way, if the prototype works created by emergent practitioners were effectively recognised as artworks (quantifying) and, if they were enjoyable, why (qualifying). In several of the questions we offered a “why” qualifier to a “yes/no” response; prompting the subject to think about why they answered yes or no. Sometimes very detailed responses were obtained, but if the question was too direct or a bit confronting, we tended not to receive an answer at all, as is indicated by the no comment section in Q4a being the largest percentage (Table 3). As the results show the audience members where experiencing a humanities definition of authenticity in relation to the artworks; one that recognized the experience as related to art and as enjoyable enough to maintain engagement [4, 12, 13].

Table 3. Reasons for lack of awareness ranked by percentage.

3.5 Results

This Case Study was designed to record and analyse, in an exploratory way, how and why people engage with interactive art. This was done through applying a curatorial practice-based research approach to working with emerging practitioners and partnering with a public festival where participating artists and curators were interested in learning more about the medium they are working in. The survey used to evaluate the audience’s experience engaging with the work asked a series of open or closed questions. As the results will inform the next Case Study, this section contains a brief report of the data generated by these responses.

For the next experiment, which will be designed for a more specific interactive experience, we would suggest to ask less open questions, resulting in clearer data. As in Q4a, this question was so open that 1/3 of the participants didn’t know how to articulate the qualifying aspect of it (Table 3). In the instances of the “no comment” answers, if we wanted to know why, we would have to contact the subjects and perform a follow up interview. Alternately, in the next experiment, participants won’t be permitted to leave the experiment until all questions are answered. As this first experiment was an exploratory study, we left issues like this open to the subject.

We see an example of this in Q5 we see where a “yes/no” question was asked and received a largely positive response with 5.4:1. When again followed with a qualifying question, the largest percentage of subjects didn’t know or couldn’t verbalize how they knew the work was interactive, and didn’t understand until they were told by researchers.

The proposed explanation for this results according to the authors is that there is a difference in the two fields we are approaching the data from. From a design/engineering point of view quantitative data is more useful than the qualitative, subjective data collected from question 5a. This is, again, largely because the question is too open, and more than half of the participants that were unable to identify the intervention as interactive couldn’t articulate an answer (Table 4). However, a possible method to fix this situations is for the researchers to follow up and interview those that responded with the no clarification and ask them to explain or to explore the grey areas of their experience.

Table 4. Responses of subjects who couldn’t identify the Intervention as interactive.

The trend changed for Q6 and 6a. Where more people answered no to the closed question, people were a little more specific in the open question. Still, the largest number of answers in percentages were “not applicable”, and “no comment” was tied with understanding the work was interactive within a timeframe of under 10 s and between 1–2 min (Fig. 5). Though Q8 was an open question, we received a variety of answers, the largest of which was NOT a “no” or “I don’t know” response. This leads us to conclude that not only are the works recognisable and effective, but that they are so in relation to art or design related subject matter, largely gaming (Table 5).

Fig. 5.
figure 5

Survey question 6a

Table 5. What do the artworks made the subjects think of.

Q9 is more about specifically engaging with digital interactive art with a sample rate of 1:1.77 that leans towards a lack of engagement. But considering those who responded ‘yes’, we can see again that the works were largely recognised as digital art and/or media by 59.4% (Games, art, dance, media, movies, music, emotions, and people; as shown in Table 5). Those that associated the works with this also clarified that they had seen similar work at university or school, which speaks of a generic sophistication, with the emergent practitioners successfully producing at a formal university level.

Q10 is an equaliser. Where Q9 asked about interactive art, Q10 wanted to know how many of the subjects engaged with art in general, with a positive response of 2.8:1; and then where (Table 6). Here the “no” response and “no comment” qualifiers are about even. After that most subjects engage with art at Art Galleries, through music or through digital media, as would be expected from this audience.

Table 6. Response to where do the subjects with art.

The responses to Q11 are quite encouraging. The majority of people could conceive the works being exhibited in other spaces, but they largely didn’t comment on where. The two biggest sections where they did comment were at a gallery, or in any public space. Some of the strongest answers in this section are in a “Private/Dark Room”, as a “3-D facade projection”, “shopping malls”, “train stations”, and even as an “e-birthday card”.

In Q12 and Q13 we did ask some more of the pointed questions that are hard to posit. We asked if they would revisit this kind of work and if so, why? 91% of the subjects said YES, 36% of them said because they had fun, 21% because it was interesting, and curiosity and uniqueness tied at 9%.

We felt positive about these responses, even about the more specific Q13, on how the experience might be improved. The largest number of subjects wanted improvements relating to the design. As a prototype and experimental exercise for the makers and the producer, this makes sense. After this, 13% wanted a better interface, 11% a better sound/audio integration, and 8% wanted clearer exhibition design.

3.6 Detailed Analysis

Interactivity and Engagement: Overall, this first experiment is considered successful as an intervention within a larger exhibition environment, largely because people recognized it as art that was interactive, and because they were willing to both acknowledge and participate in realizing this art form. As shown in the data, people came along to the larger event, noticed our intervention, they used it and had fun, and they were interested and engaged with both the systems and research.

Further analysis of the data as collated shows that people were confident of the interactivity present in both pieces and engaged with them quickly on a 5.4:1 ratio. Furthermore, this happened even when it was not completely clear to people how the interface worked since far less (2:1) of the surveyed people claimed to have understood it.

Other indications of success include that the works were recognisable in their prototype form as interactive artworks and professionals in similar and related fields recognised this, though they wanted more from the experience once they hit the limitations of the systems. Table 7 below shows the percentages to which they were aware of the works and how they understood that the works were meant to be interactive, ranging from didactics to visual cues to actually being told as much by the researchers. From this data we are comfortable concluding that with 91% of the audience willing to return, and with 2/3 of this total qualified with a positive response, our audience experienced the works in an authentic manner.

Table 7. Reasons given for awareness of an interactive piece.
Table 8. Percent of people that would engage with this kind of work in the future.
Table 9. Percent of people that would engage with this kind of work in the future.

Repetition and Improvement: Another important factor in the audience’s response noticed by the researchers is their willingness to engage with interactive art, more than 90% (Table 8). The reasons provided for this willingness are varied, with ‘Fun’ and ‘Interesting’ being the most common, at more than 55% combined (Table 9). Because of these results, we (the art and technology collaborators who delivered the experiment and analysed the data), see value in the qualitative/quantitative analysis, and can better advise each other on how to formulate more focused questions in the next survey relating to more established practitioners and more finished works.

Design for the next experiment in this curatorial study relies on what the audience felt could be improved in the design of this intervention. Table 10 shows the percentages of what the audience felt could be improved in the overall design and the engagement factors.

Table 10. Suggestions given of improvement of the interactive piece.

4 Discussion

In conclusion, the researchers are satisfied with the intervention inasmuch as it is meant to be exploratory, to gather general information about how audiences experience interactive art within the scope of HCI user scenarios and an experimental user interface. In this early stage, the results remain general, with not all the data being usable without further investigation. Important findings that the authors did take away from this intervention, and which will be incorporated into the design of the second Case Study are:

  1. 1.

    When there are multiple artworks with different engagement mechanisms, they should be treated as independent interactions within the survey. Though, they were both there as a single disruption and in essence, they have the same fundamental principles, they operate in a different way, and therefore, the experience was then different for the participant. This is articulated in the want by the participants to have a cleaner, better design, largely in regard to the interface (Table 10).

  2. 2.

    We found that some of the questions asked were too open even for an exploratory study, which resulted in confusing the participants and making it difficult for them to articulate the hows and the whys of their engagement. This then rendered the analysis of certain data too broad to accurately identify any possible pattern. Where this doesn’t necessarily mean that the questions were badly designed for the experiment, there is an opportunity in the second intervention to have the questions more closely focused on the exhibition space (and place) as part of the experience (Table 10).

  3. 3.

    Both the curatorial methodology and process call for a reflective practice Case Study (stage 4 of Denouement’s design). At the time of this initial exploratory Case Study, Turnbull Tillman was scheduled to produce ISEA2015: Disruption in Vancouver, British Columbia. In learning from the disruptive nature of Denouement, Turnbull Tillman was able to iterate her methodology, again in a public art setting, and this time analysing her own process and honing her appreciative system, rather than the audiences experience of the work. Her journey and findings are detailed in a co-authored book chapter Disruption and Reflection: A Curatorial Case Study in [11]. The new Case Study will again be focused in the audience’s experience.

5 Conclusion

This paper reports the results of the first experimental intervention made by the collaborators to identify emergent techniques for curating, evaluating and speaking about interactive art across disciplines. In addition, this paper aims to contribute to new ways of exploring human-computer interaction by positioning the Case Study in an artistic environment, thus transforming the engagement experience and incorporating new terminology. There are some tendencies that can be clearly identified and improved upon, and others that fall into a more exploratory vein. This exploration sets the groundwork for a second intervention as per Turnbull Tillman’s iterative methodology. We can now incorporate more details based on what we have discovered here. In collecting, collating and analysing the data, and then presenting it here, we are hoping to contribute to new techniques in evaluating interactive systems through staging experimental interventions away from the laboratory and into social spaces.