Skip to content
Publicly Available Published by Oldenbourg Wissenschaftsverlag April 1, 2022

XR Art and Culture: Successful Collaborations in Interdisciplinary Development Processes

  • Maja Stark

    Maja Stark has coordinated the HTW Berlin’s INKA project AURORA School for ARtists since 2018 and XR_Unites since 2020. She lectures on topics related to XR in art and culture, publishes, conceives and organizes events at the intersection of culture and computer science. Maja Stark studied art history with minors in French and Italian at the Freie Universität Berlin and at the Sorbonne IV in Paris. Her focus as an art scholar has been on art in extended reality (XR) for several years. After a traineeship at Deutscher Kunstverlag, she was Editorial Coordinator at the Cluster of Excellence Image Knowledge Gestaltung at Humboldt-Universität zu Berlin from 2015 to 2018.

    EMAIL logo
    , Elisabeth Thielen

    Elisabeth Thielen studied Applied Computer Science and works as a research assistant at the HTW Berlin. As part of the INKA research group and the EU-funded APOLLO project, she develops and manages various interactive augmented and virtual reality developments in collaboration with the Konzerthaus Berlin that combine classical music with modern mediation technology. These include, for example, the first virtual string quartet in augmented reality (2018) and the interactive composition Umwelten (2020). From 2018 to 2020, she was also a staff member of the EU-funded AURORA School for ARtists at HTW Berlin, where she trained artists in augmented reality and supported them in the implementation of creative applications in the AURORA production lab.

    , Christoph Holtmann

    Christoph Holtmann is a research assistant and an experience and software developer at the HTW Berlin focusing on virtual and mixed reality, gamification, education, and culture. He is particularly interested in innovative ideas, concepts, and technologies. He also teaches and enjoys passing on his knowledge.

    , André Selmanagić

    André Selmanagić studied computer science and media at the HTW Berlin, where he has been working as a research assistant in the research groups Creative Media and INKA since 2012. From 2018 to 2021 he was one of the coordinators of the AURORA School for ARtists, where he teached augmented reality to arts and culture professionals. He also supported them in developing their own creative applications. He finds it particularly exciting that extended reality applications require entirely new input/output modalities and thus novel approaches interaction design.

    , Michael Droste

    Michael Droste studied computer science at the HTW Berlin. He first encountered immersive technologies like virtual und augmented reality during his employment as a research associate at the research group INKA. His work as a developer focused on human-computer interaction and the implementation of interactive applications that are easily used and understood by users. The field of extended reality intrigues him because of the immersive experiences and the effect they have on users.

    and Leonid Barsht

    Leonid Barsht is a developer and research assistant at the HTW Berlin’s INKA project XR_Unites. From 2018 to 2021, he worked at the AURORA School for Artists. There he developed augmented reality applications in cooperation with creatives from various fields, such as the Red Dot Award-winning application New York, New York by designer Sarah Müller. He also teaches the basics of augmented reality as a lecturer – from 2019 to 2021 as part of AURORA and from 2020 to 2021 also in seminars for students of Applied Computer Science at HTW Berlin.

From the journal i-com

Abstract

In this paper we present different approaches for interdisciplinary collaborations between art and culture professionals resp. cultural institutions and the interdisciplinary XR experts of our research group. We discuss lessons learned and best practices devised from multiple years of experience in interdisciplinary collaborations. Different project settings need different solutions: With our contribution, we hope to show exemplary ways for successful interdisciplinary collaborations at the intersection of culture and computer science – and to show the potential of such projects for the development of new ways and tools of collaboration.

ACM CCS:

1 Introduction

Existing since 1999, the research group INKA (information and communication applications) at the research center for culture and computer science (FKI), HTW Berlin – University of Applied Sciences, has realized many research projects at the intersection of culture and computer science. Together with art and culture professionals (ACPs) and cultural institutions (CIs) such as museums and concert halls, the interdisciplinary project teams develop mobile applications with a focus on extended reality (XR) and explore innovative ways of connecting culture and state-of-the-art technologies.

In the context of a vivid exchange within the INKA research group, this paper is a joint initiative of interdisciplinary researchers in three on-going INKA projects – APOLLO (2016–2022), AURORA (2018–2022), and XR_Unites (2020–2023) – to share their experiences based on six years of close collaborations with ACPs and CIs. The paper will provide an overview and analysis of collaborations with experts at the CI Konzerthaus Berlin (APOLLO) and with ACPs from the independent art and culture scene in Berlin (AURORA and XR_Unites). This main part will lead to recommendations for different collaboration settings, but also to new ways and tools that could facilitate the collaborations between ACPs, CIs and XR experts in the INKA project XR_Unites and beyond.

Thus, the goal of the paper is to show solutions for different project settings to support developers and other XR technology experts in comparable settings in their decision on how to involve ACPs and CIs in the technological development – and how to be involved in their part of the common creative process.

2 Related Works

In order to establish a frame of reference for the experiences and best practices we found during our projects of collaboration with CIs and ACPs, we will review the development process and prototyping tools for XR applications in general and discuss publications that describe interdisciplinary collaborative development of XR applications.

Figure 1 
            Iterative process.
Figure 1

Iterative process.

Modern software development is often a dynamic process where developers, designers and other stakeholders are iteratively improving a product through prototypes and mockups (Figure 1). While there are different strategies for these agile approaches (e. g., Scrum, Extreme Programming or Feature Driven Development), the importance of early and iterative prototyping is a shared property. The same general idea can be applied to collaborative development with ACPs and CIs: Implementing a proof-of-concept as early as possible to ensure the creative vision of the artist or CI is realized.

These early prototypes are often of low fidelity and focus more on specific functionalities than on an overall consistent experience. Classifying prototypes based on their fidelity is a common approach for HCI and an extensive analysis of this classification can be found in [1]. When working iteratively, low fidelity prototypes may be increasing in complexity or are extended in their general functionality towards a finished product over time.

However, prototyping for XR applications has additional challenges. As the user experience is heavily influenced by the device through which the application is experienced, many prototyping or mockup tools that are available to non-technical experts (e. g., Adobe XD or Figma) are insufficient to communicate the planned application as they are envisioned for classical two-dimensional screen-based applications. However, a crucial part of a virtual reality (VR) experience is the immersive surrounding that separates the user from the real world. On the other hand, the spatial positioning of the digital content in the real world is a key property of every augmented reality (AR) application.

The authors of [2] describe a joint concept of a participatory design and model-based approaches to prototyping to create improved prototypes for mixed reality (MR) applications. The authors argue that the combined approaches lead to the development of fewer prototypes as creative concepts and technical feasibility can be assessed mostly in parallel. Approaches like [2] highlight the inclination to try innovative approaches for prototyping MR applications as established concepts appear unoptimized for these use cases.

Additionally, to implement a functional prototype for AR or VR, knowledge of programming is often a key necessity, which is also discussed in [3]. In [4] the necessity of prior programming knowledge is also listed as a challenge regarding most AR and VR authoring tools. Most XR experiences are implemented using a combination of an appropriate XR library (e. g., SteamVR, ARKit or ARCore) and a game engine (e. g., Unity).

There are, of course, exceptions: Authoring tools like Artivive or Adobe Aero simplify the creation of AR applications by limiting the possibilities for content and interactions. In return, these tools can be used without any programming knowledge. For VR, similar tools exist. For example, Veer Experience allows its users to turn any 360-degree video sphere or image into a VR experience. The idea to build authoring tools to allow non-developers to contribute to the development process of MR applications is also the core concept in [5]. The authors present an authoring tool for industrial augmented reality applications, which utilizes interaction stories to add interactivity to digital content in AR without writing code. Interaction stories are described as “UI sequences modeled by end users within an AR authoring tool” and are built from a combination of actions, interactions, and conditions. Altogether, these building blocks allow non-developers to implement simple interactions for their AR applications.

However, especially for more ambitious artistic projects, the limitations introduced by authoring tools that allow the realization of XR applications without programming can become a creative restriction relatively fast. Therefore, collaboration between technical experts and artists yields enormous potential. In [3] different collaborative AR projects between artists and developers are presented, while focusing not only on the projects, but putting a heavy emphasis on the mutually beneficial collaboration.

In [6] the authors describe the implementation of an innovative ballet with AR elements and go into detail how they collaborated and where major challenges in their collaborations were encountered. The authors suggest that an important part for a successful collaboration lies in the understanding of the language of the collaborating partners, where the “goal […] is not to master the art but to grasp some of the partner’s work culture and processes, which are then much better understood […].” A similar sentiment is also found in the retrospective assessment of the collaborations in [3], where the author states that “the key issues to successfully collaborating with experts in other disciplines are fluent communication, a common interest to evolve, and an open-minded predisposition.” The project presented by Fischer et al. in [7] describes a comparable experience of a collaborative project that uses AR to allow dancers to interact with virtual content while performing on stage. The authors reminisce that “this kind of collaboration has radiated a special fascination in both fields and has quickly attracted the interest of over a dozen students and professors” and that both fields contributed differently to the project, as “the project was guided by the artistic vision, that is science provided a service to art in form of new technology.”

At this point, it can be added that the collaboration can be much more diverse than the artistic idea on one side and the technical implementation on the other. Rather, the ideas and requirements of the artists can spur the scientists and programmers to new developments, and in turn, technical possibilities can lead to new artistic inspiration. Tilt Brush by Google[1] for example is a tool that enables artists to sketch and draw in VR. Special pen controllers like the Logitech VR Ink[2] were developed for such tools. Moreover, the process creates the potential for technical ideas from the artists and artistic contributions from the programmers. Thus, a project can emerge that grows beyond the sum of the artists’ and scientists’ ideas and possibilities.

In [8], another perspective of the beneficial effect of interdisciplinary collaboration in AR development is presented. The authors describe the beneficial effect two art students had during their internship in an interdisciplinary research team of mostly technological experts that was working on a mobile AR game. As the art students were not as aware of the technical possibilities, they could think more freely of concepts and contributed creative ideas.

In an empirical study from 2020 [9], AURORA developers and twelve artists were interviewed about the interplay between creativity and technical realization during the one-to-one collaborations. Almost all artists answered that their concepts and ideas for artistic AR applications changed with their growing knowledge about the AR technology. On the one hand, this could lead to restrictions – on the other hand, the deepened knowledge revealed new possibilities, e. g., in terms of user interaction options. While the developers were challenged to find creative solutions for at times quite unusual concepts, they denied a real backward-channel from art to science: new technical solutions in terms of digital tools or new ways of programming did not result. Still, the results would not have been possible without the interdisciplinary collaborations.

These beneficial collaborations can not only be found in AR: For example [10] describes a VR lecture, that was realized as an interdisciplinary project in order to combine the expertise of the computer science curriculum and the digital humanities curriculum, therefore allowing a unique dive into the topic of VR from a multi-facetted viewpoint.

However, regardless of whether the collaboration is focusing on AR or VR, the challenges of interdisciplinary collaborations between non-technical and technical experts are overlapping and greatly similar. The authors of [10] conducted a series of 26 interviews with AR/VR designers and developers focusing on their experiences in interdisciplinary collaborative projects. Besides technical difficulties mostly discussed above, the authors also found that most technical experts are faced constantly with general misconceptions about the possibilities of AR and VR and that especially “creators who do not have a technical background often have unrealistic expectations about what AR/VR can or cannot provide.”

The authors of [11] present similar observations: For their research they interviewed 21 AR/VR creators grouped in three different levels of expertise (hobbyists, domain experts, professional designers). The general consensus was that the threshold for starting and becoming an expert in AR/VR development is still relatively high. The authors identified eight barriers that stem from various points of the development cycle, namely (1) not knowing where to start, (2) difficulties in making use of online learning resources, (3) lacking concrete design guidelines and examples, (4) difficulties in designing for the physical aspects of immersive experiences, (5) problems in planning and simulating motion in AR, (6) how to start designing story-driven immersive experiences, (7) encountering many unknowns in development, testing, and debugging, as well as (8) testing users and evaluating challenges. Some of these barriers result from the novelty of the technologies, while others are a result of a dynamically changing environment of libraries, frameworks and vendors.

3 Own Approaches 2016–2021

3.1 Interdisciplinary Collaboration with the Konzerthaus Berlin in the Project APOLLO

The research project APOLLO is a collaboration between the research group INKA and the Konzerthaus Berlin, aiming to create XR applications for classical music communication. Throughout the project different approaches of collaboration were used.

Figure 2 
              The feedback iteration cycle used for most of the projects in APOLLO.
Figure 2

The feedback iteration cycle used for most of the projects in APOLLO.

Ways of Collaboration with the Konzerthaus Berlin

Generally, there were a few different stakeholders in any XR project with the Konzerthaus. The main stakeholders were:

  1. external project lead is responsible for the final decisions about the project

  2. internal project lead is responsible for leading the concept phases as well as feedback and decisions for project iterations

  3. developer(s) is/are in charge of software development

  4. designers and artists are responsible for the visual appearance of projects and applications

  5. music expert gives feedback on the musical knowledge that should be conveyed by an application and its correctness

  6. pedagogue[3] gives feedback on the quality of the educational aspect of an application, especially if the application is geared towards minors

Figure 2 displays a typical feedback iteration cycle used in projects within the project APOLLO. The general concept for an application was mostly developed with all parties present during an initial brainstorming session. After the goal and outline for an application were set, the work continued in smaller groups. 1–2 developers worked closely with a designer to construct the first few prototypes of an application. More advanced iterations were then shown to the internal lead as well as an external project employee at the Konzerthaus Berlin and reviewed by them. Their feedback was incorporated in additional iterations, before a more presentable version of a prototype or an application were shown to the external project lead.

In the following, two projects will be further analyzed regarding how those stakeholders appeared in and worked on the projects.

The first example is the development of the exhibition at the vestibule of the Konzerthaus Berlin [14]. The exhibition consisted of four different AR applications, that each had their own physical stele in the vestibule.

The topics of the applications were centered around the history and the orchestra of the Konzerthaus and aimed to show an insight into the cultural institution.

One of the applications focuses on showing the inside of the Konzerthaus and the different halls, that can only be viewed when attending a concert (Figure 3, top left). The application features six different three-dimensional models of the outside and the five halls of the concert hall.

Figure 3 
                The four different applications of the AR exhibition (top images: P. Castagnola).
Figure 3

The four different applications of the AR exhibition (top images: P. Castagnola).

The next application (Figure 3, top right) shows important dates from the history of the concert hall on a timeline with users able to select the dates and get more information about them.

For the third application a small game was created (Figure 3, bottom left). The game allows users to allocate an instrument to its respective place in the orchestra and awards points for correct placements.

The final application brings a virtual string quartet to life by placing musicians on four different playing card AR markers as soon as they are recognized in the camera view (Figure 3, bottom right).

The development of those applications was separated with each app having their own developer. A designer was closely involved in the process of two of those applications (time ring and quartet). Due to the active involvement of the designer, those projects had smaller and quicker iterations, while the other two applications had longer ones. Generally, the iterations first included the developer and designer, before requesting feedback and approval by the internal project lead as well as the music expert. Only advanced and more polished versions were being shown to the external project lead. While this was done to only show ‘better’ versions to the external project lead, this also meant that their feedback came very late in the development process. It also resulted in complete changes in the direction of the project and in one case even a redevelopment of the project. In order to prevent this, earlier versions should have been shown to external stakeholders and their feedback incorporated during the concept phase of the project.

Another example shows a different approach. During the development of the VR application Umwelten [15] the constellation differed.

Figure 4 
                Screenshot of Umwelten (Visual Artist: Julian Bonequi).
Figure 4

Screenshot of Umwelten (Visual Artist: Julian Bonequi).

The application lets the user be the conductor of a fantastic, almost alien world and sounds (Figure 4). The playing field consists of unique 3D models, that each have multiple audio sources which users can activate by interacting with the 3D models. The sound files used in the project were recorded by the orchestra of the Konzerthaus, but featured unusual techniques such as rolling marbles on the surface of a drum in order to create sound.

The project was heavily influenced by the vision of the visual artist Julian Bonequi and the composer Mark Barden, who were external stakeholders in the project, with one of the artists being actively involved in the software development process. The concept was developed during weekly meetings with most of the stakeholders. During the concept phase, the development already began, and multiple different versions could be tested by all participants. Especially the artist also involved in the software development collaborated closely with the programmers. While this sometimes led to a decrease in the quality of the code, it did mean the artist could understand the steps between the iterations and plan his work on the 3D scene better.

Analysis

The main difference between the AR exhibitions applications and Umwelten was the concept, which in one case was finished before the development started while in the other case it evolved and grew with the development process. This came with mixed advantages and disadvantages.

The first approach made it easier to differ between the different stakeholder roles and only had trained software developers working on the Unity projects. However, the later involvement of feedback from the external project lead resulted in more work than would have been necessary with earlier feedback. Complete or partial rehauls of projects also led to a decrease in the quality of the internal project structures in Unity. Earlier user tests might also have helped to either change the direction of the projects earlier or increase the confidence of the external project lead in the projects and thus swaying them to accept different versions.

Especially due to the unique nature of the Umwelten application, there were multiple user tests in order to gauge how users would receive the application. One major point of disagreement between artists and developers was often usability vs. creativity. While the artists came up with very creative interaction designs, they did not always follow typical usability rules that software developers try to adhere to [4]. Some of these interaction concepts had to be changed or cut from the final application, especially after several user tests, where users could not grasp these concepts.

The final application was once again tested and user surveys were conducted. The application was received quite well and the interaction concepted were understood easier by users. One example being the button allocation. The first version used nearly all of the controller buttons of the Oculus Quest device, leading to users struggling to find interactive functions due to not being able to recall the necessary button. The final application was changed and only two buttons are used: one for moving throughout the world and one for triggering interactive content throughout the world. Users remarked that it was easier to remember those buttons and find more of the interaction possibilities inside Umwelten. The solution of simplifying the button assignment proved to be a good solution for both the developers (greater adherence to usability concepts) and the artists (users ended up spending more time exploring the virtual world, as opposed to struggling to understand what to do and in the worst case giving up quickly).

The active involvement of the artists also meant smaller and faster iterations. Overall, the outcome of this project fits more closely with the goals of all the stakeholders since they were actively involved in the process. Sometimes the active involvement of all the stakeholders meant that the role distribution was not always clear, especially with one of the artists also taking on a development role.

Table 1

Overview – Recommendations for XR collaborations with cultural institutions and artists retrieved from multiple years of experience in the APOLLO project.

AR Exhibition

Long iterations/clearly separated roles
Umwelten

Short iterations/partially mixed roles
Advantages Easy to differ between the stakeholder roles

Only trained software developers work on the Unity projects
Smaller and faster iterations. The outcome fits more closely with all the goals of all the stakeholders since they are actively involved in the whole process.

– The ACP understands the steps between the iterations and plans his/her work on the 3D scene better.
Disadvantages Long iteration cycles can lead to late feedback in the development process

– This can result in complete changes in the direction of the project or even in a redevelopment of the project
An active involvement of all the stakeholders can mean that the role distribution is not always clear, especially with an ACP also taking on a development role.

– The result can be a decrease in the quality of the code/neglecting common HCI concepts
Learnings Shorter iterations and clear role division (–> Umwelten)

Early feedback from all participants in order to prevent project reworks
Big creative potential when directly involving artists (–> AURORA)

Developers are needed for finishing touches and ensuring HCI concepts

Table 1 shows an overview of the learnings from the two mentioned collaborations. Within the AR exhibition project, it was an advantage to have clearly defined stakeholder roles, but a disadvantage to have long iteration cycles with late feedback from the external stakeholders. Therefore, shorter iterations and clearer role division was used in following projects.

During the Umwelten project those smaller and faster iterations led to an outcome fitting better with the goals of the stakeholders. Being closely involved in those steps also meant the visual artists could plan his work on the 3D scene better and more in line with the developers. However, with the artist working on some of the scripts themselves sometimes leads to a decrease in code quality. It was therefore necessary for the developers to rewrite some of the concepts at the end of the project, but lead to more creative ideas within the interactions with the Umwelten environment.

3.2 Further Augmented Reality Trainings and One-to-One Collaborations with Artists in the Project AURORA

The INKA project AURORA offers five one- or two-day modular trainings in AR and digital media production to ACPs from the independent scene of Berlin. Furthermore, some of the creative AR application concepts developed during these courses are implemented in the AURORA Production Lab in close collaboration with the project researchers.

Ways of Involving Art and Culture Professionals

At the AURORA Production Lab, collaboration on creative AR applications was done in an iterative fashion, with intermingled phases of ideation, development, and evaluation. In most cases one ACP collaborated with one AURORA developer, supported by a digital media production specialist.

During the ideation phases the concepts by the ACPs, which were vague in the beginning, were refined collaboratively. Ideas were prototyped with the help of sketching, (digital) whiteboards (e. g., Miro), paper prototyping, and role-playing using mockup-devices in physical space. Building on the ACP’s initial knowledge about AR application creation from the AURORA trainings, AR technologies to be used, the assets to be acquired and the user experience (UX) of the app were discussed. This gave both sides, the developers and the ACPs, the opportunity to learn about the respective possibilities and limitations in the spectrum between creative concept and technological feasibility. Furthermore, this exchange made it possible to find new approaches to problems that would not have been considered originally [9]. Quite analog to challenges described in [4], this could be the hardware and software performance, the hardware-specific limitations, or the consideration of the robustness of sensory input regarding surrounding environmental factors.

Figure 5 
                Collaboration between two artists and an AURORA developer in the AURORA Production Lab, 2019. Photo: AURORA, HTW Berlin.
Figure 5

Collaboration between two artists and an AURORA developer in the AURORA Production Lab, 2019. Photo: AURORA, HTW Berlin.

During the development phases, the collaboration took place within the scope of pair programming and joint editing of scenes in the game engine Unity (Figure 5). This happened both in person and later, due to the Corona pandemic, via the Internet (video chat/remote control). In order to simplify the workflow for the ACPs for defining content and logic, the developers prepared easy-to-use Unity components, prefabs and (external) libraries. There was a tendency that the logic of the AR applications was implemented by the developers whereas the production of the analog/digital media and arrangements (scenes) was made by the ACPs. For her two-sided interactive augmented reality card Heidenrose, for example, the artist Ariane Stamatescu created the 3D models of the fleshy roses using the software Blender with weekly support by Dagmar Schürrer, whereas the user interactions with these models as foreseen in the artistic concept – changing appearance and/or sounds through touching gestures on the screen – were coded with C# by the AURORA programmer Leonid Barsht. The AR artwork uses Johann Wolfgang von Goethe’s famous poem Heidenröslein (published in 1789) to reveal the problematic acceptance of sexualized violence against women in our cultural history. The virtual roses on the card’s scenic front side react by a metamorphosis into menacing zombi flowers as soon as the user tries to touch them via the screen (Figure 6). The reverse side offers more examples and background information researched by the artist and transformed into an interactive collage of texts, images, and videos by the programmer. The AR project is part of the free INKA AR application, which can be downloaded in the common stores.

Figure 6 
                
                  Heidenrose by Ariane Stamatescu in collaboration with Dagmar Schürrer and Leonid Barsht, 2021. Photo: AURORA, HTW Berlin.
Figure 6

Heidenrose by Ariane Stamatescu in collaboration with Dagmar Schürrer and Leonid Barsht, 2021. Photo: AURORA, HTW Berlin.

A lot of this often challenging work was done individually between the collaborative sessions (“homework”). The distribution and recording of the respective tasks and bugs (during the evaluation phases) with the help of tools such as GitLab was also an important part of the process. GitLab was also used for versioning and sharing the projects. In this context, the ACPs had to learn how to work with the version control system Git at the beginning of the project.

The evaluation of the projects took place between the iterations of the development. In this way, bugs, e. g., could be found and quickly fixed thanks to the rapid process model. Sometimes the debugging also brought challenges in execution. For example, with Unity’s AR Foundation framework, it is not possible to test the application in the editor without detours, which is why it has to be rebuilt on the smartphone every time changes are made. In addition, the AURORA team had to consider that not every tester has a high-performance device with the desired operating system version. The main challenge, especially with external testers, was communicating the application’s functionality: mainly at the beginning of the development, user interfaces or tutorials are usually not yet defined and implemented with the result that the tester does not know what to do and what the desired result should be.

Analysis

Due to the combination of collaborative sessions and individual homework, the ACPs quickly gained knowledge about and confidence in using the technologies and toolchains required to build AR applications (mainly Git, Unity and Blender). They became proficient in scene editing and creating testable builds of their apps for iOS (using XCode) or Android and thus were able to do most of the non-programming work completely on their own. The knowledge about the technologies involved, especially their limitations, also helped in iteratively polishing their concepts.

Indeed, the fact that the artists came into the Production Lab to deepen their knowledge from the AURORA courses was a new approach that was a real chance for the ACPs to dive into the technology. A few of the ACPs with strong motivation even became acquainted with C# programming and consequently only needed help concerning more complicated computer science related problems or deeper Unity related topics. For instance, the artists duo Banz & Bowinkel profited from previous knowledge in digital media production and AR development when they came to the AURORA Production Lab for the start of their Bot series in 2019 (Figure 7): Realized as a vinyl print of 3 × 3 meters in size, a patterned carpet on the floor is used as a marker, where multiple avatars can be made visible through a mobile device using the artists’ AR app. For their critical reflection on the relation between humans and computers, the artists were not only interested in AR, but also in artificial intelligence programming: The avatars’ behaviors are determined by an Utility AI: potential actions like “seek company” or “avoid more than three neighbors” have scores that are constantly recalculated based on the avatars’ ever-changing needs and environmental factors.

Figure 7 
                
                  Bot 03 by Banz & Bowinkel in collaboration with André Selmanagić and Michael Droste, 2019. Photo: DAM gallery.
Figure 7

Bot 03 by Banz & Bowinkel in collaboration with André Selmanagić and Michael Droste, 2019. Photo: DAM gallery.

The summer courses of 2021 were the first to teach the visual programming language Bolt instead of C#. An evaluation of whether the ACPs in the coming production lab phase will feel confident enough to tackle more (visual) programming tasks, will have to follow at the end of the AURORA project.

It is also the focus on the AR training of the ACPs that led to a dismantling of some key barriers relevant for AR development beginners that are described by Ashtari et al. [11]: thanks to the support of the AR experts, the ACPs knew where to start and were given necessary up-to-date information and well-designed AR examples both in the AURORA courses and during the development. The AR experts also shared their knowledge on testing and evaluation methods. On the other side, problems such as the difficulty to plan and simulate motion in AR or the lack of concrete design guidelines as identified by the authors of [11] also challenged the AURORA project.

Of course, the focus on the training also had its price in terms of quality: As with most inexperienced developers, the written code and assets that were produced by the ACPs individually may have done what they were supposed to, but they did not meet the quality standards of professional developers. In the domain of programming this included obscure software architecture, inconsistent naming and inefficient functions causing performance bottlenecks. Unity-related programming/component design tasks were also often solved in a very problem-specific way by the ACPs, so that they cannot easily be used in projects tackling similar problems (keyword generality). In the domain of content production and scene editing for game engines, almost all ACPs also did understandable beginner mistakes like creating assets that have a too big performance impact (high resolution videos and textures, high polygonal meshes, particle systems with huge amounts of particles). This often led to severe performance implications, especially for the mobile devices that were targeted. Consequently, these parts had to be partially cleaned up by the AURORA developers. This was sometimes also done during the sessions to teach these quality standards.

Figure 8 
                
                  Señor Pulpo by Christina Sarli in collaboration with André Selmanagić, 2021. Photo: AURORA, HTW Berlin.
Figure 8

Señor Pulpo by Christina Sarli in collaboration with André Selmanagić, 2021. Photo: AURORA, HTW Berlin.

To give an example, one of the most challenging projects in terms of performance impact was a prototype for the hybrid artist’s book Señor Pulpo created in 2020/21 by the artist Christina Sarli in collaboration with the AURORA developer André Selmanagić. The book about a common octopus is organized in two parts – visual interpretations of haikus and a comic (Figure 8) – and enriched by an astonishing multitude of 2D/3D animations and sound. These digital elements are triggered depending on local time and/or distance of the mobile device (keyword close up-function). As the programmer describes in a Technical Insight to the development process [12], loading the book’s complete AR content at once (as Unity usual does) led to crashes on some mobile devices due to insufficient memory capacity. As not only the size, but also the amount of the assets led to this problem, Unity’s new Addressables system was used, because it allows to load and unload content dynamically. Due this huge advantage, the system is meanwhile implemented for the complete INKA AR application including the Señor Pulpo project. Insofar, one could state that the challenge given by the artist’s project finally led to a better solution for the whole research group.

Regarding the three major challenges for AR creation in interdisciplinary teams as identified by Krauß et al. [6], team-internal misconceptions about the medium (1) – such as unrealistic expectations of what AR can provide – could be avoided thanks to the described discussions with the technical creators during the ideation phase.

The same applies to the lack of tool support and appropriate methods (2). For instance, it was minimized by providing extensive insight into finished best-practice AR projects for a better understanding of the subject. Moreover, the simultaneous collaboration and thus real-time exchange facilitated the understanding of each other’s ideas, progress, and barriers. If necessary, evasive tools were used – for example, if ACPs were not yet able to build a UI in Unity, tools like Adobe XD were resorted to. In turn, developers were not afraid to learn the basics of Blender or Photoshop for faster prototyping, for instance. Nevertheless, for an optimal result, a sensible distribution of tasks was essential to avoid major reworking of program code or assets afterwards.

Concerning the absence of a common language (3), the AURORA Production Lab was (and is) not only a place of learning for the ACPs. Both developers and ACPs had to find a common language to be able to communicate their ideas and perspectives without misconceptions. While ACPs gained technological knowledge, developers were motivated to think more creatively in artistic ways. Being involved so deeply in the ideation though, it was often hard for the developers to keep the projects compact and within timely limits, resulting in projects that often took longer and had many more features and content than previously planned. Since the ACPs had to work in parallel to earn their living, the time factor was even more relevant. The resulting flexible collaboration mode would not have been possible under free market conditions. But compared to a commercial (free market) development setting, it also gave the teams a lot of freedom, e. g., in testing different technologies such as AR content on a world scale or the use of 3D AR markers, e. g., within Dagmar Schürrer’s AR project Virtualized described in detail in [9].

Table 2

Overview – Recommendations for AR collaborations with freelance ACPs retrieved from 3 years of experience in the AURORA project.

Concept/Ideation Prototyping and implementation Evaluation and testing
All team members Find a common language to be able to communicate ideas

Find a good compromise between vision and feasibility
Document how things work and what you have learned Make use of issue tracking software
Developers Communicate the opportunities and limitations Provide alternative solutions if possible

Bring your own ideas
Participate in creative prototyping techniques like role-playing etc.

Develop tools and components that artists can use and edit from the editor Ul

Use visual programming languages to involve ACPs

Be patient and accept lower quality and problem-specific code of artists and use it as teaching opportunities for them
Make testing and debugging of apps fast, easy and traceable (e. g. debug logging, shortcuts, editor debugging opportunities)
Artists Learn the technologies involved, especially their limitations and unique possibilities Learn basics of Unity (e. g. by pair-programming/-editing)

Learn visual programming languages like Bolt

Learn the use of version control software like Git
Learn how to build and application for target devices

If we were asked to recommend a best practice based on our AURORA experience, we would summarize it as follows (Table 2):

To start with the concept and ideation process, all team members should be aware that a common language between the disciplines is not granted but must be found – the same goes for a good compromise between the artistic vision and the technical feasibility. Developers should communicate technical opportunities and limitations transparently and provide alternative solutions where possible. Finally, they should not hesitate to bring in their own ideas. Artists are recommended to work at a deep understanding of the technologies involved.

During the process of prototyping and implementation, documentation is a shared property. Developers should develop tools and components that artists can use and edit themselves using the editor UI. Lower code quality of the artists can always be taken as an educational opportunity. Artists are recommended to learn the basics of Unity and should get acquainted with the use of version control software like Git.

Finally, all steps from concept to evaluation should involve issue tracking software. Making the testing and debugging of the created apps fast, easy, and traceable is a developers’ task whereas the artists are recommended to learn how to build their AR application for the target devices.

3.3 Collaboration with an Artists’ Team in the Project XR_Unites

New Ways and Tools of Collaboration

The INKA project XR_Unites started mid-2020 and focuses on the collaborative development of ambitious cultural offerings with the XR technologies VR, AR, and MR. With the help of three successive open calls, three artistic teams with promising concepts are selected by a jury until mid-2023. Each team is advised and supported in the areas of concept and aesthetics, XR development and digital media production by a team of XR specialists.

Figure 9 
                Impressions of Transient Exposure by XR_Unites in collaboration with reverb and the Chitrasena Dance Company. Dancers (from left to right): Vajira Chitrasena, Upeka Chitrasena. Collage: Chitrasena Dance Company/XR_Unites, HTW Berlin.
Figure 9

Impressions of Transient Exposure by XR_Unites in collaboration with reverb and the Chitrasena Dance Company. Dancers (from left to right): Vajira Chitrasena, Upeka Chitrasena. Collage: Chitrasena Dance Company/XR_Unites, HTW Berlin.

Transient Exposure is a MR dance installation with Microsoft’s head mounted display HoloLens 2 and was the first of the three concepts to be realized (Figure 9). It was developed from February to August 2021 in collaboration with the artists’ collective reverb – who won the first open call – and the Chitrasena Dance Company in Colombo (Sri Lanka).

Based on the archive of the Chitrasena Dance Company – esp. videos and photography –, the installation examines the contemporary in dance. Combining digitalized artifacts with 2D/3D content in 3D space, the artists want to make traditional Sri Lankan dance accessible through an interactive walk-in installation – with the long-term goal of combining live performances of the Chitrasena dancers with the HoloLens experience. In terms of interaction, the user can beat a digital drum and thus contribute to the 3D music and sound collage that plays an important role in the installation. Furthermore, he/she can swipe through photography documenting some of the highlights of the Dance Company’s history. Interactive elements are made recognizable by shiny particle systems (Figure 10).

Figure 10 
                Scene with the 3D modeled interactive drum in Transient Exposure. Dancer: Thaji Chitrasena. Screenshot: XR_Unites, HTW Berlin.
Figure 10

Scene with the 3D modeled interactive drum in Transient Exposure. Dancer: Thaji Chitrasena. Screenshot: XR_Unites, HTW Berlin.

Due to the final deadline of the XR exhibition opening in September 2021, the Transient Exposure team and the researchers of XR_Unites combined parts of the agile development methodology – that was planned for the project from the very beginning – with a strict schedule. Furthermore, they decided early to develop a specific editor for MR devices like the Microsoft HoloLens.

This approach payed tribute to the above-mentioned fact that interdisciplinary XR development can be simplified through tools that allow ACPs to contribute without coding. It also considered the mentioned observation of AURORA researchers that most of the ACPs had a greater affinity for the digital media production than for the implementation of their application’s logic. In addition, the larger an ACPs’ team, the easier is the collaboration with the help of a digital tool.

Figure 11 
                Editor – Placing an image object to the scene and changing settings for an audio asset. Screenshots: XR_Unites, HTW Berlin.
Figure 11

Editor – Placing an image object to the scene and changing settings for an audio asset. Screenshots: XR_Unites, HTW Berlin.

The idea was to allow ACPs to realize their visions as accurately as possible, but to provide a simpler and faster development process better tailored to their needs and to the technology used. While it was clear that the editor had to start from a basic toolset and small features, the ACPs were strongly involved in its development process. Even when the ACPs did not directly work on, but rather with the editor’s functions, it should grow according to their ideas and wishes. At the same time, it must not be too customized to the current project. In contrast to the AURORA project where the parts developed by the ACPs were highly individual, this time the aim was to take a more generalizing approach instead. On the one hand, this could make the editor usable in various projects and on the other hand, it should become more and more accessible without the support of developers.

Creating an editor for MR devices, tailored to the artists ideas, put the focus on editable virtual content that can be arranged within the physical room. Position, rotation, and scale (Figure 11, left) is saved as well as contextual dependencies like image markers. The editor allows the placement of various contents, such as 3D objects, scans, audio, and image and video content as well as effects. Therefore, a user interface is displayed next to the placed content. More detailed settings are displayed in a hand menu that appears when you raise your hand. For the different placed contents specific adjustments can be made. For example, colors or image sections, volume, viewing and listening radii or trigger interactions can be set (Figure 11, right). A timeline function can also be used to control time-defined events so that, for example, objects are only displayed at a specific point in time (Figure 12). With the help of the menu attached to the hand, scene settings can be changed, or intermediate states can be saved and loaded.

Figure 12 
                Editor – Set up a trigger to show object after time. Screenshot: XR_Unites, HTW Berlin.
Figure 12

Editor – Set up a trigger to show object after time. Screenshot: XR_Unites, HTW Berlin.

In the future the editor should also support multi-user editing and cloud synchronization to enable artist groups to work on their artworks simultaneously – within the scope of XR_Unites and future INKA projects.

Analysis

The approach of developing a toolbox in the form of an editor based on the artists’ requirements being continuously developed during the project presented a challenge. The developers did not implement the artists’ ideas directly, but provided the tailored tools, with the help of which the artists created prototypes of the technical artwork on their own. This approach can bring both benefits and challenges. On the one hand, this could lead to delays and the developers’ misinterpretation of the artistic ideas. On the other hand, the resulting need for communication created an even closer collaboration between the artists and the programmers – and in the end not only a work of art, but also a tool with which the artists can prototype further developments on their own.

Table 3

Overview – Recommendations for MR collaborations with teams of freelance ACPs retrieved from 7 months of experience in the XR_Unites project.

Concept/Ideation Prototyping and implementation Evaluation and testing
All team members Understand the possibilities of mixed reality experiences

Find ways and compromises for presenting art content (like videos) with the HoloLens device
Document how things work and what you have learned Observing and questioning testees about their experiences
Developers Communicate the opportunities and limitations

Bring your own ideas, especially to interaction concepts
Participate in creative prototyping techniques like role-playing and prototyping

Provide opportunities to experience and try out mixed reality at an early stage (HoloLens Editor)
Make testing of the experience fast
Digital media production specialists Understand the limitations of the hardware capabilities (performance, screen size)

Test options to virtualize physical objects (3D scanning, Kinect recordings, modelling and animation)
Start early enough to produce digital media Find performance issues
Artists Learn the technologies involved, especially their limitations and unique possibilities Learn basics of Unity

Using the HoloLens Editor to experiment with content and interactions
Learn how the designed experience can differ from the actual user experience with the HoloLens

As shown in Table 3, for good collaboration in MR projects, we recommend first talking about the possibilities, but also the limitations. For a good understanding, example content should be provided to all participants to familiarize themselves with the interaction possibilities and the presentation forms, among other things. In addition, the asset creation pipeline should be defined, for example, to generate 3D content in a way that MR hardware can handle it. Continuous testing with new content is important both for developers to ensure functionality and performance, and for artists to understand how their content looks and feels like in MR environments.

The idea for upcoming projects is to establish the following process on different levels, like prototyping or implementation: “Collaboration – Results & Tools – Analysis” (Figure 13). This process cycle extends the well-known iterative process and complements it with a cross-project approach. Processes such as prototyping and evaluation (see Figure 1) remain part of the collaboration but should increasingly influence subsequent projects. This also makes it possible to work on tools in parallel projects, so that ideas and resources are shared. Indirectly or directly, this can also lead to an exchange between artists.

Figure 13 
                Cross-project iterative process. Graphic: C. Holtmann, HTW Berlin.
Figure 13

Cross-project iterative process. Graphic: C. Holtmann, HTW Berlin.

4 The Interactive Exhibition Extended Reality – Code and Materiality in Art and Culture

The MR dance installation Transient Exposure was one of the highlights of Extended Reality: Code and Materiality in Art and Culture in Berlin, an interactive exhibition of the INKA research group in cooperation with the Humboldt-Universität (HU) zu Berlin’s Faculty of Humanities and Social Sciences, and Cluster of Excellence Matters of Activity (03.09.–04.10.2021).[4] The exhibition was curated by the INKA members Dagmar Schürrer and Maja Stark and accompanied by two open access publications: a catalog [13] and Technical Insights [12].

Figure 14 
            XR exhibition visitors and guides in the Atrium of the HU’s Faculty of Humanities and Social Sciences, 03.09.2021. Photo: AURORA, HTW Berlin.
Figure 14

XR exhibition visitors and guides in the Atrium of the HU’s Faculty of Humanities and Social Sciences, 03.09.2021. Photo: AURORA, HTW Berlin.

With a selection of sixteen exhibits, the curators put the focus on the question what materiality means for art and culture in our post-digital age, where (amongst others) VR hides the analog environment, whereas AR and MR allow the interactive combination of physical and programmed realities in real-time.

As can be seen in the photograph (Figure 14), visitors were equipped with tablets and encouraged to interact with diverse artistic applications from the INKA projects AURORA, APOLLO, and AURA as well as with an AR installation created by researchers of the HU Cluster Matters of Activity.

To give an example, the AR posters on the photo are creations by the designer Sarah Müller in collaboration with the AURORA developer Leonid Barsht and awarded with the Red Dot Award in 2021. The motives and colors are inspired by a journey to New York. As part of the INKA AR application, New York, New York allows different interactions – e. g., to resize and move the digital 3D models.

VR head-mounted displays were provided for the interactive installations for the previously described Umwelten application by the APOLLO project in collaboration with the Konzerthaus Berlin and the Immersive Slideshow by the Matters of Activity members Mareike Stoll and Maxime Le Calvé in collaboration with Dionysios Zamplaras from the EnsadLab at the ESAD Paris.

Even though there was no empirical evaluation, many visitors expressed their fascination for the presented exhibits. Some disappointment was stated regarding the small field of view of the HoloLens 2, which most of the visitors put on the very first time. All the visitors appreciated the large personal input, which helped enormously to understand both the content and the functioning of the presented XR applications: Due to the pandemic, only six persons were allowed to visit the exhibition at the same time for 1.5 hours – during each timeslot, at least two INKA members were present to explain everything. On first glance, this one-to-three relation of guides and visitors might seem a lot, but in the context of the diversity of the applications, the interaction possibilities, and the XR hardware, it has increased the quality of the exhibition and contributed to the very positive response.

5 Conclusion

This paper presents the previous work of the HTW Berlin’s research projects APOLLO, AURORA, and XR_Unites with a special focus on the lessons learned concerning interdisciplinary, collaborative development of XR applications.

Regarding the collaboration with cultural institutions, APOLLO showed two different projects that involved different cultural stakeholders. One of them being the AR exhibition project, that involved the development of different AR applications around the topics of the Konzerthaus Berlin and featured a classic setup of stakeholders, with two project leads, multiple developers, and a designer. The second project, the interactive VR experience Umwelten, additionally included a visual artist and a composer as artistic stakeholders.

The chapter showed how the learning from the first project (AR exhibition) directly influenced the work of the following project (Umwelten). The tighter iteration cycle meant a better inclusion of the stakeholders and the influence of the artistic stakeholders led to a more creative product. Based on multiple years of experience in the APOLLO project, our summarized recommendations for interdisciplinary collaborations on XR applications with cultural institutions and artists can be found in Table 1.

In the field of rather small collaboration teams – usually one developer, one expert for digital media production, and one art and culture professional (ACP) with basic knowledge in AR development –, we discussed our experience in the project AURORA. Due to the project’s educational concept of a very close one-to-one collaboration of the AR specialists with the artist (or artists duo), the three major challenges for AR creation in interdisciplinary teams as identified in [10] – team-internal misconceptions about the medium (1), the lack of tool support and appropriate methods (2), and the absence of a common language (3) –, were mastered without bigger problems, but at cost of time. The time factor was even more relevant as the ACPs had to earn their living parallel to the development. Thus, the collaboration mode was highly flexible and surely would not have been possible under free market conditions – but it also gave the teams a lot of freedom – e. g., in testing and discussing different technologies. So far, these positive conditions have led to the successful publication of 20 artistic AR applications. Three years of interdisciplinary collaborations on AR applications with freelance ACPs in the project AURORA led to our recommendations that are bundled in Table 2.

Regarding larger collaboration teams of – at the time of writing – three computer scientists (two of them students) and three artists with the goal of an artistic MR installation, the experiences in the project XR_Unites gave the idea for a process on different levels derived from the well-known iterative process (Figure 1) – but modifying it to “Collaboration – Results & Tools – Analysis” (Figure 13) and complementing it with a cross-project approach in order to share ideas and resources between different research projects. Furthermore, the prototype of an editor was developed that can be used and extended in future projects. With or without an editor: Our suggestion for interdisciplinary collaborations with teams of freelance ACPs on MR applications are summarized in Table 3.

Finally, the exhibition Extended Reality – Code and Materiality in Art and Culture was recorded to show how we presented artistic XR applications developed in our research projects successfully to a wider public.

We are convinced that there will be an increasing number of interdisciplinary collaborations comparable to the described projects – and we hope that our experience and recommendations will be useful for their stakeholders. In our eyes there can only be winners when culture and computer science come together and collaborate.

Funding statement: AURORA and XR_Unites are funded by the European Regional Development Fund’s (ERDF) special program “Strengthening Innovation Potential in Culture II” (INP-II) with kind support from the Berlin Senate Administration for Culture and Europe. APOLLO is also funded by the ERDF with the support of the Senate Chancellery for Higher Education and Research.

About the authors

Maja Stark

Maja Stark has coordinated the HTW Berlin’s INKA project AURORA School for ARtists since 2018 and XR_Unites since 2020. She lectures on topics related to XR in art and culture, publishes, conceives and organizes events at the intersection of culture and computer science. Maja Stark studied art history with minors in French and Italian at the Freie Universität Berlin and at the Sorbonne IV in Paris. Her focus as an art scholar has been on art in extended reality (XR) for several years. After a traineeship at Deutscher Kunstverlag, she was Editorial Coordinator at the Cluster of Excellence Image Knowledge Gestaltung at Humboldt-Universität zu Berlin from 2015 to 2018.

Elisabeth Thielen

Elisabeth Thielen studied Applied Computer Science and works as a research assistant at the HTW Berlin. As part of the INKA research group and the EU-funded APOLLO project, she develops and manages various interactive augmented and virtual reality developments in collaboration with the Konzerthaus Berlin that combine classical music with modern mediation technology. These include, for example, the first virtual string quartet in augmented reality (2018) and the interactive composition Umwelten (2020). From 2018 to 2020, she was also a staff member of the EU-funded AURORA School for ARtists at HTW Berlin, where she trained artists in augmented reality and supported them in the implementation of creative applications in the AURORA production lab.

Christoph Holtmann

Christoph Holtmann is a research assistant and an experience and software developer at the HTW Berlin focusing on virtual and mixed reality, gamification, education, and culture. He is particularly interested in innovative ideas, concepts, and technologies. He also teaches and enjoys passing on his knowledge.

André Selmanagić

André Selmanagić studied computer science and media at the HTW Berlin, where he has been working as a research assistant in the research groups Creative Media and INKA since 2012. From 2018 to 2021 he was one of the coordinators of the AURORA School for ARtists, where he teached augmented reality to arts and culture professionals. He also supported them in developing their own creative applications. He finds it particularly exciting that extended reality applications require entirely new input/output modalities and thus novel approaches interaction design.

Michael Droste

Michael Droste studied computer science at the HTW Berlin. He first encountered immersive technologies like virtual und augmented reality during his employment as a research associate at the research group INKA. His work as a developer focused on human-computer interaction and the implementation of interactive applications that are easily used and understood by users. The field of extended reality intrigues him because of the immersive experiences and the effect they have on users.

Leonid Barsht

Leonid Barsht is a developer and research assistant at the HTW Berlin’s INKA project XR_Unites. From 2018 to 2021, he worked at the AURORA School for Artists. There he developed augmented reality applications in cooperation with creatives from various fields, such as the Red Dot Award-winning application New York, New York by designer Sarah Müller. He also teaches the basics of augmented reality as a lecturer – from 2019 to 2021 as part of AURORA and from 2020 to 2021 also in seminars for students of Applied Computer Science at HTW Berlin.

References

[1] Michael McCurdy, Christopher Connors, Guy Pyrzak, Bob Kanefsky, and Alonso H. Vera. 2006. Breaking the fidelity barrier: an examination of our current characterization of prototypes and an example of a mixed-fidelity success. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1233–1242. DOI: https://doi.org/10/b9xxzf.10.1145/1124772.1124959Search in Google Scholar

[2] Emmanuel Dubois, Guillaume Gauffre, Cédric Bach, and Pascal Salembier. 2007. Participatory Design Meets Mixed Reality Design Models. In Computer-Aided Design of User Interfaces V. Gaëlle Calvary, Costin Pribeanu, Giuseppe Santucci, and Jean Vanderdonckt (eds.). Springer Netherlands, Dordrecht, 71–84.10.1007/978-1-4020-5820-2_6Search in Google Scholar

[3] Cristina Portalés. 2018. When Augmented Reality Met Art: Lessons Learned from Researcher–Artist Interdisciplinary Work. Multimodal Technologies and Interaction 2, 2 (April 2018), 17. DOI: 10.3390/mti2020017.Search in Google Scholar

[4] Ole Molvig and Bobby Bodenheimer. 2020. Interdisciplinarity and Teamwork in Virtual Reality Design. The Journal of Interactive Technology and Pedagogy 18 (December 2020). Retrieved August 16, 2021 from: https://jitp.commons.gc.cuny.edu/interdisciplinarity-and-teamwork-in-virtual-reality-design/.Search in Google Scholar

[5] Ingo Börsting and Volker Gruhn. 2020. Towards Efficient Interdisciplinary Authoring of Industrial Augmented Reality Applications. In Conference Companion of the 4th International Conference on Art, Science, and Engineering of Programming (Porto, Portugal) (<programming> ’20). Association for Computing Machinery, New York, NY, USA, 65–68. DOI: https://doi.org/10.1145/3397537.3398474.10.1145/3397537.3398474Search in Google Scholar

[6] Alexis Clay, Gaol Domenger, Julien Conan, Axel Domenger, and Nadine Couture. 2014. Integrating augmented reality to enhance expression, interaction & collaboration in live performances: A ballet dance case study. In 2014 IEEE International Symposium on Mixed and Augmented Reality-Media, Art, Social Science, Humanities and Design (ISMAR-MASH’D), 21–29. DOI: 10.1109/ISMAR-AMH.2014.6935434.Search in Google Scholar

[7] Andreas Fischer, Sara Grimm, Valentine Bernasconi, Angelika Garz, Pascal Buchs, et al. 2016. Nautilus: Real-Time Interaction Between Dancers and Augmented Reality with Pixel-Cloud Avatars. In 28ième conférence francophone sur l’Interaction Homme-Machine, Oct 2016, Fribourg, Switzerland, 50–57. Hal ID: 01386445.Search in Google Scholar

[8] Raymond Koon Chuan Koh, Henry Been-Lirn Duh, Cheng-Ho Chen, and Yun-Ting Wong. 2012. Co-creativity fusions in interdisciplinary augmented reality game developments. In 2012 IEEE International Symposium on Mixed and Augmented Reality-Arts, Media, and Humanities (ISMAR-AMH), 47–56. DOI: 10.1109/ISMAR-AMH.2012.6483988.Search in Google Scholar

[9] Maja Stark, André Selmanagić, Leonid Barsht, Dagmar Schürrer, Elisabeth Thielen, Denise Bischof. 2021. Between Technical Realization and Creative Process: The Interdisciplinary Development of Augmented Reality Art at the AURORA School for ARtists. In Zeitschrift Kunst Medien Bildung | zkmb. Retrieved 30 January 2022 from: http://zkmb.de/between-technical-realization-and-creative-process-the-interdisciplinary-development-of-augmented-reality-art-at-the-aurora-school-for-artists.Search in Google Scholar

[10] Veronika Krauß, Alexander Boden, Leif Oppermann, and René Reiners. 2021. Current Practices, Challenges, and Design Implications for Collaborative AR/VR Application Development. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA, 15 pages. DOI: https://doi.org/10.1145/3411764.3445335.10.1145/3411764.3445335Search in Google Scholar

[11] Narges Ashtari, Andrea Bunt, Joanna McGrenere, Michael Nebeling, and Parmit K. Chilana. 2020. Creating Augmented and Virtual Reality Applications: Current Practices, Challenges, and Opportunities. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13. DOI: https://doi.org/10.1145/3313831.3376722.10.1145/3313831.3376722Search in Google Scholar

[12] Michael Droste, Leonid Barsht, André Selmanagić, Elisabeth Thielen, Sophie Schauer, Christoph Holtmann, and Julien Letellier. 2021. Extended Reality – Code and Materiality in Art and Culture: Technical Insights. HTW Berlin. https://doi.org/10.5281/zenodo.5714026.Search in Google Scholar

[13] Maja Stark, Dagmar Schürrer (ed.). 2021. Extended Reality – Code and Materiality in Art and Culture: Catalog. HTW/HU Berlin. DOI: https://doi.org/10.5281/zenodo.5714026.Search in Google Scholar

[14] Peter Scholl, Stefan Schöbinger et al. 2018. Using Cuboid Markers for an Interactive Augmented Reality Exhibition. In Proceedings of the International Conference on Culture and Computer Science: Hybrid Systems 2018. ISBN: 978-3-86488-128-2.Search in Google Scholar

[15] Annette Thoma, Elisabeth Thielen, Andrey Borisov. 2020. An immersive and interactive composition in Virtual Reality. In Proceeding of the International Conference on Culture and Computer Science: Extended Reality 2020. ISBN: 978-3-86488-169-5.Search in Google Scholar

Published Online: 2022-04-01
Published in Print: 2022-04-26

© 2022 Walter de Gruyter GmbH, Berlin/Boston

Downloaded on 21.5.2024 from https://www.degruyter.com/document/doi/10.1515/icom-2022-0011/html
Scroll to top button