1 On Buildings and Computation

Computational processes regulate temperature, shades, permission to open doors throughout contemporary buildings. If we adopted the lens of Le Corbusier’s infamous statement that houses are machines for living in, we could be easily fooled into thinking that we should have to simply compare houses to computers. But let’s look at it more carefully. Le Corbusier also said that an armchair is a machine for sitting in; the street is a traffic machine; cities are machines too. This generative power of things is what the architect was addressing, not merely comparing elements of a house construction to a machinery: “Important advance can only come from without, from quarter where no question at all arises of such a thing as inextricably locked machinery.” [9]

Space, not building is the machine, claimed Hiller [6]. He considered Le Corbusier’s use of the word metaphorical, rather than paradigmatical: machine as a metaphor of the style and not of the floor plan. Function is never fully encoded in building form as cultural practices tend to be more complex than the possibilities offered by space. At the same time, the form of the building is a mapping of the behaviour. Buildings are machine-like, they produce functional outcomes through their spatial properties; they are at the same time language-like, embodying and transmitting social information. But they are neither language nor machines. “Buildings are thus probabilistic space machines, able to absorb as well as generate social information through their configuration.” [6]. Real space is the continuous space of material objects through which we move; logical space is the discontinuous world of expressive form; building is the point where real space is converted into logical space. Hillier saw the building’s social configuration inseparable from its materiality.

In the capacity and capability-attuned discussion on the urban and the three gestalts of architecture-turned-symbolization, Vera Bühlmann summarized Michel Serres’ thinking on the machine, which he described as the first gestalt [4]. Machine, a tool, means (Greek) and enabler (Proto-Indo-European) uses energy found in nature, and operates on the principle of the linkage of geometrically continuous circular movements. These machines work with motion, they are rotative machines and as such they are strictly geometrical, constructed from universal forms, and mostly used for transport. They convert energy from one form to another. The most important decision when designing a machine is what this machine is going to do. “A car is always going to move from A to B” [7].

Computers and buildings are coming closer together, in the unavoidable proliferation of sensors and chips that contribute to a space of potentiality constituted by this generic infrastructure. But buildings and computers have a different purpose, they do different things. Buildings are closer to cars, in terms of their specificity of application.

A computer is a universal machine. However, a computer is not a mechanical machine that operates geometrically – it operates through logic, and processes information. What becomes primary when working with computers is the application [7]. The design and development of an application does not use up any particular physical resources (apart from developer’s time, and the electricity needed to charge a laptop battery).

Computation and communication networks are capturing and counting predetermined parameters about environment conditions (air temperature and moisture, amount of light, etc.) as well as users actions and interactions with the system (presence, movement, use of certain features). Many of these systems have developed solely with the purpose of optimizing resource use and space management – as Yolande Stengers observed in a text on the imaginaries of the smart home [17]. Scarcity of resources becomes in this way an important driver of technical developments in terms of measuring building performance.

Without pertaining to answer the very large question of “what are the drives of technological development in building industry?”, I want to present these two views: scarcity-driven measurement technology on one side and communication-driven infrastructure technology on the other. With the interest in the interplay between user agency and building automation I look at the ways we talk about computation in buildings, particularly about the way automation is dependent on one or the other previously mentioned views. I observe a connection between the discourse on scarcity and a particular approach to interaction design that is concerned with establishing causality between measurement and building use. I propose, on the other hand, to appreciate a discourse in a different key, one that is concerned with capacity and communication, without moralizing on the availability of resources and production.

2 What Buildings Are About: Burning Wood or Making Shelter?

In the first edition of The Architecture of the Well-tempered Environment, Reyner Banham observed two basic methods of exploiting the environmental potential of natural resources, more specifically timber [1]. The first method suggests using wood to construct a shelter from environmental effects (such as wind or rain): a structural solution. The second method suggests using the material as a non-scarce resource to build a fire. He calls the latter the power-operated solution [1]. At the time of writing, he already observed a sort of a tipping point between the first, structural method, largely adopted by the Western culture, and the power-operated approach, based on abundance of electricity and water supply. Banham strongly advocated for a proliferation of energy-dependents solutions (Heating, ventilation, and air conditioning or HVAC systems) to achieve indoor comfort.

The first concerns over access to electrical power were caused by newly raised perception of energy as scarce following the fuel crisis of 1973–74 in the United States. As a consequence, the building industry and construction engineers began considering energy efficiency as a design constraint.

In the preface to the second edition of the same book, Banham offers a kind of remedy to the severe criticism his book was met with in the times of 1970s oil crisis [2]. He defended his position towards energy consumption as a progressive argument, one that is made with the assumption energy would not be scarce. He did not set to solve all the problems implied by building electrification and gradual automation. He simply talked about its potentials, from a point where scarcity of resources was not an issue.

The dichotomy in approaches that Banham identified is still important in thinking about building design and energy efficiency. Evaluation of a building’s performance developed into an elaborate measurement-driven discipline. Interaction with buildings became significantly more complex and powerful. But the idea of scarcity still haunts some of these discourses.

2.1 Interactive Monocoque

The two polarities: power-operated environments and structural solutions persists in the intellectual divide in terms of the interests in building design. On one side, we have the interest in designing the skin (independent from the structure, preserving the conditions of a power-operated solution inside) and on the other we have the interest in the structure itself. A holistic approach would be, Branko Kolarevic argued in his discussion on performativity of architecture to “avoid the binary choices of skin or structure and to reunify the two by embedding or subsuming the structure into the skin” [8]. In a few words, he suggested that the structure and the skin conflate in a monocoqueFootnote 1. In this way, the building is reacting to some environmental condition while at the same time enclosing a controlled environment within it. Even with its relatively small size, Lars Spuybroek’s D-tower would be a good example of this approach. And while structure and enclosure become one, what Banham called a “man made climate” [1] they both become a way of making shelter, a massively structural method of environmental management, without much attention to the power provisions.

2.2 Media-Architecture

In the versatile manifestations of media architecture, the interactive skin transformed the static structures into building-events. At night, the city becomes a stage for electrical energy, shifting focus from space to time-based architecture. An observer drawn into this play is no more outside nor inside. Media facades are not operating as shelter – that which is outside is more important than that which is inside.

A large number of public buildings designed in the past 20 years feature some kind of a screen – a matrix of interactive elements, be it LEDs or kinetic elementsFootnote 2. The well known BIX facade designed by realities: united for the Kunsthaus Graz embodies the principle of a communication skin. Furthermore, some contemporary examples such as MegaFaces, designed by Asif Khan for Sochi Olympic Games exhibit this communicative property: MegaFaces continuously molded into three-dimensional selfies of visitors to the building. The difference between such media architecture from architecture in general is in it having a screen as an integrated and infrastructural element of the facade. It is a power-operated skin, whose functioning as a shelter is not affected by the flow of electricity.

This focus on the facade is, in a way, another iteration of the old game, massive structure rendered into its dynamic representation. At the same time, it makes a difference: it separates the inside from the outside in a different fashion, where communication takes place outside of the building. Through activities such as the Media Facade Festival and other events coordinated by the Connecting Cities networkFootnote 3 media facades show their infrastructural potentiality, they can perform as a communication network. Media architecture should be thus taken more largely as a potential that is not exhausted with the display of commercials or temporary media art installations.

2.3 Smart Homes

In a recent text, Yolande Strengers made a critical overview of the way the smart home and its agenda were imagined in the past decades and the realities that are unfolding as these become integrated into everyday lives [17]. One of the mainstream visions she identified is the quantified home: a system attuned at capturing and counting, with the aim of changing users’ consumption patterns. A key task associated with smart home is meeting better the energy demand: decarbonizing and de-peeking energy systems. The narrative of efficiency springs out of that, where automation is seen as a relief for our everyday chores, with the deceptive neutrality of automated servants. Strengers used these observations to advocate for alternatives, against the control narrative, in which user’s agency is the driver of system design. She proposed doing away with the gender-less and utilitarian concept of a universal user; engaging with the eclectic composition of households and their human and nonhuman occupants in a messy way; design for different types of time and for different understandings of productivity and busyness.

One interesting question raised in smart home and smart office scenarios in terms of agency of the user is the personalization of these systems. In a project by Carlo Ratti Associati for Angeli Foundation headquarters, the proposal for Office 3.0 conceptualizes shared space as individually-tailored environmental bubbles, based on indoor position tracking and profiling of usersFootnote 4. The smart system they developed instructs heating, lighting and cooling systems to follow occupants around the building and adjust the settings to their preferences.

3 Measuring Efficiency: The Scarcity Paradigm

There are several ways in which interactive technologies and communication infrastructures have been integrated with buildings. Building automation driven by smart applications or by temporary media art installations open new spaces of potentiality for an infrastructural role of these technologies. The discourse on scarcity, however, still haunts the discipline. After the oil crisis, the main driver of smart homes, as Strengers reminded us, was the de-peaking of energy systems. Scarcity of resources and our attempts to measure them have been a strong drive behind technological developments. At the same time, each contemporary episteme is influenced by technology that we have at hand.

Banham’s call for the use of power-operated solutions in building design and engineering was met with severe criticism. But the 1970s oil crisis that the United States suffered was caused by several foreign policy moves that preceded it. Oil prices did not go up because resources were suddenly replete, they raised because of a political situation in which OPECFootnote 5 members proclaimed an oil embargo on the US. Several crises have passed since, most recent one arguably once more demonstrating that the problem of scarcity is not only localizable in the resources being finite (although they mostly probably are) but also in an artificially created scarcity.

3.1 Scarcity in the Wireless Spectrum

The above discussed principle enabled creation of scarcity in the wireless spectrum. The electromagnetic spectrum is difficult to grasp and we often resort to tangible metaphors of roads or territories when talking about it. Roads can be congested, territories crowded. We tend to say that wireless communication channels are saturated today.

Access to the electromagnetic spectrum is regulated in terms of available frequencies (both licensed and unlicensed), the maximum signal strength permitted, the geographic region over which the license applies and the designated service provided by an operator. Network operators buy rights to use specific frequencies from national regulatory authorities (FCC in the United States, RED in Europe, BAKOM in Switzerland). A relatively small portion of radio is reserved for the unlicensed spectrumFootnote 6. Contemporary wireless communication technologies (Wi-Fi, cellular, Bluetooth, NFC, etc.) make use of the different bands in both licensed and unlicensed spectrum and are based on continuous exchange between networked devices. In this way we are able to both send and receive information over the air.

Spectrum mask – the set of protocols that define different channels and regulate frequency use is provided by the Institute of Electrical and Electronics Engineers (IEEE). This spectrum mask is as an international standard which ensures device interoperability while minimizing interference with devices that share the same frequency range – amongst them microwaves, Bluetooth gadgets, Zigbees, Baby phones and wireless surveillance cameras.

In contemporary discourse on overcrowding, wireless networks are often seen as something scarce, something we need more and more of. This view conflates the capacity of communication equipment to transmit information with efficiency of protocols and techniques to encode this information onto signal. It is the spectrum mask that renders networks scarce. Researcher in cultural and economic aspects of networking technology, Rachel O’Dwyer noted that spectrum policy is broadly emblematic of the prohibitions operating over what she calls the substrate infrastructure [13]. She saw ownership of infrastructure, which implies both the right to use specific frequency bands and access to telecommunication cables, as a central asset in valorization of wireless communication technologies. Management of interferences through spectrum masks renders unlicensed frequency bands scarce.

3.2 Scarcity of Human Attention

Today, we often talk about scarcity of another resource: human attention. Malcolm McCullough discussed this at length in his book about attention and architectural atmospheres [11]. He claimed that we constantly confront an overabundance of information in the environment, be it the movement of the shadow over a wall (persistent high resolution) or digitally enabled communication networks that drive complex systems of sensors and actuators.

The idea of attention scarcity traces back to Herbert Simon’s observation of a counterbalance between information and attention – namely that the abundance of information necessarily leads to the scarcity of attention, consumed by it [16]. Following this line, McCullough set as the combined task of architects and interaction designers (which is delimiting the field of human-building interaction) to reduce information overload to something we can meaningfully consume and process. He observed interaction design as an analog to architecture in terms of organizing flows of people, resources, and ideas [12]. They both address how contexts shape action. With embedded microprocessors and communication networks architecture, he said, acquired a digital layer. The ambient processes of shaping intention through architecture and relating it to opportunities in space through interaction design are driven by the aim to control the flow of information, to emancipate the user from an overabundance of it.

3.3 Scarcity as the Key to Measurement and Causality

The discussions on energy, wireless networks and human attention point to scarcity of resources as the driver of technological developments. When we want to reduce the use of something, we first need to measure it. This is a long used technique in sustainable design and it has been successfully applied to reduce energy use through different kinds of feedback (how much CO2 consumed, how much money saved) [10]. What we can conceive of measuring is that which we consider needs saving. Once the causal relationship is established between a resource and its environmental impact (such as the use of oil or coal and their CO2 footprint), it is only a matter of optimizing measurement instruments and finding the best way to incentivize the users to save.

4 The Internet of Everything: Abundance and Balance

The communication paradigm sets itself apart from the production paradigm in which we measure and count resources, products created, labor costs, accumulation of goods and wealth. Going away from the idea of scarcity of resources, it is about emitting, receiving, storing and processing information. What becomes important is the application – the programming of software routines that computers will compute and communicate, and network infrastructure – not as a resource but as a space of potentiality. Hovestadt et al. saw one of the most important properties of the network its virality: the ability to spread information virally. Virality is inseparable from communication: media theorist Jussi Parikka pointed out the decisive role of computer viruses in the generation of novel ideas in the new science of networks – such as viral marketing or experimental vaccine softwareFootnote 7 [14]. He repeated the observation of Fred Cohen – one of the most important names in computer virus defense techniques – in order to secure communication from viruses, one would have to block it entirely. In their discussion on the abundance of energy, Hovestadt et al. propose to imagine this virality applied to energy, and not information.

4.1 Energy Is Essential, but Not Necessarily Scarce

Energy is today central to almost all activities – from working on computers, communicating over wireless devices, cooking and washing, transportation and distribution to money transfers. Without it, nothing goes: no person, no business and no infrastructure. This centrality of energy is too quickly translated into the call to use less of it – simply because of its importance. Questioning the amount of energy we use leads to the question where the energy is coming from. If we observe our fossil fuel resources as finite (or too slowly renewed) then it is clear that is should be saved for as long as possible. However, relying on oil and coal is certainly not the only option and it has been challenged repeatedly in the past 40 years. Numerous other techniques of producing electricity (water, wind, sun) have been invented and practiced. The recent trend for sustainability and self-sufficiency of buildings partly came out of the concerns for finiteness of some of the above-mentioned resources, as well as the environmental impacts electricity production has had on Earth.

While most sustainability advocates see renewable resources (wind, sun, temperature difference) as equally interesting, Hovestadt et al. make it very clear how abundant and independent from Earth’s conditions Sun’s energy is [7]. And we are already able to capture up to 22% of it with today’s mass produced silicon-based solar panelsFootnote 8. New technologies and techniques are being developed, such as transparent dye-sensitized cells patented by Michael Grätzel who won a Millenium Prize for this invention, and was able to increase their efficiency up to 19% [3, 5].

In the Genius Planet, Hovestadt et al. present a detailed calculation of the amount of solar panels that would be needed to satisfy current world energy needs, based on total solar emissions received per square meter on Earth [7]. That would be, for example, our current road network, which takes up about 10% land surface. Distributing energy’s abundance is a question for logistics, a question of making it accessible to everyone. The structure is that of the Internet: channeling information (or energy) from where it is to where it is needed. Much to Banham’s possible pleasure, Hovestadt et al. recognize the importance of power for operating buildings, in a similar relationship between energy and matter. The difference is that the former advocated burning of fossil fuels, while the latter authors articulated a sophisticated proposal for capturing and sharing of solar energy. Once it is converted to electricity, it becomes universally available.

The challenge we are presented with is, thus, not merely to capture the source of energy and store it (which is a significant challenge still today). It is in communication: how to get the energy from one place to. This is a challenge for the networking infrastructure. It is also a challenge for the way we will relate to it, the way energy will be balanced throughout.

Bühlmann observed that electricity is the only way that energy can be medial, meaning that it can carry information about itself while carrying its own energy [7]. With electricity, it is not only possible to send energy from the source to consumers, but also from one user to the other, networking the distribution. In such an ‘intelligent’ energy network a device may be used both to provide, store as well as to consume energy. This would be the Internet of energy.

5 Consequences for Human-Building Interaction

Interaction refers to something happening between and among. The first part of the word, inter- coming from the verb “to enter” (Proto Indo European) and “entera” (Old Greek for intestines), with the root en- meaning in, something that is inside, interior to. Action draws from latin actio-, referring to something being put in motion, doing, performing. In order to critically reflect upon new trends in ambient computing or what I analyze here through human-building interaction perspective, we need to see interaction with buildings as something more profound than cause-and-effect reactions between the human users and decisions taken by the infrastructure (turn on the heating, roll down the blinds). It is a question of orchestrating infrastructures.

Mainstream interaction design has largely adopted the disappearing interface metaphor for its goal, epitomized in the study of contemporary trends presented in The Age of Context [15]. At the same time, designers have pointed towards the loss of agency that is inherent in such disappearance. Timo Arnall, whose work consistently explored invisible infrastructures (Immaterial: Light Painting Wi-Fi, 2010, Robot readable world 2012, The Internet Machine, 2014) articulates such concerns in his No to NoUI manifestoFootnote 9. Arnall opposed the myth of immateriality and childish mythologies like “the cloud” in favor of design that integrates the actual qualities of the interface and increases our ability to become proficient at using technical systems. He illustrated this discussion with the example of the Nest thermostat interface, which gives out all necessary information to the user while seamlessly “learning” the user’s habits. Arnall argues for focusing on legibility and readability instead of seamless invisibility and removal of the interface. Strengers had a similar take on the matter: in her account of alternatives to the mainstream smart-home vision, she talked about the coordinated home and the DIY home. The fist one is centered on creating increasing degrees of flexibility around different sites of activity and new opportunities for energy to be consumed. The second one is a vision in which users are able to adapt technology to their own needs – be it because it failed to work properly or simply because it allowed certain degrees of freedom to be modified by design. The question becomes how to create enough space for choices to be made on the go by people, rather than predefining the states of these infrastructures in terms of causalities (user not at home: switch heating off). Users attention is not scarce, it can be focused meaningfully on intentional interactions.

5.1 Designing with Abundance

Some design principles can be articulated, that stem out of the perception of energy or communication networks as abundant, and the interest in balancing them.

Rather than interaction design, Hovestadt sees computer and information science and technology directly relevant and somewhat analog to architecture. Architecture and infrastructure engineering coupled through computer science give rise to truly intelligent buildings. This would mean that our homes, our places of work and leisure become, in effect, applications [7]. Application is the potentiality of computational technology. Resources are not being used up and exploited – application is, independent of the hardware on which it runs, about the ability to create potentiality [7]. It is easy to understand this when we think about the fact the computer is not getting heavier from running more software applications. The fundamental shift is in the potentiality of virtuality.

In general, this principle of interaction can be described as orchestration of infrastructures. Matched to our own eyes, brightness of light used to be measured by the size of a candle flame. With the introduction of electrical infrastructure, brightness was determined by the available force of the power turbine and the incandescence of the wire in the bulb [7]. With the invention of different lighting technologies and the normalization of artificial lighting, the question transforms to what kind of light do we want: reading light, relaxing light, at which time of the day, etc. It becomes a question of orchestrating the quality of light.

The non-scarce approach to resources liberates one’s thinking from a centralized network paradigm and questions of control: it requires thinking about a proliferation of sources and dispersal of directions, it is a thinking about choices. Rather than deciding to burn wood because it is cold, we look at conditions and states: we ask which, out of many things that can be done, do we want to do when it gets cold.

The interplay between user agency and building automation is then not driven by control and the attempt to decarbonize and de-peek energy consumption, as Strengers identified [17]. Rather, it is driven by a user’s emotion, their momentary preference for the kind of ambient and atmosphere they wish to be surrounded with. The discreetness of automated systems (heating, blinds control, management of appliances) can pass into a continuity of an ambient apparatus, whose states and changes are communicated through the energy supply network. This constant communication should be something that a human might want to look at at any point, but does not need to be monitoring. At the same time, the actions of the inhabitant remain discreet – they are not seen as a repetitive pattern. The hypothetical system described here is not attuned at predicting user’s behavior but at balancing the momentary tendencies. It is in this way that we can work towards turning mere automation into sophisticated orchestration.