Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

With the drop in the cost of computers, people now often have the opportunity to use multiple devices simultaneously. Users typically own and operate various types of PCs, tablets, and smart phones. In addition, they occasionally encounter public access terminals such as information kiosks, ATMs, and digital signages. It is desirable to establish communication between such devices, but enabling them to function cooperatively is often difficult and requires tedious configuration. It is necessary to provide more user-friendly interfaces for connecting and using multiple devices in cooperation.

This paper proposes a method to enable the straightforward transfer of information between a computer with a large touch-panel display, such as a tabletop PC, and another device, typically one worn by the user. The technique uses human body communication protocols proposed by Zimmerman [1], which utilize the human body as part of an electronic circuit. With our technique, the user touches an item on the display to select and transfer it to his or her computer. Upon touching the panel, the two devices are automatically connected, and the selected information is transmitted directly through the user’s body. The user does not need to configure the connections between devices in advance, and can easily transfer data between devices.

2 Selection and Transfer by Touch

2.1 Basic Interaction Technique

We propose a method for information transfer between multiple devices with a touch panel, utilizing human body communication. Figure 1 illustrates our proposed technique. Icons that represent information such as a file or an image are displayed on the large touch screen of a tabletop PC. The user has a wearable device on his or her wrist and the device has a small touch screen, much like a smartphone. To transfer information from the tabletop PC to the user’s wearable, he or she selects an icon and touches it with his or her finger. The system automatically connects the two devices, and transmits the data represented by the selected icon. Alternatively, to transfer information from the wearable device to the tabletop PC, the user first selects an icon on the wearable, and then touches the screen of the tabletop PC.

Fig. 1.
figure 1

Overview of our proposed method: transfer by touching the interactive screen.

2.2 Example Use of the Interaction Technique

  • Combining with Multitouch and Gesture Operations. Various types of t-ouch interaction can be used with our technique. For example, multiple pieces of information can be selected and transferred simultaneously by touching multiple icons. In addition, multi-touch operations can be used. The user designates the range with his or her fingers and items within the range are selected and transferred simultaneously. Images transferred to a device can also be scaled with multi-touch gestures.

  • Receiving Streaming Data. In our method, devices can communicate only while the user is touching the screen. We can take advantage of this characteristic. For example, it is only possible to stream video while the user is touching the screen. This may be useful when presenting promotional material to specific customers, or offering coupons corresponding to the time the visitor watched the video.

  • Transferring a User’s Profile to Customize Interactions. With our technique, a touch operation can be used to transfer user profiles from a consumer device to a public terminal, and vice versa, enabling customized interaction. Information can be recommended based on the user’s personal preferences. For example, if a user is searching for a restaurant from an information kiosk, he or she can provide his or her preference by touching the screen. When a restaurant is decided upon, a map can be transferred to the user’s device by touching the restaurant’s icon.

3 Prototype Implementation

3.1 Overview

Figure 2 shows an overview of a prototype implementation. The system consists of two parts: a device worn by the user (Fig. 2 (right)) and a tabletop PC with a touch panel screen (Fig. 2 (left)). When the user interacts with the tabletop PC, these two parts are connected via the user’s body, and a communication link is established. For example, when an image selected by the user is transferred from the tabletop PC to the wearable device, it is sent via the transmit/receive (TX/RX) module of the tabletop PC, through the transparent conductive sheet, and into the user’s body. Copper foil attached to the wearable couples the signal into the device via a second TX/RX module. This path can be used for bidirectional communication between the wearable device and the tabletop PC.

Fig. 2.
figure 2

Overview of the data communication path established when a user touches the tabletop PC display.

3.2 Wearable Devices

Figure 3 shows a prototype wearable device designed to be worn on the wrist (Fig. 3 (a)). A smartphone (Samsung Galaxy S II LTE) was used for interaction with the user. An external TX/RX circuit was connected to the smartphone via copper foil fitted to the back of the wearable device that allowed signals to be coupled into the human body (Fig. 3 (b)).

Fig. 3.
figure 3

The prototype wearable device.

3.3 TX/RX Circuit Board

Figure 4 shows a schematic of the TX/RX circuit board that was used for both the tabletop PC and the wearable device. The transmission (TX) circuit comprised a crystal oscillator and an AND logic circuit. Data were sent using amplitude shift keying (ASK) modulation. A carrier wave of 1 MHz was generated by the crystal oscillator. The receive (RX) circuit comprised a band-pass filter, two operational amplifiers, an envelope-detection circuit, and a comparator. Received signals were amplified, demodulated and read by an Arduino microcontroller. The signal path was half-duplex and therefore only one circuit could operate in TX mode at any one time, a condition satisfied by using relays. For example, when data were transmitted from the tabletop PC to the wearable device, the TX module of the PC and the RX module of the wearable were activated. The prototype implementation of the TX/RX circuits used a breadboard. Printed circuit boards are being fabricated at the time of writing.

Fig. 4.
figure 4

A schematic circuit diagram of the TX/RX modules.

3.4 Tabletop PC with a Touch Panel

The tabletop PC, fitted with a touch panel screen (Fig. 5), communicated with the wearable device via the screen’s surface while the user was touching it. The touch panel screen was covered with a transparent conductive sheet, connected to the PC via the TX/RX circuit board.

A PQ Labs multitouch sensor frameFootnote 1 was used to detect the user’s touch gestures on the \(60^{\prime \prime }\) display. Since infrared sensors were used to detect the point at which contact was made, interference between the display unit and the touch operation was avoided.

Fig. 5.
figure 5

A tabletop PC with a touch panel screen, and the TX/RX circuit board.

3.5 Data Transmission Protocol

Upon user contact with the screen, an electrical connection between the tabletop PC and wearable device was established. At this moment, the system attempted to establish a logical data connection using the following handshake method (Fig. 6 (a)).

First, a synchronization packet (SYN) was sent by the TX module. Upon reception of the SYN packet, the RX module returned a synchronization acknowledgement packet (SYN/ACK), which the TX module replied to with an ACK packet. Upon receiving the ACK at the RX module, a logical connection was established and the TX module began transmission.

If the TX module did not receive a SYN/ACK after transmission of a SYN packet, the SYN was resent several times. This situation was typically caused by insufficient contact between the user and the screen. Transmission was successful in most cases where a firm contact was achieved.

The protocol for termination of the communication link was similar to the initiation protocol (Fig. 6 (b)). First, the TX module sent a final packet (FIN). When the RX module received the FIN packet, it returned a FIN/ACK packet, to which the TX module replied with an ACK. Upon receiving the ACK packet at the RX module, the data transmission was complete and the connection was closed.

The above sequence assumes that the TX module knows when the user has touched the screen, making transmission possible. The tabletop PC can sense contact via the touch panel, however the wearable device did not have this information. Therefore, when user input was detected at the tabletop PC without data to be transmitted, a connection packet (CNT) was sent to the wearable device. When a CNT packet was detected by the wearable, the transmission protocol described above was initiated.

Fig. 6.
figure 6

The data transmission protocol.

3.6 An Application Example: Exchanging Image Data

Software was written and installed on a tabletop PC and a wearable device to enable the exchange of image data.

To transfer the image data from the tabletop PC to the wearable device, the user was required to touch the image file icon on the screen. This operation was recognized by the touch panel sensor, triggering transmission of the selected file. The user was required to maintain contact until the transmission process was completed. During transmission, a progress indicator was displayed on the screen of the wearable device. After transmission had been completed, the received image was displayed on the screen of the wearable device.

To send the image data from the wearable device to the tabletop PC, the user first selected an image by touching the relevant icon on the screen of the wearable. The device waited until a connection with the tabletop PC had been established to transmit the data. Then, the user touched the screen of the tabletop PC; if he or she touched the background area rather than an image icon, the wearable device sent the selected image data to the PC. If he or she touched an image icon, the transmission from the wearable device was cancelled. If the user maintained contact and the transmission was successful, the transmitted image was displayed on the screen of the tabletop PC. In this case, a progress indicator was also displayed on the screen of the wearable device.

In Fig. 7, the user has downloaded image data from the tabletop PC. Since the transmission has been successfully completed, the image selected by the user is displayed on the screen of the device. The user can then browse the image on the wearable.

Fig. 7.
figure 7

Transfer of an image file with our prototype system.

3.7 Preliminary Evaluation

We conducted a preliminary experiment to evaluate the system. We recruited four students, two undergraduates and two graduates, as subjects, and asked them to use our system. The system operated at a data rate of approximately 17 kbps. It took 4–6 s to transfer an image in this experiment. The transmission was generally successful, but failed in some cases because of insufficient contact between the user and the screen, or release of the user’s finger before the transmission was complete.

We conducted interviews about the usability of the system after each subject completed the experiment. Most subjects responded that they could easily transfer image data between the wearable device and the tabletop PC. For example, some of their positive comments were: “The process of information transfer is very simple”, “It is intuitive because we can visually recognize a target image, and select it with a direct physical operation”, “It seems secure because information is not sent without touch”. On the other hand, they made some negative comments: “Maintaining touch during the transmission is troublesome”, “I felt the transmission time was too long”, “Sometimes, the transmission was not stable”, “The wearable device is a little large”. Improvements to the speed and stability of the transmission, and reductions in the size of the device are therefore necessary. We are planning to improve these factors in future implementations.

4 Related Work

In Pick-and-Drop [2] systems, users can transfer files between computers. The user picks a file from one computer by touching its icon with a pen, and drops it on the other by touching on its display with the same pen. In Memory Stones [3], a user can ‘pick up’ a file displayed on a device screen, carry it to another device, and place the data by using his or her fingers. During this operation, the user is invited to pantomime the act of carrying a tangible object (the “stone”) and to keep his or her fingertip positions unchanged. The system identifies the source and target devices by matching the shape of the polygon formed by the fingertips when touching the respective screens.

Using these techniques, a user can transfer information in an intuitive way, but since the actual data transmission is achieved via the wireless network, the user needs to configure the connection in advance. With our method, no configuration is necessary to communicate, even with public access terminals.

Hinckley [4] proposed interaction techniques to connect multiple computers by physically bumping them. A user can bump a tablet into another resting on a desk. The software recognizes the gesture by synchronizing the two accelerometers across a wireless network. The tablet moved by the user annexes the display of the stationary tablet, allowing a panoramic image to span both displays. Seifert et al. [5] presented a system that allowed users to extend the display of their mobile phones by using external screens. Users connected their mobile phone with a screen by holding it against the border. When the connection was established, the GUI of the active mobile application was distributed across the phone and the external display. Using our technique, the user only has to touch an item on the touch-panel display to select and transfer data between two devices.

DiamondTouch [6] is a multi-user touch technology that uses human body communication for tabletop front-projected displays. It works by transmitting a different electrical signal to each part of the table surface that is to be uniquely identified. When a user touches the table, signals are capacitively coupled from directly beneath the touch point, through the user, and into a receiver unit associated with that user. The receiver can then determine which parts of the table surface are being touched by the user. Our method also communicates between wearable devices and tabletop PCs using the human body. DiamondTouch technology is used to identify which person is touching where. On the other hand, our study is intended to transfer information easily from or to the user’s device by automatically connecting it and a tabletop PC.

5 Conclusion

We proposed a technique for information transfer between multiple devices, utilizing human body communication with a touch-panel. We also developed a prototype system that could send and receive image data between a tabletop PC and a wearable device. We performed a preliminary experiment to evaluate this system. Users responded mostly positively, but some comments indicated that the prototype system should be refined so that the transmission speed and stability were improved and the size of the device was reduced. In future work, we aim to improve upon our design by developing a system that results in more stable information transfer between devices.