1 Background

Two common ways that science advances are through methods and through theory [1]. Research can advance through coming up with innovative methods for testing phenomenon. Frameworks can advance as new ways of thinking about past experimental results offers an alternative lens with which to create new theories. However, there is another way; advances in instruments can lead to new discoveries which in turn could lead both to new methods and to new theories.

Advances in instrumentation used to constitute new types of apparatuses. While this is still true, software offers several alternative techniques for improving existing apparatuses. Software can increase the speed, sensitivity, scale, or statistical inference offered by existing instruments. Everyday devices now connected devices offer new insights into everyday behaviors. Global networks of sensors offer increased situational awareness and increased statistical power for climatological research. User interfaces and intelligent services offer windows through which knowledge workers can explore these new worlds.

Knowledge workers are an important part of modern, service-oriented culture. These individuals are tasked with helping guide important, fast-paced decisions within both private and public sectors. Technology used to facilitate decision making is increasingly relying upon intelligent systems to either partially or to fully automate the decision-making process [2, 3]. These interactive learning processes involve humans as part of a decision-making feedback loop. Humans’ judgement serve to guide future machine judgement and vice versa [4].

The human-machine feedback loop has existed since the dawn of computing. The key premise is that ideal aspects of each help to augment the combined capabilities of both. In order for these types of technologies to be effective they must scaffold human capabilities. There are several ways this can take place. For example they can increase awareness, inference, attribution, or the overall decision-making process. This approach works well and improvements in DSSs.

As intelligent systems become increasingly integrated within our everyday lives the primary orientation of DSSs changes from being interface-driven to becoming service-driven. In these new environments humans no longer fettered by the capabilities of visual perception. Speech assisted technologies bring enablement for more interactive environments. Augmented reality allows digital artifacts to enter the everyday. Social networks allow machines to rely upon real-time crowd consensus. While this changes the type of interactions in the human-machine feedback loops it does not change the central premise that mutual augmentation is the most effective strategy for increasing combined capabilities.

2 Cognitive Extension

Human-Computer Interaction as a field is interested in applying the tools and techniques from the research sciences to the creation of technologies or processes that better integrate with the human experience. While traditionally concerned with human factors or usability design, increasingly and recently this field has focused upon the usability of information more broadly as it relate to facilitating decision-making [5]. Intelligent systems are becoming more common. While such systems may behave autonomously, their effectiveness (at least currently) is still quintessentially linked to their ability to enhance or to extend human activities.

Intelligent systems often utilize a form of statistics called machine learning. Machine learning is often a form of applied statistical methods whose goal is not to test for significance of variation but rather to classify which hypothesized group is most likely given the information present. (i.e. we are not asking if A is different than B but rather whether A belongs to [B, C, D, … ,]). A challenge for such systems historically has been coming up with approaches that generalize to unforeseen circumstances [6,7,8,9,10].

A similar challenge is in coming up with models that are grounded [2, 6, 9, 10]. Grounded approaches are those whose results we can explain based upon an understanding of the objects and the objective function. In cases where the definition of the object is subjective (e.g. the presence of emotional states) then models have to first determine what states exist before classification can occur. While grounding models is effective, researchers struggle to understand why some grounded models are better than others. Classification error alone is a fragile technique for evaluating model efficacy. A theory for why this occurs relates to the ability of the models to generalize to unseen examples.

Representative models are grounded models that function through encoding some properties of the attributes into the model itself [6, 9, 10]. For example, a representative model of color might encode the dimensions of color perception. However, while a model built using the dimensions of color perception that humans use would be grounded, fewer dimensions might be able to more efficiently capture this variation. The challenge in this later case is arguing that the compressive dimension is still representative.

Research suggest that grounded models benefits from first pre-processing the input. By first creating a natural ontology and later mapping this to an external ontology models are more likely to be able to statistically differentiate along meaningful axes [9, 10]. Thus, grounded models contain representations that are naturally separable based upon observed data. This is in contrast to ontological-backed models that may or may not be directly grounded. Taking this further, representational models would be those grounded models that also contain perceptual dimensions that are indicative of the conceptual landscape. Here perception implies that the model encodes the external world into a parsimonious and natural internal representation.

While specific on the need for such encodings, there is flexibility in the mechanism for such encodings; these encodings could direct, connectionist, or something else entirely. Techniques such as the use of covariance, concomitance, correlation, or convolution have worked well in the past. The important distinction is that such models do not encode artificial features even if they are useful for classification purposes. Once built, the incorporation of these models built upon information from instruments into decision-making frameworks extends our cognition.

2.1 DEAR Framework

The DEAR framework operates in four stages: detect aberrant or nonlinear behavior, evaluate for causality, assess risk, and recommend action [11]. This approach is based upon decision-making theory from operations research, risk management from public and private institutions, and statistical inference from chaos theory. The detection technique is a combination of classical statistical inference combined with methods for detecting nonlinearity. The core technique for evaluating causal relationships is based upon the convergent cross-mapping (CCM) technique proposed by Sughihara but has later been extended by other researchers [12,13,14,15]. This technique quantifies biases in pairwise relationships using a matrix of time-embedded offsets [15,16,17]. The risk assessment phase uses the optimal Kelly Criterion analysis to determine the relative risk of various outcomes with a goal towards system optimization [18]. The recommendations then assemble the output of the risk assessment and compare and contrast them with alternative uses of an institution’s resources.

The DEAR framework is based upon the results of experimentation and application to multiple domains. These authors have already successfully applied the DEAR framework to financial, social, and epidemic scenarios. They have applied the CCM component of this framework to physical systems. Naturalistic systems such as these domains represent areas of study that originally motivated chaos theory as well as areas that would substantially benefit from DSSs focused upon non-linearity. As with any system, understanding the underlying relationship governing the observable variables within such systems is based on discerning cause and effect relationships. However, unlike limited test environments, naturalistic systems also represent the unpredictable characteristics inherent in the real world. These systems exhibit complexity, nonlinearity, and multivariate interactions. As a process DEAR could be built directly into an interface as a part of a DSS or as a part of broader service-oriented solution. Either way this approach is inherently sociotechnical since the end result requires inclusion of relativistic criteria such as an institution’s objective function and/or alternative uses of institutional resources.

The four stages of the DEAR framework are: detect nonlinear abnormalities (D), evaluate causality (E), assess risk (A), and recommend action (R) [11]. Taken together these stages help to enhance cognition while reducing cognitive bias. Detection helps facilitate attention and perception while minimizing extraneous. Evaluation helps with hypothesis formation. Assessment helps with attribution and with evaluation of alternatives. Recommendation aids prospective memory and minimizes unnecessary actions. If this process is included in a service as a part of an intelligent system then the process has the added advantage that it should also serve to maximize the ratio of exploitative versus exploratory behaviors. This would result in more efficient form of agency.

2.2 Instruments

Instruments facilitate cognition when they enable just out of reach activities. These activities could be attentional, perceptual, procedural, attributional, etc. Perceptual enhancement is perhaps the clearest example of how instruments can enhance cognition. The ability to observe electromagnetic radiation in frequencies outside of the visible spectra is a clear example of where instruments enhance human cognition.

A recent development in the field of astronomy has come from an unexpected source. As high definition televisions (HD TV) became more common, broadcast standards for such devices also had to improve. With the advent of ATSC 1.0 it became possible to broadcast HD TV signals across radio wave broadband. In order to be effective this technology required that each TV have a receiver. Due to commercialization and mass production these receivers became increasingly affordable. Today these receivers cost less than 20 USD and are capable of receiving anywhere from 15 kHz–2.5 MHz. Fortunately for astronomers these frequencies are also characteristic of interplanetary emissions. This has led to the development of software defined radio (SDR) so that individuals can use these devices for a multitude of purposes.

SDR is similar to trends taking place in other, similar fields. For example in telecommunications the software defined networking (SDN) is enhancing the ability of humans to manage the quality of service for large networks of interconnected devices. In industrial processes the industrial internet of things (IoT) allows real time awareness and troubleshooting of various forms of production. What these technologies have in common is their use of software and their use of a collection of sensors. These technologies also often make use of wireless media for transmission and receiving.

Modern instruments also have a sense of agency that enables certain types of insights that were not previously possible. In these scenarios devices can perform several types of tasks. They can behave passively when they simply observe broadcast traffic. They engage in remote sensing when they use telemetry to observe behavior of other devices. Finally, they can engage in remote control when they use telecommands. This combination of behaviors when combined with large arrays of sensors allows a level of awareness and control beyond what single machine, human interactions offers.

2.3 Experimental Context

Astronomy is a field where humans have always relied upon formal logic to perform inferences for phenomenon beyond what meets the naked eye. Early observations of distant planets relied upon statistics to mediate subtle individual differences in perception. This way of thinking allowed for increased precision in observation and in turn led to the formation of more advanced theories of planetary mechanics. Over time telescopes became increasingly advanced and it is now prohibitively difficult to increase the capabilities of individual telescopes.

Astronomy benefits from the use of SDRs since they substantially increase the number of signals simultaneously analyzed. The very large array (VLA) telescope is a famous example of a radio telescope that has uses multiple independent observations to enhance overall capabilities. The low relative cost combined with the potential for highly distributed deployment promises to allow SDRs to crowdsource the development of useful radio arrays. The challenge that arrays such as the VLA face comes in correlating the outputs of the sensors into an integrated sense of awareness. The hope is that tbe conjoint application of the DEAR framework with these new instruments will yield new capabilities for astronomy.

As with any natural phenomenon, radio wave emission from Jupiter certainly vary. They range from disorganized noise to strikingly organized “songs”. The later types are typically characterized as being either L-Bursts or S-Bursts denoting whether their duration is either long or short [19,20,21]. The immediate cause for these emissions is thought to be patterns in the emission of high energy particles from the planet. The mechanism responsible for these emissions is a combination of the mixture properties of the planet as well as the planet’s interaction with the Jovian system.

The exact causes of Jupiter’s magnetic fields to form is still an active area of research. While it is impossible to directly interact with the planet, astronomers have some ideas as to what causes these emissions. They believe that Jupiter acts as a particle accelerator. Similar to how particle accelerators work in laboratories on Earth, the accelerators on Jupiter contain a conduit, a reservoir, and magnetic fields. Later for any number of reasons a strong magnetic field interacts with these reservoirs and pulls particles outwards into a band of emitted radiation. This creates a broad spectrum pattern detectable via radio telescopes.

These emissions form patterns within certain bandwidths of radio spectrum. These patterns are quite differentiable from the background noise of the typical Jovian radiation [20, 21]. So much so that astronomers describe these patterns as the “songs of Jupiter”. These patterns represent synchronization taking place in the emissions. Non-relativistic, cycltronic acceleration causes radiation that forms longer patterns called L-Bursts. Faster, relativistic, syncotronic acceleration causes radiation that forms shorter patterns called S-Bursts. The difference in duration is due to the increased speed of the particles and this translates to changes in duration of these wave patterns. However, the wavelengths remain similar such that observers can detect them using similar bandwidth receivers.

In many cases, the cause of these magnetic fields is well studied and fall into four categories: Io-A, Io-B, non-Io-A, and non-Io-B. While other moons also interact with Jupiter’s fields, none compare in magnitude of interaction to that of Io and its position along the Central Meridian Longitude (CML) as perceived by Earth-based telescopes [20, 21]. It strongly interacts with Jupiter’s magnetic field as it processes it orbit. Depending upon its position relative to Earth the emission pattern varies slightly. However, in some cases the causes are still unknown. In these cases the emissions may be due to natural interaction between these reservoirs and Jupiter’s magnetic field.

While the causes of these burst phenomenon are not conclusively known, there are some hypotheses. S-Bursts might emanate from deeper within the planet where higher energy sources exist. Conversely the L-Bursts might emanate from along the gassy outer surface of the planet. A primary interest to the authors of this paper is whether these two phenomenon might interact with each other. Could the presence of S-Bursts somehow be causing the emergence of the L-Bursts to take place?

2.4 Experimental Apparatus

Just because an instrument is capable of enhancing perception does not mean it is free from bias. As with the conventional dimensions of human perception, enhanced dimensions will also have certain innate biases. For example, an instrument observing the radio waves resultant from some astronomical phenomenon might have no information loss. However, it may also have distortion due to red-shift or it may suffer information loss due to modulation. As with any medium, there is also interference. So how can we know whether we are observing one phenomenon versus other phenomenon?

As described above the context of this study will focus upon increasing human understanding of Juptier’s radio emissions. With something such as Jupiter we have the advantage of being able to use conventional perception to observe Jupiter via telescopes using the visible spectrum. However, what if the phenomenon does not engage in visible light or is too distant to be enhanced using lenses? In such cases the degrees of separation between human perception and instrumentation increases. As this separation further and further it becomes increasingly important that our approach to cognitive enhancement is one that is both representative and grounded. Higher frequency emissions that cause visible light are difficult to receive by the same means with which we receive radio waves. This means that instruments often use different techniques to observe different spectrums of light. In such cases how is it possible to maintain confidence in comparisons across media? It may be easier to, for example, use modulation to up-convert the frequency of radio waves to that of visible light than vice versa.

This experiment used SDR technology to monitor Jupiter’s radio emissions. To properly observe interplanetary emissions, researchers used antennas intended for receiving radio frequencies in the 20 MHz range. In addition, the antenna chosen should have provided some gain. This was useful since it helped to differentiate against terrestrial and interplanetary noise. SDR receivers require the presence of a computer to translate input into intelligible time series signals. SDR receivers also typically require some sort of console software useful for interfacing with the SDR drivers. The combination of computer and console allowed operators to adjust SDR settings during observation. The computer once equipped with storage media also allowed researchers to be able to record these time series values into files useful for subsequent analysis.

2.5 Method

This was an observational study. The goal of the study was to determine whether there is a causal relationship between the S-Burst and L-Burst activity taking place in the Jovian system. This study observed samples of these behaviors taking place and then analyze them using the DEAR framework to determine whether there are causal relationships present within these emission patterns. Once armed with the results of the DEAR analysis researchers then used this to inform their subsequent investigative behaviors of the Jovian system. This combination of approaches was used in an effort to test the effectiveness of a feedback approach that combined this decision support framework with an iterative research project.

The detect phase of the DEAR framework focuses upon detecting signal abnormalities. Since space is vast and it is difficult to ubiquitously observe the appropriate signals, the researchers had to narrow the focus down to time periods of substantial interest. Through studying the historic patterns of the Jovian system, researchers were able to determine when and where to listen for S-Burst and L-Burst activity. The historic Io-A and Io-B interactions result in fairly predictable intervals during which time these emissions are easier to detect. Through listening to several of these periods of activity researchers were able to detect examples of these patterns. However, these examples of S-Burst and L-Burst activity were quite noisy and were difficult to analyze even though they were taking place in the correct bandwidth during the correct periods of time. For this reason, researchers instead extracted samples of S-Burst and L-Burst activity during a known Io-B type activity from 3/10/2002 via online archives [19].

Once the samples were collected, researchers then used various noise filters and signal amplification techniques within the SDR console. The duration of S-Burst and L-Burst range in the seconds and for this reason researchers prioritized obtaining high quality signal samples even if the duration of the samples are quite short. To further increase success of analysis researchers tried to obtain files with a sample rate in the millisecond range. In order to compare files from different sources with each other, down-sampling took place on the file with the short sample interval so that the two files had the same sample interval. Researchers used Audacity to perform this sampling. The amplitudes were composited into a text file for CCM analysis.

The evaluation phase of the DEAR framework focuses upon assessing the presence of causal relationships between variables of interest. Upon processing the input files researchers used Sugihara’s CCM process to examine causal relationships between S-Burst and L-Burst activity.

The assessment phase of the DEAR framework examines the risk involved in maintaining a policy given the presence of causal relationships. In this context the risk relates to the opportunity cost of using the radio telescope to examine the Jovian system when it could be used to explore other regions and epochs. To assess this risk researchers would use Kelly’s original equation to determine the relative risk that researchers would not obtain useful information in the Jovian region as compared to the default probability of obtaining information in other regions.

Finally, the recommendation phase of the DEAR framework would issue policy recommendations for researchers operating the radio telescope based upon the results of the previous causal inference and related risk assessment. In this case the recommendation would either be to continue evaluating the Jovian system or to continue exploring other regions.

3 Results

Researchers ran CCM on an Io-B Storm containing both S-Burst and L-Burst activity. The result of this analysis showed only weak evidence to suggest a causal relationship between these two phenomenon. The S-Burst xmap L-Burst showed none to negative causality as the library length increased (Fig. 1). The correlation of causality for L-Burst xmap S-Burst approached 0.2 which suggests weak evidence for the case of S-Bursts influencing L-Bursts. However, even this weak trend disappears when examining longer library lengths (Fig. 2).

Fig. 1.
figure 1

The CCM xmap results for Jupiter S-Burst and L-Burst activity for library lengths <0,120> . The blue, top line shows L-Burst xmap S-Burst and the red, bottom line shows S-Burst xmap L-Burst. The blue, top line suggests weak evidence for the case that S-Burst may influence L-Burst behavior. (Color figure online)

Fig. 2.
figure 2

The CCM xmap results for Jupiter S-Burst and L-Burst activity for library lengths <0,120>. The blue, top line shows L-Burst xmap S-Burst and the red, bottom line shows S-Burst xmap L-Burst. There is no evidence for causality present. (Color figure online)

Since the evaluation phase did not yield evidence for causality, the assessment and recommendation phases of experiment were much shorter. With no return on investment the assessment phase would indicate to continue exploration elsewhere. However, the weak evidence for causality causes the researchers to also avoid excluding this possibility from future searches. For this reason, the policy recommendation would be to continue natural exploration including occasionally revisiting the Jovian system.

4 Conclusions

There is not strong evidence for causation between S-Bursts and L-Bursts. However, this is not surprising since there is also not strong evidence for this being the case. The, the combination of the use of the DEAR framework in conjunction with an iterative testing approach made it possible for the researchers to inform their investigative process. Nonetheless, this research also unveiled new challenges that this context present.

The challenges faced in this experiment are characteristic both of the particular context as well as of applied decision making. In general humans struggle to make decisions in contexts that are either too complex or in contexts where the time and space dimensions are substantially different than those in everyday scenarios. Astronomy is an area of study where the distances as well as the time scales offer substantial challenges. The vastness of space make omniscience unfeasible. The rapid timescales of EMF combined with the long timescales of cosmological objects makes causal inference nearly impossible without the aid of DSSs.

These limitations stem from the inability of human perceptual dimensions to create useful conceptualizations which in turn impacts human attribution. For this reason, DSS for these contexts require augmenting human embodiment as well as human cognition. Human embodiment is finite and restricted by our innate sensorimotor apparatuses and the related perceptual dimensions. In contrast, extended mind is a concept that refers to the ability to augment human cognition through extending our perceptual awareness using information systems and, usually, visualization apparatuses. Extended embodiment would be a similar concept extended into to human embodiment.

Artificial Intelligence (AI) is a technology that enables machines to perform certain automated actions where the likelihood of a given action to take place is based upon that actions past success [22, 23]. The success of an action depends upon the objective function the AI uses. While objective functions vary quite a bit from application to application they generally focus upon increasing the prevalence of a desired outcome and/or upon the minimization of an undesirable outcome. The desirability of an outcome can be gauged based upon human experience. In so doing this approach extends human embodiment by incorporating its quintessential objectives into synthetic systems. Thus, the two systems continue to collaborate via sharing common goals.

Science in general, and Astronomy in particular, are great places for researchers to focus upon building embodiment support systems (ESSs). The vastness of space both in terms of time and physical distance make this area of research a practical paradigm within which to develop DSS/ESS hybrid systems.

5 Next Steps

The future of DSSs will include a hybrid DSS/ESS. Such systems will extend AI within the operations of the systems in order to support extended embodiment [24, 25]. The DEAR framework will become a part of a larger DSS/ESS framework in which it will continue to serve its role as a routine useful for assessing causality. Similar to the way that DSSs facilitate cognition, this system will focus upon extending the human embodiment into new situated contexts.

The long-term goal of such a system will be to build a synthetic mental model upon which a decision making cadence can operate. Effective perceptual dimensions will be essential for this system to be able to form concepts. As this system discovers concepts it will incorporate them into the model. Upon initial concept formation and at periodic intervals following the system will examine causal relationships between concepts. Based upon the success of exploring such relationships it will then also update its exploration strategy. Over time perhaps this system will learn that space is a vacuous ether filled with random connections that occasionally fire elucidating electrical impulses across different channels and wavelengths. It will develop interests and curiosities related to the patterns it finds in these impulses. It will use its curiosity to continue to refine its exploration until it becomes sufficiently mature enough to begin to offer insights to humans. At this point the system can then collaborate in a facilitative context that combines human and machine interaction.

In addition to incorporating new AI-based feedback techniques into the decision making framework, researchers plan to increase the scalability of the DEAR framework itself. There are techniques these authors are pioneering that will allow the CCM sub-process to scale the larger data sets. Through using hierarchical approaches to data sets this approach will allow for multivariate comparisons of causality to take place in quai-linear time.

Such extensions will make it possible for both this system and the DEAR framework to integrate into new domains. As the internet of things (IoT) continues to develop, the challenge for researchers will be to find new DSS approaches useful for facilitating human interaction in these contexts. Such extended approaches could help build new types of technologies. Such technologies are already at work organizing information on the internet [26]. Maybe soon we will have such approaches capable of offering real-time analysis of gun shoots through monitoring ambient noises. Or, perhaps a more advanced understanding of patterns in human neuronal activity through using electroencephalography (EEG), electrocardiography (ECG), or functional magnetic resonance (fMRI) technology [24, 27]. One thing is certain: the future of human and machine interaction is far from over.