Abstract
We explore the technological bases for argumentation combined with information fusion techniques to improve intelligence analyses. We review various tools framed by several examples of modern intelligence analyses drawn from different environments. Current tools fail to support computational associations needed for fusion of relations among entities needed for the assembly of an integrated situational picture. Most tools are single-sourced for entity streams, with tools automatically linking analyses between bounded entity-pairs and enabling levels of “data fusion”, but the rigor is limited. Yet these tools often accept the pre-processed extractions from these entities as correct. These tools can identify the intuitive associations among entities, but mostly as if uncertainty did not exist. However, in their attempt to discover relations among entities with little uncertainty and few entity associations, the complexities are left to the human analysts to be resolved. This situation leads to cognitive overloading of the analysts who must manually assemble the selected situational interpretations into a comprehensive narrative. Our goal is automating the integration of complex hypotheses. We review the literature of computational support for argumentation and, for an integrated functional design, as part of a combined approach, we nominate a unique, belief- and story-based subsystem designed to support hybrid argumentation. To deal with the largely textual data foundation of these intelligence analyses, we describe how a previously, author-developed, ‘hard plus soft’ information fusion system (combining sensor/hard and textual/soft information) could be integrated into a functional design. We combine these two unique capabilities into a scheme that arguably overcomes many of the deficiencies we cite to provide considerable improvement in efficiency and effectiveness for intelligence analyses.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
- 2.
- 3.
- 4.
We like Stanford’s definition here (http://plato.stanford.edu/entries/reasoning-defeasible/): “Reasoning is defeasible when the corresponding argument is rationally compelling but not deductively valid. The truth of the premises of a good defeasible argument provides support for the conclusion, even though it is possible for the premises to be true and the conclusion false. In other words, the relationship of support between premises and conclusion is a tentative one, potentially defeated by additional information.”
- 5.
By the way, we see the (necessary) balancing of Pro and Contra arguments as another good feature of these argumentation methods; to some degree this is a built-in preventative to the human foible of confirmation bias.
- 6.
Credal will be seen to mean belief but in regard to conducting analysis this term is taken to mean a (human’s) conviction of the truth of some statement or the reality of some being or phenomenon especially when based on examination of evidence.
- 7.
Pignistic is a term coined by Smets and is drawn from the Latin pignus for “bet”, and can be taken to imply or relate to a probability that a rational person would assign to an option when required to make a decision.
- 8.
An argument mapping tool developed at the University of Dundee; see http://www.arg-tech.org/index.php/projects/.
- 9.
For the Reader: our reviews in the next section are running commentaries about selected papers from the literature that address each reviewed topic; in various places any emphasis provided is our own. Some excerptions from the original papers are included without quotation.
- 10.
The F measure is the harmonic mean of precision and recall, and can be viewed as a compromise between recall and precision. It is high only when both recall and precision are high.
- 11.
See the website listed at Table 2.3 for further details on these systems.
- 12.
Lockeed’s Advanced Technology Laboratories; see http://www.lockheedmartin.com/us/atl.html.
References
Schlacter, J., et al. (2015), Leveraging Topic Models to Develop Metrics for Evaluating the Quality of Narrative Threads Extracted from News Stories, 6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015) and the Affiliated Conferences, AHFE 2015 Procedia Manufacturing, Volume 3
Allen, J. (1995), Natural Language Understanding (2nd ed.). Benjamin-Cummings Publishing Co., Inc., Redwood City, CA, USA
Andrews, C. and North, C. (2012), “Analyst’s Workspace: An Embodied Sensemaking Environment For Large, High-Resolution Displays”, Proc. 2012 IEEE Conference on Visual Analytics Science and Technology (VAST), Seattle, WA.
Bex, F., S. van den Braak, H. van Oostendorp, H. Prakken, B. Verheij, and G. Vreeswijk (2007a), “Sense-making software for crime investigation: how to combine stories and arguments?,” Law, Probability and Risk, vol. 6, iss. 1-4, pp. 145-168.
Bex, F, et al. (2007b), Sense-making software for crime investigation: how to combine stories and arguments?, Law, Probability and Risk.
Bex, F. (2013) Abductive Argumentation with Stories. ICAIL-2013, in: Workshop on Formal Aspects of Evidential Inference, 2013
Bier, E.A., Ishak, E.W., and Chi, E. (2006), “Entity Workspace: An Evidence File That Aids Memory, Inference, and Reading”, In ISI, San Diego, CA, 2006, pp. 466-472.
Blei, D.M., Ng, A. Y., and Jordan, M. I. (2003), “Latent Dirichlet allocation,” Journal of Machine Learning Research, vol. 3, pp. 993–1022.
Corner, A. and Hahn, U. (2009)., Evaluating Science Arguments: Evidence, Uncertainty, and Argument Strength, Journal of Experimental Psychology Applied 15(3):199-212.
Croskerry, P. (2009), A Universal Model of Diagnostic Reasoning, Academic Medicine, Vol 84, No 8, pp1022–8.
Dahl, E. (2013), Intelligence and Surprise Attack: Failure and Success from Pearl Harbor to 9/11 and Beyond, Georgetown University Press.
Date, K., Gross, G. A., Khopkar, S, Nagi, R. and K. Sambhoos (2013a), “Data association and graph analytical processing of hard and soft intelligence data”, Proceedings of the 16th International Conference on Information Fusion (Fusion 2013), Istanbul, Turkey
Date, K., GA Gross, and Nagi R. (2013b), “Test and Evaluation of Data Association Algorithms in Hard+Soft Data Fusion,” Proc.of the 17thInternational Conference on Information Fusion, Salamanca, Spain
Djulbegovic, B., et al. (2012), Dual processing model of medical decision-making, BMC Medical Informatics and Decision Making, Vol. 12
Faloutsos, C. KS McCurley, and Tomkins A. (2004), “Fast discovery of connection subgraphs.” Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining 22: 118-127.
Feng, V.W. and Hirst, G.. (2011), Classifying Arguments by Scheme, Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, pages 987–996, Portland, Oregon.
Gordon, T.F. (1996), Computational Dialectics, In Hoschka, P., editor, Computers as Assistants - A New Generation of Support Systems, pages 186–203., Lawrence Erlbaum Associates.
Gross, G., et al. (2014), Systemic Test and Evaluation of a Hard+Soft Information Fusion Framework; Challenges and Current Approaches, in: “Fusion2014,” International conference on Information Fusion,
Haenni, R. (2001) Cost-bounded argumentation, International Journal of Approximate Reasoning, 26(2):101–127.
Hastings, A.C. (1963), A Reformulation of the Modes of Reasoning in Argumentation, Ph.D. dissertation, Northwestern University, Evanston, Illinois.
Headquarters, Dept of Army (2010), Army Field Manual 5-0, The Operations Process
Hossain, MS, M Akbar, and Nicholas F Polys (2012a). “Narratives in the network: interactive methods for mining cell signaling networks.” Journal of Computational Biology 19.9:1043-1059.
Hossain, MS., et al, (2012b), Connecting the dots between PubMed abstracts PloS one 7.1
Hossain, M. S., Butler, P., Boedihardjo, A. P., and Ramakrishnan, N. (2012c). Storytelling in entity networks to support intelligence analysts, Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining.
Klein, G., et al. (2006), Making Sense of Sensemaking 2: A Macrocognitive Model, IEEE Intelligent Systems, Volume:21, Issue: 5.
Kumar, D., Ramakrishnan, N., Helm, R. F., and Potts, M. (2008), Algorithms for storytelling, IEEE Transactions on Knowledge and Data Engineering, 20(6), 736-751.
Llinas, J. (2014a), Reexamining Information Fusion--Decision Making Inter-dependencies, in Proc. of the IEEE Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), San Antonio, TX.
Llinas, J, Nagi, R., Hall, D.L., and Lavery, J. (2010), “A Multi-Disciplinary University Research Initiative in Hard and Soft Information Fusion: Overview, Research Strategies and Initial Results”, Proc. of the International Conference on Information Fusion, Edinburgh, UK.
Llinas, J. (2014b), A Survey of Automated Methods for Sensemaking Support, Proc of the SPIE Conf on Next-Generation Analyst, Baltimore, MD
Mani, I. and Klein, G.L. (2005), Evaluating Intelligence Analysis Arguments in Open-ended Situations, Proc of the Intl Conf on Intelligence Analysis, McLean Va.
Mochales, R. and Moens, M. (2008), Study on the Structure of Argumentation in Case Law, Proceedings of the 2008 Twenty-First Annual Conference on Legal Knowledge and Information Systems: JURIX 2008
Mochales-Palau, R. and Moens, M. (2007), Study on Sentence Relations in the Automatic Detection of Argumentation in Legal Cases, Proceedings of the 2007 Twentieth Annual Conference on Legal Knowledge and Information Systems: JURIX 2007
Moens, M., et al (2007), Automatic Detection of Arguments in Legal Texts, Proceedings of the 11th international conference on Artificial intelligence and Law.
Moens, M. (2013), Argumentation Mining: Where are we now, where do we want to be and how do we get there?, FIRE '13 Proceedings of the 5th 2013 Forum on Information Retrieval Evaluation.
Ng, H.T., and Mooney, R.J. (1990), On the Role of Coherence in Abductive Explanation, in Proceedings of the Eighth National Conference on Artificial Intelligence (AAAI-90)
Pirolli, P. and Card, S. (2005), The Sensemaking Process and Leverage Points for Analyst Technology as Identified Through Cognitive Task Analysis, In Proceedings of 2005 International Conference on Intelligence Analysis (McLean, VA, USA, May, 2005). pp.337-342, Boston, MA.
Reed, C. and Rowe, G. (2004), ARAUCARIA: Software for Argument Analysis, Diagramming and Representation, International Journal on AI Tools 13 (4) 961–980.
Schum, D. (2005), Narratives in Intelligence Analysis: Necessary but Often Dangerous, University College London Studies in Evidence Science.
Shahaf, D., and Guestrin, C., (2010), Connecting the dots between news articles, Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining: 623-632.
Shahaf, D, Guestrin, C, and Horvitz, E. (2012) Trains of thought: Generating information maps.” Proceedings of the 21st international conference on World Wide Web: 899-908.
Shahaf, D. et al (2013), Information cartography: creating zoomable, large-scale maps of information. Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining: 1097-1105.
Shapiro, S. (2012), Tractor: Toward Deep Understanding of Short Intelligence Messages, University Seminar, available at: http://studylib.net/doc/10515245/tractor-toward-deep-understanding-of-short-intelligence-m., 2012
Simari, G and Rahwan, I (2009) Argumentation in artificial intelligence, Springer.
Smets, P. (1994), The transferable belief model, Artificial Intelligence, Volume 66, Issue 2, Pages 191-23.
Stasko, J., Gorg, C., Liu, Z., and Singhal, K. (2013), “Jigsaw: Supporting Investigative Analysis through Interactive Visualization”, Proc. 2007 IEEE Conference on Visual Analytics Science and Technology (VAST), Sacramento, CA.
Suthers, D. et al, (1995),, ‘Belvedere: Engaging students in critical discussion of science and public policy issues’, in AI-Ed 95, the 7th World Conference on Artificial Intelligence in Education, pp. 266–273, (1995).
Thagard, P. (2000), Probabilistic Networks and Explanatory Coherence, Cognitive Science Quarterly 1, 93-116
Toniolo, A., Ouyang RW, Dropps T, Allen JA, Johnson DP, de Mel G, Norman TJ, (2014), Argumentation-based collaborative intelligence analysis in CISpaces, in Frontiers in Artificial Intelligence and Applications; Vol. 266, IOS Press
Twardy, C. (2004): Argument maps improve critical thinking. Teaching Philosophy 27 (2):95--116
van den Braack, S. W. et al, (2007), AVERs: an argument visualization tool for representing stories about evidence, Proceedings of the 11th international conference on Artificial intelligence and law, Stanford, CA.
van den Braack, S. W. (2010), Sensemaking software for crime analysis, Dissertation, Univ of Utrecht, Holland.
van den Braack, S.W., et al (2006), A critical review of argument visualization tools: do users become better reasoners?, ECAI-06 CMNA Workshop, 2006
Walton, D., Reed, C., and Macagno.F. (2008), Argumentation Schemes, Cambridge University Press.
Walton, D. and Gordon, T.F. (2012), The Carneades Model of Argument Invention, Pragmatics & Cognition, 20(1), Web 2.0, Amsterdam, The Netherlands
Acknowledgement
This publication results from research supported by the Naval Postgraduate School Assistance Grant No. N00244-15-1-0051 awarded by the NAVSUP Fleet Logistics Center San Diego (NAVSUP FLC San Diego). The views expressed in written materials or publications, and/or made by speakers, moderators, and presenters, do not necessarily reflect the official policies of the Naval Postgraduate School nor does mention of trade names, commercial practices, or organizations imply endorsement by the U.S. Government.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Llinas, J., Rogova, G., Barry, K., Hingst, R., Gerken, P., Ruvinsky, A. (2017). Reexamining Computational Support for Intelligence Analysis: A Functional Design for a Future Capability. In: Lawless, W., Mittu, R., Sofge, D., Russell, S. (eds) Autonomy and Artificial Intelligence: A Threat or Savior?. Springer, Cham. https://doi.org/10.1007/978-3-319-59719-5_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-59719-5_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-59718-8
Online ISBN: 978-3-319-59719-5
eBook Packages: Computer ScienceComputer Science (R0)