Skip to main content
Log in

Detecting Cybersecurity Threats: The Role of the Recency and Risk Compensating Effects

  • Published:
Information Systems Frontiers Aims and scope Submit manuscript

Abstract

Detecting and responding to information security threats quickly and effectively is becoming increasingly crucial as modern attackers continue to engineer their attacks to operate covertly to maintain long-term access to victims’ systems after the initial penetration. We conducted an experiment to investigate various aspects of decision makers’ behavior in monitoring for threats in systems that potentially have been compromised by intrusions. In checking for threats, decision makers showed a recency effect: they deviated from optimal monitoring behavior by altering their checking pattern in response to recent random incidents. Decision makers’ monitoring behavior was also adversely affected when there was an increase in security, exhibiting a risk compensating behavior through which heightened security leads to debilitated security behaviors. Although the magnitude of the risk compensating behavior was significant, it was not enough to fully offset the benefits from added security. We discuss implications for theory and practice of information security.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. As we discuss later, from a probability perspective, the occurrence of security incidents is not completely without pattern. Rather, in most everyday situations the number of incidents in a unit of time (e.g., in a day) can be modelled using the Poisson distribution and the time between successive incidents can be modelled using the exponential distribution.

  2. In fact, NIST, one of the most-widely used security frameworks (Cieslak, 2016) is voluntary to adopt and is deliberately designed not to be restrictive in its guidelines (National Institute of Standards and Technology, 2018).

  3. Denoting the average time between incidents by α, the inspection cost by C1, and the cost of every undetected unit of time by C2, the optimum inspection time x is approximately equal to: \(x = \sqrt {\frac{{2\alpha {C_1}}}{{{C_2}}}} \)

  4. In fact, this is a special case in which the optimum inspection time equals the average time between incidents. We intentionally chose C1 and C2 such that subjects’ intuitive answer to the problem coincided with the right inspection time (i.e., “the incident happens once every 30 seconds, so I might as well inspect once every 30 seconds”). It must be noted that this study is not particularly concerned with how accurately subjects assess the objectively-correct inspection interval. Rather, our goal is to understand differences in checking patterns in response to factors such as the outcome of previous checks, etc.

  5. Prior research on risk compensating behavior shows that those who voluntarily adopt safety measures tend to behave more cautiously despite being safer (Scott et al., 2007). Although we do not test this hypothesis here, the voluntary nature of security in this task can therefore be expected to make the observed results more conservative than if all participants purchased security.

  6. Since mean check intervals varied across conditions and rounds, directly comparing standard deviations was not justified; thus, we used the coefficients of variation.

  7. The Payment Card Industry Data Security Standard (2018), Requirement 10: Security Tracking and Monitoring (https://www.pcisecuritystandards.org/documents/PCI_DSS-QRG-v3_2_1.pdf).

  8. Readers are referred to (Warkentin at al., 2012) for a discussion on the benefits and drawbacks of using various research methodologies in studying risk compensating behaviors in information security.

References

  • Adams, J. G. (1988). Risk homeostasis and the purpose of safety regulation. Ergonomics, 31(4), 407–428

    Article  Google Scholar 

  • Balozian, P., & Leidner, D. (2017). Review of IS security policy compliance: Toward the building blocks of an IS security theory. ACM SIGMIS Database: The DATABASE for Advances in Information Systems, 48(3), 11–43

    Article  Google Scholar 

  • Barlow, R., Hunter, L., & Proschan, F. (1963). Optimum Checking Procedures. Journal of the Society for Industrial and Applied Mathematics, 11(4), 1078–1095. https://doi.org/10.1137/0111080

    Article  Google Scholar 

  • Baskerville, R. (1993). Information systems security design methods: Implications for information systems development. ACM Comput. Surv, 25(4), 375–414. https://doi.org/10.1145/162124.162127

    Article  Google Scholar 

  • Bazerman, M. H., & Moore, D. A. (2013). Judgment in Managerial Decision Making (8th ed.). Wiley

  • Bijttebier, P., Vertommen, H., & Steene, G. V. (2001). Assessment of cognitive coping styles. Clinical Psychology Review, 21(1), 85–104. https://doi.org/10.1016/S0272-7358(99)00041-0

    Article  Google Scholar 

  • Blakley, B., McDermott, E., & Geer, D. (2001). Information security is information risk management. Proceedings of the 2001 Workshop on New Security Paradigms - NSPW ’01, 97. https://doi.org/10.1145/508171.508187

  • Bodin, L. D., Gordon, L. A., & Loeb, M. P. (2008). Information security and risk management. Communications of the ACM, 51(4), 64–68. https://doi.org/10.1145/1330311.1330325

    Article  Google Scholar 

  • Boss, S., Kirsch, L. J., Angermeier, I., Shingler, R. A., & Boss, W. (2009). If someone is watching, I’ll do what I’m asked: Mandatoriness, control, and information security. European Journal of Information Systems, 18(2), 151–164

    Article  Google Scholar 

  • Brandimarte, L., Acquisti, A., & Loewenstein, G. (2013). Misplaced confidences: Privacy and the control paradox. Social Psychological and Personality Science, 4(3), 340–347

    Article  Google Scholar 

  • Butow, T., Kehoe, M., Holler, J., Lester, R., Keene, R., & Pritchard, J. (2018). Reducing MTTD for High-Severity Incidents: A How-To Guide for SREs (V. Wilson, Ed.; 1st ed.). O’Reilly Media. https://www.gremlin.com/oreilly-reducing-mttd-for-high-severity-incidents/?utm_source=google&utm_medium=cpc&gclid=CjwKCAjwk93rBRBLEiwAcMapUVYMjsmS_C1ECUi980QZICiDti0KZwXbVEsTvu1DUAV_nca7Cz8WJxoC0fIQAvD_BwE

  • Cassell, M. M., Halperin, D. T., Shelton, J. D., & Stanton, D. (2006). Risk compensation: The Achilles’ heel of innovations in HIV prevention? Bmj, 332(7541), 605–607

    Article  Google Scholar 

  • Cerullo, V., & Cerullo, M. J. (2004). Business Continuity Planning: A Comprehensive Approach. Information Systems Management, 21(3), 70–78. https://doi.org/10.1201/1078/44432.21.3.20040601/82480.11

    Article  Google Scholar 

  • Chen, M., Qian, C., & Nakagawa, T. (2011). Periodic and Random Inspection Policies for Computer Systems. In T. Kim, H. Adeli, H. Kim, H. Kang, K. J. Kim, A. Kiumi, & B. H. Kang (Eds.), Software Engineering, Business Continuity, and Education (pp. 346–353). Berlin Heidelberg: Springer

    Chapter  Google Scholar 

  • Chen, P., Desmet, L., & Huygens, C. (2014). A study on advanced persistent threats. Communications and Multimedia Security, 63–72. https://doi.org/10.1007/978-3-662-44885-4_5

  • Chong, A., & Restrepo, P. (2017). Regulatory protective measures and risky behavior: Evidence from ice hockey. Journal of Public Economics, 151, 1–11

    Article  Google Scholar 

  • Christin, N., Egelman, S., Vidas, T., & Grossklags, J. (2012). It’s All about the Benjamins: An Empirical Study on Incentivizing Users to Ignore Security Advice. In G. Danezis (Ed.), Financial Cryptography and Data Security (pp. 16–30). Berlin Heidelberg: Springer

    Chapter  Google Scholar 

  • Cieslak, N. (2016, March 29). NIST cybersecurity framework adoption on the rise. Https://Www.Tenable.Com/Blog/Nist-Cybersecurity-Framework-Adoption-on-the-Rise. https://www.tenable.com/blog/nist-cybersecurity-framework-adoption-on-the-rise

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). L. Erlbaum Associates

  • Croson, R., & Sundali, J. (2005). The Gambler’s Fallacy and the Hot Hand: Empirical Data from Casinos. Journal of Risk and Uncertainty; New York, 30(3), 195–209. https://doi.org/10.1007/s11166-005-1153-2. http://dx.doi.org.proxy.library.umkc.edu/

    Article  Google Scholar 

  • Crossler, R. E., Johnston, A. C., Lowry, P. B., Hu, Q., Warkentin, M., & Baskerville, R. (2013). Future directions for behavioral information security research. Computers & Security, 32, 90–101. https://doi.org/10.1016/j.cose.2012.09.010

    Article  Google Scholar 

  • Evans, L. (1986). Risk Homeostasis Theory and Traffic Accident Data. Risk Analysis, 6(1), 81–94

    Article  Google Scholar 

  • Ezhei, M., & Tork Ladani, B. (2020). Interdependency Analysis in Security Investment against Strategic Attacks. Information Systems Frontiers, 22(1), 187–201. https://doi.org/10.1007/s10796-018-9845-8

    Article  Google Scholar 

  • FireEye (2019). M-Trends Cyber Security Trends.FireEye Mandiant Services. https://www.fireeye.com/current-threats/annual-threat-report/mtrends.html

  • Fox, C. R., & Ülkümen, G. (2011). Distinguishing two dimensions of uncertainty. In W. Brun, G. Keren, G. Kirkebøen, & H. Montgomery (Eds.), Perspectives on thinking, judging, and decision making. Universitetsforlaget

  • Fritz, C. O., Morris, P. E., & Richler, J. J. (2012). Effect size estimates: Current use, calculations, and interpretation. Journal of Experimental Psychology: General, 141(1), 2–18. https://doi.org/10.1037/a0024338

    Article  Google Scholar 

  • Galletta, D. F., & Zhang, P. (2009). Introducing AIS Transactions on Human-Computer Interaction. AIS Transactions on Human-Computer Interaction, 1(1), 7–12

    Article  Google Scholar 

  • Glendon, A. I., Hoyes, T., Haigney, D., & Taylor, R. (1996). A review of risk homeostasis theory in simulated environments. Safety Science, 22(1–3), 15–25

    Article  Google Scholar 

  • Gutzwiller, R. S., Fugate, S., Sawyer, B. D., & Hancock, P. (2015). The human factors of cyber network defense. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 59, 322–326

  • Hedlund, J. (2000). Risky business: Safety regulations, risk compensation, and individual behavior. Injury Prevention, 6(2), 82–90. https://doi.org/10.1136/ip.6.2.82

    Article  Google Scholar 

  • Herath, H., & Herath, T. C. (2008). Investments in Information Security: A Real Options Perspective with Bayesian Postaudit. Journal of Management Information Systems, 25(3), 337–375. https://doi.org/10.2753/MIS0742-1222250310

    Article  Google Scholar 

  • Herath, T., & Rao, H. R. (2009). Encouraging information security behaviors in organizations: Role of penalties, pressures and perceived effectiveness. Decision Support Systems, 47(2), 154–165

    Article  Google Scholar 

  • Ho, S. M., & Warkentin, M. (2017). Leader’s dilemma game: An experimental design for cyber insider threat research. Information Systems Frontiers, 19(2), 377–396. https://doi.org/10.1007/s10796-015-9599-5

    Article  Google Scholar 

  • Jarvik, M. (1951). Probability learning and a negative recency effect in the serial anticipation of alternative symbols. Journal of Experimental Psychology, 41(4), 291–297

    Article  Google Scholar 

  • Johnson, C. K., Gutzwiller, R. S., Ferguson-Walter, K., & Fugate, S. (2020). A Cyber-Relevant Table of Decision Making Biases and their Definitions (Version 1). Arizona State University. https://doi.org/10.13140/RG.2.2.14891.87846/1

  • Johnston, A. C., Warkentin, M., McBride, M., & Carter, L. (2016). Dispositional and situational factors: Influences on information security policy violations. European Journal of Information Systems, 25(3), 231–251

    Article  Google Scholar 

  • Kabacoff, R. I. (2015). R in Action; R in Action, Data Analysis and Graphics with R (2nd ed.). Manning

  • Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux

  • Kam, H. J., Mattson, T., & Goel, S. (2020). A Cross Industry Study of Institutional Pressures on Organizational Effort to Raise Information Security Awareness. Information Systems Frontiers, 22(5), 1241–1264. https://doi.org/10.1007/s10796-019-09927-9

    Article  Google Scholar 

  • Kuo, H. C., & Varki, S. (2014). Are Firms Perceived As Safer After an Information Breach? ACR North American Advances, NA-42. http://acrwebsite.org/volumes/1017691/volumes/v42/NA-42

  • Kweon, E., Lee, H., Chai, S., & Yoo, K. (2021). The Utility of Information Security Training and Education on Cybersecurity Incidents: An empirical evidence. Information Systems Frontiers, 23(2), 361–373. https://doi.org/10.1007/s10796-019-09977-z

    Article  Google Scholar 

  • Laury, S. K., McInnes, M. M., & Swarthout, J. T. (2009). Insurance decisions for low-probability losses. Journal of Risk and Uncertainty, 39(1), 17–44. https://doi.org/10.1007/s11166-009-9072-2

    Article  Google Scholar 

  • Lee, R., & Lee, R. (2016). The Who, What, Where, When, Why and How of Effective Threat Hunting (SANS Institute: Reading Room - Analyst Papers). SANS. https://www.sans.org/reading-room/whitepapers/analyst/membership/36785

  • Maloney, S. (2018, September 1). What is an Advanced Persistent Threat (APT)? https://www.cybereason.com/blog/advanced-persistent-threat-apt

  • McNeil, B. J., Pauker, S. G., Sox, H. C., & Tversky, A. (1982). On the elicitation of preferences for alternative therapies. The New England Journal of Medicine, 306(21), 1259–1262. https://doi.org/10.1056/NEJM198205273062103

    Article  Google Scholar 

  • Meyer-Delius, J., & Liebl, L. (1976). Evaluation of Vigilance Related to Visual Perception. In T. B. Sheridan & G. Johannsen (Eds.), Monitoring Behavior and Supervisory Control (pp. 97–106). Springer US. https://doi.org/10.1007/978-1-4684-2523-9_9

  • National Institute of Standards and Technology (2018). Framework for Improving Critical Infrastructure Cybersecurity, Version 1.1 (NIST Cybersecurity White Paper). National Institute of Standards and Technology. https://doi.org/10.6028/NIST.CSWP.04162018

  • Northcraft, G. B., & Neale, M. A. (1987). Experts, amateurs, and real estate: An anchoring-and-adjustment perspective on property pricing decisions. Organizational Behavior and Human Decision Processes, 39(1), 84–97. https://doi.org/10.1016/0749-5978(87)90046-X

    Article  Google Scholar 

  • Peltzman, S. (1975). The Effects of Automobile Safety Regulation. Journal of Political Economy, 83(4), 677–725. https://doi.org/10.1086/260352

    Article  Google Scholar 

  • Posey, C., Roberts, T., Lowry, P. B., Bennett, B., & Courtney, J. (2013). Insiders’ protection of organizational information assets: Development of a systematics-based taxonomy and theory of diversity for protection-motivated behaviors (SSRN Scholarly Paper ID 2173642). Social Science Research Network. https://papers.ssrn.com/abstract=2173642

  • Rabin, M. (2002). Inference by Believers in the Law of Small Numbers. The Quarterly Journal of Economics, 117(3), 775–816

    Article  Google Scholar 

  • Rajivan, P., Aharonov-Majar, E., & Gonzalez, C. (2020). Update now or later? Effects of experience, cost, and risk preference on update decisions. Journal of Cybersecurity, 6(1), tyaa002. https://doi.org/10.1093/cybsec/tyaa002

    Article  Google Scholar 

  • Renaud, K., & Warkentin, M. (2017). Risk Homeostasis in Information Security: Challenges in Confirming Existence and Verifying Impact. Proceedings of the 2017 New Security Paradigms Workshop, 57–69. https://doi.org/10.1145/3171533.3171534

  • Reyna, V. F., Chick, C. F., Corbin, J. C., & Hsia, A. N. (2014). Developmental reversals in risky decision making: Intelligence agents show larger decision biases than college students. Psychological Science, 25(1), 76–84. https://doi.org/10.1177/0956797613497022

    Article  Google Scholar 

  • Sagberg, F., Fosser, S., & Saetermo, I. A. (1997). An investigation of behavioural adaptation to airbags and antilock brakes among taxi drivers. Accident; Analysis and Prevention, 29(3), 293–302. https://doi.org/10.1016/S0001-4575(96)00083-8

    Article  Google Scholar 

  • Saltzer, J. H., & Schroeder, M. D. (1975). The protection of information in computer systems. Proceedings of the IEEE, 63(9), 1278–1308

  • Scott, M. D., Buller, D. B., Andersen, P. A., Walkosz, B. J., Voeks, J. H., Dignan, M. B., & Cutter, G. R. (2007). Testing the risk compensation hypothesis for safety helmets in alpine skiing and snowboarding. Injury Prevention, 13(3), 173–177. https://doi.org/10.1136/ip.2006.014142

    Article  Google Scholar 

  • Sheridan, T. B., & Johannsen, G. (Eds.). (1976). Monitoring Behavior and Supervisory Control (1st ed. 1976). Imprint: Springer

  • Shimao, H., Khern-am-nuai, W., & Kannan, K. N. (2019). So You Think You Are Safe: Implications of Quality Uncertainty in Security Software (SSRN Scholarly Paper ID 2621846). Social Science Research Network. https://papers.ssrn.com/abstract=2621846

  • Siponen, M., & Willison, R. (2009). Information security management standards: Problems and solutions. Information & Management, 46(5), 267–270. https://doi.org/10.1016/j.im.2008.12.007

    Article  Google Scholar 

  • Slovic, P. (2010). The Feeling of Risk: New Perspectives on Risk Perception. Earthscan

  • Smith, V. L. (1994). Economics in the laboratory. Journal of Economic Perspectives, 8(1), 113–131. https://doi.org/10.1257/jep.8.1.113

    Article  Google Scholar 

  • Stafford, T., Deitz, G., & Li, Y. (2018). The role of internal audit and user training in information security policy compliance. Managerial Auditing Journal, 33(4), 410–424

    Article  Google Scholar 

  • Straub, D. W., & Welke, R. J. (1998). Coping with systems risk: Security planning models for management decision making.MIS Quarterly,441–469

  • Streff, F. M., & Geller, E. S. (1988). An experimental test of risk compensation: Between-subject versus within-subject analyses. Accident; Analysis and Prevention, 20(4), 277–287. https://doi.org/10.1016/0001-4575(88)90055-3

    Article  Google Scholar 

  • Tola, B., Jiang, Y., & Helvik, B. E. (2017). Failure process characteristics of cloud-enabled services. 2017 9th International Workshop on Resilient Networks Design and Modeling (RNDM), 1–7. https://doi.org/10.1109/RNDM.2017.8093033

  • Trimpop, R. M. (1994). The Psychology of Risk Taking Behavior. Elsevier

  • Tversky, A., & Kahneman, D. (1971). Belief in the law of small numbers. Psychological Bulletin, 76(2), 105–110. https://doi.org/10.1037/h0031322

    Article  Google Scholar 

  • Verizon (2018). 2018 data breach investigations report, 11th edition (Research Report No. 11th). Verizon. verizonenterprise.com/DBIR2018

  • Wang, J., Gupta, M., & Rao, H. R. (2015). Insider threats in a financial institution: Analysis of attack-proneness of information systems applications. MIS Q, 39(1), 91–112. https://doi.org/10.25300/MISQ/2015/39.1.05

    Article  Google Scholar 

  • Warkentin, M., Crossler, R. E., & Malimage, N. (2012). Are You Sure You’re Safe? Perceived Security Protection as an Enabler of Risky IT Behavior. Proceedings of the 2012 International Federation of Information Processing (IFIP) International Workshop on Information Systems Security Research, Dewald Roode Information Security Workshop

  • Warkentin, M., Goel, S., Williams, K. J., & Renaud, K. (2018). Are we Predisposed to Behave Securely? Influence of Risk Disposition on Individual Security Behaviours. ECIS, 25

  • Weeger, A., Wang, X., Gewald, H., Raisinghani, M., Sanchez, O., Grant, G., & Pittayachawan, S. (2020). Determinants of Intention to Participate in Corporate BYOD-Programs: The Case of Digital Natives. Information Systems Frontiers, 22(1), 203–219. https://doi.org/10.1007/s10796-018-9857-4

    Article  Google Scholar 

  • Wickens, C. D., Gordon, S. E., & Liu, Y. (1998). & others. An introduction to human factors engineering

  • Wilde, G. (1994). Target risk: Dealing with the danger of death, disease and damage in everyday decisions. Castor & Columba

  • Wilde, G. (1998). Risk homeostasis theory: An overview. Injury Prevention, 4(2), 89–91

    Article  Google Scholar 

  • Zhang, P., Li, N., Scialdone, M., & Carey, J. (2009). The intellectual advancement of human-computer interaction research: A critical assessment of the MIS literature (1990–2008). AIS Transactions on Human-Computer Interaction, 1(3), 55–107

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roozmehr Safi.

Ethics declarations

Conflict of interest

The authors have no relevant financial or non-financial interests to disclose.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Safi, R., Browne, G.J. Detecting Cybersecurity Threats: The Role of the Recency and Risk Compensating Effects. Inf Syst Front 25, 1277–1292 (2023). https://doi.org/10.1007/s10796-022-10274-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10796-022-10274-5

Keywords

Navigation