skip to main content
10.1145/3584931.3607001acmconferencesArticle/Chapter ViewAbstractPublication PagescscwConference Proceedingsconference-collections
research-article

Investigating Users' Inclination of Leveraging Mobile Crowdsourcing to Obtain Verifying vs. Supplemental Information when Facing Inconsistent Smat-city Sensor Information

Published: 14 October 2023 Publication History

Abstract

Smart cities leverage sensor technology to monitor urban spaces in real-time. Still, discrepancies in sensor data can lead to uncertainty about city conditions. Mobile crowdsourcing, where on-site individuals offer real-time details, is a potential solution. Yet it is unclear whether users would prefer to utilizing the mobile crowd on site to verify sensor data or to provide supplementary explanations for inconsistent sensor data. We conducted an online experiment involving 100 participants to explore this question. Our results revealed a negative correlation between perceived plausibility of sensor information and inclination to use mobile crowdsourcing for obtaining information. However, only around 80% of participants relied on crowdsourcing when they felt the sensor information as implausible. Interestingly, even when participants believed the sensor data to be plausible, they sought to use crowdsourcing for further details half of the time. We also found that participants leaned more towards using the crowd for explanations (48% and 49% of instances) rather than seeking verification when encountering perceived implausible sensor information (38% and 32% of instances).

References

[1]
Mohamed Abdel-Basset and Mai Mohamed. 2018. The role of single valued neutrosophic sets and rough sets in smart city: Imperfect and incomplete information systems. Measurement 124, 10.1016 (2018).
[2]
Florian Alt, Alireza Sahami Shirazi, Albrecht Schmidt, Urs Kramer, and Zahid Nawaz. 2010. Location-based crowdsourcing: extending crowdsourcing to the real world. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries. 13–22.
[3]
Chatschik Bisdikian, Lance M Kaplan, Mani B Srivastava, David J Thornley, Dinesh Verma, and Robert I Young. 2009. Building principles for a quality of information specification for sensor information. In 2009 12th International Conference on Information Fusion. IEEE, 1370–1377.
[4]
Shuo Chang, F Maxwell Harper, and Loren Gilbert Terveen. 2016. Crowd-based personalized natural language explanations for recommendations. In Proceedings of the 10th ACM conference on recommender systems. 175–182.
[5]
Yung-Ju Chang, Chu-Yuan Yang, Ying-Hsuan Kuo, Wen-Hao Cheng, Chun-Liang Yang, Fang-Yu Lin, I-Hui Yeh, Chih-Kuan Hsieh, Ching-Yu Hsieh, and Yu-Shuen Wang. 2019. Tourgether: Exploring tourists’ real-time sharing of experiences as a means of encouraging point-of-interest exploration. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 4 (2019), 1–25.
[6]
David E Copeland, Kris Gunawan, and Nicole J Bies-Hernandez. 2011. Source credibility and syllogistic reasoning. Memory & cognition 39 (2011), 117–127.
[7]
Jorge Goncalves, Simo Hosio, Niels Van Berkel, Furqan Ahmed, and Vassilis Kostakos. 2017. Crowdpickup: Crowdsourcing task pickup in the wild. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 3 (2017), 1–22.
[8]
Bin Guo. 2011. The scope of external information-seeking under uncertainty: An individual-level study. International Journal of Information Management 31, 2 (2011), 137–148.
[9]
Naeemul Hassan, Mohammad Yousuf, Mahfuzul Haque, Javier A Suarez Rivas, and Md Khadimul Islam. 2017. Towards a sustainable model for fact-checking platforms: examining the roles of automation, crowds and professionals. In Computation+ Journalism conference, Northwestern University, Evanston, IL.
[10]
Simo Hosio, Jorge Goncalves, Vili Lehdonvirta, Denzil Ferreira, and Vassilis Kostakos. 2014. Situated crowdsourcing using a market model. In Proceedings of the 27th annual ACM symposium on User interface software and technology. 55–64.
[11]
Blair T Johnson, Gregory R Maio, and Aaron Smith-McLallen. 2005. Communication and attitude change: Causes, processes, and effects. (2005).
[12]
Salil S Kanhere. 2013. Participatory sensing: Crowdsourcing data from mobile smartphones in urban spaces. In Distributed Computing and Internet Technology: 9th International Conference, ICDCIT 2013, Bhubaneswar, India, February 5-8, 2013. Proceedings 9. Springer, 19–26.
[13]
Gary Klein, Mohammadreza Jalaeian, Robert Hoffman, and Shane Mueller. 2021. The Plausibility Gap: A model of sensemaking. (2021).
[14]
Chi-Chin Lin, Yi-Ching Huang, and Jane Yung-jen Hsu. 2014. Crowdsourced explanations for humorous internet memes based on linguistic theories. In Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, Vol. 2. 143–150.
[15]
Yongxin Liu, Xiaoxiong Weng, Jiafu Wan, Xuejun Yue, Houbing Song, and Athanasios V Vasilakos. 2017. Exploring data validity in transportation systems for smart cities. IEEE Communications Magazine 55, 5 (2017), 26–33.
[16]
Stephen A Rains and Riva Tukachinsky. 2015. An examination of the relationships among uncertainty, appraisal, and information-seeking behavior proposed in uncertainty management theory. Health Communication 30, 4 (2015), 339–349.
[17]
Hesham Rakha, Ihab El-Shawarby, and Mazen Arafeh. 2010. Trip travel-time reliability: issues and proposed solutions. Journal of Intelligent Transportation Systems 14, 4 (2010), 232–250.
[18]
Champika Ranasinghe and Christian Kray. 2018. Location information quality: A review. Sensors 18, 11 (2018), 3999.
[19]
Jesse R Sparks and David N Rapp. 2011. Readers’ reliance on source credibility in the service of comprehension.Journal of Experimental Psychology: Learning, Memory, and Cognition 37, 1 (2011), 230.
[20]
Hatem Ben Sta. 2017. Quality and the efficiency of data in “Smart-Cities”. Future Generation Computer Systems 74 (2017), 409–416.
[21]
Heli Väätäjä, Teija Vainio, Esa Sirkkunen, and Kari Salo. 2011. Crowdsourced news reporting: supporting news content creation with mobile phones. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. 435–444.
[22]
Maarten Van Someren, Yvonne F Barnard, and J Sandberg. 1994. The think aloud method: a practical approach to modelling cognitive. London: AcademicPress 11 (1994), 29–41.

Index Terms

  1. Investigating Users' Inclination of Leveraging Mobile Crowdsourcing to Obtain Verifying vs. Supplemental Information when Facing Inconsistent Smat-city Sensor Information

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CSCW '23 Companion: Companion Publication of the 2023 Conference on Computer Supported Cooperative Work and Social Computing
      October 2023
      596 pages
      ISBN:9798400701290
      DOI:10.1145/3584931
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 14 October 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. information consistency
      2. information seeking
      3. mobile crowdsourcing
      4. plausibility
      5. sense-making
      6. sensor plausibility
      7. smart city

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      CSCW '23
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 2,235 of 8,521 submissions, 26%

      Upcoming Conference

      CSCW '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 73
        Total Downloads
      • Downloads (Last 12 months)46
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 17 Feb 2025

      Other Metrics

      Citations

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media