skip to main content
10.1145/3508396.3512886acmconferencesArticle/Chapter ViewAbstractPublication PageshotmobileConference Proceedingsconference-collections
short-paper
Open Access

A neural-based bandit approach to mobile crowdsourcing

Published:09 March 2022Publication History

ABSTRACT

Mobile crowdsourcing has long promised to utilize the power of mobile crowds to reduce the time and monetary cost required to perform large-scale location-dependent tasks, e.g., environmental sensing. Assigning the right tasks to the right users, however, is a longstanding challenge: different users will be better suited for different tasks, which in turn will have different contributions to the overall crowdsourcing goal. Even worse, these relationships are generally unknown a priori and may change over time, particularly in mobile settings. The diversity of devices in the Internet of Things and diversity of new application tasks that they may run exacerbate these challenges. Thus, in this paper, we formulate the mobile crowdsourcing problem as a Contextual Combinatorial Volatile Multi-armed Bandit problem. Although prior work has attempted to learn the optimal user-task assignment based on user-specific side information, such formulations assume known structure in the relationships between contextual information, user suitability for each task, and the overall crowdsourcing goal. To relax these assumptions, we propose a Neural-MAB algorithm that can learn these relationships. We show that in a simulated mobile crowdsourcing application, our algorithm significantly outperforms existing multi-armed bandit baselines in settings with both known and unknown reward structures.

References

  1. Peter Auer, Nicolo Cesa-Bianchi, and Paul Fischer. 2002a. Finite-time analysis of the multiarmed bandit problem. Machine learning 47, 2 (2002), 235--256.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Peter Auer, Nicolo Cesa-Bianchi, Yoav Freund, and Robert E Schapire. 2002b. The nonstochastic multiarmed bandit problem. SIAM journal on computing 32, 1 (2002), 48--77.Google ScholarGoogle Scholar
  3. Djallel Bouneffouf and Irina Rish. 2019. A survey on practical applications of multi-armed and contextual bandits. arXiv preprint arXiv:1904.10040 (2019).Google ScholarGoogle Scholar
  4. Lixing Chen, Jie Xu, and Zhuo Lu. 2018. Contextual combinatorial multi-armed bandits with volatile arms and submodular reward. Advances in Neural Information Processing Systems 31 (2018), 3247--3256.Google ScholarGoogle Scholar
  5. Wei Chen, Yajun Wang, and Yang Yuan. 2013. Combinatorial multi-armed bandit: General framework and applications. In International Conference on Machine Learning. PMLR, 151--159.Google ScholarGoogle Scholar
  6. Eunjoon Cho, Seth A Myers, and Jure Leskovec. 2011. Friendship and mobility: user movement in location-based social networks. In Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining. 1082--1090.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. George Cybenko. 1989. Approximation by superpositions of a sigmoidal function. Mathematics of control, signals and systems 2, 4 (1989), 303--314.Google ScholarGoogle Scholar
  8. Xiaofeng Gao, Shenwei Chen, and Guihai Chen. 2020. Mab-based reinforced worker selection framework for budgeted spatial crowdsensing. IEEE Transactions on Knowledge and Data Engineering (2020).Google ScholarGoogle Scholar
  9. Elad Hazan and Satyen Kale. 2012. Online Submodular Minimization. Journal of Machine Learning Research 13, 10 (2012).Google ScholarGoogle Scholar
  10. Linta Islam, Syada Tasmia Alvi, Mohammed Nasir Uddin, and Mafizur Rahman. 2019. Obstacles of mobile crowdsourcing: A survey. In 2019 IEEE Pune Section International Conference (PuneCon). IEEE, 1--4.Google ScholarGoogle ScholarCross RefCross Ref
  11. Shweta Jain, Ganesh Ghalme, Satyanath Bhat, Sujit Gujar, and Y Narahari. 2016. A deterministic mab mechanism for crowdsourcing with logarithmic regret and immediate payments. In Proceedings of the 2016 International Conference on Autonomous Agents & Multiagent Systems. 86--94.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Bala Kalyanasundaram and Kirk R Pruhs. 2000. An optimal deterministic algorithm for online b-matching. Theoretical Computer Science 233, 1-2 (2000), 319--325.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. John Langford and Tong Zhang. 2007. The epoch-greedy algorithm for contextual multi-armed bandits. Advances in neural information processing systems 20, 1 (2007), 96--1.Google ScholarGoogle Scholar
  14. Boyi Li, Felix Wu, Ser-Nam Lim, Serge Belongie, and Kilian Q Weinberger. 2021. On feature normalization and data augmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 12383--12392.Google ScholarGoogle ScholarCross RefCross Ref
  15. Lihong Li, Wei Chu, John Langford, and Robert E Schapire. 2010. A contextual-bandit approach to personalized news article recommendation. In Proceedings of the 19th international conference on World wide web. 661--670.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Shouxu Lin, Weifa Liang, and Jing Li. 2020. Reliability-aware service function chain provisioning in mobile edge-cloud networks. In 2020 29th International Conference on Computer Communications and Networks (ICCCN). IEEE, 1--9.Google ScholarGoogle ScholarCross RefCross Ref
  17. Andi Nika, Sepehr Elahi, and Cem Tekin. 2020. Contextual combinatorial volatile multi-armed bandit with adaptive discretization. In International Conference on Artificial Intelligence and Statistics. PMLR, 1486--1496.Google ScholarGoogle Scholar
  18. Lijing Qin, Shouyuan Chen, and Xiaoyan Zhu. 2014. Contextual combinatorial bandit and its application on diversified online recommendation. In Proceedings of the 2014 SIAM International Conference on Data Mining. SIAM, 461--469.Google ScholarGoogle ScholarCross RefCross Ref
  19. Herbert Robbins. 1952. Some aspects of the sequential design of experiments. Bull. Amer. Math. Soc. 58, 5 (1952), 527--535.Google ScholarGoogle ScholarCross RefCross Ref
  20. William R Thompson. 1933. On the likelihood that one unknown probability exceeds another in view of the evidence of two samples. Biometrika 25, 3/4 (1933), 285--294.Google ScholarGoogle ScholarCross RefCross Ref
  21. Yongxin Tong, Zimu Zhou, Yuxiang Zeng, Lei Chen, and Cyrus Shahabi. 2020. Spatial crowdsourcing: a survey. The VLDB Journal 29, 1 (2020), 217--250.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Yajun Wang and Sam Chiu-wai Wong. 2015. Two-sided online bipartite matching and vertex cover: Beating the greedy algorithm. In International Colloquium on Automata, Languages, and Programming. Springer, 1070--1081.Google ScholarGoogle Scholar
  23. Laurence A Wolsey and George L Nemhauser. 1999. Integer and combinatorial optimization. Vol. 55. John Wiley & Sons.Google ScholarGoogle Scholar
  24. Dongruo Zhou, Lihong Li, and Quanquan Gu. 2020. Neural contextual bandits with ucb-based exploration. In International Conference on Machine Learning. PMLR, 11492--11502.Google ScholarGoogle Scholar

Index Terms

  1. A neural-based bandit approach to mobile crowdsourcing

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      HotMobile '22: Proceedings of the 23rd Annual International Workshop on Mobile Computing Systems and Applications
      March 2022
      137 pages
      ISBN:9781450392181
      DOI:10.1145/3508396

      Copyright © 2022 Owner/Author

      This work is licensed under a Creative Commons Attribution International 4.0 License.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 9 March 2022

      Check for updates

      Qualifiers

      • short-paper

      Acceptance Rates

      Overall Acceptance Rate96of345submissions,28%
    • Article Metrics

      • Downloads (Last 12 months)159
      • Downloads (Last 6 weeks)17

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader