Skip to main content

Trust & Self-Organising Socio-technical Systems

  • Chapter
  • First Online:
Trustworthy Open Self-Organising Systems

Part of the book series: Autonomic Systems ((ASYS))

  • 522 Accesses

Abstract

We present our theory on trust and its components and dimensions, and apply it to trust in complex dynamic socio-technical systems and to their self-organising emergent results. Specifically, we apply our theory to ICT-based systems, where a “Social Order” is no longer fully “spontaneous” due to the invisible hand impinging on individual and selfish-decisions. In such contexts, a social order is rather based on programmed interactions, algorithmic procedures and big data. Since trust cannot be fully programmable and predictable, how can we build it in this complex and dynamic system? Some of our research questions sound: is it necessary that folks “understand” the underlying mechanisms they are relying on? What kind of information about forecasts or future projections should be provided and adjusted? What kind of role do simulation, serious games play on learning to understand and expect? Will there be algorithms working on the micro-processes and producing the emergent organisation, and if yes, how effective and reliable will they be? There are at least two different levels of trust in complex systems and in their functioning processes: trust in the emergent order and trust in the micro-layer rules. Are the systems rules and resulting equilibriums fair, equity inspired, in relation to the interests of the involved groups/subjects? A complex and cognitive model of trust is needed for this analysis.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Consider for example the excellent definition of “Organic Computing System” in Müller-Schloer’s document on “Organic Computing Initiative” (2004): an “Organic Computing system is a technical system which adapts dynamically to the current conditions of its environment. It is self-organising, self-configuring, self-optimising, self-healing, self-protecting, self-explaining, and context-aware”. See also [38].

  2. 2.

    We also agree with the main claims there: [what matters is] “the human as the user of self-organising and self-adaptive systems and the usability of such systems”; “Functional correctness, security, safety, and reliability are facets that have to be ensured for the system’s components as well as for the system as a whole. The classical notions of trust and reputation in MAS also apply to this relationship between system components. The relationship between the system and the user is influenced by the transparency and consistency of the system towards the user and most importantly by its usability, i.e. the way the user is informed about self-organising processes and is allowed to interact with the system.”. However, we see additional problems in this, like the participatory and hybrid nature of the system, or the hidden interests of the “spontaneous” order.

  3. 3.

    Consider the celebrated sentence of Epicurus: “It is not our friends’ help that helps us, it is the confidence of their help.”

  4. 4.

    To see how strong such identification was, see for example [21]; the review of “Trust on the Internet”, a book on Internet security, which focuses solely on the topic of security.

  5. 5.

    Not only the design is never neutral but is in favour of the interests of one party over another, with conflicting interests [23], but also the algorithms managing a MA equilibria and dynamics have the same – but more hidden – feature.

  6. 6.

    “Kripta” is the nice term – introduced by Bacharach and Gambetta [24] – to explain that trust presupposes and it is ascribed to some non-observable, hidden ‘quality’ of the trustee. We can observe her/his/its behaviour (“manifesta”) but we rely on its control-devices. In our model, the “internal” attribution of trust and its ascription to ‘inner’ qualities are particularly important: they can be motivational (e.g. honesty, values, or friendship), cognitive (e.g. expertise, or competence), and also performative (e.g. skills).

  7. 7.

    See also the so called “Algorithmic Economy”; e.g. http://www.forbes.com/forbes/welcome/; http://blogs.gartner.com/peter-sondergaard/the-internet-of-things-will-give-rise-to-the-algorithm-economy/

  8. 8.

    On the possible usefulness of violations in any organisation, see [25].

  9. 9.

    In the paper on “Making visible the invisible hand” [27] the title of a section sounds “The pseudo-spontaneous order”; despite its rhetorical efficacy, in the text there is no claim that the emerging social order is fully “manoeuvred”.

  10. 10.

    According to us, the authors are a bit optimistic as for: (i) the computational governance of complex and self-organising hybrid systems; (ii) the role of cooperation, common interests, social capital, etc. whereby they ignore the crucial and positive role of ‘conflicts’.

  11. 11.

    For the use of simulation to test collective decision making models see in this book Ch. 17 Lucas and Payne.

References

  1. Omicini, A., Contucci, P.: Complexity and interaction: blurring borders between physical, computational, and social systems. English. In: Badica, C., Nguyen, N., Brezovan, M. (eds.) Computational Collective Intelligence. Technologies and Applications, vol. 8083, pp. 1–10. Springer, Berlin/Heidelberg (2013). ISBN:978-3-642-40494-8

    Chapter  Google Scholar 

  2. Pezzulo, G., Hoffmann, J., Falcone, R.: Anticipation and anticipatory behavior. Cognit. Process. 8, 67–70 (2007)

    Article  Google Scholar 

  3. Anders, G., Siefert, F., Steghöfer, J.-P., Reif, W.: Self-Organizing Systems, pp. 90–102. Springer, Berlin/Heidelberg (2014)

    Google Scholar 

  4. Nafz, F., Ortmeier, F., Seebach, H., Steghöfer, J.-P., Reif, W.: A universal self-organization mechanism for role-based organic computing systems. In: Nieto, J.G., Reif, W., Wang, G., Indulska, J. (eds.), Autonomic and Trusted Computing, pp. 17–31. Springer, Berlin/Heidelberg (2009)

    Chapter  Google Scholar 

  5. Noriega, P., Padget, J., Verhagen, H., d’Inverno, M.: The challenge of artificial socio-cognitive systems. In: COIN. AAMAS, Paris (2014)

    Google Scholar 

  6. Sollner, M., Pavlou, P., Leimeister, J.M.: Understanding trust in IT artifacts – a new conceptual approach. In: Academy of Management Annual Meeting, New York (2013)

    Google Scholar 

  7. Steghöfer, J.-P., Kiefhaber, R., Leichtenstern, K., Bernard, Y., Klejnowski, L.: Trustworthy organic computing systems: challenges and perspectives. In: Xie, B., Branke, J., Sadjadi, S.M., Zhang, D., Zhou, X. (eds.) Autonomic and Trusted Computing, pp. 62–76. Springer, Berlin/Heidelberg (2010)

    Chapter  Google Scholar 

  8. Steghöfer, J.-P., Behrmann, P., Anders, G., Siefert, F., Reif, W.: HiSPADA: self-organising hierarchies for large-scale multi-agent systems. In: IARIA’13, Lisbon (2013)

    Google Scholar 

  9. Aslanyan, Z., Ivanova, M.G., Nielson, F., Probst, C.: Modeling and analysing socio-technical systems. In: 1st International Workshop on Socio-Technical Perspective in IS Development (STPIS), Stockholm, vol. 1374, pp. 121–124 (2015)

    Google Scholar 

  10. Castelfranchi, C.: For a Pessimistic Theory of the Invisible Hand and Spontaneous Order. https://www.academia.edu/823483/For_a_Pessimistic_Theory_of_the_Invisible_Hand_and_Spontaneous_Order (2001)

  11. Castelfranchi, C., Falcone, R.: Trust Theory. A Socio-Cognitive and Computational Model. Wiley, Chichester (2010)

    Book  MATH  Google Scholar 

  12. Falcone, R., Castelfranchi, C.: Social trust: a cognitive approach. In: Castelfranchi, C., Yao-Hua, T. (eds.) Trust and Deception in Virtual Societies, pp. 55–90. Kluwer Academic Publishers, Dordrecht (2001)

    Chapter  Google Scholar 

  13. Garfinkel, H.: A conception of, and experiments with, ‘trust’ as a condition of stable concerted actions. In: Harvey, O.J. (ed.) Motivation and Social Interaction, pp. 187–238. Ronald Press Co., New York (1963)

    Google Scholar 

  14. Pitt, J., Artikis: A.: The open agent society: a retrospective and future perspective. In: COIN. AAMAS, Istanbul (2015)

    Google Scholar 

  15. Rutter, J.: Sociology of trust towards a sociology of E-trust. Int. J. New Prod. Dev. Innov. Manag. 3, 371–385 (2001)

    Google Scholar 

  16. Pelligra, V.: Under trusting eyes: the responsive nature of trust. In: Gui, B., Sugden, R. (eds.) Economics and Social Interaction: Accounting for Interpersonal Relations. Cambridge University Press, Cambridge (2005)

    Google Scholar 

  17. Falcone, R., Castelfranchi, C.: Socio-cognitive model of trust. In: Encyclopedia of Information Science and Technology. IGI Global, Hershey (2005)

    Book  MATH  Google Scholar 

  18. Falcone, R., Piunti, M., Venanzi, M., Castelfranchi, C.: From Manifesta to Krypta: the relevance of categories for trusting others. ACM Trans. Intell. Syst. Technol. 4, 1–24 (2013)

    Google Scholar 

  19. Giddens, A.: Modernity and Self-Identity. Self and Society in the Late Modern Age. Stanford University Press, Stanford (1991)

    Google Scholar 

  20. Gollmann, D.: Security in Socio-technical Systems (2011)

    Google Scholar 

  21. Clegg, A.: Trust on the Internet. Telecomun. Policy 22, 159–160 (1998)

    Google Scholar 

  22. Hassas, S., Marzo-Serugendo, G.D., Karageorgos, A., Castelfranchi, C.: On Self-Organising Mechanisms from Social, Business and Economic Domains (2006)

    Google Scholar 

  23. Fry, T.: Design as Politics. Berg Publishers, New York (2010)

    Google Scholar 

  24. Bacharach, M., Gambetta, D.: Trust as type detection. In: Castelfranchi, C., Tan, Y.-H. (eds.) Trust and Deception in Virtual Societies, pp. 1–16. Springer, Netherlands (2001)

    Chapter  Google Scholar 

  25. CastelFranchi, C.: Engineering social order. In: Omicini, A., Tolksdorf, R., Zambonelli, F. (eds.) Engineering Societies in the Agents World, pp. 1–18. Springer, Berlin/Heidelberg (2000)

    Chapter  Google Scholar 

  26. Cofta, P.: Trust, Complexity and Control: Confidence in a Convergent World. Wiley, Hoboken (2007)

    Book  Google Scholar 

  27. Castelfranchi, C.: Making visible “the Invisible Hand”. The mission of social simulation. In: Adamatti, D., Dimuro, G., Coelho, H. (eds.) Interdisciplinary Applications of Agent-Based Social Simulation and Modeling. IGI Global, Hershey pp. 1–19 (2014)

    Google Scholar 

  28. Castelfranchi, C.: The theory of social functions. Challenges for multi-agent-based social simulation and multi-agent learning. J. Cognit. Syst. Res. 2, 5–38 (2001)

    Google Scholar 

  29. Veitas, V.: A World Views: The Cognitive Development of the Global Brain. http://www.slideshare.net/vveitas/dsg-short-presentation (2014)

  30. Zia, K., Ferscha, A., Riener, A., Wirz, M., Roggen, D., Kloch, K., Lukowicz, P.: Pervasive computing in the large: the socionical approach. In: Adjunct Proceedings of the Eighth International Conference on Pervasive Computing, Helsinki, p. 6 (2010)

    Google Scholar 

  31. Von Mises, L.: In: Press, Y.U. (ed.) Bureaucracy. https://mises.org/library/bureaucracy (1944)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cristiano Castelfranchi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Castelfranchi, C., Falcone, R. (2016). Trust & Self-Organising Socio-technical Systems. In: Reif, W., et al. Trustworthy Open Self-Organising Systems. Autonomic Systems. Birkhäuser, Cham. https://doi.org/10.1007/978-3-319-29201-4_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-29201-4_8

  • Published:

  • Publisher Name: Birkhäuser, Cham

  • Print ISBN: 978-3-319-29199-4

  • Online ISBN: 978-3-319-29201-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics