Skip to main content
Log in

Human–robot collisions detection for safe human–robot interaction using one multi-input–output neural network

  • Methodologies and Application
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

In this paper, a multilayer feedforward neural network-based approach is proposed for human–robot collision detection taking safety standards into consideration. One multi-output neural network is designed and trained using data from the coupled dynamics of the manipulator with and without external contacts to detect unwanted collisions and to identify the collided link using only the intrinsic joint position and torque sensors of the manipulator. The proposed method is applied to the collaborative robots, which will be very popular in the near future, and is implemented and evaluated in 3D space motion taking into account the effect of the gravity. KUKA LWR manipulator is an example of the collaborative robots, and it is used for doing the experiments. The experimental results prove that the developed system is considerably efficient and very fast in detecting the collisions in the safe region and identifying the collided link along the entire workspace of the three-joint motion of the manipulator. Separate/uncoupled neural networks, one for each joint, are also designed and trained using the same data, and their performance is compared with the coupled one.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

Notes

  1. Any other body could be used to do the collision with the robot since our main concept from these experiments is just to show the dynamic coupling between the manipulator joints.

  2. These three variables in this paragraph indicate the position error, actual joint velocity, and measured joint torque.

  3. It should be noted that the hardware/processor used for the training process of the coupled NN is changed in the case of the uncoupled ones because it is not available in this time. In fact, the change in the hardware cannot affect the results and also the comparison between the coupled NN and the uncoupled ones since it is only affecting the required time for the training process. It is known that the training process in case of one independent NN for one joint takes less time than the case of one NN coupling the three joints together. The main point here is not the required time of the training process, but the best performance of the trained NN and its effectiveness.

References

  • Albu-Schaffer A, Haddadin S, Ott C, Stemmer A, Wimbock T, Hirzinger G (2007) The DLR lightweight robot: design and control concepts for robots in human environments. Ind Robot Int J 34:376–385. https://doi.org/10.1108/01439910710774386

    Article  Google Scholar 

  • Anton FD, Anton S, Borangiu T (2013) Human–robot natural interaction with collision avoidance in manufacturing operations. In: Service orientation in Holonic and multi agent manufacturing and robotics. Springer, Berlin, pp 375–388. https://doi.org/10.1007/978-3-642-35852-4

  • Berardi VL, Zhang GP (1999) The effect of misclassification costs on neural network classifiers. Decis Sci 30:659–682. https://doi.org/10.1111/j.1540-5915.1999.tb00902.x

    Article  Google Scholar 

  • Chen S, Luo M, He F (2018) A universal algorithm for sensorless collision detection of robot actuator faults. Adv Mech Eng 10:1–10. https://doi.org/10.1177/1687814017740710

    Article  Google Scholar 

  • Cho C, Kim J, Lee S, Song J (2012a) Collision detection and reaction on 7 DOF service robot arm using residual observer. J Mech Sci Technol 26:1197–1203. https://doi.org/10.1007/s12206-012-0230-0

    Article  Google Scholar 

  • Cho C, Kim J, Kim Y, Song J-B, Kyung J-H (2012b) Collision detection algorithm to distinguish between intended contact and unexpected collision. Adv Robot 26:1825–1840. https://doi.org/10.1080/01691864.2012.685259

    Article  Google Scholar 

  • De Luca A, Albu-Schaffer A, Haddadin S, Hirzinger G (2006) Collision detection and safe reaction with the DLR-III lightweight manipulator arm. In: 2006 IEEE/RSJ international conference on intelligent robots and systems. IEEE, Beijing, China, pp 1623–1630. https://doi.org/10.1109/iros.2006.282053

  • Devijver PA, Kittler J (1982) Pattern recognition: a statistical approach, Englewood cliffs. Prentice-Hall, Upper Saddle River

    MATH  Google Scholar 

  • Dimeas F, Avenda LD, Nasiopoulou E, Aspragathos N (2013) Robot collision detection based on fuzzy identification and time series modelling. In: Proceedings of the RAAD 2013. 22nd international workshop on robotics in Alpe-Adria-Danube Region, Slovenia

  • Dimeas F, Avendano-valencia LD, Aspragathos N (2014) Human—robot collision detection and identification based on fuzzy and time series modelling. Robotica. https://doi.org/10.1017/s0263574714001143

    Article  Google Scholar 

  • Du K-L, Swamy MNS (2006) Neural networks in a softcomputing framework. Springer, London

    MATH  Google Scholar 

  • Du K, Swamy MNS (2014) Neural networks and statistical learning. Springer, Berlin. https://doi.org/10.1007/978-1-4471-5571-3

    Book  MATH  Google Scholar 

  • Ferrari S, Stengel RF (2005) Smooth function approximation using neural networks. IEEE Trans Neural Netw 16:24–38

    Article  Google Scholar 

  • Flacco F, Kroger T, De Luca A, Khatib O (2012) A depth space approach to human–robot collision avoidance. In: 2012 IEEE international conference on robotics and automation, RiverCentre, Saint Paul, Minnesota, USA, 2012, pp 338–345

  • Flacco F, Kroeger T, De Luca A, Khatib O (2014) A depth space approach for evaluating distance to objects. J Intell Robot Syst 80:7–22. https://doi.org/10.1007/s10846-014-0146-2

    Article  Google Scholar 

  • FRANKA EMIKA, Infanteriestraße 19 80797 Munich, Germany (n.d.) http://donar.messe.de/exhibitor/cebit/2017/G679739/broschuere-ger-499315.pdf

  • Fumagalli M, Gijsberts A, Ivaldi S, Jamone L, Metta G, Natale L, Nori F, Sandini G (2010) Learning to exploit proximal force sensing: a comparison approach. In: Sigaud O, Peters J (eds) From motor learning to interaction learning in robots. Studies in computational intelligence. Springer, Berlin, pp 149–167. https://doi.org/10.1007/978-3-642-05181-4_7

  • Gandhi D, Cervera E (2003) Sensor covering of a robot arm for collision avoidance. In: SMC’03 conference proceedings. 2003 IEEE international conference on systems, man and cybernetics. Conference theme—system security and assurance (Cat. No. 03CH37483). IEEE, Washington, DC, USA, 2003, pp 4951–4955. https://doi.org/10.1109/icsmc.2003.1245767

  • Goldberg KY, Pearlmutter BA (1988) Using a neural network to learn the dynamics of the CMU direct-drive arm II. Carnegie Mellon University, Pittsburgh

    Google Scholar 

  • HC10 collaborative robot—Yaskawa Motoman (n.d.). https://www.motoman.com/collaborative/products

  • Haddadin S, Albu-sch A, De Luca A, Hirzinger G (2008) Collision detection and reaction: a contribution to safe physical human–robot interaction. In: 2008 IEEE/RSJ international conference on intelligent robots and systems, IEEE, Nice, France, pp 3356–3363

  • Hagan MT, Menhaj MB (1994) Training feedforward networks with the Marquardt algorithm. IEEE Trans Neural Netw 5:2–6

    Article  Google Scholar 

  • Haykin S (2009) Neural networks and learning machines, 3rd edn. Pearson, London

    Google Scholar 

  • Hornik K, Stinchcombe M, White H (1990) Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural Netw 3:551–560

    Article  Google Scholar 

  • Indri M, Trapani S, Lazzero I (2017) Development of a virtual collision sensor for industrial robots. Sensors 17:1–23. https://doi.org/10.3390/s17051148

    Article  Google Scholar 

  • ISO 10218-1 (2011a) Robots and robotic devices—safety requirements for industrial robots—part 1: robots

  • ISO 10218-2 (2011b) Robots and robotic devices—safety requirements for industrial robots—part 2: robot systems and integration

  • ISO/TS 15066 (2016) Robots and robotic devices—collaborative robots

  • Jung B, Choi HR, Koo JC, Moon H (2012) Collision detection using band designed disturbance observer. In: 8th IEEE international conference on automation science and engineering, Korea, 2012, pp 1080–1085

  • Jung B, Koo JC, Choi HR, Moon H (2014) Human–robot collision detection under modeling uncertainty using frequency boundary of manipulator dynamics. J Mech Sci Technol 28:4389–4395. https://doi.org/10.1007/s12206-014-1006-5

    Article  Google Scholar 

  • KUKA Roboter GmbH, Lightweight Robot 4+, Specification, D-86165 Augsburg, Germany, 2010

  • Kitaoka M, Yamashita A, Kaneko T (2010) Obstacle avoidance and path planning using color information for a biped robot equipped with a stereo camera system. In: Proceedings of the 4th Asia international symposium mechatronics, pp 38–43. https://doi.org/10.3850/978-981-08-7723-1_p134

  • KUKA.FastResearchInterface 1.0, KUKA System Technology (KST), D-86165 Augsburg, Germany, 2011

  • Lam TL, Yip HW, Qian H, Xu Y (2012) Collision avoidance of industrial robot arms using an invisible sensitive skin. In: 2012 IEEE/RSJ international conference on intelligent robots and systems, Algarve, Portugal, pp 4542–4543

  • LeCun Y, Bengio Y (1995) Pattern recognition and neural networks. In: Arbib MA (ed) The handbook of brain theory and neural networks. MIT Press, Cambridge, pp 1–22

    Google Scholar 

  • Lenser S, Veloso M (2003) Visual sonar: fast obstacle avoidance using monocular vision. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems (IROS 2003). https://doi.org/10.1109/iros.2003.1250741

  • Li Z, Wu H, Yang J, Wang M, Ye J (2018) A position and torque switching control method for robot collision safety. Int J Autom Comput 15:156–168. https://doi.org/10.1007/s11633-017-1104-9

    Article  Google Scholar 

  • Lu S, Chung JH, Velinsky SA (2005) Human–robot collision detection and identification based on wrist and base force/torque sensors. In: Proceedings of 2005 IEEE international conference on robotics and automation, Spain, pp 796–801

  • Marquardt DW (1963) An algorithm for least-squares estimation of nonlinear parameters. J Soc Ind Appl Math 11:431–441

    Article  MathSciNet  Google Scholar 

  • Mihelj M, Munih M (2010) Open architecture xPC target based robot controllers for industrial and research manipulators. In: IEEE workshop on innovative robot control architectures for demanding (research) applications how to modify and enhance commercial controllers (ICRA 2010), Anchorage, pp 54–61

  • Min F, Wang G, Liu N (2019) Collision detection and identification on robot manipulators based on vibration analysis. Sensors 19:1–17. https://doi.org/10.3390/s19051080

    Article  Google Scholar 

  • Morinaga S, Kosuge K (2003) Collision detection system for manipulator based on adaptive impedance control law. In: Proceedings of 2003 IEEE international conference on robotics and automation, Tsirno, pp 1080–1085

  • Most T (2005) Approximation of complex nonlinear functions by means of neural networks. In: 2nd Weimar optimization and stochastic days 2005, Grand Hotel Russischer Hof, Weimar, Germany

  • Murray RM, Li Z, Sastry SS (1994) A mathematical introduction to robotic manipulation. CRC Press, Boca Raton

    MATH  Google Scholar 

  • Nielsen MA (2015) Neural networks and deep learning. Determination Press

  • Patiño HD, Carelli R, Kuchen BR (2002) Neural networks for advanced control of robot manipulators. IEEE Trans Neural Netw 13:343–354. https://doi.org/10.1109/72.991420

    Article  Google Scholar 

  • Peasley B, Birchfield S (2013) Real-time obstacle detection and avoidance in the presence of specular surfaces using an active 3D sensor. In: 2013 IEEE workshop on robot vision, Clearwater Beach, FL, USA, pp 197–202. https://doi.org/10.1109/worv.2013.6521938

  • Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117. https://doi.org/10.1016/j.neunet.2014.09.003

    Article  Google Scholar 

  • Schmidt B, Wang L (2013) Contact-less and programming-less human–robot collaboration. In: Forty sixth CIRP conference on manufacturing systems 2013. Elsevier, pp 545–550. https://doi.org/10.1016/j.procir.2013.06.030

  • Schreiber G, Stemmer A, Bischoff R (2010) The fast research interface for the KUKA lightweight robot. In: IEEE workshop on innovative robot control architectures for demanding (research) applications how to modify and enhance commercial controllers (ICRA 2010), Anchorage, pp 15–21

  • Sharkawy A-N, Aspragathos N (2018) Human–robot collision detection based on neural networks. Int J Mech Eng Robot Res 7:150–157. https://doi.org/10.18178/ijmerr.7.2.150-157

    Article  Google Scholar 

  • Sharkawy AN, Koustoumpardis PN, Aspragathos NA (2019) Manipulator Collision Detection and Collided Link Identification Based on Neural Networks. In: Aspragathos N, Koustoumpardis P, Moulianitis V (eds) Advances in service and industrial robotics. RAAD 2018. Mechanisms and machine science, vol 67. Springer, Cham

    Google Scholar 

  • Smith AC, Hashtrudi-Zaad K (2005) Application of neural networks in inverse dynamics based contact force estimation. In: Proceedings of 2005 IEEE conference control application, IEEE, Toronto, Canada, pp 1021–1026. https://doi.org/10.1109/cca.2005.1507264

  • Vemuri AT, Polycarpou MM (1997) Neural-network-based robust fault diagnosis in robotic systems. IEEE Trans Neural Netw 8:1410–1420

    Article  Google Scholar 

  • Yamada Y, Hirasawa Y, Huang S, Umetani Y, Suita K (1997) Human–robot contact in the safeguarding space. IEEE/ASME Trans Mechatron 2:230–236

    Article  Google Scholar 

  • Zhang GP (2000) Neural networks for classification: a survey. IEEE Trans Syst MAN Cybern C Appl Rev 30:451–462. https://doi.org/10.1109/5326.897072

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Mr. Nikitas Xanthopoulos, External Expert, Department of Mechanical Engineering and Aeronautics, University of Patras, for his help in programming with C++. Abdel-Nasser Sharkawy is funded by the “Egyptian Cultural Affairs & Missions Sector” and “Hellenic Ministry of Foreign Affairs Scholarship” for PhD study in Greece.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Abdel-Nasser Sharkawy.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Ethical approval

This article does not contain any studies with animals performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Communicated by V. Loia.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix 1: Discussion of the dynamic coupling

1.1 First part: dynamic parameters of the manipulator

The inertia matrix, Coriolis and centrifugal matrix, and gravity vector of the three-link manipulator from (1) are calculated in Murray et al. (1994) as

$$ M\left( \theta \right) = \left[ {\begin{array}{*{20}c} {M_{11} } \\ 0 \\ 0 \\ \end{array} \begin{array}{*{20}c} 0 \\ {M_{22} } \\ {M_{32} } \\ \end{array} \begin{array}{*{20}c} { 0} \\ { M_{23} } \\ { M_{33} } \\ \end{array} } \right] $$
(3)

where

$$ \begin{aligned} M_{11} & = I_{y2} s_{2}^{2} + I_{y3} s_{23}^{2} + I_{z1} + I_{z2} c_{2}^{2} + I_{z3} c_{23}^{2} + m_{2} r_{2}^{2} c_{2}^{2} + m_{3} \left( {l_{2} c_{2} + r_{3} c_{23} } \right)^{2} \\ M_{22} & = I_{x2} + I_{x3} + m_{3} l_{2}^{2} + m_{2} r_{2}^{2} + m_{3} r_{3}^{2} + 2m_{3} l_{2} r_{3} c_{3} \\ M_{23} & = I_{x3} + m_{3} r_{3}^{2} + m_{3} l_{2} r_{3} c_{3} \\ M_{32} & = I_{x3} + m_{3} r_{3}^{2} + m_{3} l_{2} r_{3} c_{3} \\ M_{33} & = I_{x3} + m_{3} r_{3}^{2} \\ \end{aligned} $$
$$ C\left( {\theta ,\dot{\theta }} \right) = \left[ {\begin{array}{*{20}c} {C_{11} } \\ {C_{21} } \\ {C_{31} } \\ \end{array} \begin{array}{*{20}c} {C_{12} } \\ {C_{22} } \\ {C_{32} } \\ \end{array} \begin{array}{*{20}c} { C_{13} } \\ { C_{23} } \\ { 0} \\ \end{array} } \right] $$
(4)

where

$$ \begin{aligned} C_{11} & = \left( {\left( {I_{y2} - I_{z2} - m_{2} r_{2}^{2} } \right)c_{2} s_{2} + \left( {I_{y3} - I_{z3} } \right)c_{23} s_{23} - m_{3} \left( {l_{2} c_{2} + r_{3} c_{23} } \right)\left( {l_{2} s_{2} + r_{3} s_{23} } \right)} \right)\dot{\theta }_{2} + \left( {\left( {I_{y3} - I_{z3} } \right)c_{23} s_{23} - m_{3} r_{3} s_{23} \left( {l_{2} c_{2} + r_{3} c_{23} } \right)} \right)\dot{\theta }_{3} \\ C_{12} & = \left( {\left( {I_{y2} - I_{z2} - m_{2} r_{2}^{2} } \right)c_{2} s_{2} + \left( {I_{y3} - I_{z3} } \right)c_{23} s_{23} - m_{3} \left( {l_{2} c_{2} + r_{3} c_{23} } \right)\left( {l_{2} s_{2} + r_{3} s_{23} } \right)} \right)\dot{\theta }_{1} \\ C_{13} & = \left( {\left( {I_{y3} - I_{z3} } \right)c_{23} s_{23} - m_{3} r_{3} s_{23} \left( {l_{2} c_{2} + r_{3} c_{23} } \right)} \right)\dot{\theta }_{1} \\ C_{21} & = \left( {\left( {I_{z2} - I_{y2} + m_{2} r_{2}^{2} } \right)c_{2} s_{2} + \left( {I_{z3} - I_{y3} } \right)c_{23} s_{23} + m_{3} \left( {l_{2} c_{2} + r_{3} c_{23} } \right)\left( {l_{2} s_{2} + r_{3} s_{23} } \right)} \right)\dot{\theta }_{1} \\ C_{22} & = - l_{2} m_{3} r_{3} s_{3} \dot{\theta }_{3} \\ C_{23} & = - l_{2} m_{3} r_{3} s_{3} \dot{\theta }_{2} - l_{2} m_{3} r_{3} s_{3} \dot{\theta }_{3} \\ C_{31} & = \left( {\left( {I_{z3} - I_{y3} } \right)c_{23} s_{23} + m_{3} r_{3} s_{23} \left( {l_{2} c_{2} + r_{3} c_{23} } \right)} \right)\dot{\theta }_{1} \\ C_{32} & = l_{2} m_{3} r_{3} s_{3} \dot{\theta }_{2} \\ \end{aligned} $$
$$ G\left( \theta \right) = \left[ {\begin{array}{*{20}c} 0 \\ { - \left( {m_{2} gr_{2} + m_{3} gl_{2} } \right)\cos \theta_{2} - m_{3} gr_{3} \cos \left( {\theta_{2} + \theta_{3} } \right)} \\ { - m_{3} gr_{3} \cos \left( {\theta_{2} + \theta_{3} } \right)} \\ \end{array} } \right] $$
(5)

Equations (1) and from (3) to (5) show the coupling that occurs between the joints. From (3), it is noted that the inertial force of link 3 affects the measured joint torques and the external torques of the joints 2 and 3. The measured joint torque and the external torque of joint 1 are dependent on the masses of the link 2 and the link 3 “end-effector” (\( m_{2} , m_{3} \)), but do not depend on their accelerations. It should be noted that this case is a special selection of the joints, but in other cases, there might be effect on the measured/external torques of joint 1 due to the inertial forces of the links 2 and 3. The inertial force of the link 2 affects the measured joint torques and the external torques of joint 2 and not joint 3, whereas the inertial force of the link 1 affects only the measured joint torque and the external torque of joint 1 and not joints 2 and 3.

The Coriolis and centrifugal terms presented in (4) show the coupling occurs between the three joints. The Coriolis forces of link 1 or 2 affect the measured joint torques and external torques of the three joints, whereas the Coriolis forces of link 3 affect only the measured joint torques and external torques of the joints 1 and 2 and not joint 3. From (5), it is clear that the gravitational force of link 3 affects the measured joint torques and external torques of both joints 2 and 3 which prove the coupling between these two joints. The gravitational force of link 2 affects only the measured joint torque and external torque of joint 2, whereas there is no gravitational force of link 1.

1.2 Second part: results of the experiments that investigate the coupling

The results of the 18 experiments done to prove the dynamic coupling between the joints, as shown in Fig. 2, are presented in Table 7 where the maximum amplitudes of the measured joint torques \( \left( {\tau_{{{\text{msr}}1}} , \tau_{{{\text{msr}}2}} , \tau_{{{\text{msr}}3}} } \right) \) and the external torques \( \left( {\tau_{{{\text{ext}}1}} , \tau_{{{\text{ext}}2}} , \tau_{{{\text{ext}}3}} } \right) \) at the time when the obstacle collides with the manipulator are included to summarize the effect on the torques of the joints 1, 2, and 3.

Table 7 The results of the 18 experiments to investigate the joint coupling

In Table 7, “Yes” means that the effect on the measured torque and the external torque of the joint \( i \) (\( i = 1, 2, 3 \)) is high or clear, when the collision between the obstacle and the manipulator occurs, whereas “No” means that it is very small and negligible. For the effect of the collision on the measured/external torques of the joints 1 and 2, “Yes” is chosen for the all cases/configurations, since it is obvious that this effect is high. For the effect of the collision on the measured/external torques of the joint 3, “Yes” is chosen only for the first configuration where the effect of the inertial forces is high. This configuration is taken as a reference configuration to choose “Yes” or “No” for the other, the second and the third, configurations where there is very small effect from the inertial forces. In the second and the third configurations, the percentage of the external torque of this joint (joint 3) to the external torque of joint 1, \( \frac{{\tau_{{{\text{ext}}3}} }}{{\tau_{{{\text{ext}}1}} }} \) or the percentage of the measured torque of joint 3 to the measured torque of joint 1, \( \frac{{\tau_{msr3} }}{{\tau_{msr1} }} \), is compared with a threshold and if this percentage is higher than or equal to this threshold then “Yes” is chosen otherwise “No.” This threshold is defined as 80% from \( \frac{{\tau_{{{\text{ext}}3}} }}{{\tau_{{{\text{ext}}1}} }} \) or 80% from \( \frac{{\tau_{{{\text{msr}}3}} }}{{\tau_{{{\text{msr}}1}} }} \) which is calculated from the first/reference configuration. This threshold is calculated for both weights (1 kg and 2 kg), and the lowest value is chosen. The threshold also is changing with the increase/change in the speed. It should be noted that the measured or the external torque of the joint 1 is taken as a reference torque because it has the highest value or in another meaning it is the most affected one by the collision. The selected threshold is presented in Table 8.

Table 8 The threshold value (%) to choose “Yes” or “No” for the effect of the collision, between the obstacle and the manipulator, on the torques of joint 3 for the second and the third configurations

Appendix 2: Protocol of the experiments in the paper

The protocol for the entire experiments executed in this paper is presented and discussed in Table 9. The all experiments, except experiment 6, are used with the proposed coupled NN. Experiments 1, 2, and 3 are used also for training and evaluation the uncoupled NNs, and experiments 4 and 6 are used for their verification as discussed in Sect. 7. The desired sinusoidal motion \( \theta_{\text{des}} \left( t \right) \) for the three joints simultaneously is chosen as

$$ \theta_{\text{des}} \left( t \right) = \frac{\pi }{4} - \frac{\pi }{4}\cos \left( {2\pi ft} \right) $$
(6)

where \( f \) is the frequency of the sinusoidal motion.

Table 9 The steps and protocol for all the discussed experimental work in the paper

Appendix 3: The training process of the NN

After trying different weights’ initializations with different hidden neurons (40, 50, 70, 90, 120, 150, and 170), the best performance for the coupled NN and uncoupled NNs that give us the minimum squared error (MSE) and the proper collision thresholds is shown in Figs. 20 and 21, respectively. For training whether the coupled NN or uncoupled ones, the best performance occurs after using 150 hidden neurons and 1000 iterations.

Appendix 4: Latencies from the sensors

As discussed in the paper, MATLAB Toolbox is used for the training process of the NN. The training process of the coupled NN is implemented offline on AMD Ryzen 5 1600 Six-Core processor since too many data require good hardware with high RAM to make the training fast. After the training process is finished, the coupled trained NN is implemented on KUKA LWR robot to check online its efficiency and generalization. The collision detection time, presented in Table 2, comes/arises from the following time:

  • 800 µs is required to read from the internal joints torque, position, and velocity sensors as well as to send motion commands to the robot.

  • 145 µs is required to read from the external force sensor (its latency 144 µs).

  • 6 µs is required for the NN calculations.

  • 1 ms (sampling rate) − (800 + 145 + 6) µs = 49 µs is required to other calculations and algorithmic commands.

This process is illustrated in Fig. 22. The required time for reading from the sensors of KUKA, as presented, is less than 1 ms (the sampling rate), so the collision detection time, which is presented in Table 2, is enough for reaction. Therefore, safe human–robot interaction is achieved.

Fig. 22
figure 22

The time of the occurring loop in KRC

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sharkawy, AN., Koustoumpardis, P.N. & Aspragathos, N. Human–robot collisions detection for safe human–robot interaction using one multi-input–output neural network. Soft Comput 24, 6687–6719 (2020). https://doi.org/10.1007/s00500-019-04306-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-019-04306-7

Keywords

Navigation