top of page

Human-Robot Collaboration

Introduction

• Human Robot Collaboration (HRC) refers to all processes in which humans and machines work and interact with each other in the same workflow, without any barriers between the individual work areas. People and HRC robots therefore share a single work space, which does not require separating protective equipment.

• To ensure that a collaborative robot is used in a productive environment, different prerequisites and standards must be fulfilled. After all, a collaborative robot must work directly with people across a wide range of production areas and must not harm them in any way. On the one hand, this means that there must be suitable and sufficiently dimensioned safety systems for human robot interaction.

• Because people aren’t robots, safety demands that these autonomous robots need to be able to adapt to variable circumstances. To ensure safety, co-working industrial robots need to have the ability to be aware of humans in their environment and need to be able to stop or maneuver out of the way if there’s a risk of injury and then be able to smoothly return to the task once the danger passes. The latest technology is making this easier.

2020-04-30%20(1)_edited.jpg
HOME: Welcome

GOALS

1.) Safety: To minimize injuries at workplace.

2.) Easy to use and a compact design.

3.) Designing a simple, cost effective object detection autonomous system using Ultrasonic sensor

HOME: About My Project

BACKGROUND

Throughout the collection of data, for accidents caused by human-robot interaction that was collected by OSHA, it was found that some major factors contribute to workplace incidents when it comes to human-robot interaction such as the design of work area, coding problems, mechanical problem and lack of training for employees, but most of them were caused because of unconstrained movement of robot when  worker was in close proximity and lead to major injuries that needed hospitalization .For an example from the collected data , the robot dropped weight on the back of the worker causing a fracture on his right shoulder which required hospitalization. This indicates that there is a safety issue during Human-Robot Collaboration.

17robot-superJumbo.jpg
HOME: Intro
Covid 19

CONSTRAINTS

• Effectual angle is <15°, therefore the sector area covered is less.

• Can detect objects effectively within 13-feet.

• Time: This project could only span one semester.

• Experimental data including different test subjects data unavailable due to stay-at-home conditions surrounding global pandemic, COVID-19.

HOME: Body

SURVEY/ EXPERIMENTAL RESEARCH

We analyzed the collected data according to :

• Age

• Gender

• Cause of Injury

• Level of injury

Untitled_edited.jpg
HOME: Conclusion

PROJECT IDEA

Untitled.png
HOME: Image

METHOD

Untitled.png

LEVEL-1

SAFETY-RATED MONITORED STOP 

The first collaborative mode is “Safety-rated Monitored Stop” – SMS. It is the simplest type of collaboration. The operator performs manual tasks inside a collaborative area, which is an operative space shared between the human and the robot. Inside such collaborative area, both the human and the robot can work, but not at the same time since the latter is not allowed to move if the operator occupies this shared space.

Untitled.png

LEVEL 2

HAND GUIDING

The second mode is “Hand Guiding” – HG. Also known as “direct teach”, in this collaborative mode the operator can teach the robot positions by moving the robot without the need of an intermediate interface, e.g. robot teach pendant.

Untitled.png

LEVEL 3

SPEED AND SEPARATION MONITORING

This is your other projects section. It’s a great place to update your readers about previous projects you’ve done, or the ones you are working on. Add photos, illustrations and videos. Proud about your work? Don’t keep it to yourself! Let users know about it, and add a link to the original project or write-up.

Untitled.png

LEVEL 4

POWER AND FORCE LIMITING

The fourth mode is “Power and Force Limiting” – PFL. This collaborative approach prescribes the limitation of motor power and force so that a human worker can work side-by-side with the robot. This level requires dedicated equipment and control models for handling collisions between the robot and the human with no harmful consequences for the latter.

HOME: Other Projects

ARDUINO - Ultrasonic Sensor

What You Need to Know

The HC-SR04 ultrasonic sensor uses SONAR to determine the distance of an object just like the bats do. It offers excellent non-contact range detection with high accuracy and stable readings in an easy-to-use package from 2 cm to 400 cm or 1” to 13 feet.

The operation is not affected by sunlight or black material, although acoustically, soft materials like cloth can be difficult to detect. It comes complete with ultrasonic transmitter and receiver module.

The distance sensed by the module is adjustable and 2 or more can be used to guard the whole workspace. If a human user reaches into the workspace the robot stops.

Untitled.png
HOME: About Me

EXPERIMENT

HOME: Other Projects
INTERFACE FOR ARDUINO UNO

INTERFACE

RESULT

The developed system was tested by placing obstacles at various distances.Table-1 summaries the results obtained. 

The characteristic profile generated by the ultrasonic sensor is linear and stable which can be observed in Figure.

This is as a result of the smooth and good refraction surface of the obstacle used in the experiment. Also, there is variation in the comparative profiles for both calculated and experimental time durations obtained in Figure. It is established that as the distance of the obstacle gradually increases, time taken for the ultrasonic sensor to detect the object also increases gradually.

CALCULATION FORMULA

D=(T×V)/2

Where,

D= Distance between the sensor and the detected object.

T= Time Between transmitted and received reflected wave.

V= Ultrasonic wave propagation speed in air at normal speed 344m/s.

HOME: Other Projects
Untitled.png
3-Finger-Gripper-Universal-Robots_main-1

Conclusion

Our team looked to reduce overall costs in manufacturing settings where collaborative robots are used by reducing the potential number of injuries that can occur. Our team tested an ultrasonic sensor that was created using Arduino for effectiveness at different distances.
Due to unforeseen circumstances, we were unable to do hands-on robot testing with the sensor to see the effectiveness of the sensor in a real life setting.

HOME: Conclusion
Untitled.png

FUTURE SCOPE

Our project looked to improve Human-Robot Collaboration in cases where the robot is operating in a safety-rated monitored stop mode. This mode is much simpler than the other 3 modes because although the human and the robot share the same space, they do not perform tasks at the same time. Our future scope is to apply this ultrasonic sensor technology to different modes. In particular, we hope to apply this to a Level 3 mode where there is speed and separation modeling. The idea is to be able to slow down the robot when a human is in a dangerous zone and completely freeze the robot when the human is at the most dangerous zone or touching the robot. In order to do this, a safety map would have to be created to determine the least and most dangerous areas around the robot.

HOME: Quote
HOME: Watch

Citations

Our work would not have been possible without the developments and knowledge we gained from the following research studies and academic journals.


Alegue, Ela Mvolo Evina. “Human-Robot Collaboration with High-Payload Robots in Industrial Settings.” IECON 2018 - 44th Annual Conference of the IEEE Industrial Electronics Society, 2018, doi:10.1109/iecon.2018.8591069.


Lasota, Przemyslaw A., et al. “A Survey of Methods for Safe Human-Robot Interaction.” Foundations and Trends in Robotics, vol. 5, no. 3, 2017, pp. 261–349., doi:10.1561/2300000052.


Makrini, Ilias El, et al. “Task Allocation for Improved Ergonomics in Human-Robot Collaborative Assembly.” Interaction Studies Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems

Human Robot Collaborative Intelligence, vol. 20, no. 1, 2019, pp. 102–133., doi:10.1075/is.18018.mak.


Villani, Valeria, et al. “Survey on Human–Robot Collaboration in Industrial Settings: Safety, Intuitive Interfaces and Applications.” Mechatronics, vol. 55, 2018, pp. 248–266., doi:10.1016/j.mechatronics.2018.02.009.


Xiao, Luyin. “Unmanned Vehicle with Obstacle Avoidance Technology Based on Ultrasonic Ranging.” 2018 Eighth International Conference on Instrumentation & Measurement, Computer, Communication and Control (IMCCC), 2018, doi:10.1109/imccc.2018.00025.

Yılmaz, Esra, and Sibel T. Özyer. “Remote and Autonomous Controlled Robotic Car Based on Arduino with Real Time Obstacle Detection and Avoidance.” Universal Journal of Engineering Science, vol. 7, no. 1, 2019, pp. 1–7., doi:10.13189/ujes.2019.070101.

HOME: Citations

HCI Design Principles

ATTENTION PRINCIPLES


  • A1: Salience compatibility

  • A4: Avoid resource competition


PERCEPTION PRINCIPLES


  • P5: Make displays legible (or audible)

  • P7: Support top-down processing

  • P9: Make discriminable


MEMORY PRINCIPLES


  • M11: Support visual momentum

  • M13: Be consistent


MENTAL MODEL PRINCIPLES


  • MM14: Pictorial realism

  • MM15: Moving part

HOME: Text
HOME: Contact

Contact Us

I hope you enjoyed reviewing our project - please get in touch if you want to hear more!

Raja Ratan Addo | raddo3@uic.edu

         Heba Salem  |  hsalem4@uic.edu

Brandon Solares | bsolar3@uic.edu

Sauban Farooqui  | sfaroo22@uic.edu

842 W Taylor St, Chicago, IL 60607

Thanks for submitting!

bottom of page