Computational models designed to automate and monitor interactions between humans and robotics that provide humans and robots with an appropriate mental model of how the others (humans and robots) will react to various behaviors, data quality, instructions and environmental changes are of extreme importance for collaborative robotics, referred here as “co-botics”.
The planned research will focus on the application of advanced machine learning and pattern recognition methodologies for facilitating shared intelligent cooperation between robotic units and humans. Advanced multi-modal data analysis aiming at describing cues from the real world (including humans) from multiple information sources will be developed and applied to this end. Based on that technology, online visual information analysis will be combined with sensor data analysis for decision making that will be interpreted in the entire system as suggestion-based cooperation through shared intelligent interactions.
The project will investigate efficient scheduling approaches that involve robots and humans. An example of such scenarios is the efficient loading and unloading of e.g. autonomous shipping applications. We will work on providing accurate descriptions of the objects appearing within the working envelope of each robot, leading to safe and efficient working environment. We believe that by exploiting information of the package and combining all available sensor cues instead of using only information related to destination, can lead to intelligent planning. To make the intelligent cooperation among humans and robots more accurate, the efficient scheduling will be investigated over the provided labelled data, if provided by the sponsoring industry, for specific tasks.