Privacy Notice

Welcome on ARRIVAL

We put great importance to data protection and therefore use the data you provide to us with upmost care. You can handle the data you provide to us in your personal dashboard. You will find our complete regulations on data protection and clarification of your rights in our privacy notice. By using the website and its offers and navigating further, you accept the regulations of our privacy notice and terms and conditions.

Accept

Challenge / Goal

The scenario of large-scale deployment of robots in public spaces and cities still poses questions about their safety, reliability and acceptability by the general population. Furthermore, current systems have limited autonomy and often require human intervention to address unexpected scenarios. In this view, the deployment at the scale of robots is likely to require a standard built-in remote control to:

A)    mitigate the risks that early-stage autonomy poses 
B)    open new scenarios through which robots can contribute to safety and resilience as an extension of city operations in case of emergency

The Use Case explored the challenges related to enabling remote control at a scale comparable to a hypothetical large-scale deployment of robots in cities. We developed and tested a robotic embodiment system that uses Virtual Reality for untrained users’ real-time control, sensing and manipulation of robots. The Use Case looked at the ability to operate in a dynamic public environment and the enhancement of human sensing of surroundings by mixing 3D, 2D and robotic sensors in virtual reality. 

Solution

We developed the Case Study iteratively, scaling up the complexity of the system to address:

1)    Mobility using 3D reconstruction of the environment in Virtual Reality and mapping of robot movements with the head of the user
2)    Manipulation using a digital twin representation of the robot, its joints and actuators
3)    Mixing 3D and 2D video feeds to broaden the field of view and rendering of information from robot proximity and collision sensors in the user field of view

The solution was enabled by Extend Robotics proprietary technology for low-latency 3D video feed and remote control AMAS Meta Quest application, readapted to support mobile robotic platforms and extended with new capabilities.

Images


Want to learn more about the lessons learned, financial details and results?

Log in

Time period

Planning time: Less than 6 months

Implementation time: 6 months to 1 year

Implementers

Open University

Service providers

Extend Robotics

End users

Health and safety, security, emergency response units, hospitals, care homes and shopping mall

Something went wrong on our side. Please try reloading the page and if the problem still persists, contact us via support@bable-smartcities.eu
Action successfully completed!