Robot Avatar Jonny by Dragon Tree Labs for ANA Avatar XPRIZE competition HD

Robot Avatar Jonny by Dragon Tree Labs for ANA Avatar XPRIZE competition
00:04:43

12n.ru 16744 ролика

Robot Avatar Jonny by Dragon Tree Labs for ANA Avatar XPRIZE competition.

Our avatar is designed to provide real-time telepresence communication with an ability to observe surroundings via VR-glasses, move around on wheels and interact with objects via two 6-axis manipulators (“Avatar Johny”). We rely on object recognition technology to assist the Operator to control the objects, feel the presence naturally without haptics feedback; and it gives the Recipient feeling of safety around the Avatar.

Our demo scenario is based on a well-known “Guess which Cup” game where the player should guess where (under which cup) the ball has been hidden by the game leader. In our case, the Avatar’s Operator acts as a player, and the Recipient acts as the game leader. When the game starts, the Recipient and the Operator (via Avatar) greet each other and the Recipient invites the Operator to play a “Guess which Cup” game.

Let us briefly highlight Johny’s key features:
— The avatar’s Operator can look around via VR-glasses just turning his head (like in real life);
— On board neural network is trained to recognize objects in front of the Avatar allowing the Operator to interact with the real worlds by Avatar’s arms;
— The Avatar’s Operator uses a regular gamepad (see picture below) to drive the Avatar where he wants to. This approach is easy and convenient and very close to controlling a car in a video game.
— Selection of detected objects is made through the buttons of the gamepad (see picture below). Objects available for manipulation are detected by a neural network.

Despite our initial plan to use quadrupled robotic platform, we decided to use a wheel-based platform. We were not able to purchase Boston Dynamics Spot and the set of Universal Robotics manipulators as originally planned. Instead, we have integrated R&D prototypes of the platform and manipulators designed and created by university labs in Russia. We discovered that placing a manipulator on the back of quadrupled platform complicates the Operator's perception of the body and overall controls of the arms. So we decided to design the upper part of the Avatar to be similar to the human body (arms should be placed slightly below the head and on the sides).

Additionally, we integrated robotics assistance intelligence hardware designed by the Russian robotic start-up Fast Sense Studio to process multiple neural networks on the edge.

Processing unit consists of the computers:
NUC — it is used for controlling manipulator (based on ROS);
Raspberry Pi — the main purpose of this part is controlling wheels and head;
Fast Sense Robotics Assistance Intelligence — here neural networks are running.

A couple of words about Fast Sense Robotics AI ( fastsense.readthedocs.io/en/latest/). It is our proprietary platform which has been designed specifically to run numerous neural networks and collect data from tens of various sensors. The platform is a powerful onboard computer bringing scalable Edge AI capabilities to mobile robotics. It consists of a single board COM Express module with Intel CPU, a set of edge AI accelerators to inference several neural nets on-board in real-time and has numerous hardware interfaces to robotic sensors and actuators. AI accelerators are connected using the M.2 PCIe interface and can be scaled depending on the task. The platform enables running up to 6 different neural nets in parallel without performance degradation receiving data from several sources in real-time.

More information about our Avatar Robot Jonny and our team at ANA Avatar XPRIZE competition you can find on the website dragontreelabs.tech/

If you have any questions feel free to contact us by email hello@dtlabs.tech

RSS
Нет комментариев. Ваш будет первым!