top of page
Search

Advanced Animatronic Head Capable of Human Motion

roboticsanimatroni

The animatronic industry is rapidly evolving and artisans are trying to experiment with the existing designs. One of their innovative and significant attempts is directed toward the advancement of animatronic head design.


To achieve the desired results, the researchers are exploring the potential of facial and head tracking software.


Noteworthy Research in the Field


Hanson Robotics identifies themselves as one of the pioneers of the industry, although many of their A.I. demonstrations are pre-recorded performances. Their extensive research into animatronic head enabled them to create robots that can analyze facial expressions, such as Zeno. They have also made robots that may display and be controlled by any actor’s facial expressions, like Jules and Einstein. Entertaining, but ethical?


One of the limitations of these robots is their complexity. That’s why they serve mostly as the inspirational models for face tracking and face detection robots, rather than directly useful ones.


MIT Media Lab designs Nexi that can express human emotions. Unlike the robots created by Hanson Robotics, her face consists only of moveable rigid components.


Interestingly enough, Nexi has a rigid mouth that only closes and opens and she has no horizontal control on her mouth to exhibit other lip movements.




The problem is, each eyebrow of this robot is controlled individually, as is each set of eyelids. This complicates the mechanism and designing procedure and compels the future roboticists to eliminate such technical complexities in the future generation animatronic figures.


Advanced Project Goals


According to the new age researchers, the animatronic figures need to meet the following requirement:


  • The animatronic head must simulate the shape of a human head.

  • The robot should have a human-like voice.

  • All the movements of the concerned figure will be recorded using the face tracking software, if feasible.

  • Though the robotic head cannot simulate a human face and head in all its functionality, nevertheless, a wide range of fundamental features need to be included.

  • Ultimately, the gestures of the head should correspond with other organs of the figure like animatronic hand and neck. Otherwise, the performance will appear to be bizarre.


Software


Many different software platforms are used by the roboticists to develop such a head. They can be roughly divided into two major groups.


The first one is a face tracking software that obtains data from a human subject performing a series of movements in front of a high-quality webcam.


The initial challenge that many roboticists faced was the unavailability of adequate open-source software. Because these couldn’t control the robot’s mouth, eyelids, and eyebrows.


The commercial version of FaceAPI was the sole software that could collect data on every potential degree of freedom.


Gaze Tracker is another open-source eye-tracking software intended to be used with gaze studies. But mapping eye rotations of the robot is not possible with this.


Nevertheless, with certain core adjustments, the eye rotations can be extracted from the pixel information. Gaze Tracker requires an infrared camera.


Conclusion


With the progression of technology, that day is not far away when the roboticists will be able to design a perfect animatronic figure.


Should you look to know more about animatronic hand and head, contact the experts at Custom Entertainment Solutions. Call 01.801.410.4869.

47 views0 comments

Recent Posts

See All

Comments


bottom of page