Researchers at Massachusetts Institute of Technology (MIT) have introduced a new technique for animating characters in movies and video games which allows artists greater flexibility and control. The method works by generating mathematical functions called barycentric coordinates to define how 2D and 3D shapes can move, stretch, and bend within space.
Animation techniques currently available can be rigid, offering only one option for barycentric coordinate functions pertaining to a specific animated character. Artists are often forced to develop a new move for a character from the ground up if they wish to create a slightly different effect.
The new technique developed by MIT, however, allows artists to preview shapes and character movements, selecting the smoothness energy that best meets their preferences. The computations are handled by a neural network that has been developed to generate barycentric coordinate functions, guaranteeing that calculations are always valid.
The team felt an effective approach was to pay heed to the basic mathematical construction of barycentric coordinates that were introduced in 1827 by German mathematician August Möbius. These computations allow for complex cages to exert influence over shapes within. The researchers have utilized overlapping virtual triangles for their new technique into which they incorporate the barycentric coordinates of complex, modern character cages.
This neural network model enables the combination of virtual triangles’ barycentric coordinates to create a complex yet smooth function. Artists can then tweak these functions to achieve different motions until they reach an animation that meets their desired vision.
The technique provides promising applications beyond the arts, with potential uses in medical imaging, virtual reality, architecture, and computer vision, to assist robots in understanding object movement in the real world.
Looking ahead, the team aims to progress the neural network’s speed and incorporate the method within an interactive interface that can enable artists to iterate their animations in real time. The research saw collaboration between several institutions and has received support from various sources including the U.S. Army Research Office, the U.S. Air Force Office of Scientific Research, and the U.S. National Science Foundation.