Techno Blender
Digitally Yours.

Sony’s Mocopi Full Body VR Motion Trackers are Small, Affordable

0 54


Starting early next year, Sony plans to capitalize on the rise in popularity of VTubers—virtual YouTube (and sometimes Twitch) celebrities—and our slow but steady transition to a life spent entirely in VR. The company will be introducing a new and relatively affordable motion capture system that relies on just six sensors strapped to the body and a smartphone capturing all of the tracked motion data.

Computer-generated characters appearing in movies, TV, and video games were once all animated by hand; a time-consuming and expensive process that can result in a human character’s movements looking less than realistic (which, fair, isn’t always the goal). Motion capture helped solve that problem by capturing the nuanced movements of human performers and translating their movements to virtual characters, right down to facial expressions and even the movements of the eyes. Now, it seems we’re taking the steps to do it live.

Hollywood’s approach to motion capture for VFX-heavy productions like She-Hulk involves lots of expensive equipment, cameras strapped to actor’s heads, and often massive purpose-built studios packed with sensors to accurately capture a full-body performance. In other words, a budget that most VTubers don’t have access to.

There are more affordable solutions, such as software-based motion trackers, but they often suffer from accuracy issues. VR systems, like the HTC Vive, offer robust full-body tracking solutions through wearable trackers and nearby base stations, but the price of that hardware starts to quickly add up again. And while solutions like SlimeVR offer multi-limb tracking for just a couple of hundred bucks, it’s a crowd-funded product, which often means challenges when something goes wrong.

モバイルモーションキャプチャー:可能性、無限大。ちいさくて、かるい、モバイルモーションキャプチャー mocopi(モコピ)【ソニー公式】

Sony’s mocopi not only comes from a company with as recognizable a name in electronics as it gets; it also delivers six lightweight wireless tracking sensors (reminiscent of Apple’s AirTags) worn on the ankles, wrists, lower back, and the head. The whole package costs 49,500 yen, or about $360. Expensive, but much cheaper than a studio or even a Vive. Although the sensors are held in place with velcro straps (and a clip for the lower back sensor to attach to pants), the mocopi system looks mostly unobtrusive, as all of the performance capture is handled by a mobile app running on a smartphone.

モバイルモーションキャプチャー:mocopi(モコピ)の詳しい使い方【ソニー公式】

Setup, as demonstrated in this how-to video, looks easy and straightforward, with the app providing a live preview of the user’s captured movements as applied to a virtual character. The mocopi system can be used as a real-time solution to bring a VTuber character to life, or a virtual reality character in apps like VRChat, but through an SDK being released on December 15th, the captured performance data can also be imported into 3D animation programs to create more life-like characters.

Although mocopi won’t provide the same level of accuracy as the hardware that Hollywood relies on, it could be a more affordable motion capture solution for filmmakers or game developers with limited budgets. Pre-orders are expected to be available sometime in mid-December, with the system shipping about a month later in January 2023; however, Sony is initially only making mocopi available to the Japanese market, with no details on if or when it will be available in other markets at a later date.


Starting early next year, Sony plans to capitalize on the rise in popularity of VTubers—virtual YouTube (and sometimes Twitch) celebrities—and our slow but steady transition to a life spent entirely in VR. The company will be introducing a new and relatively affordable motion capture system that relies on just six sensors strapped to the body and a smartphone capturing all of the tracked motion data.

Computer-generated characters appearing in movies, TV, and video games were once all animated by hand; a time-consuming and expensive process that can result in a human character’s movements looking less than realistic (which, fair, isn’t always the goal). Motion capture helped solve that problem by capturing the nuanced movements of human performers and translating their movements to virtual characters, right down to facial expressions and even the movements of the eyes. Now, it seems we’re taking the steps to do it live.

Hollywood’s approach to motion capture for VFX-heavy productions like She-Hulk involves lots of expensive equipment, cameras strapped to actor’s heads, and often massive purpose-built studios packed with sensors to accurately capture a full-body performance. In other words, a budget that most VTubers don’t have access to.

There are more affordable solutions, such as software-based motion trackers, but they often suffer from accuracy issues. VR systems, like the HTC Vive, offer robust full-body tracking solutions through wearable trackers and nearby base stations, but the price of that hardware starts to quickly add up again. And while solutions like SlimeVR offer multi-limb tracking for just a couple of hundred bucks, it’s a crowd-funded product, which often means challenges when something goes wrong.

モバイルモーションキャプチャー:可能性、無限大。ちいさくて、かるい、モバイルモーションキャプチャー mocopi(モコピ)【ソニー公式】

Sony’s mocopi not only comes from a company with as recognizable a name in electronics as it gets; it also delivers six lightweight wireless tracking sensors (reminiscent of Apple’s AirTags) worn on the ankles, wrists, lower back, and the head. The whole package costs 49,500 yen, or about $360. Expensive, but much cheaper than a studio or even a Vive. Although the sensors are held in place with velcro straps (and a clip for the lower back sensor to attach to pants), the mocopi system looks mostly unobtrusive, as all of the performance capture is handled by a mobile app running on a smartphone.

モバイルモーションキャプチャー:mocopi(モコピ)の詳しい使い方【ソニー公式】

Setup, as demonstrated in this how-to video, looks easy and straightforward, with the app providing a live preview of the user’s captured movements as applied to a virtual character. The mocopi system can be used as a real-time solution to bring a VTuber character to life, or a virtual reality character in apps like VRChat, but through an SDK being released on December 15th, the captured performance data can also be imported into 3D animation programs to create more life-like characters.

Although mocopi won’t provide the same level of accuracy as the hardware that Hollywood relies on, it could be a more affordable motion capture solution for filmmakers or game developers with limited budgets. Pre-orders are expected to be available sometime in mid-December, with the system shipping about a month later in January 2023; however, Sony is initially only making mocopi available to the Japanese market, with no details on if or when it will be available in other markets at a later date.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment