Usually, these technical demonstrations involve a lot of specialized hardware for the performance capture and a good deal of computer processing and manual artist tweaking to get the resulting data into a game-ready state.Įpic's upcoming MetaHuman facial animation tool looks set to revolutionize that kind of labor- and time-intensive workflow. "MetaHuman Animator" goes from iPhone video to high-fidelity 3D movement in minutes.Įvery year at the Game Developers Conference, a handful of competing companies show off their latest motion-capture technology, which transforms human performances into 3D animations that can be used on in-game models. "Epic’s new motion-capture animation tech has to be seen to be believed What are the limits, constraints and processing requirements? Where are the hidden issues? Having been immersed in performance capture for the last few years, and having seen how bad the results are from live capture and suit-based capture, I always wonder what the hidden compromise is on new systems. It's going to be very interesting to test out this pipeline and determine both the actual fidelity it reaches comparable to existing methods and the noise created to be accepted or that requires a post processing cleanup pass.
0 Comments
Leave a Reply. |