U.S. Patent No. 10,930,044: Control system for virtual characters

Issued February 23, 2021, to Mursion Inc.
Filed: October 16, 2019 (claiming priority to November 6, 2015)

Overview:

U.S. Patent No. 10,930,044 (the ‘044 patent) relates to a control interface for virtual characters which allows a user to select facial expressions, poses, and behaviors of their virtual character. The ‘044 patent details a control system including a library of several facial expressions, poses, and behavior for the virtual character. The system displays a menu which provides a user a list of expressions and such stored in the library. The system receives an input from a user for one or more of a user-selected expressions, poses, or behaviors and an animation of the virtual character changing from the prior position to the selected position is displayed. The ‘044 patent can be used for live human-avatar interactions to allow users to emote through their virtual characters, and seems to be a precursor solution to Apple’s later introduced (in 2017) animoji, which produces a similar result but using automatic facial recognition.

Abstract:

A control system provides an interface for virtual characters, or avatars, during live avatar-human interactions. A human interactor can select facial expressions, poses, and behaviors of the virtual character using an input device mapped to menus on a display device.

Illustrative Claim:

  1. A system for controlling a virtual character, comprising: one or more processors and a memory located at an interactor node, the memory including a stored library comprising a plurality of facial expressions, a plurality of poses, and a plurality of behaviors associated with the virtual character; and wherein the one or more processors are operable, via machine-readable instructions stored in the memory, to: display a menu on a first display device located at the interactor node, the menu configured to provide a listing of selections from one of the plurality of facial expressions, the plurality of poses, and the plurality of behaviors associated with the virtual character, receive from an input device at the interactor node one or more of a user-selected facial expression, a user-selected pose, and a user-selected behavior for the virtual character selected from the menu, provide an animation of movement of the virtual character from a prior position to a position exhibiting the one or more user-selected facial expression, user-selected pose, and user-selected behavior, and display on a second display device located at an end user node located remotely from the interactor node, the movement of the virtual character from the prior position to the position exhibiting the one or more user-selected facial expression, user-selected pose, and user-selected behavior.