U.S. Patent no. 10,265,621: Tracking specific gestures related to user movement
Issued April 23, 2019 to Disney Enterprises, Inc.
Priority Date January 20, 2015
U.S. Patent No. 10,265,621 (the ‘621 Patent) relates to inter-device communications and more specifically to distinguishing specific gestures in user movements in an immersive play environment. The ‘621 Patent includes a platform providing an immersive play environment wherein user movement is captured via tracking devices. The invention proceeds by using multiple tracking devices to capture first and second tracking data. This addresses a limitation in non-immersive play environments where a user is limited to a controller in order to directly interact with the video game system. An example of this is where the user has performed a particular action, such as swinging a toy sword at an interactive object, and in response the devices could react and perform various audiovisual effects which would create a feeling of sensory immersion for the user. This enhanced level of realism is not possible using traditional analog controller interfaces and thus the ‘621 Patent could change the immersive video game environment.
Techniques are disclosed for distinguishing user movements in an immersive play environment. A first tracking device is provided that measures first tracking data. A second tracking device is also provided to measure second tracking data. A controller device is also provided to receive tracking data from the second tracking device. The second tracking device measures movement of a user by receiving the first tracking data from the first tracking device and determining third tracking data movement based on a combination of the first tracking data and the second tracking data.
1. A platform providing an immersive play environment, the platform comprising: a controller device configured to transmit instructions to an interactive toy device in the immersive play environment; a plurality of tracking devices of a user during a play experience in a physical space, the plurality of tracking devices including a handheld device configured to be held by the user and a wearable device configured to be worn by the user, wherein the handheld device and the wearable device include at least one of a gyroscope, an accelerometer, and a magnetometer, wherein each of the handheld device and the wearable device is configured to capture movement of the user in the physical space over time as first tracking data and second tracking data, respectively, wherein a virtual object is extrapolated from the handheld device in a three-dimensional vector space corresponding to the physical space, wherein the virtual object is substantially linear, wherein the wearable device is configured to receive the first tracking data from the handheld device, generate combined tracking data based at least in part on the first tracking data and the second tracking data, and transmit the combined tracking data to the controller device, whereupon the controller device: determines that the combined tracking data reflects a movement sequence that triggers a hit event, based on determining: (i) that the movement sequence is within a predefined range of the interactive toy device and (ii) that the movement sequence defines an arc in the three-dimensional vector space with a rotational velocity exceeding a specified threshold velocity; and upon determining that the movement sequence triggers the hit event, causes the interactive toy device to perform, in the physical space, an action simulating collision between the virtual object and the interactive toy device absent a physical collision therebetween, the action comprising an effect selected from an audio effect, a visual effect, and an audiovisual effect.