U.S. Patent No. 10,665,265: Event reel generator for video content
Issued May 26, 2020 to Sony Interactive Entertainment America LLC
Priority Date: February 2, 2018
U.S. Patent No. 10,665,265 (the ’265 Patent) relates to generating an event reel of the best moments from a longer video, such as of a recorded video game session. The ’265 Patent details a method of selecting video segments based on spectator reaction data, and synchronizing the cuts between different segments to a music track to create an edited compilation of video clips. In the context of video game streaming, for example, spectators may “live react” to video game sessions by inputting emojis, likes, and thumbs up/down that correlates to the video content in real time. This spectator reaction data is used to identify which sections of video content generated certain feedback so that optimal video clips may be compiled into an event reel.
Methods and systems are provided for generating an event reel based on spectator reaction data, the event reel is temporally synchronized to a music track. A method includes receiving a video file for video content and receiving spectator reaction data related to reactions generated by spectators while viewing the video content. The method includes processing the spectator reaction data to identify video time slices from the video content that correspond to segments of interest of the video content. The method includes processing a music track to identify markers for the music track that correspond to beats of the music track and generating an event reel having a video component defined by a sequence of the video time slices that is temporally synchronized to the markers of the music track.
1. A method for creating an event reel for video content that is synchronized to music, the method including, receiving a video file for the video content; receiving spectator reaction data defined by reactions generated by spectators while viewing the video content, the spectator reaction data is indicative of segments of interest wherein the reactions of the spectator reaction data are timestamped; processing the spectator reaction data for identifying video time slices from the video content that correspond to the segments of interest of the video content having a threshold reaction intensity, each of the video time slices includes a plurality of video frames from the video content, wherein processing the spectator reaction data includes calculating reaction intensity for the spectator reaction data during intervals of time for the video file, the reaction intensity is identified by a summation of reactions during each of the respective intervals of time; processing a music track, received for creating the event reel, to identify markers for the music track that correspond to beats associated with the music track; and generating the event reel having a video component assembled for a sequence of the video time slices and an audio component defined at least partially by the music track, the generating the event reel includes synchronizing the video time slices to the markers for the music track, wherein the synchronizing causes scene changes between sequential video time slices to correspond in time with at least a portion of the beats associated with the music track.