AI makes captivating movie trailers

River D'Almeida, Ph.D
3 min readJun 2, 2023

A new machine learning model uses viewer’s brain waves to create trailers that tug on heart strings.

Photo by Erik Mclean on Unsplash

In today’s vast digital landscape, where a breathtaking array of video content awaits us at every click, finding that perfect gem amidst the ocean of choices can be an overwhelming and time-consuming task.

Enter the groundbreaking world of video summarization, a process that aims to distill the essence of a video, extracting its most informative moments while ensuring minimal loss of vital information. Read on to learn about EEG-Video Emotion-based Summarization (EVES), a revolutionary model that fuses the power of neural signals and deep reinforcement learning to produce video summaries that are both quantitatively and qualitatively superior.

Traditionally, video summarization has heavily relied on labor-intensive human annotations, a costly and time-consuming process. However, EVES sets itself apart by leveraging multimodal signals instead. By tapping into the neural activity of viewers, EVES is capable of learning and discerning the nuances of visual interestingness, enabling it to craft video summaries that captivate and engage audiences on a deeper level.

To ensure a seamless alignment between the visual content and the neural signals, EVES incorporates a Time Synchronization Module (TSM). This…

--

--

River D'Almeida, Ph.D

Follow me for bite-sized stories on the latest discoveries and innovations in biomedical research.