figshare
Browse
Temporally_consistent_gradient_based_video_editing_EMBEDDED.pdf (37 MB)

Temporally consistent gradient based video editing

Download (0 kB)
presentation
posted on 2013-12-17, 08:14 authored by Gabriele FaccioloGabriele Facciolo, Pablo Arias, Rida Sadek, Vicent Caselles

Presentation* of the paper:  “A Variational Model for Gradient-Based Video Editing”, Sadek, Facciolo, Arias, and Caselles, IJCV, 2013.

Presented 2013-06-24 at IMNC, Orsay, France.
Presented 2013-10-04 at Technicolor, Rennes, France. 

* Embedded videos only visible with Acrobat.

 

In this work we present a gradient-based variational model for video editing, addressing the problem of propagating gradient-domain information along the optical flow of the video. The resulting propagation is temporally consistent and blends seamlessly with its spatial surroundings. In addition, the presented model is able to cope with additive illumination changes and handles occlusions/dis-occlusions. The problem of propagation along the optical flow arises in different video editing applications. In this work we consider the application where a user edits a frame by modifying the texture of an object's surface and wishes to propagate this editing throughout the video.

 

History

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC