This repository contains code I wrote for my bachelor thesis, titled:
Neural Style Transfer, developed by Gatys et. al is a technique to transfer the style of an image onto another image.
A natural next step to extend this work is to apply it to the domain of videos. Applying NST to every video frame individually does result in strong flickering. However, different approaches have been developed to enforce continuity in the output video. E.g.
Our work took the idea of NST for Video one step further. Rather than just applying the style of an image onto a video, we instead decompose a video into three independent components: Content, Visual Style and Animation Style (or Temporal Style). This decomposition allows us to freely mix and match different videos together to generate new ones.
We can "extract" the Animation Style of a video by optimizing a video filled with noise to match the animation style of the target. This reconstruction of the animation style illustrates the independence from both content and visual style of the input fire video.
Our general approach allows us to stylize videos with the visual style of an image as well, resulting in interesting output videos without any flickering.
We can also stylize an animation using an existing animation. By varying the weight parameter for the content, we can adjust the strength of the content component in the final video.
We can also combine visual style, animation style and content freely together to create a new interesting video.
The possibilities are endless. We can also create animations in multiple steps. This resulting animation has the animation style and content of a looping pink liquid animation, but the visual style of Van Gogh's Starry Night.







