Everyone Can Dance with iPERCore and source video and one image

0š¯•¸koji
1 min readDec 11, 2020

You can see the demo
https://www.impersonator.org/project_img/impersonator_plus_plus/demo_video/demo_1_512x512.mp4

Impersonator++

Liquid Warping GAN with Attention: A Unified Framework for Human Image Synthesis, including human motion imitation, appearance transfer, and novel view synthesis. Currently the paper is under review of IEEE TPAMI. It is an extension of our previous ICCV project impersonator, and it has a more powerful ability in generalization and produces higher-resolution results (512 x 512, 1024 x 1024) than the previous ICCV version.

Creatorā€™s Google Colab
https://colab.research.google.com/drive/1bwUnj-9NnJA2EMr7eWO4I45UuBtKudg_?usp=sharing#scrollTo=hCqM2xQitKXj

Google Colab (Famous propeller dance in Japan)

requirements
source movie: mp4
source image: one image/two images(This GoogleColab is using two images)

The following is the result I tried.

If you have GPU, you can run iPERCore on your local machine.

Install guide
https://github.com/iPERDance/iPERCore/blob/main/docs/install.md

--

--

0š¯•¸koji

software engineer works for a Biotechnology Research startup in Brooklyn. #CreativeCoding #Art #IoT #MachineLearning #python #typescript #javascript #reactjs