Skip to content

Forked Code for Dr. Parikh's ICCC'20 paper - "Feel The Music: Automatically Generating A Dance For An Input Song"

Notifications You must be signed in to change notification settings

tanishqsandhu/feel-the-music

 
 

Repository files navigation

Forked from @purvaten

(Not official) Feel The Music: Automatically Generating A Dance For An Input Song

This repository holds my work as an Undergraduate Research Assistant with Dr. Parikh, on top of her existing code Feel-The-Music

Full text available at: https://arxiv.org/abs/2006.11905

Requirements

Create a new Python 3.7 virtual environment. Install the requirements using pip install -r requirements.txt

Steps for Generating Dances

  1. git clone https://github.com/purvaten/feel-the-music.git

  2. cd feel-the-music

  3. Generate dance (example below)

python generate_dance.py \
--songpath './audio_files/flutesong.mp3' \
--songname 'flutesong' \
--steps 100 \
--type "action" \
--visfolder './vis_num_steps_20/dancing_person_20'

A folder named plots will be created in the current directory containing frames of the dance and the final combined output as <songname>.mp4. The music and dance matrices will be saved as music.png and dance.png in the current directory.

NOTE : <visfolder> should contain GRID_SIZE number of images of the agent, smoothly transitioning into each other with the files numbered as 1.png, 2.png, ... <GRID_SIZE>.png. In our experiments GRID_SIZE=20.

Results

Song Type Number of Steps Agent Video
flutesong action 100 Stick figure Qries
Oe oe oe oa action 50 Stretchy leaves Qries
It's the time to disco (karaoke) action 100 Floating leaves Qries

For more examples, check https://sites.google.com/view/dancing-agents.

About

Forked Code for Dr. Parikh's ICCC'20 paper - "Feel The Music: Automatically Generating A Dance For An Input Song"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%