Emily Kim
I am a PhD student at Carnegie Mellon University (CMU) Robotics Institute (RI), advised by Professor Jessica Hodgins. My research focus spans broadly in the field of Computer Vision and Graphics, particularly in generating human datasets for various tasks and object detection. I obtained my Bachelor of Science in Joint Computer Science-Math at Harvey Mudd College.
My current research focuses on human dataset for generating universal Codec Avatars and evaluating the avatars generated from cross identity subjects (with one subject driving the appearance and the other subject driving the expression) using the autoencoder. I am currently collaborating with Julieta Martinez at Meta on a project for generating novel views from control and identity images using a combination of diffusion and GAN models. In my previous research, I have focused on RGB video-based human motion tracking for human action recognition (HAR) for everyday gestures and error diagnosis of the exercise repetitions performed during physical therapy. My research field also spans to the practical adversarial attaacks on vehicle detection from satellite images. We use both the texture (2D) and the mesh-based (3D) attack that attacks both the computer vision-based detection algorithms as well as the human perceptive system.
News
- (Sep 2024) Our paper on Codec Avatars dataset got accepted at NeurIPS 2024 Datasets and Benchmarks track for poster presentation! Check out our Github repo here.
- (Mar 2024) The website for CVPR Workshop on Codec Avatars is up!
- (Jan 2024) I am working with Julieta Martinez at Meta Reality Labs on the dataset and code release for Universal Codec Avatars. This dataset will be presented at CVPR 2024 Codec Avatars Workshop.
- (Jan 2024) We presented our paper on activity recognition during the poster session at WACV 2024.
- (Jan 2024) Our paper on activity recognition is released! Checkout our REMAG dataset suite here.
- (Oct 2023) Our paper on activity recognition has been accepted to WACV 2024! Check out our video and paper.