top of page

PROJECTS

Latest Projects
FRL_logo.png

Social VR uses motion tracking to embody users as animated avatars. Often, only the head and hand poses are measured, and the rest of the body pose is estimated via inverse kinematics. Optical tracking of the hands can frequently fail when the hands are visually obscured from the VR headset cameras, and this causes errors in the rendered motion. In this project, we investigated three amelioration strategies to handle these errors and demonstrated experimentally that each is effective in reducing the impact of errors.
As part of this work, we developed a method for simulating headset-based hand tracking loss occurrences on error-free motion-captured data.
Additionally, we explored general issues around study design for motion perception. We compared different strategies for presenting stimuli and soliciting input. For example, the presence of a simultaneous recall task reduced but did not eliminate sensitivity to motion errors.
Finally, we showed that motion errors are interpreted, at least in part, as a shift in interlocutor personality. 

(Link to paper and video

teaser_vertical.png

Internship: Strategies for Handling Hand Tracking Loss in Social VR

At Facebook Reality Labs, AR/VR group

PhD Project: Avatar gesture synthesis

At Trinity College Dublin, department of Graphics, Vision, and Visualisation

My PhD project focuses on automatically generating natural motion for coversing virtual humans. For this, I am exploring machine learning techniques and natural language processing. The project is in collaboration with ADAPT Centre.

During this project, I created two large multimodal datasets and a number of machine learned models for purposes such as motion segmentation. These resources are available at https://trinityspeechgesture.scss.tcd.ie/.

My thesis is available here: http://www.tara.tcd.ie/xmlui/handle/2262/96795

gestures_1%20(1)_edited.png

Internship: Abstract virtual character perception

At Trinity College Dublin, department of Graphics, Vision, and Visualisation

I worked on a project about which visual features influence the perception of the personality of virtual characters. This is part of the EU's Populate project. (See below for visuals.)

Master thesis project: Person-Dependent Social Action Recognition using VR

At the Max-Planck Insitute of Biological Cybernetics, Social and Spatial Cognition group

In my master project, I researched aspects about social action recognition. I used a Virtual Reality setup consisting of 3D shutter glasses, a backprojection screen, and optical motion tracking. For designing my VR environment I used the Unity3D game engine and a motion capture suit.

Internship: Interactive Social Action Recognition

At the Max-Planck Insitute of Biological Cybernetics, Social and Spatial Cognition group

During my internship I used a Virtual Reality setup (with an environment designed with Unity 3D) to investigate how social actions are processed when actions are merely observed versus when actions are observed and executed at the same time.

Bachelor thesis project: Manipulations Of Bodily Self-Perception

At the Max-Planck Insitute of Biological Cybernetics, Perception and Action in Virtual Environments group

My bachelor thesis focused on how we perceive ourselves in space and how we can alter self-perception by manipulating the perceived self-localization. For this, a full-body "rubber-hand" illusion was induced using a Virtual Reality setup with a head-mounted display and an environment developed with Unity 3D.

Project Visuals

ExpressGesture: Expressive Gesture Generation from Speech through Database Matching

Multi-objective adversarial gesture generation

Understanding the Predictability of Gesture Parameters from Speech and their Perceptual Importance

Human or Robot? Investigating voice, appearance and gesture motion realism of conversational agents

Moral decisions game (Unity Webplayer experiment)

Unity game project (in progress)

OpenGL projects

Simple snake game programmed with Racket

 

© 2017 by Ylva Ferstl

bottom of page