Notable Projects

Throughout my career, I have had had the pleasure of collaborating with artists and
machine learning scientists to build out data pipelines for a series of models. Here are a few project highlights.

Project:
Synthetic Humans Rendering Pipeline.

Core Tools:

Unreal Engine 5.6 (headless)

Python

Side Tools:

Ray Multiprocessing

Storage Tools:

Internal Cluster

Role: Lead Data Engineer

Company: Cartwheel, 2025

A bit about the work:

The ins and outs (literally) for a BEDLAM-inspired synthetic render pipeline using Unreal Engine 5.6.

There are many moving parts to a synthetic data pipeline: camera intrinsics and extrinsics, ground truth motion capture data, retargeting systems, environment variations —you name it. The name of the game is reliable ground truth variations. Based on the paper “Look Ma, No Markers” released by the Microsoft Cambridge research team, this pipeline outputted approximately 1M images every 24 hours.

Project:
Motion Capture Data Standardization Pipeline.

Core Tools:

Maya (headless)

FBX SDK

Python

Side Tools:

Blender (scripting)

Storage Tools:

Google Cloud Services

Role: Lead Data Engineer

Company: Cartwheel, 2024

A bit about the work:

3D motion data is expensive to get your hands on, complex once you do, and messy once you dig in.

This project focused on unifying all of our acquired motion into our internal data structure for training our generative and stylized motion models.

Project:
Extremities Refinement Pipeline.

Core Tools:

Unreal 5.6

Maya 2024

Python

(other internal tooling)

Storage Tools:

Google Cloud Storage

S3/DVC

Role: Lead Data Engineer

Company: Cartwheel, 2025

A bit about the work:

Hands and feet are notoriously problematic and inaccurate in motion capture data, and the make or break for a nuanced performance.

Feet: In collaboration with Levi Harrison (Pixar), we were able to develop a foot processing pipeline for both pre-processing model training data, and post processing monocular motion capture data.

Hands: Utilizing computer vision libraries and internal hand processing modules, clean and targeted hand poses are detected from user-provided monocular video input and converted into motion capture data.

Project:
Video, Audio, and Motion Capture Syncing Pipeline.

Core Tools:

Maya (headless)

Vicon Motion Capture System

Technoprops Head-Mounted Cameras

FFMPEG

BWF MetaEdit (open source audio metadata)

Premiere Pro and EDLs (scripting)

Storage Tools:

AWS S3

Role: Data Engineer

Company: Unity, 2023

A bit about the work:

Building ground truth data for a series of different models being researched was an absolute highlight during my time at Unity. With an in-house motion capture stage, I had the opportunity to learn how to use the Vicon system, enabling our team to run our own internal shoots to expand our data.

We worked with IR facial video, motion capture, and scripted or improvised audio. As all of these data types needed to be edited, synced, and packaged for the dataset hand off, building out a pipeline for batch processing was crucial.