Lead Unity development for a multi-part, data-driven interactive installation by FIELD.IOOpens in a new tab for IBM at the The Masters TournamentOpens in a new tab—visualizing nine years of AI-augmented shot data within a guided experience.
Roles
- Lead Unity Developer
- Technical Lead (Interactive Systems)
- System Architecture & Integration
- Performance Optimization
- Interface Prototyping & Hardware Integration
Stack
- Unity (C#)
- Python
- React
- Wwise
- Custom Hardware Integration
- Web API Integration
Outline
I led the Unity development for this high-end interactive installation designed for a guided environment at The Masters Tournament. The experience visualized nine years of tournament shot data—289 players and 266,939 individual shots—transforming complex, AI-augmented datasets collected by IBM into an intuitive, spatial, real-time exploration.
The installation was experienced by invited guests, including former Masters winners and their caddies, and consistently elicited strong reactions through its depth, clarity, and responsiveness.
System Overview
The project was conceived as a modular, synchronized system, consisting of:
- A standalone Unity application acting as the core real-time visualization engine
- A web-based remote control (authored in React), fully synchronized with the Unity runtime
- A custom-built physical control interface, controlling the Unity application
- A web API delivering raw, up-to-date Masters data from a live database (including 2024 and 2025 events)
- A bespoke audio layer, integrated via Wwise
Key Contributions
- Designed and owned the Unity application architecture, supporting multiple input modalities and operational modes
- Implemented large-scale, real-time data visualization pipelines for hundreds of thousands of historical golf shots
- Authored a React-based remote control interface, tightly synchronized with the Unity experience
- Prototyped and integrated a custom physical interaction device, including participation in the design of the device communication protocol
- Integrated Wwise audio middleware, enabling the sound designer to deliver handcrafted, responsive soundscapes directly driven by runtime state
- Collaborated closely with designers, data engineers, hardware specialists, and sound designers to align creative intent with technical feasibility
Technical Characteristics
- Deterministic, production-grade runtime behavior suitable for guided experiences in high-stakes environments
- Clear separation between data ingestion, visualization logic, interaction, and presentation
- Designed for operational stability, maintainability, and future live data updates rather than one-off execution
Why this project matters
This project exemplifies my role as a technical lead within agency-driven, high-stakes installations—translating complex datasets into compelling real-time experiences, architecting multi-part systems across software, hardware, and web, and delivering reliable interactive applications for premium audiences.




