JI.
Pijin hero
WebAR / VPS TrackingPIJIN · 2024 - 2025

A site-specific WebAR experience built on Niantic's Visual Positioning System - one of the most technically demanding tracking methods available in augmented reality. The High Line's “Dinosaur” pigeon sculpture by Iván Argote was scanned, mapped, and transformed into a persistent digital anchor. Once localized, the AR layer outfits the giant pigeon in full NYC streetwear - North Face puffer, Timberland boots, Yankees fitted, and a single AirPod. No app download. No QR code. Just point your phone and the city comes alive.

<10sRegistration Time
VPSTracking Method
360°Walkable AR
0App Downloads
ClientPIJIN
RoleCreative Technologist
Platform8th Wall WebAR
TrackingNiantic Lightship VPS
LocationThe High Line, NYC
3D ToolsBlender, RealityCapture
TypeSite-Specific AR
DeliveryMobile Browser

The Challenge

Most AR experiences rely on flat image targets or QR codes - constraints that limit spatial fidelity and break the illusion of digital objects existing in physical space. This project required something fundamentally different: a full 3D sculpture as the tracking anchor, centimeter-level alignment from any viewing angle, and persistent augmentation that holds as users walk a complete 360 degrees around the subject.

The technical bar was high. Visual Positioning System tracking demands a precise 3D map of the physical environment, real-time camera localization against that map, and dynamic occlusion so digital objects appear to exist behind physical geometry. All of this needed to run in a mobile browser with zero friction - no app store, no download, no login. Just a URL and a camera.

The Subject

The “Dinosaur” pigeon sculpture by Iván Argote on the High Line in Hudson Yards - the physical anchor that the entire AR experience is built around. This 4K drone scan captures the sculpture and its surrounding environment for VPS mapping.

How It Works

01

Scan & Localize

The user opens a URL and points their camera at the physical sculpture. Niantic Lightship VPS recognizes the geolocation and matches the live camera feed against a pre-built 3D mesh of the environment - localizing the device with centimeter-level precision.

02

Anchor Alignment

Digital content is precisely anchored to the physical sculpture's geometry. Unlike flat image tracking, VPS alignment is volumetric - the AR layer wraps around the actual 3D contours of the pigeon, maintaining spatial accuracy from every viewing angle.

03

Sequential Transform

The Pijin is outfitted piece by piece with signature NYC streetwear - Timberland boots first, then the North Face puffer jacket, Yankees fitted cap, and finally the single AirPod. Each accessory is mesh-fitted to the sculpture's body for anatomically correct placement.

04

360-Degree Interaction

Users walk freely around the sculpture while the AR layer holds position. Environmental lighting estimation matches digital shading to real-world conditions, and dynamic occlusion ensures the streetwear appears to wrap naturally around the bird's physical form.

Build Process

Work-in-progress: 3D point cloud scan of the pigeon with digital Yankees cap, AirPod, and Timberland boots overlaid
01

Point Cloud + Accessories

Gaussian splat scan of the physical sculpture with digital accessories positioned in 3D space. Each item - Yankees cap, AirPod, Timberland boots - is anchored relative to the scan geometry so spatial alignment holds regardless of camera angle or distance.

3D model of the pigeon wearing a black North Face puffer jacket in Blender viewport
02

Puffer Jacket Modeling

The North Face puffer jacket modeled in Blender and custom-fitted to the pigeon's body. Cloth simulation combined with manual sculpting achieved the quilted, volumetric silhouette that wraps convincingly around the bird's round form - critical for maintaining believability in AR.

Pijin in 8th Wall editor — pigeon model with North Face puffer, Yankees cap, Timberland boots, and AirPod positioned in 3D scene
03

8th Wall Scene

The fully dressed pigeon staged in the 8th Wall WebAR editor — all accessories positioned, lighting dialed, ready for VPS deployment to the High Line.

3D scan of the pigeon from behind, wearing full NYC outfit - North Face puffer, Yankees cap, Timberland boots
04

Full Outfit Assembly

The complete streetwear look assembled on the 3D scan - black North Face puffer, Yankees fitted cap, and Timberland boots. Every accessory is mesh-anchored to the sculpture's geometry, ensuring the outfit holds spatial coherence from any viewing angle in the final AR experience.

Final Result

AR Demo

Final AR composite - the fully dressed pigeon anchored to the physical sculpture on the High Line, NYC skyline behind it. VPS tracking holds the digital layer in place as the camera moves freely.

Field Test

On-site validation at the High Line - the full AR experience running live on a mobile device with real foot traffic, confirming VPS stability and visual fidelity under real-world conditions.

Technical Architecture

VPS Localization

Niantic's Visual Positioning System builds a persistent 3D map of the physical environment using point cloud data from the sculpture and its surroundings. At runtime, the device's camera feed is matched against this map in real-time, achieving sub-centimeter positional accuracy without any physical markers, QR codes, or image targets. This is the same spatial computing infrastructure that powers large-scale AR at Niantic.

Dynamic Occlusion

A transparent “occluder” mesh of the sculpture was generated from the scan data. This invisible geometry masks digital objects at the correct depth - so the puffer jacket appears to wrap around the pigeon rather than floating in front of it. Without this layer, the illusion of physical presence breaks down entirely.

Environmental Lighting

Real-time lighting estimation samples the ambient conditions of the High Line at the moment of viewing - sunlight direction, intensity, color temperature - and applies those parameters to the digital materials. The puffer jacket reflects actual environmental light, making the AR layer indistinguishable from the physical scene at a glance.

Zero-Friction Delivery

The entire VPS tracking pipeline, 3D rendering engine, and occlusion system runs directly in the mobile browser via 8th Wall - no app download, no account creation, no installation. This is a deliberate architectural choice: friction is the primary enemy of experiential AR, and every barrier between user and experience reduces engagement.

Tech Stack

8th WallNiantic Lightship VPSLens StudioThree.jsBlenderRealityCaptureWebARGLSL ShadersGaussian Splatting

More Projects