When Snap acquired Vertebrae, the company I was with at the time, our small team suddenly found ourselves inside one of the biggest names in AR. The goal was clear but massive - bring startup speed and creativity into Snap’s new enterprise AR initiative (ARES).
I joined in July 2021 and jumped right into learning about their 3D/AR product offerings and how they could be reimagined for enterprise use. The team was excited, we were three separate acquisitions coming together to shape a new vision for how AR could live outside social. Powering things like Live try-ons, 3D product viewers, and AR Mirrors for physical spaces.
We built some amazing experiences over 2.5 years, but unfortunately Snap had other priorities and decided to focus on core offerings shutting ARES down. I was incredibly grateful for the ride and learned so much about the technology and how to navigate like a startup in a bigger company.
I met some of the smartest, kindest people I’ve ever worked with. I left with new skills and a renewed sense of why I love building things from scratch.
Live Try-On Eyewear
One of my favorite projects from this period was the Live Try-On Eyewear experience. We wanted to make buying glasses online feel as natural and confident as trying them on in-store. Using Snap’s camera and real-time AR technology, we created a flow where users could open their front camera directly from a product page and see how different frames looked and fit in seconds.
What made this challenging was balancing realism and speed, ensuring the virtual try-on felt authentic, without latency or uncanny distortions. I worked closely with our product and engineering teams to fine-tune the design.
The impact was clear: users who engaged with try-on spent more time on the product page, explored more SKUs, and converted at a higher rate. But the real success was emotional, consumers said it gave them more confidence when making a purchase.
Live Try-On Footwear
What would it take to try on shoes from your phone, no box or mirror required? Using the device’s back-facing camera, we built an experience that tracked users’ feet in real time and rendered shoes over them with realistic lighting, depth, and motion.
It was one of the most technically demanding experiences we worked on. The camera had to accurately recognize and track feet as users moved, even in cluttered environments or different lighting conditions. My focus was on the product experience, how to make it intuitive, fun, and fast enough that people actually wanted to use it.
When it worked, it felt almost magical. You’d point your camera down, and a pair of sneakers would instantly appear, perfectly aligned as you moved. That sense of delight combined with practical utility, changes the perception of AR from just a fun gimmick to actually giving purchase confidence.
Image Try-On
Where eyewear and footwear AR experiences focused on camera-on experiences and consumers in motion, Image Try-On explored full body image upload product try-ons. The feature allowed shoppers to upload a photo or select from avatar models to visualize how garments looked in context, directly on their own image.
Our goal was to bring the personal, emotional part of shopping online, to let users imagine “how would this actually look on me?” without live body scans or dedicated apps. We designed a lightweight web-based flow that rendered clothing onto static images in seconds, using Snap’s underlying computer vision models for segmentation, depth, and texture alignment.
I worked on defining the product flow and UI patterns that guided users through photo upload, garment placement, and visualization.
3D Viewer
The 3D Viewer transformed the traditional e-commerce product image into an interactive experience. Customers could rotate, zoom, and inspect items in high detail from the stitching on a shoe to the texture of a handbag. Designed to integrate seamlessly within a merchant’s product page, it offered an intuitive, low-latency experience across devices and browsers.
Due to the necessity of building 3D assets in order to support AR experiences we were then able to leverage the same asset and showcase it on a product page. Make it the perfect segway into the AR experience. It played a crucial role in building consumer trust, giving customers the sense of “holding” the item virtually, reducing uncertainty and hesitation before purchase.
Digital Asset Manager
The Digital Asset Manager served as the foundation for every experience we built. It allowed merchants to upload, store, and manage 3D and AR assets, then approve and publish them directly to their associated try-on or viewer integrations.
Before the DAM, asset workflows were fragmented. Our goal was to create a single source of truth, one place where teams could track asset readiness, approve visual quality, and push updates live with confidence. The result was faster onboarding, fewer errors, and a scalable content pipeline capable of supporting multiple retail partners simultaneously.