Understand and engage

with Art near you

in an intuitive, colorful way

Context

Aura is an ambitious, contemporary approach to mapping the Art world. As the Lead UX & Visual Designer, I created an experience that connects observers to the art they are near and displays “quick facts” knowledge cards, community discussions, and a map of suggested and related artists and works.

Behind the scenes, it combines sensor data, location awareness, crowd-sourced content aggregation and data visualization – bundling everything together into a clean and intuitive interface.

Working on a small team of five required me to take the lead on all phases of the product design process. I adapted and learned quickly, prototyped early, and mentored the team in mobile design best practices.

Role

Lead UX & Visual Designer

Team Size

5 Member Team,
Solo Designer

Timeline

6 Months

Impact & Results
  • 1000+ Accounts created by community and artist member within two months of first launch
  • 2x Museum partner pilots including the internationally known Toledo Museum of Art
  • 32 Beacon sites installed including Twitter Headquarters and SF Galleries and used during pilot launch
  • 2x App Store launches of our full apps, generated rapid interactive prototypes from low to high-fidelity, and delivered assets and specifications 
for our gradual/incremental rollouts
  • 2x pivots in strategy and core usecases for which I fully executed our core product design work
  • Continuous research testing which I defined and managed, including pilot launch tests

Ideation & Exploration

Iterating Towards Our MVP

I led our group ideation sessions to define our product’s core use cases, how we could make the best of mobile capabilities, and what questions or assumptions we could later test in development. From there, I captured our ideas into component- and flow-based wireframes.

Rapid Testing And Prototyping

Two main use cases emerged: Capturing (while viewing the art) and Recalling (retracing steps and diving deeper, after the fact). In both flows, my focus was to enhance each moment in the app with a quick glimpse of detail and an intelligent display of metadata.

I launched our initial testing rounds with fresh interactive prototypes in low- and mid- fidelity. On the right: Play with the early interactive prototype from your phone or browser.

First-Time Onboarding
& OS Permissions Handling

Two main use cases emerged: Capturing (while viewing the art) and Recalling (retracing steps and diving deeper, after the fact). In both flows, my focus was to enhance each moment in the app with a quick glimpse of detail and an intelligent display of metadata.

I launched our initial testing rounds with fresh interactive prototypes in low- and mid- fidelity. On the right: Play with the early interactive prototype from your phone or browser.

Solution & Impact

Nearby Particles

Creating a new affordance for location awareness

When a user is near a place or artwork, a subtle particle system creates an “aura” around the nearby item. The particle system samples colors from the originating image – aligning with our principle to elevate the interface with the user’s own content.

Onboarding Flow

I tested numerous flows and visual treatments to achieve a memorable and lightweight first-time experience.

I decreased our first-time user dropoff rate by 60% after I audited our system-level permissions granting touchpoints. I created a multivariate test to evaluate the change of adoption rate when certain permissions were enqueued at first launch, versus asking in-context at later times of the journey. Our results showed that some features worked better up front – especially when paired with a value-driven summary.

Design System & Global UI Components

Working from the atomic design methodology, I started with core content and navigational affordances, modularizing them into components, and finally defining an IA map, flows, and layouts for each mobile screen. I worked closely with engineering to demonstrate the app at all zoom levels at varying fidelities, allowing them to make headway on coding while the final organism was realized.

Hi-Fidelity Interactive Prototype

As we pushed more to code, I created interactive prototypes and animation demos for our business leaders and engineering team to reference and gather early feedback from. We received great input, like the addition of tracking artwork/artists/venues from a simple integrated (+) follow button.

Discover and drill-in to learn about the
 art, artists, and exhibitions near you.

Remember
 the art you see with Capture and Recall. A shelf holds multiple photos at the top, letting you spend less time documenting and more time taking in the art, uninterrupted.

The colorful shutter button follows Aura’s visual design principle: pops of color branding are incorporated in core UI elements, making the interface a chameleon that reflects your art collection.

Aura groups your captured Art automatically into a scannable grid; organizes your museum visits by location; and transcribes placards and statements into searchable text.

A light screen above the tab bar helps ground the Capture button without competing with the core IA of the app’s tabs.

The Memories screen organizes artwork in a scannable grid, grouping by time and museum/gallery/location.

A light screen acts to ground the Capture button at the bottom.

Get the full story – design process, research, motion design, pivots, and more in the case study.