Evernote Scannable is “the fastest mobile scanning app for iPhone and iPad”. Here’s how it works:
Step 1 Scan a document using the camera.
Step 2* Let’s rename it.
Step 3* Now, you can send or save your scan.
Step 4 Let’s pick email. Done!
Step 5* Wait, forgot to save a copy to Evernote. Let’s go to Recents.
Step 6* Two taps…
Step 7* …and we’re back!
The asterisks? That’s what I worked on. When I got there Scannable had just been released, but user feedback revealed some of the assumptions behind major product decisions to be incorrect. In a small team made up of 1 PM, 3 engineers, and 1 other designer—my mentor Keith Lang, who taught me everything I know—I drove the entire design process from research to implementation on several critical Version 1.x improvements.
The experience after scanning used to look like this:
Between the tiny buttons, the ambiguous “Export” and “More…” options, and the bottom tabs, this design didn’t test very well. Most people didn’t use the tabs or even know they existed, and abandonment rates were higher than expected.
Before doing any design, I conducted preliminary interviews with Scannable users and Evernote’s customer service representatives. The latter sat right next to the Scannable team, and had an invaluable wealth of knowledge regarding user pain points.
I noticed a pattern of distinguishing between sending actions and saving actions. Email, text messages, and so on involved sending it to someone, whereas Evernote and Camera Roll involved saving to somewhere. My PM and I explored ways to apply this paradigm to simplify the user stories, and discussed how the underlying app logic would change.
The result of those conversations was splitting the actions into two categories. Evernote and Camera Roll fell under Save. Mail fell under Send, and so did More… (renamed Share because it pulled up the default iOS sharing tray). Message and Export were low-engagement options that already appeared in Share, so they were cut.
Fact: 90% of people are right-handed
So for the one-handed use case, the thumb is closest to the bottom-right corner.
I used this insight to guide my design decisions. Scannable’s purpose is to promote Evernote adoption, so to further this goal I aimed to Evernote the most accessible option.
Notice how in the early iterations, Save is on the left and Send on the right. I changed this so Save on the right, with Evernote to the right of Camera Roll. Subtle, but the details matter; this intuition held up in user tests. (Apple seems to agree, as seen in one of their UI changes.)
Using a Pixate prototype, I conducted more in-depth user tests to determine whether the Send/Save paradigm actually reflected the user’s mental model. Participants were asked think out loud while performing various tasks and measured both on quantitative measures (e.g. time to complete task) and qualitative ones (e.g. confusion, hesitation).
The majority of testers gravitated towards Send when asked to perform sending actions (“Email it to a friend.”), and Save when asked to perform saving actions like (“Store it for future reference.”). No one had trouble with the touch targets—they were large enough that presses were successful on the first attempt.
My Send/Save model was developed and released. It improved upon the previous experience on all analytics by double digits, particularly completion rate.
This was my first design project on an industry product team, so I had to pick up a lot of skills in a short amount of time; learning new methodologies like Design Thinking, new tools like Origami, and using old tools in new ways, like creating UI animations in Photoshop. It also taught me the importance of seemingly small details in constructing objective user experiments. In testing the Send/Save mental model, I sought to avoid bias from using those words explicitly in the tasks.
Remember these screens?
What used to happen was if you pressed “Done”, Scannable would just delete your scan. The Recents menu didn’t exist.
The original Scannable team intended for it to be single-use — scan once, never look back. But not all users saw it that way.
Clearly, people expected the app to hold onto their scans.
At first, my PM and I discussed ways to tweak the success screen so it would more accurately communicate what was going on. Like changing the wording of the “Done” button, or encouraging Evernote more so deletion wouldn’t be a problem; low-cost patches to a deeper problem.
The initial prototypes tested poorly. Instead of trying to get the user’s mental model to match the app’s, we decided to try the opposite approach. I looked into different ways to add an element of forgiveness to the app, and the one that got the best feedback was a safety buffer.
Even then, I discovered a lot of ways to implement that core idea. This concept only holds onto your most recent scan:
…but you can tap on the clock icon to view your history. As a perk, scans saved to Evernote are linked for easy access.
In feedback from users, customer reps, and the Scannable team, I learned that storage wasn’t as big an issue as I thought. This led to the idea of a visual log where all scans are saved for potential restore.
I also explored the concept of a time limit — we’d hold onto your scans, but only for so long. This was inspired by the “Recently Deleted” feature in the iOS Photos app.
I conducted user tests comparing new models to the old one. The feedback was overall positive:
- ”…the second application interface seemed easier to me. Mostly due to the ability to see a history of your documents."
- ”The second prototype was easier. You could see the last item scanned, view a history and possibly access your documents again."
- ”I liked the second prototype because it seemed more apparent that it saved my previous documents. The first prototype didn’t portray that."
Both of my initiatives — the visual log and time limit — shipped in the next release. Afterwards, we saw a dramatic reduction in one-star reviews and forum complaints.
This project was difficult because it involved addressing a fundamental issue — aka, a core assumption behind the app’s entire model being inaccurate — while also maintaining the structural integrity of the app, because a redesign was out of question. It taught me to innovate within boundaries, which is really what design’s all about.
I also learned to talk to people unabashedly. Working on important product decisions means you can’t afford to be shy, and I quickly overcame my initial reluctance. Figuring out Recents involved interacting with both customer support representatives and users directly to truly understand what their behavioral tendencies were, and designing for them.
Send My Info
Scannable’s Send My Info feature allows you to share a virtual business card via email. As a side project, I was asked to design and code the email.
Adaptive Sign Up/Login
While waiting for people to email me back, I took the initiative to spec out a combined Sign Up and Login experience that works like this:
- show “Email” and “Password” fields
- don’t show a “Sign Up” or “Login” button
- after the user types in their email, check if it’s a registered Evernote account and only display the appropriate button
I proposed this change to the Scannable PM, and he liked it. It shipped in the next release!
My Directorial Debut
I wrote and and directed a minute-long product demonstration video for internal use. People in the office said it was Oscar material; you’ll have to take my word for it!
I learned animation on-the-job and created this cute scanner for a project asset!
Hudl — Design Challenge
Hudl is a platform that provides video analysis and coaching tools for sports teams. I got an offer from doing this design challenge for their interview, but had to turn it down because I’d accepted a PM role at Microsoft. Regardless, it’s a well-documented showcase of my design process under time and resource constraints.
Hudl Technique has a Video Library Problem
Hudl Technique is a mobile, video capture app that enables instructors and athletes to learn from and improve their form.
Here are some pain points users have expressed to us:
- Once a user has a number of videos in their library, managing those videos could be better.
- Finding a specific video is difficult. For instance, if you’re looking for a golf swing from two weeks ago, you might be scrolling through lots of video to find that specific one.
- There’s no notion of separating videos into groups or teams.
We’d like you to solve these pain points.
Show us your solution for organizing, filtering, sorting and managing video within the Hudl Technique app.
- interactive prototype
- accompanying presentation that explains your design decisions and any assumptions you’ve made
I wanted to show my design process in real time instead of writing a glossy retrospective. So I presented my results as a sequenced multimedia experience made up of audio diaries, mockups, and more.
2. First Impressions
3. Photos App
4. Photos App Reference
5. Basketball Tags
6. A Realistic Solution
7. Initial Sketch
8. Initial Sketch, Explained
9. Hudl Technique Video Library Mockup
10. Prototype Limitations
11. Marvel, Not Invision
I ended using Marvel, not Invision. They’re pretty much the same but Marvel doesn’t have a project limit for free accounts.
14. A Conversation With Quinn
15. Final Thoughts
I think design should be purpose-driven, so let’s go over some of the objectives mentioned in the brief:
- “Once a user has a number of videos in their library, managing those videos could be better.”
- “Finding a specific video is difficult. For instance, if you’re looking for a golf swing from two weeks ago, you might be scrolling through lots of video to find that specific one.”
- “There’s no notion of separating videos into groups or teams.”
My proposed improvements can be broken down into 3 components: Thumbnail View, Better Selection, and Groups. Here’s how they address these pain points:
- Thumbnail View — This addresses #2. Grouping the videos as thumbnails sorted by date makes it way easier to locate the video you’re looking for. Sure, the current list view shows the tag and athlete, but you could sort by those categories in the dropdown menu anyways. In my opinion, the trade-off is worth it.
- Better Selection — This addresses #1. Using the proposed selection features, tagging/grouping multiple videos is now quicker and easier than ever.
- Groups — This addresses #1, #2, and #3. Groups would be great way for users to organize their video library so finding a specific video is easier. Having groups or folders is a fundamental part of any file system, I’m kind of surprised this isn’t already a feature.
Thanks for the opportunity to do this challenge! I hope you find this solution useful, I tried design something that could actually be integrated into the current product without too much friction.
These are thoughts I had on Hudl Technique that weren’t directly relevant to the challenge.
Menu Bar Dropdown
Branding, Web Design
HackTX is a student-run hackathon at UT Austin. The event had outgrown its M.C. Escher-inspired branding, which looked dated next to the competition.
I created a bold, versatile visual identity inspired by Austin’s famous street art. The logo was designed for display at large sizes, playing off of the “Everything’s bigger in Texas” motto and expressing the exciting growth of the event.
My brand guidelines were implemented across all promotional materials, including website, banners, apparel, and more.