Summer Reading

UX Design
UI Design
UX Research
Scottsdale Public Library



Every year, Scottsdale Public Library organizes a summer reading program for participants of all ages. Their goal is to engage community members, encourage literacy, and promote the enjoyment of reading. Participants record their reading progress online and can earn prizes as they reach certain milestones.


In the past two years, Scottsdale Public Library's summer reading program has seen a decrease in participation.  The program coordinator would like to increase participation rates next year. The challenge is not only to 1.) increase sign-ups, but also 2.) encourage participants to consistently record their reading.
How might we redesign the website to increase participation and better serve users?

Spoiler Alert

I've prepared the final design and prototype for you if you should like to jump ahead. It will also be linked at the end of the case study.
Jump to solution →

Design Process

My process

Although listed in order below, my design process for each project varies depending on existing information and what problem we are addressing. I also like to apply an iterative cycle as I uncover insights, receive feedback, and continually make improvements.
1. Research

- Market research
- Competitive analysis
- User interviews
- Affinity mapping

2. Define

- User persona and journey map
- How might we statements
- Task flows

3. Ideate

- Wireframe sketches
- Mid-fidelity wireframes
- Information architecture

4. Test

- Prototyping
- Usability testing
- Iterating on findings

5. Design

- Style guide
- Logo design
- High-fidelity design and prototype

A little background

Growing up, I would participate in my library’s summer reading program every year. It grew to be something I always looked forward to.  When I took a look at Scottsdale Public Library's reading program website, I immediately noticed some usability issues and outdated visual design that seemed unaligned with the library’s goals and audience.


Stakeholder Kickoff Meeting
To begin the project, I arranged a meeting with the library's summer reading program coordinator. She shared important user demographics and outlined the metrics of success that would help us set project goals.
Age Group Participation
Metrics of Success
  • Age group with highest participation is kids (5-11 years old)
  • Age groups with lowest participation are "pre-readers" (0-5 years old) and adults (18+ years old)
  • The library measures success in terms of both program participation and completion
  • Goal is to grow the program and turn reading into a habit
Establishing Goals
After combining our notes on known issues as well as stakeholder needs, we agreed on four main goals:


Next, I set out to create a research plan that would outline my research goals, questions, and methodologies. Having participated in the summer reading program before, I had my own ideas, but I knew it was important to conduct user interviews to ensure I was validating real needs.

I wanted to focus on motivations and tracking habits in these interviews because one of the main project goals was to get users to regularly log their reading until they finish the program.

User interviews with 6 potential reading program users (ages 23-58) helped me understand the nuances of their experiences. In each interview I also shared images of the current website and recorded their comments.
Interview Goals
  • To understand what frustrations exist with the current website design
  • To learn about experiences users have had with logging/keeping track of habits
  • To understand what users would find valuable in a reading program website
Key Insights
These 1-on-1 interviews gave me insight into user pain points and motivations in regard to setting personal goals and maintaining habits. By analyzing patterns from user interviews, I started to see how I could turn these key insights into executable design changes.
User Feedback on Existing Website
Below, I annotated the existing screens with interviewees' comments. Their concerns ranged from poor visual hierarchy to outdated branding and confusing text. *User comments are annotated in blue
  • Kid-focused and outdated visual design
  • Progress bar size and clarity
  • Confusing call-to-action (CTA) buttons
  • 3 column layout looks cluttered/too much on the page


User Personas
I created personas Vivian and Amy based on qualities observed in user interviews. I wanted to represent one "reader" and one "non-reader" because one of the main project goals was to increase sign ups.  This means empathizing with people who might not typically think about signing up or enjoy reading.

Personas help me focus on specific user needs and act as a reference point throughout the design process that keeps ideas user-centered.


Brainstorm: Gamification and Dashboards
Since the reading program utilizes a logging/prize system, I researched gamification and methods to create rewarding user experiences. The goal was to make engaging screens that would still clearly communicate important information.

Redesigning the dashboard was a challenge! It took many sketches and rearranging the pieces to create a balanced and logical design. Dashboards are meant to provide an overview of a lot of information, so I prioritized visual hierarchy based on what features users would most often use.

I had fun playing around with ideas like a circular progress bar, but ultimately chose a more functional horizontal progress bar that better displayed prizes.


Mid-fidelity wireframes
I translated my final sketches into mid-fidelity wireframes for both mobile and desktop screens. Designing in Figma helped me identify potential issues like readability on smaller screen sizes and allowed me to quickly switch around elements.


Usability Testing
After bringing my wireframes to high-fidelity, I conducted usability testing. I created both desktop and mobile prototypes in Figma and exported them to Maze. Maze is a product research platform that allowed me to see click error rates, time spent on each task, and a heat maps of user interactions.

I recruited 7 users to test the mobile prototype and 7 users to test the desktop prototype.
Usability Test Objectives
  • Test the usability and ease of navigation of the homepage and dashboard
  • See if participants intuitively move through main task flows (adding a book and logging reading)
  • Test the experience of joining and participating in the summer reading program for a wider age range
Usability Test Insights
Usability testing revealed that a majority of participants navigated successfully through the prototypes and were able to complete tasks.  Users provided positive and constructive feedback, which helped me identify strengths and weaknesses of the new designs.
"It was rewarding to see my points adding up, it would encourage me to read more."
"I found this to be easy to follow and use, which would help ensure that I come back to update.”
"When you search for books you get a list of the books, but I was expecting a check or plus button to add the book”


Usability Test Insights
In addition to user comments, I closely analyzed the Maze heat maps for mis-click rates and to see points of friction. By doing this, I saw exactly where there was opportunity improve the design.
1. Clarify Mobile Dashboard
Task: Redeem a prize
Completion rate: 100%
Misclick rate: 33.5%

Users misclicked on the badges when trying to redeem a prize.  I believe this is because they were yellow and had icons. Yellow is the color used on my call-to-action (CTA) buttons.

- Changed badges to blue so they would not be confused with yellow CTA’s

- Moved the badge section below bookshelf to separate it from prize section

- Underlined “prizes” to show it is clickable
2. Eliminate Click Error

Task: Set a Reminder

Analyzing the heat maps of the “set a reminder” screen, I saw that some users would try to select the “confirm” button to set a reminder without selecting a date and time.  A date and a time are both required to proceed.

- Revised the “confirm” button to be in the inactive state until a time and date are selected

Hi-Fidelity Designs

After implementing priority revisions from usability test feedback, I arrived at my final hi-fidelity designs.


View the mobile prototype below, or click here to view it in Figma.


Partnering with a stakeholder, I learned how to ensure that had their needs addressed while also making decisions based on the design thinking process.

This project also drove home the importance of consistency as I designed responsive screens for mobile and desktop devices. Considering the experience for users on different devices also taught me to consider limitations around how my designs would function. Creating components and keeping a style guide made the entire responsive design process more efficient.
If I had more time I would...
- Test my designs with children (5-11 years old). I would like to research how their needs would differ from adults and what pain points might arise from this redesign.

- Add a community updates feature. In interviews, some users said that they would like a way to see what others are reading and that adding a social aspect might help motivate them and keep them accountable.

Next project