Project Description

UX CASE STUDY

ParkPoolr

ParkPoolr aims to connect drivers with parking near events. A usability evaluation determined how the website could be improved.

Introduction

ParkPoolr is a website that aims to connect drivers looking for parking near major events (State Fair, Vikings football games, etc) with homeowners and business near those events that have parking spaces to rent.

The website is currently live and had good customer awareness after launch, however analytics indicate that the website has high bounce and cart abandonment rates. ParkPoolr enlisted my team to conduct a usability evaluation to determine how the website could be improved.

Executive Summary

The research team determined that the ParkPoolr website is somewhat functional for both drivers and hosts. The concept is sound and could provide significant value to the business space of event parking.

However, significant communication and usability issues exist that may negatively affect use of the website and inhibit customer retention.

Findings and recommendations to remedy these issues are outlined in this report.

Evaluation Goals

The client presented three areas of research interest to the research team. After completion of an initial heuristic analysis, our research team determined that two other areas of interest should be focused on.

01: Client Goals
  • Improve speed and learnability of the site. (Provide insights into what feature and flow modifications would be ideal to prioritize before the app launch)
  • Improve retention rate of the platform. (Users should want to use the site again)
  • Ensure low barrier to entry. (Let users pre-book a spot with a license plate and card number)
02: Research Team Goals
  • Determine users’ attitudes towards available payment options. (Venmo, Paypal, Amazon Wallet, check, cash payments on arrival, etc)
  • Determine users’ attitudes about safety and security. (Lessors ratings/reviews, security iconography, background checks, screening, anonymity, etc)

Methodology

The research team approached the evaluation by first conducting a heuristic analysis to gain an initial understanding of the website’s strengths and weaknesses. Then, research goals were formed and validated. Next, remote and in-lab user testing was performed. Findings were recorded and an affinity diagram was created by the research team.

01. Heuristic Analysis

A heuristic analysis was performed by the design team to evaluate the website from an average user’s perspective. Observations and problems encountered by the tester were noted. The items were then assigned to applicable digital product heuristics.

Read the heuristic analysis.

02. User Testing

Remote Testing

A total of eight remote user tests were performed by the research team. Voice and screen recordings were captured during the tests using join.me software. A draft testing script was used for the test. Pertinent findings were drawn from the tests and entered into a common Trello raw data board.

In-Lab Testing

Four in-lab user tests were performed. A moderator, dedicated note taker and the tester were situated in a conference room. In an adjoining room, an additional note taker and tech support staff observed the testing session through one-way glass, microphone and screen-mirrored monitors.

The moderator ran the tester through predetermined driver and host scenarios with information needed for the tests provided on reference note cards. Screen, face and voice recordings were captured via the test laptop’s built-in camera and microphone using Apple QuickTime software.

03. Affinity Diagramming

An affinity diagram was created by pulling pain points and problem areas from user test notes and writing them on sticky notes. The notes were arranged by scenario or website section. The notes were then synthesized by looking for common patterns.

Five main areas of concern emerged and were prioritized by severity. Additional, less severe issues were noted for inclusion in a final report.

Findings & Remedies

Data synthesis through affinity diagramming revealed clear areas of concern, along with several additional findings of note. Findings are listed in order of severity.

01. Landing Page: Lack of Clarity

Most users indicated that, upon entering the landing page, they were not immediately able to discern the function of the website.

  1. Header banner does not effectively communicate site concept.
    Remedy: Explore image and language options that communicate website goals (see proposed solution below).
  2. App image and Icon design confusing:
    “The icons look like picnic tables.”
    “Do they have a smartphone app?”
    Remedy: Explore icon design possibilities. Consider removing phone image until app is launched.
  3. Only 1 of 12 users played the informational video.
    Remedy: Consider using above the fold screen area to communicate site goals with static content such as the two informational content blocks below the video.

Proposed landing page redesign.

02. Host Process Uncertainty

Host scenario users indicated that, although they were able to complete the process of listing parking spots, they did not feel confident that they had completed the tasks successfully.

  1. After submitting the “Add a spot” form, some users thought the listing process was complete. They did not click the newly created spot card to choose an event for the spot.
    Remedy: Add a form field to choose an event on the “Add a spot” form. If the user wishes to enter an event later, add a “choose later” selection to the drop down list of events.
  2. Some users thought “Add Spot” and/or “Add a New Spot” meant one single spot, rather than all the spots in a lot. Some users thought they would have to complete 25 forms for 25 spots in their parking lot.
    Remedy: Explore different language options for labels, links and buttons to clarify that the listing is for multiple spots.
03. Lack of Cancellation Function

After completion of the driver scenario, users were prompted with a question about cancellation. The question was framed by asking them to consider what they would do if they had a conflict arise and wanted to cancel their reserved parking spot.

“I can’t cancel? That would piss me off.” – user 9

  • All users were unable to cancel their parking spot using the website (expected).
  • All users verbally indicated that they would be upset when realizing they couldn’t cancel.
  • Some users then tried to navigate the website to find cancellation policy information or methods to contact ParkPoolr to cancel. They were unable to find a phone number to call. Some felt the contact form would take too long to get a reply.
  • This issue would strongly negatively affect users’ probability of using the website in the future.

Remedy: Allow users to easily cancel their parking spot bookings, and/or provide a customer service phone number.

Related Finding: Users indicated that they strongly desire a confirmation email receipt and/or print receipt function when completing booking of a parking spot.

04. User Interface Issues
  1. Back button position: Almost all users indicated the back button was missing (when at the top of the screen) or in the wrong spot. They expected the back button to be at the bottom left of the screen.
    Remedy: Always position the back button at the bottom left.
  2. Price consistency: Almost all users were confused when the price and fee were changed to display as combined, leading to uncertainty along the user journey.
    Remedy: Display price consistently. Consider not showing the fee separately.
  3. Form language and validation: Most users reported minor pain points when filling out forms.
    • “Description of parking spot” language unclear.
    • Not sure whether to use dollar sign ($), decimal, cents, etc in price entry field.
    • Some lines did not turn green after selection (lists)
    Remedy: Explore language and validation fixes.
05. Map & List View

The map and list view functionality were often mentioned by users as a point of confusion and frustration.

  1. Find parking button: After selecting event location and event, many users did not know they needed to click this button to show results.
    Remedy: Automatically display results when any UI element is changed.
  2. Map/List toggle: Some users could not find this button, even when prompted to use list view.
    Remedy: Experiment with UI design to make the two view options more apparent.

“I might just look for a parking meter at this point.” – user 9

Since the map view was non-functional on the test site, findings are suspect. However the test team felt some issues were still worth reporting.

06. Additional Findings

Many less severe findings of note were recorded by the evaluation team. They are omitted in this portfolio for the sake of brevity, but are included in the full report:

Read the full evaluation report.

You made it to the end. Nice.