All-in-One App for New Laptop Owners

Project Scope
​
Introduction
In a world where technology evolves at an astonishing pace, the quest to seamlessly integrate devices into users' lives requires an approach that intertwines technical ingenuity with human understanding.
​
This project's scope encompassed a transformative challenge: crafting an "all-in-one" app that would serve as a user's trusted companion upon purchasing a new laptop.
​
Timeframe: 9 months
My role: Primary UX Researcher
My Team: To ensure the continuous momentum of improvement, I facilitated a bi-monthly presentation of research findings to key stakeholders, including representatives from engineering, design, marketing, and quality assurance (QA) teams.
Intense testing over 9 months required rapid analysis and reporting. ​
Methods: Qualitative Research, Quantitative Research, Generative research, Usability testing, Surveys, Feature Ranking, Card Sorts, Triangulation.
Mixed methods research enabled me to rapidly work on the fly throughout the evaluative stages of software development.
Longitudinal metrics such as participant ratings of features using Likert scales were captured from moderated and unmoderated studies.
​
Participant Screening: A screener with 8 questions included questions on having a stable income, a genuine inclination towards Dell products without any inherent bias, and a demonstrable intention to purchase a laptop within a six-month timeframe. This ensured that the insights garnered were from users with genuine potential to engage with the product, minimizing potential biases that could skew the findings.
Tools: Usertesting.com, Qualtrics, Camtasia, InVision​
At the core of this undertaking was not just the app itself, but a profound understanding of:
How users interact with technology
How they perceive value
How they navigate the intricate landscape of features and functionalities

Key Findings and User Insights:
​
The 1:1 interviews conducted with users unearthed a critical user sentiment that echoed loudly throughout the findings. Participants expressed strong reservations about encountering advertisements when initially using their newly acquired laptops. A resonating quote, "I just spent $1000 on this laptop and now I'm being told that I need to spend more money because it doesn't come with everything needed?!" encapsulated the underlying frustration.
A prevailing theme that emerged was a sense of being "ambushed" by the app's advertisements. Participants voiced their expectation that a newly purchased laptop should not immediately demand additional expenditures for security, Office, and service protection.
The response was so adverse that users considered the app as extraneous and, in some cases, went as far as to uninstall it, branding it as "bloatware."
The placement of advertisements within a section labeled "Essentials" exacerbated user discontent.
This juxtaposition further intensified their perception of the app as a sales platform rather than a utility that adds value to their laptop experience.
In conclusion, the research uncovered profound insights into user expectations, frustrations, and their emotional connection to the product. The impact of these findings resonated deeply in the iterative design process, leading to a heightened awareness of the delicate balance between monetization and user experience.

Marketing's Stance on Research Recommendations
Despite the compelling insights garnered from the research, which strongly indicated user frustration with advertisements, marketing maintained a strategic stance that necessitated the inclusion of upsells and advertisements within the app.
​
In-depth 1:1 task-based interviews provided valuable video clips and quotes that succinctly captured user sentiments regarding ads.
However, these insights alone were not sufficient to sway marketing's decision to retain advertisements.
To effectively communicate the insights derived from my research and sway the marketing stance, I used a combination of qualitative and quantitative data to enrich my presentation by offering both depth and statistical relevance.
Leveraging Usertesting.com and Qualtrics Surveys:
Usertesting.com allowed me to capture real-time user experiences, providing tangible evidence of user frustration with the current ad placements in addition to user navigation patterns, expressions of dissatisfaction, and feedback on ad-related interactions.
Qualtrics surveys offered a quantitative layer to the qualitative insights, allowing me to gather a larger sample of user opinions and sentiments. Through well-structured surveys, participants were invited to provide Likert scale responses that quantified their level of frustration with ad placements and their impact on their overall experience.
​
Incorporating Interview Findings, Card Sorting, and Feature Ranking:
This approach not only enabled me to identify sections that users found unsatisfyingly labeled, but also allowed users to participate actively in defining their content expectations.
​
User-Advocated Approach to Ad Categories:
Participants advocated for a user-centered approach to categorizing ads within the app. The suggestion to align ad categories with user expectations—such as avoiding ads in a section labeled "essentials" but allowing ads in a section named "accessories"—not only provided a solution but also demonstrated users' willingness to engage with content that catered to their needs.
My presentation to the marketing team provided a comprehensive perspective that underscored user sentiment, frustration, and aspirations. These insights provided a bridge between research-backed findings of user needs and marketing's strategic business objectives.
Likert Scales and other metrics were collected throughout iterations to measure the effect of changes.

Features were categorized using an open Card Sorting activity with participants.

Results of Feature Ranking

Evangelizing Results - Poster
