Product: Reporting and analytics tool
Role: Lead UX Researcher
Method: User Experience Questionnaire (UEQ)
Participants: 126 users of the tool completed the UEQ
Stakeholders: Product manager, and engineers
Outcome: I used the UEQ to evaluate whether perceived performance problems justified engineering investment. Results showed that most users perceived the system as fast, suggesting performance optimization was not an immediate priority.
The product manager for a reporting and analytics application informed me that the development team planned to pause new feature development to focus entirely on improving system performance.
Some backend analyses suggested that some users might experience system delays.
Before allocating engineering resources exclusively to performance optimization, I suggested conducting a quick study to determine whether users perceived the system’s performance as adequate.
My Role
Proposed investigating perceived system performance before pausing feature development
Selected the User Experience Questionnaire (UEQ) as the measurement instrument
Designed and administered the survey using SurveyGizmo (now Alchemer)
Analyzed questionnaire results and interpreted UX scales
Communicated findings to the product team
Do users perceive the reporting application as slow, or is system performance generally considered acceptable?
Is perceived system speed a significant usability concern that would justify pausing feature development?
UEQ Scale Structure
To evaluate whether system performance was a real user concern, I proposed conducting a quantitative user experience survey using a standardized UX instrument.
Standardized questionnaires allow researchers to systematically measure subjective user perceptions, such as usability, efficiency, and satisfaction.
The goal was to quickly determine whether users perceived the reporting and analytics tool as slow or acceptable in terms of performance.
The User Experience Questionnaire (UEQ) is a 26-item standardized instrument designed to measure multiple dimensions of user experience.
It includes several scales assessing both pragmatic qualities (such as efficiency and clarity) and hedonic qualities (such as attractiveness and stimulation).
The “slow–fast” scale directly measures users’ perception of system responsiveness, which made it appropriate for evaluating perceived system performance.
The program team identified clients who appeared to have experienced performance delays.
Participants were recruited via email.
The questionnaire was administered using SurveyGizmo (now Alchemer).
General User Experience:
The perception of the reporting and analytics tool was positive (significantly above the neutral rating of 4).
There was room of improvement.
Perceived system performance
Few participants (12%) seem to perceive the reporting and analytics tool to be slow.
32% of participants were neutral
55% indicated that they found it fast (rating of 5 or more).
Overall, the results suggest that most users perceived the system’s performance as acceptable, with relatively few participants reporting slow performance.
Results indicated that most users perceived the system as fast, with only a small minority reporting slow performance.
These findings suggested that pausing feature development to focus exclusively on performance optimization was not necessary, allowing the team to continue prioritizing new functionality while monitoring performance over time.
This study demonstrated how a standardized UX questionnaire can help validate engineering priorities.
Running the UEQ annually would allow us to track how UX metrics (including performance perception) trend over time.
Future studies could combine survey results with system performance metrics to correlate perceived and actual latency.
Reviewing literature on perceived performance may also help identify design strategies that mitigate the impact of system delays.