🧪 Uxcel Heuristic Analysis
Research Overview
Objective: Conduct competitive audit & use it to define what is quality within the space of UX design courses. And when I know what is quality, start judging Uxcel site from a Heuristic perspective using industry-standard tools and methodologies to identify critical usability issues and competitive gaps.
Duration: 8h
Competitor in this study
UX Artifacts
- Rawdata from all tests (*.csv)
Cleaned data
- 🏷️ Keywords (Google Sheets)
- 🗂️ Performance Tests Results (Google Sheets)
- 🎨 Flows (Figma)
- 📺 Presentation (FigJam)
- ✍️ Content Audit (Google Sheets, a project I've already performed)
Tools Used
- Performance
- Code Quality
- Accessibility
- Visualisation of flows
- Presentation
- Google Sheets
- FigJam
Figma Plugins
- Visual Sitemap (for FigJam)
- html.to.design — by ‹div›RIOTS
Methodology
Heuristic evaluation combined with technical performance auditing and competitive benchmarking
Research Methodology & Process
Phase 1: Establishing the Research Framework
Tool Selection Rationale: I chose a multi-faceted approach combining UX heuristics with technical performance metrics to create a holistic view of platform usability and get a deep understanding of what quality is within the space of UX Design course platforms:
- Google Lighthouse: Industry standard for Core Web Vitals and performance measurement
- W3C HTML/CSS Validators: Baseline code quality assessment
- IBM Accessibility Checker: WCAG 2.1 compliance analysis
- Nielsen's 10 Heuristics: Systematic usability evaluation framework
Research Questions Defined:
- How does Uxcel's technical performance impact user experience compared to competitors?
- What accessibility barriers exist that could exclude users or create legal liability?
- How do performance issues translate to business costs and user retention?
- What systematic improvements would yield the highest UX impact?
Phase 2: Competitive Landscape Analysis
Competitor Selection Process: I identified key competitors based on:
- Target audience overlap (UX/UI design education)
- Platform complexity (interactive learning features)
- Market positioning (professional development focus)
Selected Platforms:
- Interaction Design Foundation (established academic approach)
- DesignLab (bootcamp-style intensive learning)
- Memorisely (Figma-focused design training)
- Uxcel (gamified bite-sized learning)
Standardized Testing Protocol:
Testing Environment: - Device: Desktop Chrome browser - Network: Standard broadband connection - Cache: Cleared before each test - Extensions: Disabled for consistency - Time: Conducted during off-peak hours to minimize CDN variability
Phase 3: Technical Performance Audit Methodology
Google Lighthouse Analysis Process:
I ran comprehensive Lighthouse audits collecting these metrics:
Core Web Vitals:
- First Contentful Paint (FCP) - perception of loading speed
- Largest Contentful Paint (LCP) - actual loading completion
- Total Blocking Time (TBT) - interaction responsiveness
- Cumulative Layout Shift (CLS) - visual stability
Performance Categories:
- Performance Score (weighted algorithm of all metrics)
- Accessibility Score (automated WCAG compliance check)
- Best Practices Score (modern web development standards)
- SEO Score (search engine optimization factors)
Data Collection Framework: I created a systematic spreadsheet to track:
Platform Name | URL | Fetch Time | Performance Score | Accessibility Score | Best Practices Score | SEO Score | FCP Score | FCP Value | LCP Score | LCP Value | TBT Score | TBT Value | CLS Score | CLS Value
Resource Optimization Analysis: For each platform, I documented:
- Total byte weight and scoring impact
- Render-blocking resources and delay measurements
- Unused CSS/JavaScript calculations
- Image optimization opportunities
- Modern format adoption rates
Phase 4: Accessibility Research Methodology
IBM Accessibility Checker Implementation:
I used IBM's automated scanning tool to conduct WCAG 2.1 compliance analysis:
Scanning Parameters:
- Ruleset: WCAG 2.1 AA standards
- Scope: Full page scan including dynamic content
- Browser: Chrome with accessibility extensions
- Screen reader simulation: NVDA compatibility testing
Data Structure Created:
Issue Categories: - Violation Level (A, AA, AAA) - Rule Type (skip_main_exists, aria_content_in_landmark, text_contrast_sufficient) - Element Location (XPath for precise identification) - WCAG Checkpoint Reference - Remediation Priority Scoring
Critical Issues Identified: I systematically categorized 1,527 accessibility violations across:
- Navigation and landmark structure
- Color contrast compliance
- Alternative text coverage
- Keyboard accessibility
- Screen reader compatibility
Phase 5: Code Quality Assessment Process
W3C Validation Methodology:
I ran systematic validation tests using W3C's official validators:
HTML Validation Process:
- Markup Validator for structural compliance
- Error categorization (critical vs. warning level)
- Semantic HTML5 usage assessment
- Document structure analysis
CSS Validation Process:
- CSS Validator for syntax and property compliance
- Error impact assessment on rendering
- Performance impact of validation failures
- Cross-browser compatibility implications
Results Documentation:
Validation Metrics per Platform: - HTML Errors (structural issues) - HTML Warnings (best practice violations) - CSS Errors (syntax problems) - CSS Warnings (compatibility concerns)
Phase 6: Heuristic Evaluation Process
Well, everyone tells us to follow the Nielsen's 10 Heuristics Application, right? Which goes something like this:
Evaluation Protocol:
- Task-based scenarios - Simulated real user journeys
- Severity rating scale - 1-4 impact assessment
- Evidence documentation - Screenshots and interaction recordings
- Cross-platform consistency - Mobile vs. web experience comparison
I don't entirely disagree with that approach, but I prefer to work in my own way
First, it's not possible to prioritize effectively & judge impact on a scale without conducting a competitive audit. The audit establishes the benchmark for quality within the context/industry you're working in.
So, like I said, I performed this analysis according to my preference of how to deal with such a complex site and app like Uxcel is. And on top of everything, when I'm trying to onboard myself to the project. To that end, I used the plugin mentioned earlier to download large portions of the site and app into Figma. This allowed me to leverage Figma’s native flow features to visualize the user journey, as shown below, and start testing usability and if I was working in a team, start using the commenting feature in Figma where issues would arise.
After that, visualizing the current sitemap in FigJam became more straightforward. And then create a suggested sitemap for the future; using microfrontends similar to:
- www.uxcel.com (ONLY marketing for individuals)
- teams.uxcel.com (ONLY marketing for teams)
- blog.uxcel.com (ONLY blog)
- help.uxcel.com (Help center)
- app.uxcel.com (as it is)
- myaccount.uxcel.com (dealing with the account, user profile & billing)
- partner.uxcel.com (affiliate)
- legal.uxcel.com (Term & Conditions, Cookies, Privacy etc)
Once completed, I began organizing the material in FigJam, using it as a workspace to annotate my thoughts.
Next steps for this project
- Gather more thoughts about improvements in FigJam
- Duplicate the design file where I had downloaded Uxcel site & app and create a new prototype, based on the suggestion I made in FigJam.
- Perform a more traditional Heuristic Analysis in accordance Nielsen's 10 Heuristics: Systematic usability evaluation framework and gather all of that in a Google Sheet/Excel document.
Tools used
From brief
Topics
Share
Reviews
3 reviews
Okay, hold up, are you flexing your newly announced Apple Liquid Glass design language skills for this project thumbnail or what, Jonas? 🤨
P.S.
Skimmed and my actual project review will arrive in 4-5 business days 😄
Thanks for sharing your deep insights from interesting perspectives.
Referring to the project briefs, I assume that the research task should be conducted by applying the classic Heuristic Guidelines. Ref: https://www.nngroup.com/articles/ten-usability-heuristics/
Cheers.
Love this
You might also like

Ithnain Management System

Jeel app website wireframes

ALT.Studio Color System

2 PAY

Design Workshop Plan
