Empowering Cybersecurity Learners

Redesigning the cybersecurity learning experience for 800k+ users

Role: Product Designer

Role: Product Designer

Client: picoCTF

Client: picoCTF

Duration: 8 months (Jan - Aug 2025)

Duration: 8 months (Jan - Aug 2025)

Team: 5 MHCI students

Team: 5 MHCI students

Skills:  Product Strategy, UX Research, Information Architecture, Interaction Design, Prototyping, Design Systems, Video Editing

Skills:  Product Strategy, UX Research, Information Architecture, Interaction Design, Prototyping, Design Systems, Video Editing

PicoCTF analytics revealed that 35.9% of users weren't completing a single cybersecurity challenge due to lack of structure and guidance, our MHCI capstone team designed learning paths with contextual support and personalized feedback, helping beginners stay engaged while developing real-world cybersecurity skills and improving platform retention.

PicoCTF analytics revealed that 35.9% of users weren't completing a single cybersecurity challenge due to lack of structure and guidance, our MHCI capstone team designed learning paths with contextual support and personalized feedback, helping beginners stay engaged while developing real-world cybersecurity skills and improving platform retention.

PicoCTF analytics revealed that 35.9% of users weren't completing a single cybersecurity challenge due to lack of structure and guidance, our MHCI capstone team designed learning paths with contextual support and personalized feedback, helping beginners stay engaged while developing real-world cybersecurity skills and improving platform retention.

Solution Demo

Solution Demo

Solution Demo

A video I directed, filmed, and edited demonstrating our solution in real-world context

A video I directed, filmed, and edited demonstrating our solution in real-world context

A video I directed, filmed, and edited demonstrating our solution in real-world context

Editing Software: DaVinci

Editing Software: DaVinci

Editing Software: DaVinci

The Challenge

The Challenge

The Challenge

PicoCTF analytics revealed 35.9% of users weren't completing a single cybersecurity challenge due to lack of structure and guidance. Originally designed as a competition tool, picoCTF evolved into a year-round learning platform but still felt like an autograder rather than a comprehensive educational environment.

PicoCTF analytics revealed 35.9% of users weren't completing a single cybersecurity challenge due to lack of structure and guidance. Originally designed as a competition tool, picoCTF evolved into a year-round learning platform but still felt like an autograder rather than a comprehensive educational environment.

PicoCTF analytics revealed 35.9% of users weren't completing a single cybersecurity challenge due to lack of structure and guidance. Originally designed as a competition tool, picoCTF evolved into a year-round learning platform but still felt like an autograder rather than a comprehensive educational environment.

The Goal: Retain existing users, and transform picoCTF from a competition platform into an engaging learning environment for 800,000+ users.

The Goal: Retain existing users, and transform picoCTF from a competition platform into an engaging learning environment for 800,000+ users.

The Goal: Retain existing users, and transform picoCTF from a competition platform into an engaging learning environment for 800,000+ users.

Research & Discovery

Research & Discovery

Research & Discovery

4-month research phase (Jan - May 2025)

4-month research phase (Jan - May 2025)

26

26

User interviews

User interviews

800+

800+

Survey responses

Survey responses

9

9

Subject matter interviews  (EdTech, learning sciences)

Subject matter interviews  (EdTech, learning sciences)

1

1

High school focus group

High school focus group

Key Insight: Focus on Self-Initiated Learners

Our high school focus group revealed that unmotivated users need more than a platform redesign—they need external support like teachers and structured programs. We pivoted to focus on self-initiated learners: users with intrinsic motivation who just need better platform support to achieve their goals.

Our high school focus group revealed that unmotivated users need more than a platform redesign—they need external support like teachers and structured programs. We pivoted to focus on self-initiated learners: users with intrinsic motivation who just need better platform support to achieve their goals.

Three Design Principles

Three Design Principles

Three Design Principles

Research revealed what these learners need:

  1. Confidence - Show progress and clear next steps

  2. Determination - Motivate and reward effort

  3. Social Connection - Enable collaborative learning

Scope Decision: We deprioritized social features after client alignment, as child safety protections would exceed our capacity and timeline. We focused on confidence and determination features we could implement effectively.

Design Process: Narrowing Down 120+ Ideas

(2) Cognitive Scaffolding

Design Process: Narrowing Down 120+ Ideas

Design Process: Narrowing Down 120+ Ideas

We tested many ideas, good and bad

We tested many ideas, good and bad

We tested many ideas, good and bad

We tested many ideas, good and bad

Setting clear learning goals and testing with an open mind helped us explore unmet user needs through unique methods.

Setting clear learning goals and testing with an open mind helped us explore unmet user needs through unique methods.

Setting clear learning goals and testing with an open mind helped us explore unmet user needs through unique methods.

Our validation approach:

120+

120+

Ideas

Ideas

30

30

Prototypes tested

Prototypes tested

7+

7+

Testing methods

Testing methods

5

5

Major solutions

Major solutions

Ideas We Explored

We tested everything from social features to AI tutoring to understand what truly resonated with learners.

picoAmbassador - Mentorship program connecting experienced users with beginners

Study Groups - Collaborative spaces for solving challenges together

Certificates - Tangible rewards for completing learning milestones

picoMobile - On-the-go cybersecurity learning app

picoFaceOff - Low-stakes competitions for friendly rivalry

In-platform Rewards - Progress badges and achievement tracking

Learning Paths - Structured guidance through curated challenges ✓

AI Assistant - Intelligent tutoring and strategy reflection ✓

Rapid Prototyping with AI

To accelerate our testing cycles, I leveraged AI prototyping tools (v0.dev) to transform Figma wireframes into functional prototypes. This workflow sped up our design process significantly—instead of spending hours connecting arrows in Figma, we deployed interactive prototypes that users could actually click through and experience.

The benefit? Faster iterations meant more rounds of testing, leading to stronger insights. We could test 30 prototypes in a week because AI tools eliminated the bottleneck of manual prototype building.

What We Learned Through Testing

  1. Rewards need context. Badges alone weren't motivating—what mattered was clear indicators of learning outcomes. Users wanted rewards that reflected meaningful progress, not just participation.

  1. AI should solve impossible problems. Rather than forcing AI into the product, we asked: "What user needs can AI solve that would be impossible to scale manually?" This reframe led to our AI reflection feature that analyzes user strategies—something that would require massive human effort otherwise.

  1. Implementation matters more than concept. The same idea could succeed or fail depending on execution. Personalized features tied to specific user goals consistently outperformed generic one-size-fits-all solutions.

  1. Social features need safeguarding. Through participatory design workshops with our client (using "Tarot Cards of Tech"), we uncovered that social features would require extensive child safety protections beyond our timeline. We strategically deprioritized these to focus on what we could deliver effectively.

Strategic Prioritization

Using frameworks like Impact vs. Effort Matrix and MoSCoW prioritization, we collaborated with our client to narrow 120 ideas down to solutions that delivered maximum value within our constraints. We focused on features addressing confidence and determination—the two pillars we could implement effectively while maintaining platform safety.

Impact vs. Effort Matrix

MoSCoW

The Solution: A Layered Learning System

The Solution: A Layered Learning System

The Solution: A Layered Learning System

(4) Constraints that Liberate

Guidance, Momentum, and Reinforcement

Guidance, Momentum, and Reinforcement

Guidance, Momentum, and Reinforcement

Layer 1: Guidance – Learning Paths

Powered by insights from user testing, we designed a new learning paths feature – Structured, step-by-step journeys through curated challenges helping beginners build skills with confidence. 

Powered by insights from user testing, we designed a new learning paths feature – Structured, step-by-step journeys through curated challenges helping beginners build skills with confidence. 

Learning Path Introduction

Learning Path Introduction

Learning Path Introduction

CTF Interface

CTF Interface

CTF Interface

Personalized Recommendations for Next Steps

Personalized Recommendations for Next Steps

Personalized Recommendations for Next Steps

Layer 2: Momentum – Gamified Progress

My Focus

My Focus

The existing profile was a simple visualization of challenges solved—no context, no motivation, no sense of growth.

The existing profile was a simple visualization of challenges solved—no context, no motivation, no sense of growth.

The Challenge: How could we scale this to accommodate our new features while making it a delightful and useful experience?

My Role: I designed a comprehensive profile system that transforms abstract numbers into tangible markers of progress, creating meaningful feedback loops that drive continued engagement. 

My Role: I designed a comprehensive profile system that transforms abstract numbers into tangible markers of progress, creating meaningful feedback loops that drive continued engagement. 

Designing the Profile Experience

The redesigned profile serves as a personal dashboard that celebrates growth and maintains momentum through five key components:

My Role: I designed a comprehensive profile system that transforms abstract numbers into tangible markers of progress, creating meaningful feedback loops that drive continued engagement. 

My Role: I designed a comprehensive profile system that transforms abstract numbers into tangible markers of progress, creating meaningful feedback loops that drive continued engagement. 

  1. Activity Visualization — Seeing Consistency

  1. Visualizing Consistency

  1. Visualizing Consistency

A GitHub-style contribution tracker shows at-a-glance when users are active, making consistent practice visible and rewarding. Users can see their learning patterns over time and identify when they're most productive.

A GitHub-style contribution tracker shows at-a-glance when users are active, making consistent practice visible and rewarding. Users can see their learning patterns over time and identify when they're most productive.

  1. Challenge Completion Tracking — Understanding Progress

  1. Understanding Progress

  1. Understanding Progress

A color-coded breakdown (Easy/Medium/Hard) maps user progression across difficulty levels. This helps users see not just how many challenges they've completed, but what kind—showing skill development beyond raw numbers.

A color-coded breakdown (Easy/Medium/Hard) maps user progression across difficulty levels. This helps users see not just how many challenges they've completed, but what kind—showing skill development beyond raw numbers.

  1. Skills Analysis — Modeling Growth

  1. Modeling Skill Growth

  1. Modeling Skill Growth

A radar chart visualizes competency across cybersecurity categories (Web Exploitation, Cryptography, Reverse Engineering, Forensics, Binary Exploitation). This helps users identify strengths and areas for growth while providing concrete evidence of skill development.

A radar chart visualizes competency across cybersecurity categories (Web Exploitation, Cryptography, Reverse Engineering, Forensics, Binary Exploitation). This helps users identify strengths and areas for growth while providing concrete evidence of skill development.

  1. Learning Path Progress — Maintaining Direction

  1. Maintaining Direction

  1. Maintaining Direction

Active learning paths display with clear progress indicators, showing users exactly where they are and what's next. This creates a seamless connection between the profile and the learning experience.

Active learning paths display with clear progress indicators, showing users exactly where they are and what's next. This creates a seamless connection between the profile and the learning experience.

  1. Event Participation – Community Connection

  1. Community Connection

  1. Community Connection

The profile also tracks competition participation, showing which events users have competed in, team affiliations, and points earned.

The profile also tracks competition participation, showing which events users have competed in, team affiliations, and points earned.

Users achievements are automatically private, ensuring privacy and safety for users, many of which are children.

Users achievements are automatically private, ensuring privacy and safety for users, many of which are children.

The Badge System: Delight and Motivation

The Badge System: Delight and Motivation

The Badge System: Delight and Motivation

The profile also tracks competition participation, showing which events users have competed in, team affiliations, and points earned.

The profile also tracks competition participation, showing which events users have competed in, team affiliations, and points earned.

Users achievements are automatically private, ensuring privacy and safety for users, many of which are children.

Users achievements are automatically private, ensuring privacy and safety for users, many of which are children.

Solve Count Badges (5, 10, 25, 50, 100+ solves)
For dedicated learners hitting meaningful milestones

Streak Badges (1, 2, 4, 8, 16+ week streaks)
Recognizing consistent practice and habit formation

Streak Badges (1, 2, 4, 8, 16+ week streaks)
Recognizing consistent practice and habit formation

Learning Path Badges (by topic)
Celebrating completed learning journeys with topic-specific designs

Learning Path Badges (by topic)
Celebrating completed learning journeys with topic-specific designs

Streak Mechanics: Building Habits Through Retention

Streak Mechanics: Building Habits Through Retention

Streak Mechanics: Building Habits Through Retention

The Insight: During testing, users described seeing their streak fill up as "a satisfying dopamine boost" that encouraged them to return regularly.

The Insight: During testing, users described seeing their streak fill up as "a satisfying dopamine boost" that encouraged them to return regularly.

I designed a weekly streak system to avoid overwhelming users that:

I designed a weekly streak system to avoid overwhelming users that:

  • Shows current streak count prominently

  • Displays activity across the current week (5 of 7 days active)

  • Provides gentle encouragement to maintain momentum

  • Resets gracefully to avoid punishing users for breaks

  • Shows current streak count prominently

  • Displays activity across the current week (5 of 7 days active)

  • Provides gentle encouragement to maintain momentum

  • Resets gracefully to avoid punishing users for breaks

Why weekly? Our research showed that weekly streaks hit the sweet spot—regular enough to build habits, forgiving enough to avoid burnout. This creates sustainable motivation rather than stressful pressure.

Why weekly? Our research showed that weekly streaks hit the sweet spot—regular enough to build habits, forgiving enough to avoid burnout. This creates sustainable motivation rather than stressful pressure.

The profile system creates a virtuous cycle of motivation: recognition through badges, direction through learning path progress, community through event participation, and habit formation through streaks. Together, these elements transform the profile from a static stats page into an active motivator that keeps users engaged and returning to the platform.

The profile system creates a virtuous cycle of motivation: recognition through badges, direction through learning path progress, community through event participation, and habit formation through streaks. Together, these elements transform the profile from a static stats page into an active motivator that keeps users engaged and returning to the platform.

Layer 3: Reinforcement – AI-Powered Strategy Reflection

We designed an AI-powered feature that compares user approaches to official walkthroughs, providing personalized feedback at scale through an accessible, low-friction design. 

We designed an AI-powered feature that compares user approaches to official walkthroughs, providing personalized feedback at scale through an accessible, low-friction design. 

Will PicoCTF users spend 5 minutes documenting their solution approach for an AI learning summary?

We validated interest through a smoke test with real users.

After solving a CTF challenge, 20% of users received a prompt to try the AI reflection assistant on their solution.

We validated interest through a smoke test with real users.

After solving a CTF challenge, 20% of users received a prompt to try the AI reflection assistant on their solution.

17%

17%

clicked "Try AI Reflection", beating our 15% target

clicked "Try AI Reflection", beating our 15% target

Layer 4: Design System

Redesigned Information Architecture

Redesigned Information Architecture

Redesigned Information Architecture

Just 1.4% of 800,000 users completed a playlist—features were essentially invisible.

Just 1.4% of 800,000 users completed a playlist—features were essentially invisible.

As a result of all of this research, we decided to deliver a new information architecture, based on usability best practices and our user research findings, to improve feature discoverability and make learning resources more visible and accessible.

As a result of all of this research, we decided to deliver a new information architecture, based on usability best practices and our user research findings, to improve feature discoverability and make learning resources more visible and accessible.

  1. New homepage serving as a central hub

  1. New Homepage

  1. New Homepage

The original picoCTF website had no home page! We designed a homepage that serves as a starting point and central hub to help users start, resume, and explore their journey. They can now discover learning resources and features that are relevant to their learning journey that they previously couldn't find.

The original picoCTF website had no home page! We designed a homepage that serves as a starting point and central hub to help users start, resume, and explore their journey. They can now discover learning resources and features that are relevant to their learning journey that they previously couldn't find.

  1. Restructured navigation based on user mental models

  1. Restructured Navigation

  1. Restructured Navigation

  1. Modern design system with clear visual hierarchy

  1. Modern Design System

  1. Modern Design System

We redesigned the look and feel for picoCTF, because the previous visuals were somewhat outdated and lacked clear visual hierarchy, making it even harder for users to find important features.

We redesigned the look and feel for picoCTF, because the previous visuals were somewhat outdated and lacked clear visual hierarchy, making it even harder for users to find important features.

Measuring Success & Impact

Measuring Success & Impact

Measuring Success & Impact

AI Reflection smoke test: predicts 17% engagement from users (beat 15% target)

SUS (System Usability Score): 86

Our designs are shipping! Late 2025 as Phase 1 of 3-phase roadmap

Client Feedback

Looking Towards the Future

Looking Towards the Future

Looking Towards the Future

What Worked?

  • Early testing saved us from expensive mistakes. Assumption artifacts revealed that well-researched concepts can still fail—understanding what users do beats what they say.

  • "Wrong" paths led to breakthroughs. Testing with uninterested users felt like a dead-end at first, but it taught us that platform redesigns alone can't spark interest from scratch—we needed to focus on self-initiated learners —our real target audience.

  • Designer intuition + data > data alone. With 800,000+ users, perfect data is impossible. Conflicting feedback from different tests created decision paralysis until we used prioritization frameworks and trusted our design instincts to dig ourselves out.  

What I'd Do Differently

  • Prioritize accessibility from day one. Timeline constraints pushed accessibility and responsiveness to the side—a trade-off I'd reverse given another chance. 

  • Establish clearer success metrics upfront. While some metrics can only be validated post-launch, I'd create a clearer measurement framework earlier with pre- and post-launch validation plans

  • Aligning earlier with stakeholders about constraints to avoid deprioritizing features late in the process

My capstone team during our final presentation (left to right: Sanjna Subramanian, Zoe Mercado, Melissa Gibney, Hedy Hui, Megan Chai

Let's Work together

© 2025. Designed by Zoe Mercado

Let's Work together

© 2025. Designed by Zoe Mercado

Let's Work together

© 2025. Designed by Zoe Mercado