Skip to main content
Pixel art showing two phone screens: Left phone has bad usability with developer jargon, endless spinner, cryptic error message, dangerous adjacent buttons, and cluttered text. Right phone has good usability with clear labels, progress feedback, simple layout. Tagline: Software That Works vs. Software People Want to Use.

CS 3100: Program Design and Implementation II

Lecture 24: Usability

©2026 Jonathan Bell, CC-BY-SA

Learning Objectives

After this lecture, you will be able to:

  1. Define usability and describe the five key aspects of usability
  2. Identify stakeholders and their usability concerns
  3. Create personas to make design trade-offs explicit
  4. Recognize the relationship between usability and safety
  5. Apply Nielsen's 10 Usability Heuristics to evaluate an interface

Working Software Is Table Stakes — Usable Software Wins Users

Split comparison: Left shows SceneItAll passing all tests but homeowner walking away to use physical switches. Right shows same app with homeowner comfortably controlling home from couch. Labels: 'Passes all tests' vs 'Users actually use it'.

Usability Measures How Well Software Serves Humans Achieving Their Goals

Usability is a measure of how well an artifact (software, device, interface) supports humans in achieving their goals.

It's not a single property but rather a collection of related qualities:

  • Can users figure out how to use it?
  • Can they accomplish their actual tasks?
  • How much effort does it take?
  • Will they remember how to use it later?
  • Do they enjoy the experience?

Connection to L9: We identified stakeholders and their needs. Usability asks: can they actually meet those needs using our software?

Learnability and Effectiveness: Can Users Figure It Out and Succeed?

Learnability

How easy is it for users to accomplish tasks the first time?

SceneItAll test: Can a new user turn off living room lights without a tutorial?

  • ✓ High: See "Living Room" → tap → adjust
  • ✗ Low: See "Area Hierarchy Browser" → confused

Effectiveness

Can users successfully complete their intended tasks?

SceneItAll test: Can users create a "Movie Night" scene?

  • Working ≠ Discoverable
  • Attempted ≠ Completed
  • Completed ≠ Correct

Relationship: Users must first learn the interface before they can be effective. First impressions matter — users who fail early may never try again.

Productivity and Retainability: Efficiency Over Time

Productivity

How efficiently can users accomplish tasks once learned?

SceneItAll test: Turn off all downstairs lights

  • Low: 12+ taps through nested menus (2 min)
  • High: One "Downstairs Off" button (3 sec)

Same outcome, vastly different effort.

Retainability

How well do users maintain proficiency over time?

SceneItAll test: Guest returns next holiday — can they still use it?

  • Seasonal features (holiday lights yearly)
  • Rare operations (adding new device)
  • Occasional users shouldn't re-learn

Relationships: Effectiveness leads to productivity through practice. Learnability enables retainability — interfaces matching mental models are easier to remember. Retainability feeds back into productivity.

Satisfiability Connects Everything: The Complete Picture

Satisfiability

How pleasant is the experience?

  • Satisfied users explore more features
  • Satisfied users forgive occasional issues
  • Frustrated users abandon for alternatives

The feedback loop: Satisfaction → willingness to learn more → deeper effectiveness → greater productivity → more satisfaction

Design DecisionHelpsHurts
Detailed onboarding tutorialLearnabilitySatisfiability (impatient users)
Voice command shortcutsProductivityLearnability (more to discover)
All options on one screenProductivityLearnability (overwhelming)

You can't maximize everything — good design requires knowing which aspects matter most for your users.

CLI vs. GUI: Neither Is Universal — Context Determines the Right Choice

Command-Line Interface

> sceneitall set living-room lights 50%
Living room lights set to 50%
> sceneitall activate "Movie Night"
Activating scene: Movie Night...
AspectRating
LearnabilityLow (must learn commands)
EffectivenessHigh (if you know commands)
ProductivityVery high (for experts)
RetainabilityLow (forget syntax)
SatisfiabilityVaries (some love CLIs)

Graphical Interface

Visual room layout with sliders, buttons, and scene cards

AspectRating
LearnabilityHigh (visual, explorable)
EffectivenessHigh (guided interactions)
ProductivityMedium (more clicks)
RetainabilityHigh (visual cues)
SatisfiabilityGenerally higher

Neither is "better" — it depends on who's using it and for what.

Different Stakeholders Prioritize Usability Aspects Differently

Remember from L9: stakeholders are anyone who affects or is affected by the system. Different stakeholders care about different usability aspects.

StakeholderUsage PatternUsability Priorities
Primary ownerDaily, 10+ timesProductivity, Satisfiability
Family membersDaily, didn't choose systemLearnability, Effectiveness
GuestsOccasional (holidays)Learnability, Retainability
InstallerOne-time configurationEffectiveness, Productivity
NeighborsIndirect (affected by outdoor lights)Safety implications

Same system, same features — but you can't optimize for everyone. You must make intentional choices about priorities.

Personas Make Stakeholders Concrete: Meet Marcus and Dorothy

Persona: A fictional but realistic user that makes trade-off discussions concrete — "Would Marcus find this useful? Would Dorothy be confused?"

Two persona cards: Marcus (45, tech-savvy, wants productivity and flexibility, quote: 'Don't hide features from me') and Dorothy (72, visiting grandparent, wants learnability and simplicity, quote: 'I don't want to break anything').

Features Marcus loves (automations, shortcuts) overwhelm Dorothy. Simplicity Dorothy needs frustrates Marcus.

Personas Reveal Which Trade-offs Matter

Design DecisionMarcusDorothy
Add onboarding tutorialSkip it (wastes time)Needs it (can't figure out otherwise)
Expose automation rulesEssential featureHidden complexity (confusing)
Voice command shortcutsDaily use, loves themConfusing ("what do I say?")
Simple big buttons"Childish, wastes space"Accessible, easy to tap
Nested area hierarchyPowerful organization"Where is the guest room??"

You can't optimize for everyone. Personas force you to choose primary users and make intentional trade-offs.

Connection to L9: Remember stakeholder analysis? Personas are the usability-focused refinement of that work.

Design Flexibility Can Serve Multiple Personas

Two phones showing the same smart home app: Dorothy's guest mode has 4 simple room buttons with big ON/OFF toggles. Marcus's owner mode shows rooms, scenes, automations, schedules, and settings. Same app, different experiences based on user needs.

Good design often means designing multiple experiences within one product.

Marcus and Dorothy Have Different Mental Models

Three users viewing SceneItAll with a hierarchy (Downstairs > Living Room > Reading Nook). When tapping 'All Lights Off' on Living Room, User 1 expects only Living Room affected, User 2 expects cascade up to Downstairs, User 3 expects cascade down to Reading Nook. All reasonable but contradictory.

Whatever SceneItAll actually does, it violates someone's mental model.

Good Design Either Matches Mental Models or Makes Behavior Visible

Strategy 1: Match the most common mental model

  • Research which expectation is most prevalent
  • Design behavior to match that expectation
  • Accept some users will need to adjust

Strategy 2: Make actual behavior clearly visible

  • Show cascade indicators: "This will affect 3 other areas"
  • Preview affected devices before confirming
  • Use animations that reveal what's happening

Strategy 3: Both

  • Choose sensible defaults that match common expectations
  • AND provide clear feedback about what's happening

When you can't match expectations, at least don't surprise users silently.

Usability Failures Create Real Safety Risks

Safety isn't just physical:

CategorySceneItAll Example
PhysicalStairway lights off while someone's on stairs
SecurityOutdoor lights "always off" creates vulnerability
PrivacyGuest can see all family schedules
FinancialAccidental bulk purchase of smart bulbs
OperationalDisabling smoke detector integration

Many "usability" issues are actually safety requirements:

  • "Confirm before locking all doors" — enhancement or safety?
  • "Show who is home before Away mode" — convenience or preventing lockout?
  • "Re-authenticate to share access" — friction or security?

⚠ Ask: "What could go wrong if a user misunderstands?"

Forward reference: We'll return to safety-critical systems in L35 (Safety and Reliability).

Human Error Has a Taxonomy: Reason's Classification

Reason's error taxonomy: Unsafe acts split into Unintended (Slips from attention failures, Lapses from memory failures) and Intended (Mistakes from wrong rules/knowledge, Violations from deliberate deviation).

The Usability Evaluation Dilemma: Best Evidence Comes Late

Timeline showing usability evaluation trade-offs: Early phase has low-cost paper prototypes with lower fidelity. Middle phase (sweet spots marked) has moderate-cost clickable prototypes. Late phase has expensive real implementation testing with highest fidelity but costly changes.

Three Approaches to Usability Evaluation

User Studies

Watch real users accomplish tasks

✓ Gold standard — direct evidence

✗ Expensive, time-consuming

Surveys & Feedback

App reviews, in-app feedback

✓ Large samples, real context

✗ Self-reported, no observation

Heuristic Evaluation

Experts check against principles

✓ Fast, cheap, works on prototypes

✗ May miss domain-specific issues

Today's focus: Heuristic Evaluation — experts systematically check interface against established principles. Nielsen found 3-5 evaluators catch ~75% of usability problems.

Nielsen's 10 Usability Heuristics: a 30-year-old framework that remains remarkably effective.

Nielsen's Heuristics: A 30-Year-Old Checklist That Still Works

H1: Visibility of system status

H2: Match between system and real world

H3: User control and freedom

H4: Consistency and standards

H5: Error prevention

H6: Recognition rather than recall

H7: Flexibility and efficiency of use

H8: Aesthetic and minimalist design

H9: Help users recognize, diagnose, and recover from errors

H10: Help and documentation

Developed in the 1990s by Jakob Nielsen, these heuristics capture recurring patterns in usability problems. Their relevance to modern apps (including mobile, IoT, voice interfaces) is a testament to the stability of human cognitive factors.

H1: Keep Users Informed About What Is Happening

Split comparison: Left shows user tapping Movie Night with no feedback, confused expression asking 'Did it work?' Right shows same action with 'Activating...' progress bar and individual device status updates, ending with 'Movie Night Active' confirmation.

The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

H2: Speak the Users' Language, Not Developer Language

Side-by-side: Left shows confusing developer terms like 'DeviceStateManager', 'LightEntity brightness=127'. Right shows same information as 'Living Room', 'Brightness: 50%' with natural language that users understand.

H3: Provide Clear Emergency Exits

Users often choose system functions by mistake and need a clearly marked "emergency exit" to leave the unwanted state.

SceneItAll applications:

  • "All Lights On" panic button when a scene goes wrong and everything's dark
  • Undo for accidental scene deletions — don't permanently destroy user work
  • Cancel during long operations — stop mid-execution if something's wrong
  • Easy mode switching — stuck in advanced view? Clear path back to simple mode

Connection to L12: Did our domain model capture "deleted" scenes as recoverable or truly destroyed? This is a design decision with usability implications.

H4: Consistency Builds Trust and Reduces Cognitive Load

Users should not have to wonder whether different words, situations, or actions mean the same thing.

SceneItAll consistency requirements:

  • Tapping a device should always show its controls, everywhere in the app
  • Same gestures for same actions — swipe to dismiss, long-press for options
  • Consistent terminology — don't call it "Scene" in one place and "Routine" in another
  • Platform conventions matter — iOS users expect different patterns than Android users

Consistency is internal (within your app) and external (with platform norms).

H5: Design to Prevent Errors, Not Just Handle Them

Split comparison: Left shows schedule with 'Lights Off' at 3AM and 'Wake Up' at 2AM with no warning about the illogical order. Right shows same configuration with a warning asking 'Wake Up will run BEFORE Lights Off - is this intended?' with fix options.

H6: Recognition Over Recall Reduces Memory Burden

Minimize the user's memory load by making objects, actions, and options visible. Users should not have to remember information from one part of the interface to another.

SceneItAll applications:

  • Show current device states in scene editor, not just target states
    • "Currently: 75%" when setting brightness — user knows starting point
  • Display which scenes affect which devices
    • Don't make users remember that "Movie Night" controls living room shades
  • Show recent choices
    • Last-used scenes, recent devices, common operations

Recognition is easier than recall. Seeing "Living Room" is easier than remembering it.

H7: Flexibility Serves Both Novices and Experts

Accelerators — unseen by the novice user — may speed up the interaction for the expert user, allowing the system to cater to both inexperienced and experienced users.

SceneItAll flexibility spectrum:

User LevelInteraction Style
NoviceTap room → tap device → adjust slider
IntermediateUse scene buttons for common operations
ExpertVoice command: "set office lights to 30%"
Power userCLI: sceneitall set office lights 30%

Connection to earlier: This is the CLI vs. GUI trade-off — the answer is often "both."

H8: Every Extra Element Competes for Attention

Interfaces should not contain information that is irrelevant or rarely needed. Every extra unit of information competes with the relevant information and diminishes their relative visibility.

SceneItAll prioritization:

EmphasizeDe-emphasize
On/Off toggleFirmware version
Brightness sliderNetwork address
Color pickerLast-updated timestamp
Room nameDevice ID

Minimalist ≠ minimal features. It means every visible element earns its space by serving user goals.

H9: Error Messages Should Explain Problems and Suggest Solutions

Bad Error Message

DeviceConnectionException: 
timeout at IoTBridge.sendCommand()
line 247
  • Technical jargon
  • No explanation of cause
  • No path forward
  • Makes user feel stupid

Good Error Message

Couldn't reach Kitchen Light

The light isn't responding. Try:
• Check if it's plugged in
• Move closer to reduce distance
• Wait a moment and try again

[Try Again] [Skip This Device]
  • Plain language
  • Likely cause explained
  • Actionable suggestions
  • Clear next steps

H10: Help Should Be Searchable, Task-Focused, and Concise

Even though it is better if the system can be used without documentation, it may be necessary to provide help. Such information should be easy to search, focused on the user's task, list concrete steps, and not be too large.

SceneItAll help patterns:

  • Contextual help: "How do I create a scene?" appears on scenes screen
  • Task-focused: Steps to accomplish specific goals, not feature descriptions
  • Searchable: "How do I..." queries return relevant results
  • Not a PDF manual: No 50-page document about IoT protocols

The best help is available where and when users need it.

Key Takeaways: Building Software People Actually Want to Use

  1. Five aspects of usability (learnability, effectiveness, productivity, retainability, satisfiability) — and they trade off against each other

  2. Personas make trade-off decisions concrete and explicit — "Would Marcus find this useful? Would Dorothy be confused?"

  3. Different stakeholders have different priorities (L9 connection) — you can't optimize for everyone

  4. Poor usability has safety implications — physical, security, privacy, financial, and operational risks

  5. Nielsen's heuristics provide a systematic evaluation framework — a 30-year-old checklist that still works

  6. Mental models bridge domain understanding to interface design (L12 connection) — when system matches expectations, it "just works"

Looking Ahead

Next up: Exam Review → Exam

  • Today's usability content is not on the exam
  • Focus your exam prep on material through last week

After the exam: User-Centered Design (UCD) Process

  • How to integrate usability throughout development
  • Prototyping techniques (paper, wireframe, interactive)
  • Iterative testing and refinement

Later this semester: Safety and Reliability (L35)

  • When usability failures cause real harm
  • The intersection of usability and safety requirements

Today we learned to evaluate usability. After the exam, we'll learn to design for it from the start.