Dork: Software Engineering Case Study - Building a Text-Based RPG

Updated: November 7, 2025
software-engineering python agile ci-cd testing

Project Overview

Dork: Zork-inspired text-based RPG demonstrating professional development practices in a team environment.

Quick Facts: Python 3.7+ • 4 developers, 8 weeks (4 sprints) • ~3,500 LOC, 85%+ test coverage • CI/CD: Travis CI + SonarCloud

Live: gitnick-dork.readthedocs.ioGitHub


Agile in Action

Scrum Framework

2-week sprints with daily 15-min standups, sprint planning (4 hours), and retrospectives (2 hours).

Velocity evolution:

  • Sprint 1: 13 points
  • Sprint 2: 18 points
  • Sprint 3: 21 points (team improving)
  • Sprint 4: 20 points

Key insight: Team velocity stabilizes after ~2 sprints. Use historical velocity for planning, not wishful thinking.

User Stories & Backlog

Priority-driven product backlog: Core game loop → Combat → Inventory → Save/load → Story → Polish

Each user story broken into tasks <8 hours, tracked on GitHub Projects board.


Architecture

Core structure:

Game (Singleton)
  ├── Player
  ├── World (Factory pattern)
  ├── CombatSystem
  ├── InventorySystem
  └── SaveManager

Design Patterns Applied

Factory Pattern for procedural generation, Command Pattern for undo/redo support, Observer Pattern for event system (combat triggers, analytics).

Why patterns matter: Testability, extensibility, team communication via shared vocabulary.


Testing Strategy

Test Pyramid

     /\
    /E2E\     (Few) - Full game scenarios
   /------\
  / INTEGRATION \  (Some) - Component interaction
 /--------------\
/  UNIT TESTS    \ (Many) - 85% coverage
------------------

CI Pipeline (Travis CI):

  • Automated test runs on every PR
  • pylint for code quality
  • SonarCloud gates (coverage >80%, no critical bugs)
  • Merge blocked if checks fail

Impact: Bugs caught pre-merge, not in production.


Key Lessons Learned

1. Test-Driven Development Works

Challenge: Complex combat math led to bugs.

Solution: Write tests first, then implementation. Result: 85% coverage, fewer production bugs.

2. CI/CD Catches Problems Early

Example: Developer forgot to update requirements.txt → CI failed on missing dependency → caught before code review → 5-minute fix vs potential production issue.

3. Sprint Retrospectives Drive Improvement

Sprint 2: Standups running 30+ minutes

  • Fix: Timeboxed to 15 minutes, off-topic → Slack

Sprint 3: Code review bottleneck (1 person reviewing all PRs)

  • Fix: Round-robin reviewer assignment
  • Result: Velocity increased 15% (18 → 21 points)

4. Documentation as Code

Sphinx + Read the Docs:

  • Docstrings auto-generate API docs
  • Automatically deployed on merge
  • Documentation never out of sync with code

5. Scope Creep is Real

Original scope: Simple dungeon crawler

Sprint 3 additions: Crafting system, quest log, companion AI

Impact: Sprint 4 became bug-fixing sprint instead of new features.

Lesson: Protect sprint commitments. Push nice-to-haves to next sprint.


Git Workflow

Git Flow: main (production) → develop (integration) → feature/* (work)

Branch rules: Direct commits to main blocked • PRs require 1+ reviews • CI must pass

Code review focus: Quality, test coverage, design patterns, performance


Takeaways for Future Projects

Do:

  • ✅ Automate everything (tests, linting, deployment)
  • ✅ Code review all changes
  • ✅ Keep sprints focused (resist scope creep)
  • ✅ Document as you go
  • ✅ Retrospectives drive continuous improvement

Don’t:

  • ❌ Skip tests (“we’ll add them later” = never)
  • ❌ Merge without CI passing
  • ❌ Let one person become bottleneck
  • ❌ Commit directly to main
  • ❌ Ignore technical debt (compounds over time)

Software engineering isn’t just writing code - it’s process, collaboration, and continuous improvement. Dork taught us that good practices (testing, CI/CD, code review) aren’t overhead - they’re force multipliers that let teams move faster with fewer bugs.