Theory without practice is quickly forgotten. In this lesson, you will apply the STLC to a realistic project scenario — a user registration feature for an e-commerce platform. You will define entry and exit criteria for each phase, identify the artefacts produced, and see how skipping phases creates real problems. This exercise mirrors the type of thinking your team lead will expect from you on day one.
STLC in Action — Testing a User Registration Feature
Your team has been asked to test a new user registration feature. The developers are in the Implementation phase. Here is how the STLC unfolds for this feature, with concrete entry and exit criteria at each step.
# Practical STLC walkthrough for a user registration feature
project = "User Registration Feature — E-Commerce Platform"
stlc_walkthrough = {
"Phase 1 — Requirements Analysis": {
"entry": "User story US-042 is written and accepted by the product owner",
"activities": [
"Review acceptance criteria: valid email, password 8+ chars, unique username",
"Clarify: what happens with duplicate emails? → PO says show error message",
"Add to RTM: 4 functional requirements, 2 non-functional (performance, security)",
],
"exit": "RTM covers all 6 requirements; PO has answered all clarification questions",
"artefact": "RTM with 6 traced requirements",
},
"Phase 2 — Test Planning": {
"entry": "RTM is approved; sprint timeline is confirmed (2 weeks)",
"activities": [
"Scope: functional tests (manual), security tests (SQL injection, XSS)",
"Tools: browser DevTools, Postman for API-level registration tests",
"Estimate: 3 person-days for test design, 2 for execution",
],
"exit": "Test plan reviewed in sprint planning meeting",
"artefact": "Lightweight test plan (1-page in team wiki)",
},
"Phase 3 — Test Case Design": {
"entry": "Test plan approved; UI mockups available",
"activities": [
"Write 15 test cases covering happy path, validation errors, edge cases",
"Prepare test data: valid emails, duplicate emails, SQL injection strings",
"Peer review: another tester reviews all 15 cases",
],
"exit": "All 15 test cases reviewed and approved; test data prepared",
"artefact": "15 test cases in test management tool + test data sheet",
},
"Phase 4 — Environment Setup": {
"entry": "Test cases ready; staging server specs confirmed",
"activities": [
"Deploy latest build to staging environment",
"Verify database is clean (no leftover test accounts)",
"Run 3 smoke tests: page loads, form submits, confirmation email sends",
],
"exit": "All 3 smoke tests pass on staging",
"artefact": "Verified staging environment",
},
"Phase 5 — Test Execution": {
"entry": "Environment verified; build deployed and smoke-tested",
"activities": [
"Execute 15 test cases; 13 pass, 2 fail",
"File 2 defect reports: BUG-101 (duplicate email accepted), BUG-102 (no max-length on name field)",
"Developers fix BUG-101 (P1); re-test passes",
"BUG-102 deferred to next sprint (P3)",
"Run regression suite: 13/13 pass",
],
"exit": "13/15 pass, 1 fixed and re-tested, 1 deferred with stakeholder approval",
"artefact": "Execution report + 2 defect tickets",
},
"Phase 6 — Test Closure": {
"entry": "Execution complete; all critical defects resolved",
"activities": [
"Generate test summary: 93% pass rate, 1 deferred low-priority defect",
"Lessons learned: duplicate-email validation was missing from the spec — PO will add it to the template",
"Archive test cases and defect reports",
],
"exit": "Test summary signed off; artefacts archived",
"artefact": "Test summary report",
},
}
for phase, details in stlc_walkthrough.items():
print(f"\n{phase}")
print(f" Entry: {details['entry']}")
print(f" Exit: {details['exit']}")
print(f" Artefact: {details['artefact']}")
Common Mistakes
Mistake 1 — Using vague exit criteria like “testing is done”
❌ Wrong: “Testing is complete — we tested everything.”
✅ Correct: “Testing is complete: 15/15 cases executed, 14 passed, 1 deferred (P3, stakeholder-approved), 0 open P1/P2 defects, regression suite 100 % green.”
Mistake 2 — Not archiving test artefacts during Test Closure
❌ Wrong: Finishing the sprint and immediately moving to the next feature without saving test cases, results, or defect reports.
✅ Correct: Archiving all artefacts in the team wiki or test management tool so that future regression testing, audits, and onboarding new team members can reference the work that was done.