Overview
This QA framework provides comprehensive manual testing coverage for Terra’s core functionality over a 3-day testing period. It’s designed to be used by a QA tester with minimal Terra knowledge but general web application testing experience.Document Structure
📋 QA_TEST_SCRIPTS.md (Main Document)
Purpose: Detailed step-by-step test scripts Size: 40 test cases organized across 3 days Use: Primary testing guide - follow step-by-step Contains:- Detailed test procedures with expected results
- Test data and prerequisites
- Bug report template
- Daily summary templates
- Final QA report format
🔍 QA_QUICK_REFERENCE.md (Companion Guide)
Purpose: Quick lookup and testing tips Size: ~10 pages of checklists and references Use: Keep open while testing for quick answers Contains:- 3-day plan at a glance
- Critical test checklist
- Feature coverage map
- Testing tips and best practices
- Bug severity guidelines
- Key URLs and commands
📊 QA_PROGRESS_TRACKER.md (Progress Tracking)
Purpose: Track daily progress and results Size: Structured tracking tables Use: Update throughout each day Contains:- Daily test completion tracking
- Bug summary tables
- Test coverage by feature
- Risk assessment
- Production readiness checklist
- Final recommendation template
📖 QA_README.md (This File)
Purpose: Getting started guide Use: Read first to understand the frameworkGetting Started
Step 1: Pre-Testing Setup (30 minutes)
-
Review all documents:
- Skim through QA_TEST_SCRIPTS.md to understand scope
- Read QA_QUICK_REFERENCE.md fully
- Print or open QA_PROGRESS_TRACKER.md for tracking
-
Get credentials:
- Admin account credentials
- Regular user account credentials
- Test email account access
- Optional: Airtable, Plaid sandbox accounts
-
Prepare test data:
- Small PDF (< 5MB) for file uploads
- Large file (> 10MB) for limit testing
- Invalid files (.exe, .php) for security tests
- Test images (PNG, JPG)
-
Set up environment:
- Access the active QA deployment URL
- Verify you can log in
- Open browser dev tools (F12)
- Have incognito/private window ready
-
Set up tracking:
- Create folder for screenshots:
terra-qa-screenshots/ - Open QA_PROGRESS_TRACKER.md for editing
- Fill in your name and start date
- Create folder for screenshots:
Step 2: Day 1 Testing (6-8 hours)
Focus: Core functionality - create, publish, submit workflow- Open QA_TEST_SCRIPTS.md to “DAY 1” section
- Open QA_PROGRESS_TRACKER.md to “Day 1” table
- Execute tests TS-001 through TS-010 in order
- For each test:
- Mark status in progress tracker (🔄)
- Follow steps exactly as written
- Record actual results in test script
- Take screenshots of any issues
- Update status: ✅ Pass, ❌ Fail, ⚠️ Blocked
- Document bugs using bug template
- At end of day:
- Complete Day 1 Summary in progress tracker
- List key findings and questions
- Flag critical issues to dev team immediately
Step 3: Day 2 Testing (6-8 hours)
Focus: Advanced features, settings, integrations- Review Day 1 findings
- Execute tests TS-011 through TS-022
- Skip TS-016 (Airtable) and TS-017 (Plaid) if no accounts available
- Pay special attention to team management and portal tests
- Complete Day 2 Summary
Step 4: Day 3 Testing (6-8 hours)
Focus: Edge cases, security, error handling- Execute tests TS-023 through TS-040
- Critical tests: TS-024 (Access Control), TS-027 (File Security), TS-030 (XSS)
- Test on mobile devices or browser responsive mode
- Complete Day 3 Summary
Step 5: Final Report (2 hours)
- Complete Bug Summary in progress tracker
- Fill out Test Coverage table
- Complete Risk Assessment
- Go through Production Readiness Checklist
- Write final recommendation
- Compile all screenshots
- Submit report to dev team
Testing Philosophy
What We’re Testing
- Functional correctness: Does each feature work as intended?
- User experience: Is the interface intuitive and helpful?
- Data integrity: Is data saved, displayed, and exported correctly?
- Security: Can users access only what they should?
- Error handling: Are errors handled gracefully with clear messages?
- Performance: Does the system respond in reasonable time?
What We’re NOT Testing (Out of Scope)
- Code quality or architecture
- Extensive load/stress testing
- Cross-browser compatibility (focus on Chrome)
- Accessibility (WCAG compliance)
- API testing (except through UI)
- Database performance optimization
Test Execution Guidelines
How to Execute Tests
- Read entire test first before starting
- Follow steps exactly as written
- Record actual results even if they match expected
- Take screenshots of every bug
- Don’t skip steps unless test is blocked
- Ask questions if something is unclear
- Document everything - better to over-document than under-document
When a Test Fails
-
Verify it’s actually a bug:
- Re-run the test to confirm
- Check if you followed steps correctly
- Try in different browser if possible
-
Document thoroughly:
- Use Bug Report Template
- Include screenshot/video
- Record exact steps to reproduce
- Note which submission/form ID affected
- Assign severity (Critical/High/Medium/Low)
-
Determine if blocking:
- If test is blocked, mark as ⚠️ Blocked
- Document blocker reason
- Move to next test
- Flag blocker to dev team
-
Continue testing:
- Don’t wait for fixes during testing period
- Document and move on
- Re-test later if fix is deployed
Priority Rules
- Critical tests first (marked as Critical in scripts)
- Test in order (tests may depend on earlier tests)
- Don’t skip prerequisites (e.g., need published form before testing submissions)
- Security tests cannot be skipped (TS-024, TS-027, TS-030)
Bug Reporting Best Practices
Good Bug Report Example
Poor Bug Report Example
- No test script reference
- Vague description
- No steps to reproduce
- No expected vs actual
- No screenshot
- No severity
- Can’t be reproduced by dev team
Time Management
Recommended Schedule
Day 1: Core Functionality (10 tests)- 9:00 AM - Setup and TS-001 to TS-003 (2 hours)
- 11:00 AM - TS-004 to TS-007 (2 hours)
- 1:00 PM - Lunch break
- 2:00 PM - TS-008 to TS-010 (2 hours)
- 4:00 PM - Bug documentation and Day 1 summary (1 hour)
- 9:00 AM - Review Day 1, TS-011 to TS-014 (2.5 hours)
- 11:30 AM - TS-015 to TS-018 (2 hours)
- 1:30 PM - Lunch break
- 2:30 PM - TS-019 to TS-022 (2.5 hours)
- 5:00 PM - Bug documentation and Day 2 summary (1 hour)
- 9:00 AM - Review Day 2, TS-023 to TS-028 (2.5 hours)
- 11:30 AM - TS-029 to TS-034 (2 hours)
- 1:30 PM - Lunch break
- 2:30 PM - TS-035 to TS-040 (2 hours)
- 4:30 PM - Bug summary and final report (2 hours)
If You Fall Behind
Priority order if time is limited:-
Critical path (MUST test):
- TS-001: Authentication
- TS-002: Create/Publish
- TS-003: Submit Form
- TS-004: View Submissions
- TS-024: Access Control
- TS-027: File Security
- TS-030: XSS Prevention
-
High priority:
- TS-005: Form Status
- TS-006: Field Types
- TS-007: Validation
- TS-010: Notifications
- TS-013: Team Management
- TS-019: Portal
- TS-022: Mobile
-
Medium priority:
- All other tests
-
Can skip if needed:
- TS-016: Airtable (if no account)
- TS-017: Plaid (if not configured)
- TS-033: Multi-language
- TS-037: Vault
Common Issues & Solutions
”I can’t log in”
- Verify credentials are correct
- Check if on correct environment (staging vs production)
- Clear browser cache and cookies
- Try incognito/private window
- Contact dev team for credential reset
”Form won’t publish”
- Check for validation errors in form builder
- Verify all required settings configured
- Check browser console for errors
- Try creating a simpler form first
”Email notifications not receiving”
- Check spam/junk folder
- Verify email configuration in settings
- Test with different email address
- Allow 1-2 minutes for delivery
- Check notification history in admin
”Test is blocked by previous failure”
- Document blocker in progress tracker
- Skip to next independent test
- Return to blocked tests if fixes deployed
- Don’t let one blocker stop all testing
”I found a bug but not sure if it’s critical”
- Use severity guidelines in Quick Reference
- Ask: Does this block core workflow?
- Ask: Does this expose security risk?
- When in doubt, mark as High and flag to team
Success Metrics
Minimum Acceptable
- 30+ tests completed (75% coverage)
- All Critical tests executed
- All security tests executed
- Pass rate > 70%
- All bugs documented with repro steps
Target
- 35+ tests completed (87% coverage)
- Pass rate > 85%
- No Critical bugs
- < 3 High bugs
- All bugs have screenshots
Excellent
- 40 tests completed (100% coverage)
- Pass rate > 95%
- No Critical or High bugs
- Comprehensive final report
- Recommendations for improvements
After Testing
Deliverables
-
Completed QA_PROGRESS_TRACKER.md
- All tables filled out
- Bug summary complete
- Final recommendation provided
-
Bug Reports
- All bugs documented using template
- Screenshots attached
- Severity assigned
-
Screenshots Folder
- Organized by bug ID
- All critical bugs have visual evidence
-
Final Report (optional - can be part of tracker)
- Executive summary
- Overall metrics
- Production readiness assessment
- Top priority fixes
- Testing gaps
Handoff to Dev Team
- Share all documents
- Walkthrough critical bugs
- Demo blockers if possible
- Answer dev team questions
- Offer to retest fixes
Questions & Support
During Testing
If you have questions:- Check QA_QUICK_REFERENCE.md first
- Check this README
- Document question and continue testing
- Flag to dev team in daily summary
- Document immediately
- Flag to dev team right away (don’t wait)
- Mark as Critical severity
- Continue with other tests
- Document in progress tracker
- Notify dev team
- Work on test planning or documentation while waiting
- Make up time when environment restored
Final Checklist
Before submitting your QA report:- All test results recorded
- All bugs documented with template
- All screenshots saved and organized
- Bug severity assigned for all bugs
- Daily summaries completed
- Bug summary table filled out
- Test coverage calculated
- Production readiness checklist completed
- Final recommendation written
- Signature added to tracker
- All documents spell-checked
- Ready to present findings
Appendix: Glossary
Admin: User with elevated permissions to create/manage forms Applicant: Regular user who submits forms via portal Draft: Unpublished form or incomplete submission Portal: Authenticated area for applicants (/portal) Share Link: Trackable URL for form distribution Slug: URL-friendly identifier (e.g., “my-form” in /f/my-form) Submission: Completed form response Vault: Applicant’s reusable profile information Workspace/Folder: Organization unit for grouping forms Form States:- Draft: Not published, not accessible to public
- Published: Live and accepting submissions
- Closed: Published but not accepting new submissions
- Archived: Removed from active list but not deleted
- Super Admin: Full system access
- Admin: Can create and manage forms
- User/Applicant: Can submit forms and view their applications
- Team Member (Editor): Can edit specific forms
- Team Member (Viewer): Read-only access to forms
Document Version
- Version: 1.0
- Created: 2024-01-27
- Framework based on: TestRail best practices
- Customized for: Terra platform
- Testing period: 3 days
- Test count: 40 test scripts
Ready to begin?
- ✅ Read this README completely
- ✅ Skim all three documents
- ✅ Complete pre-testing setup
- ✅ Start with Day 1, TS-001