Skip to main content

Overview

This QA framework provides comprehensive manual testing coverage for Terra’s core functionality over a 3-day testing period. It’s designed to be used by a QA tester with minimal Terra knowledge but general web application testing experience.

Document Structure

📋 QA_TEST_SCRIPTS.md (Main Document)

Purpose: Detailed step-by-step test scripts Size: 40 test cases organized across 3 days Use: Primary testing guide - follow step-by-step Contains:
  • Detailed test procedures with expected results
  • Test data and prerequisites
  • Bug report template
  • Daily summary templates
  • Final QA report format

🔍 QA_QUICK_REFERENCE.md (Companion Guide)

Purpose: Quick lookup and testing tips Size: ~10 pages of checklists and references Use: Keep open while testing for quick answers Contains:
  • 3-day plan at a glance
  • Critical test checklist
  • Feature coverage map
  • Testing tips and best practices
  • Bug severity guidelines
  • Key URLs and commands

📊 QA_PROGRESS_TRACKER.md (Progress Tracking)

Purpose: Track daily progress and results Size: Structured tracking tables Use: Update throughout each day Contains:
  • Daily test completion tracking
  • Bug summary tables
  • Test coverage by feature
  • Risk assessment
  • Production readiness checklist
  • Final recommendation template

📖 QA_README.md (This File)

Purpose: Getting started guide Use: Read first to understand the framework

Getting Started

Step 1: Pre-Testing Setup (30 minutes)

  1. Review all documents:
    • Skim through QA_TEST_SCRIPTS.md to understand scope
    • Read QA_QUICK_REFERENCE.md fully
    • Print or open QA_PROGRESS_TRACKER.md for tracking
  2. Get credentials:
    • Admin account credentials
    • Regular user account credentials
    • Test email account access
    • Optional: Airtable, Plaid sandbox accounts
  3. Prepare test data:
    • Small PDF (< 5MB) for file uploads
    • Large file (> 10MB) for limit testing
    • Invalid files (.exe, .php) for security tests
    • Test images (PNG, JPG)
  4. Set up environment:
    • Access the active QA deployment URL
    • Verify you can log in
    • Open browser dev tools (F12)
    • Have incognito/private window ready
  5. Set up tracking:
    • Create folder for screenshots: terra-qa-screenshots/
    • Open QA_PROGRESS_TRACKER.md for editing
    • Fill in your name and start date

Step 2: Day 1 Testing (6-8 hours)

Focus: Core functionality - create, publish, submit workflow
  1. Open QA_TEST_SCRIPTS.md to “DAY 1” section
  2. Open QA_PROGRESS_TRACKER.md to “Day 1” table
  3. Execute tests TS-001 through TS-010 in order
  4. For each test:
    • Mark status in progress tracker (🔄)
    • Follow steps exactly as written
    • Record actual results in test script
    • Take screenshots of any issues
    • Update status: ✅ Pass, ❌ Fail, ⚠️ Blocked
    • Document bugs using bug template
  5. At end of day:
    • Complete Day 1 Summary in progress tracker
    • List key findings and questions
    • Flag critical issues to dev team immediately

Step 3: Day 2 Testing (6-8 hours)

Focus: Advanced features, settings, integrations
  1. Review Day 1 findings
  2. Execute tests TS-011 through TS-022
  3. Skip TS-016 (Airtable) and TS-017 (Plaid) if no accounts available
  4. Pay special attention to team management and portal tests
  5. Complete Day 2 Summary

Step 4: Day 3 Testing (6-8 hours)

Focus: Edge cases, security, error handling
  1. Execute tests TS-023 through TS-040
  2. Critical tests: TS-024 (Access Control), TS-027 (File Security), TS-030 (XSS)
  3. Test on mobile devices or browser responsive mode
  4. Complete Day 3 Summary

Step 5: Final Report (2 hours)

  1. Complete Bug Summary in progress tracker
  2. Fill out Test Coverage table
  3. Complete Risk Assessment
  4. Go through Production Readiness Checklist
  5. Write final recommendation
  6. Compile all screenshots
  7. Submit report to dev team

Testing Philosophy

What We’re Testing

  • Functional correctness: Does each feature work as intended?
  • User experience: Is the interface intuitive and helpful?
  • Data integrity: Is data saved, displayed, and exported correctly?
  • Security: Can users access only what they should?
  • Error handling: Are errors handled gracefully with clear messages?
  • Performance: Does the system respond in reasonable time?

What We’re NOT Testing (Out of Scope)

  • Code quality or architecture
  • Extensive load/stress testing
  • Cross-browser compatibility (focus on Chrome)
  • Accessibility (WCAG compliance)
  • API testing (except through UI)
  • Database performance optimization

Test Execution Guidelines

How to Execute Tests

  1. Read entire test first before starting
  2. Follow steps exactly as written
  3. Record actual results even if they match expected
  4. Take screenshots of every bug
  5. Don’t skip steps unless test is blocked
  6. Ask questions if something is unclear
  7. Document everything - better to over-document than under-document

When a Test Fails

  1. Verify it’s actually a bug:
    • Re-run the test to confirm
    • Check if you followed steps correctly
    • Try in different browser if possible
  2. Document thoroughly:
    • Use Bug Report Template
    • Include screenshot/video
    • Record exact steps to reproduce
    • Note which submission/form ID affected
    • Assign severity (Critical/High/Medium/Low)
  3. Determine if blocking:
    • If test is blocked, mark as ⚠️ Blocked
    • Document blocker reason
    • Move to next test
    • Flag blocker to dev team
  4. Continue testing:
    • Don’t wait for fixes during testing period
    • Document and move on
    • Re-test later if fix is deployed

Priority Rules

  1. Critical tests first (marked as Critical in scripts)
  2. Test in order (tests may depend on earlier tests)
  3. Don’t skip prerequisites (e.g., need published form before testing submissions)
  4. Security tests cannot be skipped (TS-024, TS-027, TS-030)

Bug Reporting Best Practices

Good Bug Report Example

BUG-003: Form submissions fail when file upload field is empty

Test Script: TS-006 (Multiple Field Types)
Date: 2024-01-15
Severity: High
Browser: Chrome 120.0 on macOS Sonoma

Steps to Reproduce:
1. Visit form /f/qa-field-types
2. Fill all required fields
3. Leave file upload field empty (no file selected)
4. Click Submit button

Expected Result:
- Form should submit successfully (file upload is not required)
- Success message displayed
- Submission saved to database

Actual Result:
- Error message: "File upload failed"
- Form does not submit
- User stuck on form page

Screenshot: bug-003-file-upload-error.png
Form ID: qa-field-types (slug)
Form Database ID: abc-123-def-456

Additional Notes:
- Error only occurs when field is empty
- Uploading a file works fine
- Issue reproducible 100% of the time
- Console shows: "TypeError: Cannot read property 'size' of null"

Poor Bug Report Example

BUG-005: Form broken

Something is wrong with the form. It doesn't work.
Why poor?
  • No test script reference
  • Vague description
  • No steps to reproduce
  • No expected vs actual
  • No screenshot
  • No severity
  • Can’t be reproduced by dev team

Time Management

Day 1: Core Functionality (10 tests)
  • 9:00 AM - Setup and TS-001 to TS-003 (2 hours)
  • 11:00 AM - TS-004 to TS-007 (2 hours)
  • 1:00 PM - Lunch break
  • 2:00 PM - TS-008 to TS-010 (2 hours)
  • 4:00 PM - Bug documentation and Day 1 summary (1 hour)
Day 2: Advanced Features (12 tests)
  • 9:00 AM - Review Day 1, TS-011 to TS-014 (2.5 hours)
  • 11:30 AM - TS-015 to TS-018 (2 hours)
  • 1:30 PM - Lunch break
  • 2:30 PM - TS-019 to TS-022 (2.5 hours)
  • 5:00 PM - Bug documentation and Day 2 summary (1 hour)
Day 3: Edge Cases & Security (18 tests)
  • 9:00 AM - Review Day 2, TS-023 to TS-028 (2.5 hours)
  • 11:30 AM - TS-029 to TS-034 (2 hours)
  • 1:30 PM - Lunch break
  • 2:30 PM - TS-035 to TS-040 (2 hours)
  • 4:30 PM - Bug summary and final report (2 hours)

If You Fall Behind

Priority order if time is limited:
  1. Critical path (MUST test):
    • TS-001: Authentication
    • TS-002: Create/Publish
    • TS-003: Submit Form
    • TS-004: View Submissions
    • TS-024: Access Control
    • TS-027: File Security
    • TS-030: XSS Prevention
  2. High priority:
    • TS-005: Form Status
    • TS-006: Field Types
    • TS-007: Validation
    • TS-010: Notifications
    • TS-013: Team Management
    • TS-019: Portal
    • TS-022: Mobile
  3. Medium priority:
    • All other tests
  4. Can skip if needed:
    • TS-016: Airtable (if no account)
    • TS-017: Plaid (if not configured)
    • TS-033: Multi-language
    • TS-037: Vault

Common Issues & Solutions

”I can’t log in”

  • Verify credentials are correct
  • Check if on correct environment (staging vs production)
  • Clear browser cache and cookies
  • Try incognito/private window
  • Contact dev team for credential reset

”Form won’t publish”

  • Check for validation errors in form builder
  • Verify all required settings configured
  • Check browser console for errors
  • Try creating a simpler form first

”Email notifications not receiving”

  • Check spam/junk folder
  • Verify email configuration in settings
  • Test with different email address
  • Allow 1-2 minutes for delivery
  • Check notification history in admin

”Test is blocked by previous failure”

  • Document blocker in progress tracker
  • Skip to next independent test
  • Return to blocked tests if fixes deployed
  • Don’t let one blocker stop all testing

”I found a bug but not sure if it’s critical”

  • Use severity guidelines in Quick Reference
  • Ask: Does this block core workflow?
  • Ask: Does this expose security risk?
  • When in doubt, mark as High and flag to team

Success Metrics

Minimum Acceptable

  • 30+ tests completed (75% coverage)
  • All Critical tests executed
  • All security tests executed
  • Pass rate > 70%
  • All bugs documented with repro steps

Target

  • 35+ tests completed (87% coverage)
  • Pass rate > 85%
  • No Critical bugs
  • < 3 High bugs
  • All bugs have screenshots

Excellent

  • 40 tests completed (100% coverage)
  • Pass rate > 95%
  • No Critical or High bugs
  • Comprehensive final report
  • Recommendations for improvements

After Testing

Deliverables

  1. Completed QA_PROGRESS_TRACKER.md
    • All tables filled out
    • Bug summary complete
    • Final recommendation provided
  2. Bug Reports
    • All bugs documented using template
    • Screenshots attached
    • Severity assigned
  3. Screenshots Folder
    • Organized by bug ID
    • All critical bugs have visual evidence
  4. Final Report (optional - can be part of tracker)
    • Executive summary
    • Overall metrics
    • Production readiness assessment
    • Top priority fixes
    • Testing gaps

Handoff to Dev Team

  1. Share all documents
  2. Walkthrough critical bugs
  3. Demo blockers if possible
  4. Answer dev team questions
  5. Offer to retest fixes

Questions & Support

During Testing

If you have questions:
  1. Check QA_QUICK_REFERENCE.md first
  2. Check this README
  3. Document question and continue testing
  4. Flag to dev team in daily summary
If you find a critical bug:
  1. Document immediately
  2. Flag to dev team right away (don’t wait)
  3. Mark as Critical severity
  4. Continue with other tests
If environment is down:
  1. Document in progress tracker
  2. Notify dev team
  3. Work on test planning or documentation while waiting
  4. Make up time when environment restored

Final Checklist

Before submitting your QA report:
  • All test results recorded
  • All bugs documented with template
  • All screenshots saved and organized
  • Bug severity assigned for all bugs
  • Daily summaries completed
  • Bug summary table filled out
  • Test coverage calculated
  • Production readiness checklist completed
  • Final recommendation written
  • Signature added to tracker
  • All documents spell-checked
  • Ready to present findings

Appendix: Glossary

Admin: User with elevated permissions to create/manage forms Applicant: Regular user who submits forms via portal Draft: Unpublished form or incomplete submission Portal: Authenticated area for applicants (/portal) Share Link: Trackable URL for form distribution Slug: URL-friendly identifier (e.g., “my-form” in /f/my-form) Submission: Completed form response Vault: Applicant’s reusable profile information Workspace/Folder: Organization unit for grouping forms Form States:
  • Draft: Not published, not accessible to public
  • Published: Live and accepting submissions
  • Closed: Published but not accepting new submissions
  • Archived: Removed from active list but not deleted
User Roles:
  • Super Admin: Full system access
  • Admin: Can create and manage forms
  • User/Applicant: Can submit forms and view their applications
  • Team Member (Editor): Can edit specific forms
  • Team Member (Viewer): Read-only access to forms

Document Version

  • Version: 1.0
  • Created: 2024-01-27
  • Framework based on: TestRail best practices
  • Customized for: Terra platform
  • Testing period: 3 days
  • Test count: 40 test scripts

Ready to begin?
  1. ✅ Read this README completely
  2. ✅ Skim all three documents
  3. ✅ Complete pre-testing setup
  4. ✅ Start with Day 1, TS-001
Good luck with your testing! 🚀 For questions about this framework or Terra functionality, contact the development team.