Testing Fundamentals & QA Concepts
What is software testing and why is it needed?
Software testing is the process of validating quality and identifying risks to confirm that a product meets requirements and is fit for use.
Example: “In our billing app, testing caught a rounding bug that would’ve mischarged 2% of users.”
Difference between QA and QC?
QA (Quality Assurance) focuses on process prevention, while QC (Quality Control) focuses on product defect detection.
Example: “QA introduced code review checklist; QC found a UI overflow on smaller screens.”
Verification vs. Validation?
Verification ensures we are building the product right, while validation ensures we are building the right product.
Example: “We verified API contract with schema checks; validated with UAT against real workflows.”
Static vs. Dynamic Testing?
Static testing uses reviews and analysis without execution, while dynamic testing involves executing code or systems.
Example: “Static: design review flagged missing error states; Dynamic: tests reproduced 500 errors.”
Priority vs. Severity?
Severity measures impact, while priority defines the order of fixing.
Example: “Logo misalignment: low severity, medium priority before launch.”
Test Levels & Types
Test levels (unit, integration, system, UAT)?
Each level has a different scope and owner. Earlier levels catch defects sooner.
Example: “Unit tests caught null checks; integration found OAuth token expiry edge case.”
Test types (functional, non-functional)?
Functional tests check features and requirements, while non-functional tests measure performance, security, usability, etc.
Example: “Functional: refund rules; Non-functional: t<500ms search response.”
Shift-left vs. shift-right testing?
Shift-left means testing earlier in SDLC, while shift-right involves production monitoring and experiments.
Example: “Shift-left with contract tests; shift-right via synthetic checks in prod.”
SDLC & STLC
Where does testing fit in SDLC models (Agile, Waterfall)?
In Agile, testing is continuous, while in Waterfall it is phase-gated.
Example: “In Scrum, we test each story within the sprint.”
Key activities in STLC?
Steps include: Planning → Analysis → Design → Execution → Defect reporting → Closure.
Example: “We created entry/exit criteria before executing regression.”
Entry and exit criteria?
Preconditions define when to start (entry) and when to finish (exit).
Example: “Exit: 0 critical/major defects, 95% regression pass.”
Traceability matrix (RTM)?
RTM maps requirements to test cases and defects to ensure coverage.
Example: “RTM showed 2 untested acceptance criteria pre-UAT.”
Definition of Ready/Done for a story?
DoR = start readiness; DoD = completion quality.
Example: “Done included automated tests + updated docs.”
How do you handle changing requirements mid-sprint?
Re-estimate, re-prioritize, and adjust tests/scope accordingly.
Example: “We split the story and limited scope to MVP.”
Test Planning & Strategy
What goes into a test plan?
Scope, approach, resources, environment, risks, and schedule.
Example: “Black-box for UI; contract tests for services; JMeter for load.”
How do you define test scope?
Based on risk, impact, usage, and constraints.
Example: “Focused on checkout path covering 80% traffic.”
Risk-based testing approach?
Prioritize by likelihood × impact.
Example: “Payment gateway > wishlist due to revenue risk.”
Test estimation techniques?
Use WBS, 3-point estimation, or historical velocity.
Example: “Used 3-point for regression: 2–3–5 days → 3.3 days.”
Choosing manual vs. automation?
Automate stable, repeatable, high-value flows; keep others manual.
Example: “Automated smoke + regression; kept ad-hoc UX manual.”
Test environment strategy?
Maintain prod parity, seeded data, and isolation.
Example: “Dedicated staging with masked prod data.”
Configuration and compatibility matrix?
Lists platforms, browsers, devices to ensure coverage.
Example: “Android 12–14, iOS 16–18, Chrome/Firefox/Safari latest-1.”
Test data strategy?
Balance synthetic vs. masked prod data with edge cases.
Example: “Used Faker to generate edge names with emojis.”
Test Design Techniques
Equivalence partitioning?
Divide inputs into valid/invalid classes.
Example: “Age field: <0, 0–120, >120.”
Boundary value analysis?
Test edges and just inside/outside values.
Example: “Password length 8–64: test 7,8,64,65.”
Decision tables?
Map conditions vs. outcomes.
Example: “Shipping fee rules across location × cart value.”
State transition testing?
Validates state changes triggered by events.
Example: “Order: NEW→PAID→SHIPPED→DELIVERED.”
Use case vs. user story testing?
Use case = end-to-end flows; user story = small increments.
Example: “E2E from signup to purchase.”
Pairwise testing?
Minimizes test sets by covering parameter pairs.
Example: “OS×Browser×Locale minimized to 10 combos.”
Negative testing strategy?
Check invalid inputs, timeouts, and failures.
Example: “Drop network during payment confirmation.”
Exploratory testing charters?
Time-boxed, goal-oriented exploratory sessions.
Example: “60-min charter: stress coupon edge cases.”
Functional Testing
Smoke vs. Sanity Testing
Answer: Smoke testing checks basic build health to ensure the application is stable enough for deeper testing. Sanity testing focuses on verifying specific changes or bug fixes.
Example: Smoke: verified login works; Sanity: validated new OTP flow.
Regression Testing Strategy
Answer: A good strategy includes risk-based prioritization, automation coverage, and test tagging for efficient execution.
Example: Tagged @checkout
tests for every PR.
End-to-End Testing Scope
Answer: End-to-end testing validates real-world workflows across services to mimic user journeys.
Example: Cart → payment → invoice → email receipt flow was tested.
Ad-Hoc vs. Structured Testing
Answer: Ad-hoc testing is exploratory and useful for discovering unexpected issues, while structured testing ensures systematic coverage.
Example: Found a copy bug during ad-hoc testing.
Acceptance Criteria to Test Cases Mapping
Answer: Each acceptance criterion (AC) should translate into multiple test cases, covering both happy and alternative paths.
Example: AC-3 resulted in 4 alternative-path test cases.
Non-Functional Testing
Performance Test Types (Load, Stress, Soak)
Answer: Load = normal expected traffic, Stress = beyond capacity, Soak = sustained long-duration performance.
Example: Soak test revealed a memory leak after 8 hours.
Key Performance Metrics
Answer: Response time, throughput, error rate, and percentiles (p90/p95).
Example: p95 checkout API latency < 300ms.
Bottleneck Diagnosis
Answer: Use profiling, APM, and DB query optimization.
Example: Added index on orders table → p95 improved by 40%.
Basic Security Checks
Answer: Apply OWASP Top 10 principles: input validation, authentication, and authorization.
Example: Blocked XSS with proper output encoding.
SQL Injection Testing Approach
Answer: Use parameterized queries and inject test payloads.
Example: ‘ OR 1=1 returned 403 with WAF log.
Authentication/Authorization Testing
Answer: Validate role-based access, token expiry, and session controls.
Example: Viewer role blocked from admin endpoints (403).
CSRF and CORS Basics
Answer: CSRF mitigated via tokens and same-site cookies; CORS enforced with strict origin policies.
Example: Same-site=Lax cookie prevented cross-origin POST.
Accessibility Testing Essentials
Answer: Follow WCAG guidelines: ARIA labels, contrast, keyboard navigation.
Example: Fixed low-contrast CTA button for AA compliance.
Usability Testing Feedback Loops
Answer: Combine heuristics, user sessions, and analytics.
Example: Reduced checkout steps from 6 → 3 after testing.
Privacy & Compliance (GDPR/PII)
Answer: Enforce data minimization, masking, and consent flows.
Example: Masked PII in logs after compliance audit.
Defect Management
Lifecycle of a Defect
Answer: New → Assigned → In-Progress → Resolved → Verified → Closed.
Example: Bug reopened due to missing null check.
Good Bug Report Anatomy
Answer: Clear steps, expected vs. actual result, environment, evidence, severity, and priority.
Example: Attached HAR file + console trace for a 500 error.
Duplicate and “Can’t Reproduce” Handling
Answer: De-duplicate via search and enrich reports with repro data.
Example: Added dataset + video for race condition repro.
Root Cause Analysis (RCA)
Answer: Use 5 Whys, fishbone diagrams, and categorize prevention measures.
Example: Schema validation added to stop null payloads.
Defect Leakage Metric
Answer: Ratio of escaped defects to total defects.
Example: Post-release P1s dropped 60% after contract tests.
Triaging Process
Answer: Review by cross-functional teams, balance risk vs. effort.
Example: Deferred low-impact UI glitch to next sprint.
Agile, Scrum, DevOps & CI/CD
Tester’s Role in Scrum
Answer: Act as quality advocate, contribute automation, and participate in refinement.
Example: Wrote BDD tests during sprint grooming.
Definition of Done for Features
Answer: Includes passing tests, documentation, code review, and deploy readiness.
Example: CI pipeline green + 90% coverage on critical paths.
Working in CI/CD
Answer: Tests run per PR, gating merges, providing fast feedback.
Example: Jenkins blocked merge due to smoke test failure.
Feature Flags in Testing
Answer: Test both enabled and disabled states; ensure rollback.
Example: Flagged new search feature → tested rollback in seconds.
Blue/Green or Canary Releases
Answer: Rollout to small user groups, monitor KPIs, rollback if needed.
Example: Canary to 10% users; error spike → rollback.
Test Flakiness Control
Answer: Stabilize locators, manage waits, seed data properly.
Example: Replaced sleeps with network-idling waits.
Continuous Testing Strategy
Answer: Follow test pyramid: unit → API → UI with contract tests and monitoring.
Example: API suite ran in 2 minutes per PR.
Managing Technical Debt in Tests
Answer: Modularize, refactor, and track debt tickets.
Example: Split page objects into smaller components.
Quality Gates (Sonar, Coverage)
Answer: Block merges if thresholds aren’t met (security, reliability, coverage).
Example: Blocked PR with >0 vulnerabilities.
Production Monitoring & SLOs
Answer: Use synthetic tests, logs, and alerting based on SLOs.
Example: Synthetic login every 5 min; alert if p95 > 400ms.
Test Automation
Test Automation Pyramid
Answer: Wide unit coverage, medium service/API, thin UI.
Example: Reduced UI tests 30%, added API automation.
Selecting Automation Tools
Answer: Choose based on stack, team skills, and ROI.
Example: Playwright for UI, REST Assured for APIs.
Page Object Model (POM)
Answer: Encapsulate UI locators and actions into reusable objects.
Example: Reduced duplication across 40 UI tests.
Data-Driven vs. Keyword-Driven
Answer: Data-driven varies inputs; keyword-driven defines reusable steps.
Example: CSV-based login permutations in data-driven tests.
Flaky UI Test Mitigation
Answer: Use stable locators, explicit waits, mocks.
Example: Added data-testid
and mocked network routes.
Parallel Test Execution
Answer: Run tests in shards with isolated data.
Example: 100 specs finished in 8 mins on 8 runners.
CI Integration for Automation
Answer: Integrate headless runs, artifacts, dashboards.
Example: Stored screenshots + logs for failed runs.
Reporting Frameworks (Allure/Extent)
Answer: Rich reports with steps, logs, screenshots.
Example: Linked Allure report to JIRA tickets.
BDD (Cucumber/Gherkin)
Answer: Create behavior specs readable by all stakeholders.
Example: Product team validated Gherkin scenarios.
Contract Testing (Pact)
Answer: Validate consumer-provider contracts.
Example: Caught field rename before release.
Service Virtualization/Mocking
Answer: Simulate unavailable or costly dependencies.
Example: WireMock used for payment gateway.
Choosing What Not to Automate
Answer: Avoid volatile UIs, one-off tests, low ROI areas.
Example: Manually reviewed marketing landing pages.
API & Microservices
Testing REST APIs
Answer: Validate contracts, happy/negative flows, authentication, rate limits.
Example: Postman tests included schema validation + 401/429 checks.
Idempotency in APIs
Answer: Same request yields consistent result without side effects.
Example: PUT updates remain stable with idempotency key.
Testing Pagination & Filtering
Answer: Validate boundaries, ordering, and totals.
Example: Checked page size=1, max, and last page edge cases.
Circuit Breakers & Retries
Answer: Test fallback and retry logic for resilience.
Example: Verified fallback triggered on downstream 503.
Event-Driven Testing (Kafka/SQS)
Answer: Validate publish/consume, ordering, retries, DLQ handling.
Example: Handled duplicate events correctly.
API Versioning Strategy
Answer: Ensure backward compatibility and controlled deprecation.
Example: Ran v1 and v2 side by side before sunset.
Testing GraphQL
Answer: Validate schema, resolvers, N+1 queries, auth.
Example: Found performance issue in nested queries.
Data, DB & ETL Testing
Validating Data Integrity
Answer: Check PK/FK, unique constraints, and null handling.
Example: Found orphaned rows after cascade delete.
ETL Pipeline Testing
Answer: Validate mapping, transformation, sampling, and data drift.
Example: Caught rounding error in currency conversion.
Performance in DB Queries
Answer: Use indexes and query explain plans.
Example: Covering index reduced query time from 2s → 150ms.
Data Masking/Anonymization
Answer: Tokenize or mask PII in non-production.
Example: Masked emails but preserved domain stats.
Migration/Backup Restore Testing
Answer: Validate dry runs, checksums, and rollback.
Example: Restored snapshot in DR successfully.
Reporting/BI Validation
Answer: Cross-check KPIs with source data.
Example: Dashboard revenue matched ledger within ₹10.
Mobile, Web & Cross-Browser
Responsive Testing Strategy
Answer: Test at breakpoints with emulators and real devices.
Example: iPhone SE footer overlap fixed.
Native vs. Hybrid App Testing
Answer: Native = platform APIs, Hybrid = web views.
Example: Offline cart sync tested on Android native app.
Push Notifications Testing
Answer: Validate permissions, deep links, and timing.
Example: Notification CTA opened correct screen.
Cross-Browser Issues
Answer: Account for CSS/JS differences with polyfills.
Example: Polyfilled Intl API for Safari.
Accessibility on Mobile
Answer: Test with screen readers and gesture navigation.
Example: VoiceOver announced labels correctly.
Tools, Collaboration & Reporting
Which Tools Do You Use and Why?
Answer: Pick tools aligned with problems.
Example: JIRA for defect tracking, Playwright for UI, REST Assured for APIs.
Creating Dashboards & Metrics
Answer: Build both exec-level and team-level dashboards.
Example: Grafana board tracked release readiness.
Working with Developers on Defects
Answer: Provide repro data, logs, and collaborate empathetically.
Example: Paired with dev to isolate race condition.
Collaborating with Product/UX
Answer: Involve early for acceptance criteria and usability checks.
Example: Validated prototypes to de-risk user flows.
Documentation Best Practices
Answer: Maintain living docs like runbooks and strategy guides.
Example: Onboarding time reduced 50% with a runbook.
Risk, Metrics, Estimation, UAT & Misc
Key QA Metrics You Track
Answer: Defect density, escape rate, lead time, coverage.
Example: Escape defects reduced after contract testing.
Handling Tight Deadlines
Answer: Triage risk, thin-slice tests, automate smoke.
Example: Released with canary deployment + monitoring.
Running UAT Effectively
Answer: Use real personas, scripts, and quick feedback loops.
Example: Merchants validated statement exports.
Post-Release Validation (Shift-Right)
Answer: Run smoke tests in production and monitor error budgets.
Example: Synthetic checkout + log sampling after deploy.