Manual Testing
1. Static Testing
- Description: A testing technique where the code, requirements, and design are reviewed without executing the program. It helps in identifying errors early.
- Example: Code reviews or walkthroughs of design documents.
2. Dynamic Testing
- Description: Testing that involves executing the software to verify its functionality and behavior during runtime.
- Example: Running a web application to test user login.
3. Verification
- Description: Ensuring that the software meets the specified requirements and design. It answers, "Are we building the product correctly?"
- Example: Reviewing design specifications to ensure the product follows them.
4. Validation
- Description: Ensuring the software meets user needs and requirements. It answers, "Are we building the right product?"
- Example: Conducting user acceptance testing to ensure the product meets customer requirements.
5. SDLC (Software Development Life Cycle)
- Description: A structured approach to software development that includes phases like planning, design, development, testing, deployment, and maintenance.
- Example: A typical SDLC process involves requirements gathering, design, coding, testing, and release.
6. SDLC Models
- Description: Various approaches to the software development process. Common models include Waterfall, Agile, V-Model, and Spiral.
- Example: Agile focuses on iterative development with frequent feedback, while Waterfall follows a linear, sequential approach.
7. Agile
- Description: An iterative and incremental approach to software development that emphasizes collaboration, flexibility, and customer feedback.
- Example: Using Scrum or Kanban methodologies for managing development tasks.
8. 7 Principles of Testing
- Description: Seven guiding principles to ensure effective software testing.
- Testing shows the presence of defects.
- Exhaustive testing is impossible.
- Early testing saves time and money.
- Defect clustering.
- Pesticide paradox.
- Testing is context-dependent.
- Absence-of-errors fallacy.
9. STLC (Software Testing Life Cycle)
- Description: A series of phases followed to test a software product and ensure quality, including test planning, design, execution, and closure.
- Example: Phases include test planning, test design, test execution, and test closure.
10. Types of MT (Testing Techniques)
- Description: Testing methods used to assess software quality, such as Manual Testing (MT), Automation Testing, or Mutation Testing.
- Example: Manual Testing involves the tester executing test cases without using automated tools.
11. Testing Techniques
- Description: Methods used to design and execute tests effectively, such as black-box or white-box testing.
- Example: Black-box testing focuses on functionality while white-box testing focuses on code structure.
12. Black Box Testing
- Description: A testing method that focuses on the software’s output based on various inputs, without knowledge of internal code structure.
- Example: Testing a login page by entering valid and invalid credentials to check for correct behavior.
13. White Box Testing
- Description: A testing method that involves knowledge of the internal code structure. It focuses on testing individual code paths and branches.
- Example: Writing test cases that check for edge cases in the code and test all possible branches.
14. Functional Testing
- Description: Testing the software’s functionality based on the requirements to ensure it behaves as expected.
- Example: Testing a search feature on an e-commerce site to make sure it returns relevant results.
15. Unit Testing
- Description: Testing individual components or units of the software in isolation to verify that they work correctly.
- Example: Testing a function that adds two numbers to ensure it returns the correct result.
16. Test Cases
- Description: A set of conditions or variables under which a tester checks if the software is working as intended.
- Example: A test case that checks if a user can successfully log into a website.
17. Integration Testing
- Description: Testing the interactions between different software modules to ensure they work together.
- Example: Testing the integration between a payment gateway and a shopping cart module.
18. Big Bang Testing
- Description: Integrating all components of the system at once and testing them together.
- Example: Testing an entire application after all the modules have been integrated at the same time.
19. Incremental Testing
- Description: Testing each module individually as it is integrated into the system.
- Example: Testing a new module as it's added to an existing application to verify that it works with the other components.
20. Sandwich Testing
- Description: A combination of both Big Bang and Incremental Testing, where some modules are tested incrementally while others are tested at the final stage.
- Example: First testing individual components, then integrating and testing everything together.
21. System Testing
- Description: Testing the entire system as a whole to verify that all components work together as expected.
- Example: Testing the complete functionality of a banking application after all modules (login, transactions, account management) are integrated.
22. Regression Testing
- Description: Ensuring that new code changes do not adversely affect the existing functionality of the software.
- Example: Re-running test cases after a software update to ensure no existing features are broken.
23. Smoke Testing
- Description: Basic testing to ensure that the most critical functions of the software are working correctly.
- Example: Checking if the application starts and the primary features, like login or data retrieval, are functioning.
24. Sanity Testing
- Description: A quick round of testing to ensure that the basic functionality of the application works after a new build or changes.
- Example: Verifying that a new build of an application launches correctly after bug fixes.
25. Boundary Value Analysis
- Description: A testing technique that focuses on testing the boundaries of input values, as defects are often found at the edge cases.
- Example: Testing the input range of a field that accepts values between 1 and 100 by checking the values 0, 1, 100, and 101.
26. State Transition Testing
- Description: Testing the software based on its different states, and how it transitions between those states based on inputs.
- Example: Testing a form that transitions from "inactive" to "active" state based on user input.
27. Decision Table Testing
- Description: Using decision tables to represent different conditions and actions in a structured way for testing complex logic.
- Example: Testing a discount application system based on different conditions (age, loyalty status).
28. User Acceptance Testing (UAT)
- Description: Testing the software by the end-users to confirm it meets their needs and expectations before the product is released.
- Example: A business user testing a new feature in a software tool to verify if it aligns with business requirements.
29. Non-Functional Testing
- Description: Testing aspects of the software not related to specific functions, such as performance, scalability, and security.
- Example: Load testing to verify that a website can handle a large number of concurrent users.
30. Functional Testing
- Description: Ensuring that the software’s functionality works according to its specifications and requirements.
- Example: Verifying that an online store’s checkout process functions as described in the requirements.
31. Defect/Bug Lifecycle
- Description: The process that a defect or bug goes through, from detection to resolution, including stages like New, Assigned, Fixed, and Closed.
- Example: A bug reported by a tester goes through the lifecycle of being fixed and closed after verification.
32. Use Case Testing
- Description: Testing based on use cases, which describe the interactions between a user and the system for achieving specific goals.
- Example: Testing a user registration process as described in the use case document.
33. Test Documentation
- Description: Written documentation that defines the testing process, test cases, results, and defects.
- Example: Test case documents, test plans, and defect logs that provide a clear record of the testing process.
34. Requirements Traceability Matrix (RTM)
- Description: A document that maps and traces user requirements with test cases to ensure that all requirements are covered during testing.
- Example: An RTM that links each test case to its corresponding requirement, ensuring comprehensive coverage.
Each of these terms is essential for understanding and performing different types of software testing and development processes effectively.