Data Testing
ETL Testing
ETL Extract Transform Load is a process that extracts data from source systems, transforms the information based on a set of business rules, then loads the data into a single repository. ETL testing refers to the process of validating, verifying and qualifying data while preventing duplicate records and data loss.
Stages of the ETL testing process
Effective ETL testing detects problems with the source data early on – before it is loaded to the data repository — as well as inconsistencies or ambiguities in business rules intended to guide data transformation and integration.
Validate Data Sources
Validate Transformation logic
Load validation
- Invalid data is rejected and the default values are accepted.
- Data sets in staging to Loading area works as expected
- Historical loading works as expected
- Incremental and total refresh work as expected.
Big Data Testing
The ultimate use of Big Data is its ability to give us actionable insights. Inferior quality data leads to poor analysis and hence to poor decisions. Errors in data of pharmaceutical companies or banks or investment firms may lead to legal complications. Ensuring accuracy of data leads to correct human engagement and interaction with the data system.
We ensure the quality of the data by rigorous and detailed end to end testing across the different stacks of the Big Data ecosystem. As a team of Big Data SDETs(Software Development Engineers in Testing), we understand the nitty-gritty of Big Data development and fundamentals of testing methodologies.