Key Responsibilities:
Write test data scripts, based on ETL mapping artifacts
Execute data scripts and detailed analysis on the scripts
Create strategies and test cases for applications that use ETL components
Data mining and detailed data analysis on data warehousing systems
Execute formal test plans to ensure the delivery of data related projects
Provide input and support big data testing initiative
Define and track quality assurance metrics such as defects, defect counts, test results
Technical Experience:
5 years of relevant work experience testing backend, focusing on complex Data Pipeline
Experience analyzing ETL mapping documents
Create SQL scripts, Shell script based ETL mapping documents
Create and execute strategies and test cases for applications that use ETL components
Expert in Data Warehouse Pipeline Testing
Data warehouse testing experience
Automate Test Execution for Data Pipeline
Review Test execution results
Experience with Airflow is big plus
Technical skills on SQL query and data pipeline are essential for this role
Basic understanding of Big Data architectures and terminologies like spark, hive, HBase.
create logs to document testing phases and conduct post-release/ post-implementation testing
Work with cross-functional teams to ensure quality throughout the software development lifecycle.
Independent Testing experience across the SDL
Experience working in any cloud environment AWS (Amazon Web Services) is plus
You must verify your mobile number to apply to this job.