About the Role:
Identify the customer-driven use cases for different Big Data services and platforms.
Write scalable data processing applications/scripts to simulate customer-driven use cases for different Big Data services.
You will be involved in test automation of modern Big Data systems on-premise and cloud (AWS, GCP & Azure), CI/CD for software development lifecycle.
Independently explore different Big Data Technologies as and when needed, knowing their functioning in and out, which will help in adding framework support for integration and use case driven scenarios testing.
Good problem solving and analytical skills
About You
8+ years of Relevent experience
Hands-on experience in implementing end-to-end solutions for test automation.
Test Framework development experience
Should have the operational instincts and keen attention to details to think beyond the obvious and come up with solutions
Good knowledge of Linux operating systems.
Good knowledge of Big Data technologies, preferably Hadoop, Spark, Hive.
Exposure in writing simple to complex SQL queries
Hands-on experience in implementing Jenkins / CI-CD Pipeline
Good Knowledge of Big Data technologies like BigQuery, Kafka, Impala, Presto, Oozie and Airflow, Elasticsearch, HBase, and Snowflake
Knowledge of Cloud data technologies and platforms like EMR, DataBricks, GCP, HDInsights, Kubernetes, etc.
--
You must verify your mobile number to apply to this job.