JOB DESCRIPTION

Big Data Engineer [IT20-088]

Posting Date : 12 Oct 2020 | Close Date :11 Dec 2020


Job Description

9+ years of experience in playing a major role in “building Enterprise Datalake on Cloudera Hadoop platform” from various heterogeneous data sources (like RDBMS, Cloud data, Application APIs, files, etc.). The primary ETL tool for data loading and transformation would be “Informatica BDM/Data Engineering Integration”. Should be involved in ‘requirements analysis’, ‘Hadoop data modeling’, ‘design/development of Informatica ETL jobs’ and ‘production implementation of developed ETL jobs, Spark scripts using CI/CD tools’.

Experience & Skill Set

  1. Designed and implemented the ETL framework for Big data on Cloudera Hadoop platform.
  2.  Rich experience in supporting Post deployment into Production and involved in optimizing the performance of Data retrieval and resolving the issues proactively.
  3. Helped other teams for design and technical related issues.
  4. Designed archival Jobs for optimize the storage and processing time.
  5. Strong experience of Big data platform – specially on Cloudera Hadoop ecosystem –HDFS, Hive, Hbase, Impala, Sqoop, Spark, Kafka
  6. Strong experience in Informatica modules (Informatica 10.4 exp. would be preferred) – BDM/DEI, DES, PowerCenter, PowerExchange, CDC
  7. Strong scripting skills in Linux environment and SQL Experience in data modeling in Hadoop.
  8. Good experience with CI/CD & DevOps tools – Nexus, Jenkins, GitHub, SVN, JIRA Good to have knowledge on ‘Cloud datalake’ and ‘Visualization tools like Tableau’
  9. Good to have knowledge on Informatica EDC & Axon.
  10. Ability to prioritize and multi-task across numerous work streams. Strong interpersonal skills; ability to work on cross-functional teams.
  11. Strong verbal and written communication skills.
  12. Deep knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise wide data architectures and data warehousing/BI.

Roles & Responsibilities

  1. Develop and deploy BigData jobs (Informatica & Spark) for building Enterprise Data lake.
  2. Document bigdata use cases, solutions and recommendations.
  3. Help program and project managers in the design, planning and governance of implementing big data projects
  4. Performed detailed analysis of business problems and technical environments and use this in designing the solution
  5. Ensures adherence to architecture standards and best practices to maintain consistency across the enterprise landscape
  6. Ensure integrity & security of assigned data architecture
Specialization : Information Technology - Applications
Type of Employment : Contract
Minimum Experience : 9
Work Location : Central
If you encounter difficulties, please click here for technical assistance.