Mandatory Skills: AWS, Databricks, Informatica IDMC
- Certification (Mandatory : GCC (Government Commercial Cloud (GCC) experience required.)
- Minimum 5 years of experience in data engineering, with expertise in AWS services, Databricks, and/or Informatica IDMC.
- Proficiency in programming languages such as Python, Java, or Scala for building data pipelines.
- Design and architect data storage solutions, including databases, data lakes, and warehouses, using AWS services such as Amazon S3, Amazon RDS, Amazon Redshift, and Amazon DynamoDB, along with Databricks Delta Lake. Integrate Informatica IDMC for metadata management and data cataloging.
- Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and enrich data, making it suitable for analytical purposes using Databricks Spark capabilities and Informatica IDMC for data transformation and quality.