The Data Platform Team is responsible for designing, implementing, and managing a modern data platform that embraces the principles of data mesh, empowering teams to create and manage their own data products. Our mission is to deliver high-quality, scalable data solutions that drive business value across the organization
Qualifications
Requirement:
Degree in IT, Computer Science, Data Analytics or related field
2 to 4 years of experience in Data Engineering, DevOps, or related fields.
Proven experience working in a mature, DevOps-enabled environment with well-established cloud practices, demonstrating the ability to operate in a high-performing, agile team.
Familiarity with cloud platforms (AWS, AliCloud, GCP) and experience managing infrastructure across public cloud and on-prem environments, particularly with OpenShift Container Platform (OCP).
Knowledge on automation tools such as Ansible, Terraform, and CLI tools across hybrid cloud environments.
Competence in designing and implementing data ingestion, ETL frameworks, dashboard ecosystems, and data orchestration
Hands-on experience with Linux systems, Object Storage, Python, SQL, Spark and Presto query engines
Working knowledge of CI/CD best practices, with experience in setting up and managing CI/CD pipelines for continuous integration, testing, and deployment.
Experience in implementing security measures, including IAM roles, policies, Security Groups, and Network ACLs.
Preferred:
Certifications in cloud technology platforms (such as cloud architecture, container platforms, systems, and/or network virtualization).
Knowledge of telecom networks, including mobile and fixed networks, will be an added advantage.
Familiarity with data fabric and data mesh concepts, including their implementation and benefits in distributed data environments, is a bonus.