About Us
About SATS Feed and Connect Communities
SATS is a global leader in gateway services and Asia's pre-eminent provider of food solutions. Using innovative food technologies and resilient supply chains, we create tasty, quality food in sustainable ways for airlines, food service chains, retailers, and institutions. With heartfelt service and advanced technology, we connect people, businesses, and communities seamlessly through our comprehensive gateway services for customers such as airlines, cruise lines, freight forwarders, postal services and eCommerce companies.
Fulfilling our purpose to feed and connect communities, SATS delights customers in over 215 locations and 27 countries across the Asia Pacific, UK, Europe, the Middle East and the Americas. SATS has been listed on the Singapore Exchange since May 2000. For more information, please visit www.sats.com.sg
Key Responsibilities
Objectives of this role
Work with teams from concept to operations, providing technical subject matter expertise for successful implementation of data solutions in the enterprise using modern data technologies. This individual will be responsible for the planning, execution, and delivery of data initiatives. The role will also be working with the team to expand and optimise data pipelines and architecture. This is a hands-on development role mainly using Microsoft Azure data engineering and Databricks skill sets and data integration using Python and Java.
Responsibilities
- Develop, and maintain scalable data pipelines to ingest, process, and store data from various sources into the operational and analytical data platforms
- Optimise data processing and storage infrastructure for mission-critical, high-volume, near-real-time & batch data pipelines
- Implement data quality checks, monitoring, and alerting to ensure data accuracy and availability
- Troubleshoot and resolve data pipeline issues in a timely manner to minimise impact on business operations
- Work collaboratively with relevant teams to define functional and technical requirements
- Document technical specifications, processes, and workflows for data pipelines and related systems
- Manage stakeholder expectations and ensure clear communication
Key Requirements
Required Skills And Qualifications
- 3 or more years data engineering experience, with experience in Python, Java and SQL
- Familiar with cloud computing platforms, data engineering tools and services
- Experience in data lake and data warehouse pipelines
- Familiar with structured and unstructured data, database management and transformation methodologies
- Familiar with technical integrations using microservices, API, message queue, stream processing, etc.
- Exposure to CI/CD pipeline, Azure DevOps or GitHub
- Communication skills, with ability to explain technical concepts to non-technical stakeholders
Preferred Skills And Qualifications
- Tertiary qualifications in computer science, information technology, engineering, or related discipline
- Certifications on cloud technology and data engineering