Job Title: Data Engineer (12-Month Contract)
Location: KL Eco City (Hybrid: 2 days office, 3 days WFH)
Region Supported: ANZ (with occasional UK support)
Working Hours: 9:00 AM 6:00 PM (flexibility for early/late calls)
Open Positions: 4
About the Role
We are looking for Data Engineers to join our team on a 12-month contract. This role involves designing, developing, and maintaining data pipelines and ETL workflows using AWS services, ensuring data integrity, scalability, and performance. You will also support migration efforts to Databricks and optimize workflows for efficiency.
Key Responsibilities
- Design, develop, and maintain data pipelines and ETL workflows using AWS services.
- Implement orchestration and automation for data workflows.
- Work with large datasets to ensure data integrity and performance.
- Collaborate with stakeholders to understand data requirements and deliver solutions.
- Deploy changes directly to production environments with confidence and accountability.
- Support migration efforts to Databricks and optimize workflows for performance.
Qualifications
- Experience with data lake architectures, big data technologies, and data pipeline orchestration.
- Familiarity with CI/CD practices for data engineering.
- AWS Certification (e.g., AWS Certified Data Analytics Specialty or Solutions Architect) is a plus.
- Strong problem-solving skills and attention to detail.
Key Skills
- Expert-level fluency in AWS services: Glue, S3, Lambda, Step Functions.
- Strong proficiency in PySpark for distributed data processing.
- Advanced SQL skills for querying and optimizing data operations.
- Comfortable with deploying changes live without formal review, ensuring quality through self-validation.
- Databricks experience is a plus.
- Ability to work independently and manage tasks in a siloed environment.
Interview Process
- 2 Level of virtual interviews which includes technical and a live coding assessment during the Interview.
Interested Apply now or reach out for more details!