Search by job, company or skills

  • Posted 8 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job description

We are working with a top tier global consulting firm to support the delivery of a large scale transformation for a Telecom client based out of Malaysia.

This is a 5 year programme focused on AI revenue transformation and acceleration. They have already spent a year building out the key assets and have mobilised a 20-30 person cross functional team (data, tech, consulting) based in KL.

They now require practitioners with proven hands on experience across a number of expertise areas to join them to help scale and drive value out of what has been built in phase 1.

You'll be working as a member of the consulting team delivering for the end client and will need to be based on-site in KL. This will be an initial 6 month engagement with the option to extend into a multi-year contract.

Open to those already based in Malaysia.

Scope of work:

  • Design, develop, and maintain data pipelines using Airflow
  • Build modular and incremental data flows that allow efficient refresh cycles and minimize redundant runs
  • Collaborate with cross-functional teams to integrate AI/ML models into production data environments
  • Develop ETL/ELT workflows to ingest and transform data from multiple telecom and operational systems
  • Ensure data quality, version control, and governance through consistent documentation and reusable components
  • Participate in troubleshooting and optimization of data processes to ensure reliability, scalability, and performance
  • Contribute to domain logic validation in partnership with business and analytics teams

Skills / Experience needed:

  • 37 years of hands-on experience in data engineering or related roles.
  • Strong proficiency in Python, SQL, and modern data frameworks (e.g., Airflow, Spark)
  • Working knowledge of Azure data ecosystem (Data Factory, AzureML, Synapse, Blob Storage, Azure SQL, etc.)
  • Familiarity with AWS services (S3, ECS, RDS, EC2)
  • Experience designing incremental / modular data pipelines and integrating APIs or external data sources
  • Strong understanding of data governance, versioning, and access management practices
  • Excellent problem-solving skills, structured thinking, and strong communication abilities

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 134882671

Similar Jobs