Search by job, company or skills

PayNet (Payments Network Malaysia)

Senior Data Solution Engineer - Data Lake

This job is no longer accepting applications

new job description bg glownew job description bg glownew job description bg svg
  • Posted 7 months ago

Job Description

Summary Of Responsibilities

We are seeking a Senior Data Solution Engineer to lead the full lifecycle of data solutions projects with minimal guidance. The ideal candidate will have extensive experience with cloud infrastructure, big data technologies, and machine learning fundamentals, capable of architecting, implementing, and optimizing data pipelines that can process 10 to 100 gigabyte-scale datasets efficiently.

Key Areas Of Responsibilities

  • Lead end-to-end data solutions projects through all phases: discovery, planning, design, implementation, testing, operations, and optimization
  • Architect and implement scalable data processing pipelines using modern technologies
  • Design and enforce data governance frameworks and lineage tracking solutions
  • Collaborate with stakeholders to translate business requirements into technical solutions
  • Provide technical leadership and mentorship to junior team members
  • Optimize existing data solutions for improved performance and cost-efficiency
  • Implement and maintain data quality controls and monitoring systems

Qualifications & Experience

Minimum Qualifications

  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent practical experience)
  • Relevant certifications in AWS, data engineering, or cloud architecture
  • 5+ years of experience in data engineering or solutions architecture
  • Strong expertise with Kubernetes (k8s) for containerized applications
  • Extensive experience with AWS data services (including but not limited to S3, Glue, Athena, EMR, Lakeformation)
  • Proficiency with PySpark for large-scale data processing
  • Experience with data lake technologies (Apache Iceberg, Apache Hive)
  • Strong understanding of data lineage tracking and governance principles
  • Advanced GitOps practices using GitLab and GitLab CI/CD pipelines
  • Experience orchestrating complex data pipelines with workflow management tools
  • Proven ability to process and analyze 10 to 100 gigabyte-scale datasets efficiently
  • Working knowledge of ML concepts and frameworks
  • Experience with CI/CD for data applications
  • Terraform or other IaC tools for infrastructure provisioning

PERSONAL QUALITIES

  • Self-motivated problem solver who can work with minimal guidance
  • Excellent communication skills to translate technical concepts to non-technical stakeholders
  • Strong project management capabilities
  • Detail-oriented with a focus on data quality and reliability
  • Collaborative mindset for cross-functional team environments

More Info

Job Type:
Industry:
Employment Type:

Job ID: 111944703