Search by job, company or skills

  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role : Data & AI Ops Engineer

Location : Klang Valley, KL (project-based)

YOE : 6-10 Years

Type : Permanent

The Data & AI Ops Engineer provides end-to-end technical expertise in designing, building, and operating scalable data and AI-enabled platforms. This role ensures reliable, high-quality data pipelines, models, and integrations that support analytics, reporting, and AI-driven solutions across the organization.

The position plays a key role in data architecture, API-based integration, performance optimization, and operational support, ensuring data availability across dashboards, applications, and backend systems while meeting SLA requirements for Incident Requests (IR) and Service Requests (SR).

Job Description:

  • Design, develop, and maintain scalable data pipelines and integration workflows for analytics, reporting, and operational systems
  • Ensure data quality, consistency, and reliability through validation, cleansing, and transformation processes
  • Collaborate with developers, business analysts, and system owners to define data requirements and deliver business-aligned solutions
  • Optimize data storage and retrieval performance across databases, data lakes, and cloud platforms for batch and real-time processing
  • Build, deploy, and enhance data models, APIs, and ETL/ELT frameworks in line with architecture and governance standards
  • Monitor and resolve data-related IRs and SRs in accordance with SLA and operational expectations
  • Support data-centric IT initiatives, including planning, coordination, risk mitigation, and stakeholder engagement
  • Maintain clear documentation for data flows, schemas, and integration logic to support audit, compliance, and knowledge sharing
  • Ensure seamless integration with core business systems and external platforms
  • Continuously evaluate and adopt emerging data and AI technologies to improve efficiency and solution performance

Key Skills & Requirements:

  • Strong understanding of data architecture, data modeling, and ETL/ELT pipeline design
  • Experience with enterprise data platforms (RDBMS required; Azure Data Factory, Databricks, Power BI are a plus)
  • Familiarity with API-based integration (REST, SOAP, MQ)
  • Knowledge of cloud platforms (Azure, AWS, or GCP)
  • Understanding of data governance, security, privacy, lineage, and compliance requirements
  • Awareness of AI/ML fundamentals, including chatbot solutions, predictive analytics, and model deployment
  • Experience in BI and reporting frameworks for visualization and decision support
  • Strong documentation, communication, and stakeholder engagement skills
  • Ability to operate in dynamic environments, troubleshoot issues, and maintain operational stability

Education & Experience:

  • Bachelor's Degree in Computer Science, Information Systems, Data Analytics, or related field
  • Minimum 6 years hands-on experience in data engineering or data integration
  • At least 3 years designing and supporting enterprise-scale data pipelines

OR

  • Diploma in a relevant discipline
  • Minimum 8 years practical experience in data engineering or related domains
  • At least 4 years in a senior or technical lead role supporting cross-functional data initiatives

More Info

Job Type:
Industry:
Employment Type:

Job ID: 137327059