Job type: 12-month contract | Hybrid
Location: Kuala Lumpur
Responsibilities:
Platform Provisioning
- Provision and configure Microsoft Fabric capacities, workspaces, semantic models, and supporting Azure services (e.g., Storage, Key Vault).
- Deploy the Agile Insights Fabric Admin and Governance Pack and the ETL framework to support robust data platform operations.
Data Integration & Manipulation
- Establish connectivity between the data platform and external data sources, ensuring secure and reliable access.
- Design and implement ingestion pipelines for structured/unstructured data.
- Develop data transformations across layers (bronze silver gold) to ensure scalable, high-quality, and business-ready datasets.
- Manage reference data integration and ensure alignment across domains.
Validation & Governance
- Support UAT cycles, identifying and addressing data quality and integration issues.
- Facilitate Change Advisory Board (CAB) endorsement, ensuring architecture and deployment readiness.
Production Deployment & Knowledge Transfer
- Deploy solutions into production with proactive monitoring and operational continuity.
- Produce clear as-built documentation of data architecture, pipelines, and processes.
- Conduct knowledge transfer sessions with client teams to ensure sustainable operations.
Requirements:
- Must have: Azure, Databricks & Python
- Proven experience as a Data Engineer working with Microsoft Fabric, Azure Data Services, and modern data platforms.
- Strong knowledge of data ingestion, transformation, and orchestration frameworks.
- Experience building data pipelines across bronze, silver, and gold layers.
- Proficiency in SQL, data modelling, and handling structured/unstructured data.
- Familiarity with security, governance, and deployment best practices in enterprise data environments.
- Hands-on experience supporting UAT, CAB processes, and production deployments.