Search by job, company or skills

gxbank

Senior Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About the Bank:

We are a growing regional digital bank group and are revolutionizing financial banking services across Southeast Asia. Our mission is to unlock big dreams and drive financial inclusion throughout the region. As a regional digital bank, we have the right foundation—data, technology, and trust—because we are Built With Heart. We believe that real impact starts with real people. If you're ready to Own The Mission and help us shape the future of Digital banking, we invite you to join us.

Responsibilities

  • Design, develop, test, deploy, and maintain robust and scalable ELT data pipelines using dbt (data build tool) for data transformation within Snowflake.
  • Orchestrate and schedule complex data workflows using Apache Airflow, ensuring timely and reliable data delivery.
  • Develop connectors and scripts (primarily in Python) to extract data from various source systems (APIs, databases, files, streaming platforms) and load it into Snowflake.
  • Implement data ingestion strategies (batch and streaming) using Snowflake capabilities (e.g., Snowpipe, external stages).
  • Optimize Snowflake warehouse usage, query performance, and overall data platform efficiency.
  • Manage and monitor Snowflake resources, ensuring cost-effectiveness and scalability.
  • Implement and enforce data governance, security (e.g., RBAC, data masking), and privacy best practices within Snowflake.
  • Assist in schema design, table optimization (clustering, partitioning), and data loading strategies.
  • Solve key business problems through using an appropriate mix of strategic thinking and computational methods.
  • Develop and uphold best practices with respect to change management, documentation and data protocols.

Requirements

  • Bachelor/Master degree in Analytics, Data Science, Mathematics, Computer Science, Information Systems, Computer Engineering, or related technical field.
  • Demonstrated mastery of complex SQL queries, analytical functions, stored procedures, and performance tuning.
  • 5+ years of hands-on experience with SQL or any Data warehouse/Data
  • Lake, including data loading, transformations, performance optimization, and security features.
  • Proven experience in building and managing complex data transformation pipelines using dbt, including Jinja templating, macros, tests, and documentation.
  • Solid experience in designing, developing, and deploying production-grade data pipelines using Apache Airflow (DAGs, Operators, Sensors, XComs).
  • Strong Python scripting skills for data manipulation, API integrations, and Airflow DAG development.
  • Analytical and independent problem solver. Meticulous with high attention to detail.
  • Strong communicator with ability to switch hats between data/technical speak and business/layperson speak.
  • Solid understanding of data warehousing concepts, dimensional modeling (star/snowflake schemas), and data lake architectures.
  • Deep understanding of Extract, Load, Transform (ELT) or ETL principles and best practices.
  • Familiarity with data quality frameworks, data lineage, and data governance principles.
  • Experience working in a digital banking or financial services environment is highly advantageous, with an understanding of financial data concepts and regulatory requirements.

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 146624469

Similar Jobs