Search by job, company or skills

  • Posted 5 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About the company:

RPG Commerce is a leading multi-brand, omnichannel powerhouse redefining the consumer landscape across Southeast Asia and beyond. With a portfolio of household names like Montigo, OiYO by MONTIGO, and Casebang, we have scaled from a digital-first startup to a regional force with over 70 retail locations across Malaysia and Singapore. Our footprint spans Malaysia, Singapore, Indonesia, Thailand, and the UAE, giving us a unique vantage point on global consumer trends.

About the role:

Reporting to the Strategy Manager, you will be the architect of RPG's data universe. You aren't just moving data; you are building the engine that powers our decision-making. You will centralize data from 10+ sources (Shopify, Shopee, TikTok, Xilnex, Warehouse Management System and so on) into a high-performance cloud warehouse, develop APIs connections to extract data, perform data transformation and create the visualizations that our leadership team relies on daily. We are looking for a builder who loves zero-to-one challenges and wants to automate the mundane to make room for the extraordinary.

Key Responsibilities :

  1. Design and deploy scalable ELT (extract/transform/load) pipelines using GCP, Airbyte, Python, and SQL to centralize multi-channel commerce data into a cloud data warehouse environment.
  2. Build and maintain API connections to bridge gaps between third-party tools and our internal data ecosystem.
  3. Build and maintain high-impact dashboards in various data visualization tools.
  4. Identify manual bottlenecks across the organization and deploy automation scripts or tools to streamline operations.
  5. Build scalable data models & warehouses and develop audits for data integrity & quality, implementing alerting and anomaly detection as necessary.
  6. Develop tools and solutions to enable our CI/CD workflow, establish best practices for CI/CD (Git) and documentation so the system survives as the team grows.

Requirements:

  • 3+ years of software development, data engineering, or related experience manipulating, processing and extracting value from large datasets.
  • Demonstrated strength in data modelling, ETL/ELT development, and data warehousing.
  • Experience writing complex SQL statements and developing in Python
  • Experience in any of the cloud platforms such as GCP, AWS, Microsoft Azure and so on.
  • Experience working with data modelling and orchestration tools such as Airbyte, GCP Workflows etc
  • Experience with software configuration management tools (Git) and CI/CD tools.
  • Takes pride in an efficient designs and accurate results
  • Ability to objectively analyze pros/cons and respective trade-offs of a design path, and partner with team members to arrive at the most optimal solution.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 142642399

Similar Jobs