Responsibilities:
- Design, build, and maintain reliable, scalable ETL/ELT data pipelines for automated data ingestion, transformation, and validation into data warehouse.
- Develop data integration/data flow system to manage different sources (databases, APIs, files) and to ensure data consistency and quality.
- Develop customized data processing programs / APIs by using Python / JavaScript programming language.
- Propose and implement the most appropriate architecture for batch and real-time data ingestion and processing.
- Implement data quality checks, logging, and monitoring to ensure reliability and transparency of data flows.
Requirements / Qualifications:
- At least Degree in Computer Science, Information Technology or equivalent.
- Strong SQL knowledge and min 1 ++ year working experience is required with relational databases, query authoring (SQL) as well as query optimization.
- Hands-on experience in coding / scripting development (such as ETL, PowerShell, SQL, Python, JavaScript and SQL Stored Procedure)
- Proficient in SQL script and experience with relational databases.
- Added advantage if candidate has experience working with big data.
- Added advantage if candidate has experience in Google Cloud Platform and Big Query.