Hybrid Work Arrangement (Work from Home and Office)
Regular Company Events (Sports Tournament, Outdoor Activities)
Gym, Dental, Optical
Responsibilities:
Lead and support client project implementations with advanced analytics, project planning, and communication, focusing on data integration and transformation using ETL/ELT processes.
Design, develop, and optimize robust data pipelines and workflows for data warehouse, data marts and (optionally) data lakehouse environments, ensuring high performance, scalability, and data quality.
Develop, optimize, and maintain complex SQL queries, stored procedures, and database objects for data transformation, aggregation, and reporting. Implement indexing, partitioning, and performance tuning for large-scale datasets.
Utilize and manage ETL tools (e.g., Azure Data Factory, Microsoft SSIS, Qlik Talend, Informatica, etc.) to extract, transform, and load data from diverse sources, ensuring data integrity and consistency.
Build and maintain advanced dashboards and reports using Microsoft Power BI, leveraging DAX, Power Query, and custom visuals to deliver actionable insights. Implement row-level security, data modeling, and interactive analytics features.
Leverage experience with cloud data platforms such as Snowflake or Databricks for scalable data processing, real-time analytics, and integration with machine learning workloads (preferred but not required).
Interpret and analyze data to identify trends, optimize supply chain and business processes, and provide actionable recommendations to clients.
Collaborate with project leads, system architects, and client teams to establish data best practices, success criteria, and ensure alignment with business objectives.
Communicate key results and insights to stakeholders in verbal, visual, and written formats, translating complex technical findings into business value.
Support research initiatives for solution design, continuous improvement, and adoption of new data technologies.
Willingness to learn new applications such as SAP's BI solutions
Requirements:
4 to 10 years of relevant experience in data analytics, data engineering, or related fields.
Bachelor's degree in Computer Science, Information Technology, Data Science, or a related fields.
Strong analytical, problem-solving, and communication skills, with the ability to translate business requirements into technical solutions.
Proven expertise in data warehouse architecture design and implementation.
Proven expertise in ETL/ELT processes development, advanced SQL (including writing, optimizing, and troubleshooting stored procedures, functions, and complex queries).
Hands-on experience with ETL tools (e.g., Azure Data Factory, Qlik Talend, Microsoft SSIS, Informatica, etc.) and data visualization tools (Microsoft Power BI, Tableau, or similar), including advanced skills: data modeling, DAX, Power Query, dashboard/report design, and deployment.
Strong attention to detail, documentation, and testing skills.
Good communication and teamwork abilities; willingness to learn from senior colleagues and adapt to new tools and technologies.
Ability to support multiple tasks, prioritize effectively, and work collaboratively within a project environment.
Hands-on experience with cloud computing platforms such as AWS, Azure, GCP is a strong advantage.
Familiarity with data lakehouse concepts and real-time data processing, including experience with cloud data platforms such as Snowflake or Databricks is a plus.
CI/CD pipeline development experience is highly desirable.
Understanding SAP systems (especially SAP HANA); experience with SAP Datasphere or SAP Analytics Cloud (SAC) is a plus.
Experience with accounting terminology and concepts is an advantage.
Experience coding with Python scripts language and data related library (e.g., Pandas, PyMySQL) is a plus.