Join a dynamic Enterprise Data Organization that powers data operations and distribution for global business partners.
Be part of a team that values innovation, collaboration, and efficiency in delivering cutting-edge data solutions.
Key Responsibilities
- Design and implement customized file configurations for clients
- Provide expert troubleshooting and consultancy for API integrations
- Enable seamless data sharing into Snowflake, GCP BigQuery, and Databricks
- Lead client migrations from legacy systems to modern data delivery platforms
- Drive process optimization and automation initiatives
- Manage KPI reporting and generate insights from operational metrics
- Collaborate with Product & Development teams to resolve application issues
- Engage with senior stakeholders to ensure alignment and satisfaction
- Facilitate Agile prioritization and planning within the team
What We're Looking For
- Bachelor's degree in Computer Science, IT, or related field
- 1+ years of experience in API integration and programming (Python, Java, JSON, SQL, XML)
- Strong knowledge of data warehouse solutions: Snowflake, GCP BigQuery, Databricks
- Excellent problem-solving and troubleshooting skills
- Strong communication skills for client and stakeholder engagement
Preferred Qualifications
- Experience with operational tooling and automation
- Understanding of enterprise application development
- Familiarity with Azure DevOps, Splunk, Salesforce, MS Excel, PowerBI
- Ability to generate insights from operational metrics to drive performance