Search by job, company or skills

  • Posted 8 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Key Responsibilities:

  • Partner with clients to understand their data landscape, challenges, and business goals
  • Design and implement scalable data pipelines, architectures, and integration workflows
  • Build ETL/ELT processes to ingest data from diverse sources
  • Develop data models and warehouse architectures that support analytics and BI
  • Conduct technical discovery sessions and translate business needs into technical solutions
  • Communicate progress and recommendations to both technical and business stakeholders
  • Document solutions and enable client teams to own what you have built
  • Contribute insights from client work to inform our internal product development

Core Technical Requirements Essential Skills

SQL Expertise:

  • Advanced query optimisation and performance tuning
  • Complex data transformation and aggregation
  • Experience with large-scale data processing

Python Programming:

  • Data processing and manipulation libraries
  • Script automation and pipeline development
  • ETL/ELT process implementation

Cloud Platform Experience

  • Experience with at least one major cloud platform (AWS, GCP, or Azure)
  • Data warehouse or data Lakehouse experience (Snowflake, BigQuery, Redshift, Databricks)

Data Pipeline & Transformation

  • Hands-on experience with orchestration tools (Airflow, Prefect, Dagster)
  • Familiarity with transformation frameworks (dbt, SQLMesh, Spark)
  • Comfortable deploying and working with open-source tools

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145029731