Role Summary
We are seeking an experienced Solution Architect / Databricks Technical Lead with strong expertise in Azure, Databricks, and Microsoft Fabric to design, build, and lead scalable data platforms. The role involves defining end-to-end data architecture, guiding development teams, and ensuring best practices for performance, security, and cost optimization.
Key Responsibilities
- Design and architect end-to-end data solutions using Azure Databricks, Microsoft Fabric, and Azure native services
- Lead technical design discussions and act as the Databricks technical SME
- Define data ingestion, transformation, and analytics architectures (batch & streaming)
- Implement Lakehouse architecture using Delta Lake
- Guide teams on best practices, coding standards, and performance tuning
- Collaborate with business stakeholders to translate requirements into technical solutions
- Ensure data security, governance, and compliance across platforms
- Support CI/CD pipelines and DevOps practices for data platforms
- Mentor junior engineers and conduct technical reviews
Required Skills
- Strong hands-on experience with Azure Databricks (Spark, Delta Lake, MLflow)
- Solid knowledge of Microsoft Fabric (OneLake, Lakehouse, Data Engineering & Analytics)
- Deep understanding of Azure services (ADF, ADLS Gen2, Synapse, Key Vault)
- Expertise in Python / PySpark / SQL
- Experience with data modeling, ETL/ELT, and large-scale data processing
- Knowledge of cloud security, IAM, networking, and cost optimization
Good to Have
- Experience with real-time streaming (Kafka / Event Hub)
- Exposure to AI/ML workloads on Databricks
- Azure certifications (Azure Data Engineer / Architect)
- Experience working in enterprise-scale environments