Search by job, company or skills

  • Posted 16 days ago
  • Be among the first 20 applicants
Early Applicant

Job Description

Company Intro:

Encore Med is a health-tech company founded in 2016 focusing on innovating digital experience for healthcare operations and has a strong portfolio in transforming numerous business operations and processes for hospitals within the region.

Encore Med's vision is to help hospitals and healthcare institutes by providing patients with a simpler and smarter healthcare experience. Progressing with a team of 30 team members, Encore Med has been building sophisticated engines to meet complex hospital processes, innovating more products to ensure that the company products and services are moving ahead along with the technological advancements primarily in automation, artificial intelligence, and Internet-of-Things.

Job Summary:

The Data Engineer is responsible for designing, building, and maintaining the data infrastructure that supports analytics, machine learning, reporting, and operational applications. This role ensures

high-quality, reliable, and scalable data pipelines so that downstream teams of Data Scientists, Analysts, Engineers, and Product Manager can access clean, structured, and governed data.

Job Requirements

Data Pipeline Development

  • Design, build, and maintain robust ETL/ELT pipelines for ingesting data from multiple sources (APIs, databases, IoT devices, application logs, HL7/FHIR, etc.).
  • Ensure pipelines are reliable, fault-tolerant, and scalable.
  • Automate data transformation, validation, and quality checks.

Data Architecture

  • Develop and maintain data models, schemas, warehouse/Lakehouse structures.
  • Optimize storage solutions using cloud-native tools (Delta Lake, Big Query, Redshift, Snowflake, Synapse, etc.).
  • Define standards for data partitioning, indexing, and lifecycle management.

Data Quality & Governance

  • Implement data validation, deduplication, and monitoring.
  • Ensure data compliance with regulatory requirements (HIPAA, PDPA, GDPRespecially important in healthcare).
  • Document data lineage, cataloguing, and metadata.

Integration & Interoperability

  • Build connectors to ingest data from internal and external systems.
  • For healthcare environments: integrate HL7v2, FHIR, DICOM, and proprietary hospital systems.
  • Work with backend engineers to ensure unified data model across systems.

Collaboration

  • Work closely with Data Scientists to provide feature-ready datasets.
  • Partner with Product, Engineering, and BI teams to support dashboards, real-time monitoring, and analytics.
  • Translate product/business requirements into scalable data solutions.

Performance Optimization

  • Improve pipeline throughput, reduce latency, and lower storage/computation cost.
  • Tune SQL queries, warehouse performance, and job scheduling.
  • Implement best practices for distributed processing (Spark, Flink, Beam).

MLOps (Preferred)

  • Support model deployment pipelines, feature stores, and model monitoring.
  • Collaborate on training data pipelines, inference pipelines, and data versioning.

Job Qualifications

  • Bachelor's or Master's in Computer Science, Data Engineering, Software Engineering, Information Systems, or related fields.
  • 37 years of experience in data engineering or similar roles.
  • Strong foundational understanding of data structures, distributed systems, and cloud architecture.
  • Strong proficiency in SQL (analytical + optimized queries).
  • Expertise in Python or Scala.
  • Hands-on with ETL/ELT tools (Airflow, dbt, Glue, Dataflow, Synapse pipelines).
  • Experience with cloud data platforms (AWS, Azure, GCP).
  • Distributed data processing frameworks (Spark, Kafka, Beam, Flink).
  • Familiarity with data lakes, warehouses, Lakehouse architectures.
  • Understanding of API data ingestion and streaming pipelines.
  • Data modelling (Kimball, Data Vault, star/snowflake schema).
  • Metadata management, cataloguing, and lineage.
  • Data quality frameworks (Great Expectations, Deequ, Soda).

Good To Have

  • Experience with healthcare data standards (HL7, FHIR, DICOM).
  • Knowledge of machine learning workflow (feature engineering, MLOps).
  • Experience with real-time data (Kafka, Kinesis, MQTT, IoT ingestion).
  • Knowledge of BI tools (Power BI, QuickSight, Tableau).

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 139671711

Similar Jobs

Early Applicant