Data Science Intern
Internship Responsibilities
- Coordinate data and annotation workflows across multiple teams and datasets, gaining exposure to end-to-end operational processes.
- Support data quality management activities, including sampling checks, multi-stage quality reviews, and tracking quality outcomes over time.
- Assist with operations for live data products, including monitoring system health, flagging anomalies, and performing routine checks to ensure continuity of service.
- Prepare and maintain operational dashboards to surface key metrics, workflow statuses, and quality indicators for internal stakeholders.
- Gather and clarify requirements from stakeholders, helping translate them into workflow steps, acceptance criteria, and supporting documentation.
- Support issue tracking and resolution, including first-level triage, escalation coordination, and follow-up to closure with relevant teams.
- Maintain operational documentation, including standard operating procedures, workflow guides, and process notes to support consistent team execution.
- Assist in preparing regular operational and quality reports for stakeholders, highlighting risks, bottlenecks, and opportunities for improvement.
- Collaborate with machine learning and engineering teams to ensure annotation outputs meet agreed specifications and delivery timelines.
- Contribute to continuous improvement initiatives aimed at simplifying processes, enhancing tooling, or reducing operational turnaround times.
Internship Requirements
Education
- Currently pursuing a Bachelor's degree in Information Systems, Data Analytics, Computer Science, Engineering, or a related field.
Core Skills
- Strong organizational skills with excellent attention to detail, capable of managing multiple tasks concurrently.
- Clear and professional written communication, including documentation and structured stakeholder updates.
- Analytical mindset with the ability to interpret operational metrics and identify emerging trends or issues.
- Preferred Tools & Technical Knowledge
- Familiarity with Agile workflows, ticket management, and structured handovers (e.g., JIRA, Confluence).
- Exposure to AI/ML concepts, including datasets, annotation pipelines, or model workflows.
- Experience with annotation platforms such as Label Studio or similar tools.
- Basic proficiency with reporting/dashboard tools (e.g., Power BI, Tableau, Metabase).
- Awareness of quality measurement concepts such as inter-annotator agreement and sampling-based QA/QC.
- Working knowledge of SQL and/or Python for basic data queries and reporting.
- Exposure to cloud platforms such as AWS is a plus.