Bristol, UK (Hybrid, 3 days/week in office)
£50-60k dependent on experience
Our client is a TV analytics company based in Bristol, specialising in unravelling unique insights from viewing data for both video on demand (VoD) platforms such as Netflix and Amazon Prime Video, as well as traditional channels such as BBC 1 and ITV.
In 2019, they shook up the industry with their award-winning project redefining audience measurement on a global scale. Now, with big expansion plans on the horizon, we’re on the hunt for a Data Engineer to help shape the next era of video on demand analytics.
We are seeking a Data Engineer to join their Engineering team and contribute to the ongoing development of their industry-leading video-on-demand analytics platform.
This role involves building and maintaining robust data pipelines that transform raw viewing data into high-quality, actionable insights. Core responsibilities include data ingestion, transformation, quality assurance, and the implementation of monitoring and validation processes to ensure consistent delivery of reliable data to our clients.
In addition to technical responsibilities, the successful candidate will be expected to take ownership of key product areas, helping to guide the evolution of their platform in alignment with strategic priorities. You will work closely with the Development Team Lead and the Insights team to ensure day-to-day technical delivery is coherent, prioritised, and responsive to business needs.
We are also looking for someone who can support or lead aspects of their agile delivery process, acting in a Scrum Master capacity where appropriate. This includes facilitating team ceremonies (such as stand-ups, sprint planning, and retrospectives), helping remove delivery blockers, and fostering an environment of continuous improvement.
This role offers the opportunity to work across our full technology stack, from data pipeline engineering to metadata enrichment, within a collaborative and high-performing team. It is ideally suited to someone who combines strong technical ability with clear communication, organisational skills, and a proactive mindset.
Technically, our ideal candidate would have:
-
Demonstrated experience in a data-focused engineering role
-
Deep experience with Python for data transformation
-
Expert SQL abilities
-
Experience working with Snowflake
-
Comfortable working with and using Git, Github & Jira
-
A deep understanding of working with third party APIs (REST and GraphQL).
-
A detailed understanding of CI/CD practices & tooling
-
A collaborative mindset & an interest in coaching & mentoring fellow engineers
Bonus points for candidates with experience in:
-
Data orchestration ingestion and transformation tools e.g. Dagster, Prefect, FiveTran, Keboola, dbt/dbt cloud etc.
-
Data visualisation – we use Looker/LookML.
-
Development in Node.js, Typescript, or Kotlin.
-
AWS and developing infrastructure-as-code (Terrafrom).
-
Additionally, a research or mathematical background and interest in/understanding of AI & ML would earn bonus points.