Data Engineer at BBC Media Action

December 2, 2025
Apply Now

Apply for this job

Upload CV (doc, docx, pdf)

Job Description

Job Purpose

  • The BBC Media Action office in Nigeria is looking for a Data Engineer to lead the design and initial implementation of a database that brings together monitoring data across two projects.
  • This position is required to build a consolidated real-time data management dashboard system for the Changing the Script project.
  • This six-month full-time position will involve setting up the system.
  • Subsequent part-time work is then required after the initial six months.
  • We are seeking a skilled data engineer to design and implement a unified database system. The role focuses on building a robust, well-structured database with clear labelling, standardised schemas, and accurate metadata to enable reliable reporting, analytics, and insight generation.
  • The position requires expertise in ETL pipelines, near real-time data handling, large-scale data processing, and social listening workflows.
  • The role will report to the Head of Research and Learning, Nigeria and have a dotted line to the Project Director, being accountable to all deliveries to the Project Director.
  • They will work closely with the Research and Data manager who is leading the research, partner organisations and will collaborate with the technical support cross country team (digital, research) and broader BBC specialisms to ensure the database meets operational and reporting needs.

Main Duties and Responsibilities

  • Design, build, and maintain central databases/data warehouses using PostgreSQL or Azure SQL, optimised for indexing, querying, and scalability.
  • Develop and manage ETL pipelines using Python to ingest, clean, transform, and harmonise data from surveys, outreach, digital analytics, social listening, and partner submissions.
  • Process near real-time and large-scale datasets efficiently, ensuring data integrity and performance.
  • Build and maintain well-labelled and standardised schemas with clear metadata to support reporting, analytics, and downstream dashboards.
  • Handle text-heavy social listening data, including cleaning, deduplication, sentiment scoring, and calculation of social media metrics.
  • Prepare datasets for Power BI with star-schema models, transformations, and
  • calculated measures, supporting dashboards while maintaining a primary focus on database architecture.
  • Ensure data quality, governance, and security, including role-based access, logging,
  • GDPR compliance, and documentation.
  • Collaborate with London teams, and partner organisations to standardise data submission and ensure usability.

Required skills, knowledge, and experience

  • Strong experience designing relational databases or data warehouses, specifically PostgreSQL or Azure SQL.
  • Expertise in building and optimising ETL pipelines using Python.
  • Advanced SQL skills, including indexing, performance tuning, and working with large-scale data.
  • Experience importing, cleaning, and automating integration of SPSS survey datasets.
  • Experience handling social listening data: text-heavy data processing, deduplication,sentiment analysis, and calculation of social media metrics.
  • Familiarity with Power BI for dataset preparation, data modelling, and DAX measures.
  • Ability to design well-labelled, structured databases with standardised schemas, metadata, and clear data maps.
  • Experience working with near real-time data pipelines and ensuring scalability for large datasets.