Back to Job Search

Data Migration Engineer

London

£70000 - £90000 per annum

Permanent

Big Data

Data Migration Engineer

Location: Hybrid - 1-2 days a week in London

Salary: £70-90k + 10% Bonus

Job Type: Permanent

Sponsorship: Not Available

Role Summary:

We are looking for a Data Migration Engineer to help us achieve a multi-system, multi-client, company-wide migration. In this role you will work closely with the owners of our legacy platforms to understand their data, then dive right in and create the queries and processes necessary to move everything to the new. As migration work wraps up, the role will transition to leveraging Airflow and other tooling to automate operational and financial processes. This is a great role for exceptionally outgoing individuals as all work will require extensive collaboration with engineers and stakeholders across many teams.

Key Responsibilities:

  • Collaborate with engineers of legacy systems to understand what data is available and the form it takes
  • Design queries & scripts to extract and transform data from the old to the new
  • Collaborate with our Principal Data Engineer to come up with a cold-storage solution for data that will need to persist for regulatory reasons
  • Work with our solutions architect to design the data process' to be run the day each partner is migrated
  • Build and maintain ETLs and workflow automation using Snowflake and Apache Airflow
  • Thoroughly document all learnings and work-in-progress using version control
  • Implement data quality checks and reconciliations at all steps in the process
  • Collaborate with platform/infra teams on security, access controls, and secrets management

Required Qualifications:

  • 5+ years of experience in data engineering, analytics engineering, or backend engineering with significant data pipeline ownership
  • Demonstrated past success in a stakeholder management heavy data role
  • Excellent communication skills, able to translate technical knowledge to non-technical colleagues
  • Hands-on experience building and operating production workflows in Apache Airflow
  • Hands-on experience delivering production-grade transformations in dbt, including tests and documentation
  • Strong SQL skills and experience working across multiple SQL dialects.
  • Experience with data modeling and warehousing concepts (facts/dimensions, slowly changing dimensions, incremental loading).
  • Proficiency with Git-based workflows, code reviews, and CI/CD practices.

Preferred Qualifications:

  • Have used Azure cloud services previously
  • Experience with streaming/event-driven systems e.g., Kafka/Kinesis
  • Experience with infrastructure as code e.g., Terraform and containerization e.g., Docker/Kubernetes
  • Familiarity with data governance, cataloging, lineage, and privacy/security best practices.
  • Use of SQLServer in a past role.
  • Familiarity with Snowflake and/or PostgreSQL

Apply now

Share this job