Job Title: Principal Data Engineer
Location: London, UK - Hybrid (Liverpool Street office)
Department: Data & Engineering
Reports To: Head of Data Science
Employment Type: Permanent
A leading global payments and fintech platform, empowering clients to innovate and scale. Their mission is to create frictionless, secure, and intelligent payment solutions that unlock value across banking, financial services, and digital assets.
As they continue to scale their data infrastructure, they're seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines.
The ideal candidate will bring both technical depth and strategic thinking, with the ability to communicate effectively across business and technical teams.
Key Responsibilities
- Design and lead the development of scalable, secure, and high-performance data warehouse solutions for BI and analytics.
- Define and drive the long-term architecture and data strategy in alignment with business goals.
- Own orchestration of ETL/ELT workflows using Apache Airflow, including scheduling, monitoring, and alerting.
- Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows.
- Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC).
- Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans.
- Mentor and lead data engineers, fostering continuous learning and technical excellence.
- Ensure compliance with data security, privacy, and regulatory standards (e.g., PCI-DSS, GDPR).
Essential:
- 7+ years in Data Engineering, with 2+ years in a Principal or Lead role.
- Proven experience designing and delivering enterprise data strategies.
- Exceptional communication and stakeholder management skills.
- Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift).
- Hands-on experience with Apache Airflow (or similar orchestration tools).
- Strong proficiency in Python and SQL for pipeline development.
- Deep understanding of data architecture, dimensional modelling, and metadata management.
- Experience with cloud platforms (AWS, GCP, or Azure).
- Familiarity with version control, CI/CD, and Infrastructure-as-Code (Terraform or similar).
Desirable:
- Background in fintech or payments.
- Knowledge of streaming frameworks (Kafka, Kinesis).
- Experience with dbt and data quality frameworks (e.g., Great Expectations).
What's on Offer
- Competitive base salary + bonus
- Flexible hybrid working model (Liverpool Street office)
- Private healthcare & wellness benefits
- Learning and development support
- The opportunity to shape the future of payments data infrastructure