At a Glance
- Tasks: Design and maintain high-performance data pipelines in a modern cloud environment.
- Company: FreemarketFX provides innovative digital solutions for FX and cross-border payments.
- Benefits: Enjoy a hybrid working model with flexible office attendance and a collaborative culture.
- Why this job: Join a team that values accountability, innovation, and client-centric success.
- Qualifications: 5+ years in data engineering with expertise in Databricks, Azure Data Factory, Python, and SQL.
- Other info: Opportunity to work on cutting-edge data architecture and AI/ML initiatives.
The predicted salary is between 43200 - 72000 £ per year.
Location: Freemarket offers a hybrid working model. You should be able to attend the office in London Bridge when required, on average twice per week.
Department: Data / Engineering
Reports To: Head of Data
Employment Type: Permanent, Full-Time
About FreemarketFX
Freemarket is a provider of digital solutions for FX and cross-border payment needs. Anchored by deep sector expertise, rigorous compliance-led onboarding, and unmatched oversight of regulated flows, clients are rewarded with a partner that values their relationship like no other. Through our proprietary digital platform clients can access an instant settlement network and seamless real-time money movement globally within an interconnected community of like-minded companies. At Freemarket, our success is driven by our commitment to core behaviours that shape how we work and deliver value. We take accountability, ensuring outcomes are met with urgency and transparency. Our data-driven approach blends rigorous analysis with intuition to guide sound decision-making. We encourage innovation by being curious learners, always seeking new knowledge, skills, and perspectives. We act as team players, prioritising team success over individual recognition, and our client-centric mindset ensures we consistently understand and meet the needs of our clients, adding value at every step. These behaviours run through everything we do, enabling us to exceed expectations and support our clients' growth effectively.
About the Role
We are seeking a highly skilled Senior Data Engineer to design, implement, and maintain scalable, reliable, and high-performance data pipelines and architectures in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks, dbt, Azure Data Factory, and Python/SQL to support critical business analytics and AI/ML initiatives.
Key Responsibilities
- ETL Development: Design and build robust and reusable ETL/ELT pipelines through the Medallion architecture in Databricks.
- Data Transformation: Create and manage data models and transformations using dbt, ensuring clear lineage, version control, and modularity.
- Pipeline Orchestration: Develop and manage workflow orchestration using Azure Data Factory, including setting up triggers, pipelines, and integration runtimes.
- System Maintenance: Monitor, maintain, and optimize existing data pipelines, including cron job scheduling and batch/stream processing.
- Error Handling: Design and implement effective logging, monitoring, and alerting strategies for robust error management and recovery.
- Scalability & Futureproofing: Contribute to architectural discussions and decisions, ensuring scalability, data quality, and future-proof data systems.
- Collaboration: Work closely with data analysts, finance and engineers to ensure data availability and usability across business domains.
- Documentation: Maintain comprehensive documentation covering data models, architecture decisions, transformation logic, and operational procedures.
- Data Governance & Security: Ensure compliance with data security policies, data retention rules, and privacy regulations.
Required Skills and Experience
- 5+ years of experience in data engineering or similar roles.
- Strong experience with Databricks, including notebooks, cluster configuration, and Delta Lake.
- Proficiency in dbt for transformation logic and version-controlled data modeling.
- Deep knowledge of Azure Data Factory, including pipeline orchestration and integration with other Azure services.
- Experience with data integration (e.g.: APIs, JSON, XML, Web Services) essential.
- Expertise in Python and SQL for data manipulation and pipeline development.
- Hands-on experience implementing and maintaining Medallion Architecture (Bronze/Silver/Gold).
- Familiarity with CI/CD, Git version control, and agile development methodologies.
- Strong understanding of data warehousing principles, data modeling, and performance optimization.
- Experience with cron jobs, job orchestration, and error monitoring tools.
Good to have
- Experience with Azure Bicep or other Infrastructure-as-Code tools.
- Exposure to real-time/streaming data (Kafka, Spark Streaming, etc.).
- Understanding of data mesh, data contracts, or domain-driven data architecture.
- Hands-on experience with MLflow and Llama.
Senior Data Engineer New London (Hybrid) employer: freemarketFX Limited
Contact Detail:
freemarketFX Limited Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer New London (Hybrid)
✨Tip Number 1
Familiarise yourself with the Medallion architecture, as it's a key focus for this role. Understanding how to effectively implement and optimise the Bronze, Silver, and Gold layers will set you apart from other candidates.
✨Tip Number 2
Showcase your experience with Databricks and Azure Data Factory in your discussions. Be prepared to discuss specific projects where you've successfully built ETL pipelines or managed data workflows, as practical examples can really impress.
✨Tip Number 3
Highlight your collaborative skills during interviews. Since the role involves working closely with data analysts and engineers, demonstrating your ability to work as part of a team and communicate effectively will be crucial.
✨Tip Number 4
Stay updated on the latest trends in data engineering, especially around cloud technologies and data governance. Being able to discuss recent developments or tools can show your commitment to continuous learning, which aligns with Freemarket's values.
We think you need these skills to ace Senior Data Engineer New London (Hybrid)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with tools like Databricks, dbt, and Azure Data Factory. Use specific examples to demonstrate your skills in building ETL pipelines and managing data transformations.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role at Freemarket and how your values align with their commitment to accountability and innovation. Mention specific projects or experiences that showcase your ability to work collaboratively and drive results.
Showcase Relevant Skills: When answering application questions, emphasise your proficiency in Python and SQL, as well as your hands-on experience with Medallion Architecture. Be prepared to discuss your approach to error handling and system maintenance in data pipelines.
Be Honest About Experience: When asked about your years of experience in data engineering, provide an accurate account. If you have experience with additional tools like Azure Bicep or real-time data processing, mention these as they can set you apart from other candidates.
How to prepare for a job interview at freemarketFX Limited
✨Showcase Your Technical Skills
Be prepared to discuss your experience with Databricks, dbt, and Azure Data Factory in detail. Highlight specific projects where you've implemented ETL pipelines or worked with the Medallion architecture, as this will demonstrate your hands-on expertise.
✨Understand the Company Culture
Freemarket values accountability, curiosity, and teamwork. During the interview, share examples of how you've embodied these behaviours in your previous roles. This will show that you align with their core values and can contribute positively to their team.
✨Prepare for Scenario-Based Questions
Expect questions that assess your problem-solving skills, especially related to data pipeline issues or error handling. Think of scenarios where you've successfully resolved challenges in data engineering and be ready to explain your thought process.
✨Ask Insightful Questions
Prepare thoughtful questions about the team's current projects, challenges they face, and how they measure success. This not only shows your interest in the role but also helps you gauge if the company is the right fit for you.