At a Glance
- Tasks: Design and optimise data pipelines for trade and communications surveillance.
- Company: Join TP ICAP, a leader in financial market innovation.
- Benefits: Inclusive culture, competitive salary, and opportunities for growth.
- Other info: Diverse team environment with a focus on collaboration and continuous improvement.
- Why this job: Make a real impact in data engineering while ensuring compliance and security.
- Qualifications: Experience in ETL/ELT, AWS, Python or Java, and strong problem-solving skills.
The predicted salary is between 55000 - 65000 £ per year.
Effective Market Abuse Surveillance is highly dependent on data of multiple types and from many different sources. Trade Surveillance requires complete Trade and Pre-Trade transactional data, Instrument and Client Reference Data and Market Data. Communications Surveillance requires ingestion of messages from multiple electronic platforms.
This role will play a critical role in ensuring the reliability, scalability, and compliance of data pipelines that support surveillance systems across communications and trading activities, covering the above structured and unstructured data. The role bridges engineering and operations, enabling robust data ingestion, transformation, and monitoring to meet regulatory and internal compliance requirements. The Data Ops Engineer will play a critical role in collaborating with upstream teams to ensure data completeness, accuracy, and timeliness is as expected and that any data completeness or quality issues are visible. The role will also work on other Surveillance data initiatives such as persisting Surveillance Alerts in the firm’s data lake for analytics purposes.
Role Responsibilities
- Design, build, maintain and optimise end-to-end data pipelines and workflows between the source data points and target destinations, working with the wider Surveillance Technology team to prioritise automation, scalability and strategy at the heart of the design.
- Implement automated data completeness and quality checks, validation rules, and reconciliation processes to ensure accuracy, completeness, and timeliness of the data ingested and to make visible any data that is not processed.
- Identify Critical Data Elements and implement failover and recovery strategies for the respective Data Flows.
- Build AWS infrastructure using Terraform or CDK.
- Write unit, integration, and infrastructure tests.
- Monitor, investigate and resolve data anomalies through collaboration with Business Analysts, Developers, and Testers across functions and verticals.
- Implement data management and governance frameworks to ensure data is ingested and loaded per the requirements of the consuming platform; Scila for Trade Surveillance, and Global Relay for Communications Surveillance.
- Partnering with the Data Strategy and Data Infrastructure team to ensure Data Lineage, auditability and retention policies are enforced across all necessary pipelines.
- Ensuring that Data consumed and processed is compliance with regulatory, legal, and security protocols.
- Work closely with surveillance analysts, compliance officers, and engineering teams to translate business rules into technical specifications.
- Partnering closely with stakeholders and subject matter experts such as the Cloud Infrastructure team to optimise performance and costs.
Experience / Competences
Essential Criteria
- Strong experience ETL/ELT data pipeline builds from design, to implementation, to maintenance in relation to financial market messaging platforms, and trade & order systems.
- Solid understanding of CI/CD pipelines, ideally with a background in software engineering, product management or data analytics.
- Experience with some of EKS, Lambda, EventBridge, Step Functions, S3, DynamoDB, AWS Glue, Snowflake, Terraform and Transfer Family.
- Strong proficiency in Python or Java, SQL, and data pipeline frameworks (e.g., Airflow, dbt, Spark), with solid experience with the AWS ecosystem.
- Excellent problem-solving skills and ability to work in a fast-paced environment.
- Bachelor’s degree in Computer Science, Data Science, Engineering, or related field.
- Previous experience in Data Ops and Data Engineering.
- Strong communication and collaboration skills to engage with technical and non-technical stakeholders.
- Strong experience with Agile software delivery.
Non Essential
- Experience with market data ingestion, metadata extraction, and event-driven architectures.
- Experience with some of EKS, Lambda, EventBridge, Step Functions, S3, DynamoDB, AWS Glue, Snowflake, Terraform and Transfer Family.
- Knowledge of streaming technologies (Kafka, Kinesis) and API integrations, and hands-on experience with monitoring tools (e.g. Grafana) and observability practices.
- Proficient with Terraform or CDK (infrastructure-as-code).
- Experience in Business Communications Technology e.g. Bloomberg, ICE, Symphony, Teams Chat, etc.
- Familiarity with security best practices, IAM, and VPN configuration.
- Experience with regulatory compliance and data security in financial services.
- Knowledge of financial markets and trading platforms.
- Experience with GitLab, Qliksense & Alation.
- Certifications in DataOps, cloud platforms, or related areas.
Company Statement
We know that the best innovation happens when diverse people with different perspectives and skills work together in an inclusive atmosphere. That’s why we’re building a culture where everyone plays a part in making people feel welcome, ready and willing to contribute. TP ICAP Accord - our Employee Network - is central to this. As well as representing specific groups, TP ICAP Accord helps increase awareness, collaboration, shares best practice, and holds our firm to account for driving continuous cultural improvement.
Data Engineer (Trade Surveillance) in Belfast employer: TP ICAP
Contact Detail:
TP ICAP Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer (Trade Surveillance) in Belfast
✨Tip Number 1
Network like a pro! Reach out to folks in the industry on LinkedIn or at events. A friendly chat can open doors that a CV just can't.
✨Tip Number 2
Show off your skills! If you’ve got a portfolio or projects that highlight your data engineering prowess, share them during interviews. It’s a great way to demonstrate what you can bring to the table.
✨Tip Number 3
Prepare for those tricky questions! Brush up on your technical knowledge and be ready to discuss how you’d tackle real-world problems related to data pipelines and compliance. Practice makes perfect!
✨Tip Number 4
Don’t forget to apply through our website! We love seeing applications directly from candidates who are excited about joining us. Plus, it shows you’re genuinely interested in being part of our team.
We think you need these skills to ace Data Engineer (Trade Surveillance) in Belfast
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Data Engineer role. Highlight your ETL/ELT experience and any relevant projects you've worked on, especially those involving AWS and data pipelines.
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how your background aligns with our needs. Mention specific technologies from the job description that you’re familiar with, like Python or Terraform.
Showcase Problem-Solving Skills: In your application, give examples of how you've tackled complex data challenges in the past. We love seeing candidates who can think critically and adapt in fast-paced environments, so don’t hold back!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for this exciting opportunity. Plus, it’s super easy!
How to prepare for a job interview at TP ICAP
✨Know Your Data Pipelines
Make sure you understand the ins and outs of ETL/ELT processes. Be ready to discuss your experience with building and maintaining data pipelines, especially in financial markets. Brush up on specific tools like AWS Glue or Airflow, as they might come up during the chat.
✨Showcase Your Problem-Solving Skills
Prepare examples of how you've tackled data anomalies or quality issues in the past. Interviewers will want to see your analytical thinking in action, so think of scenarios where you identified a problem and how you resolved it.
✨Familiarise Yourself with Compliance
Since this role involves regulatory compliance, be ready to discuss your understanding of data governance frameworks and security protocols. Highlight any previous experience you have in ensuring data integrity and compliance in your past roles.
✨Communicate Effectively
This position requires collaboration with various teams, so practice articulating technical concepts in a way that non-technical stakeholders can understand. Prepare to demonstrate your communication skills through examples of past teamwork or cross-departmental projects.