At a Glance
- Tasks: Build resilient data pipelines and transform raw data into valuable insights.
- Company: Join a dynamic team in the Financial Services/Capital Markets industry.
- Benefits: Competitive salary, career growth, and opportunities to work with cutting-edge technology.
- Why this job: Make a real impact by optimising data processes and enabling trusted datasets.
- Qualifications: Experience in data engineering and strong analytical skills required.
- Other info: Collaborative environment with a focus on innovation and professional development.
The predicted salary is between 36000 - 60000 £ per year.
We are seeking highly skilled and experienced Azure Data Engineers to join a newly formed group concentrating on Data. Within this role you will be a key member of the team, working on a complex and challenging project within the Financial Services/Capital Markets industry. The primary focus of the role would be on building resilient, reusable Data Pipelines to extract, load, and transform raw data into a relational data model. The successful candidate will work across complex, multi-source datasets including loan servicing systems, property and valuation platforms, collections systems, and third-party data providers, delivering reliable and auditable data at scale.
Key Responsibilities
- Serve as the team’s ADF, Databricks, Python, PySpark & Spark SQL technical expert
- Responsible for day-to-day collection & ingestion of raw data into corporate data assets
- Work with the team to formalize data flows and data standards
- Enable trusted datasets for portfolio analytics, asset strategy, finance, and risk
- Supervise all data ingestion & integration processes from source to target including the data warehouse, data lake, etc
- Performance tune and optimize all data ingestion and data integration processes
- Partner with Data Stewards and Business Analysts to understand the nature of the data being handled and what an optimal Data Pipeline for it should look like
- Design solutions that are aligned to the target state Data Architecture
About You
- Degree in Computer Science, Information Systems, Data Science, or a related field is preferable
- Proven experience building resilient, reusable Data Pipelines as a Data Engineer or equivalent
- Resourceful, motivated self-starter with the ability to collaborate across business and technology
- Strong analytical, verbal, and written communication skills
- A background in financial data domains (IBOR/ABOR, transactions, market data, reference data)
- Strong experience as a Data Engineer within Real Estate, Credit, Banking, or NPL Asset Management
- Microsoft certification a plus
Data Engineer in Manchester employer: Arrow Global Group
Contact Detail:
Arrow Global Group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in Manchester
✨Tip Number 1
Network like a pro! Reach out to your connections in the industry, especially those in financial services or data engineering. A friendly chat can lead to insider info about job openings that aren't even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your best data pipelines and projects. This is your chance to demonstrate your expertise in Azure, Databricks, and Python, making you stand out to potential employers.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering scenarios. Be ready to discuss how you've tackled complex datasets and optimised data flows. Practice makes perfect, so consider mock interviews with friends or mentors.
✨Tip Number 4
Don't forget to apply through our website! We have exciting opportunities waiting for talented Data Engineers like you. Plus, it’s a great way to ensure your application gets the attention it deserves.
We think you need these skills to ace Data Engineer in Manchester
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Azure, Data Pipelines, and any relevant financial data domains. We want to see how your skills align with the role, so don’t be shy about showcasing your expertise!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re the perfect fit for our team. Mention specific projects or experiences that relate to building resilient data pipelines and working with complex datasets.
Showcase Your Technical Skills: We’re looking for a technical expert, so make sure to highlight your proficiency in ADF, Databricks, Python, and Spark SQL. Include any relevant certifications or projects that demonstrate your capabilities in these areas.
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows you’re keen on joining our team at StudySmarter!
How to prepare for a job interview at Arrow Global Group
✨Know Your Tech Inside Out
Make sure you brush up on your skills with ADF, Databricks, Python, PySpark, and Spark SQL. Be ready to discuss specific projects where you've built data pipelines and how you tackled challenges. This will show that you're not just familiar with the tools but can also apply them effectively.
✨Understand the Financial Services Landscape
Since this role is within the Financial Services/Capital Markets industry, it’s crucial to have a grasp of financial data domains like IBOR/ABOR and market data. Familiarise yourself with how these datasets interact and be prepared to discuss their significance in your previous work.
✨Showcase Your Problem-Solving Skills
Be ready to share examples of how you've optimised data ingestion and integration processes in the past. Highlight any performance tuning you've done and how it improved data reliability or efficiency. This will demonstrate your analytical skills and resourcefulness.
✨Communicate Clearly and Collaboratively
Strong communication is key, especially when working with Data Stewards and Business Analysts. Practice explaining complex technical concepts in simple terms. This will help you convey your ideas effectively and show that you can collaborate well across teams.