At a Glance
- Tasks: Design and build scalable data pipelines for advanced analytics and AI applications.
- Company: Join a growing team at a leading alternative investment firm with $65B in assets.
- Benefits: Full-time role with opportunities for professional growth and impactful projects.
- Why this job: Make a real difference by transforming data into valuable insights for investment decisions.
- Qualifications: Experience in data engineering, Python, and cloud platforms like Azure.
- Other info: Collaborative culture that encourages creativity and innovation in a fast-paced environment.
The predicted salary is between 36000 - 60000 £ per year.
We are looking to expand our Data Engineering team to build modern, scalable data platforms for our internal investment desks and portfolio companies. You will contribute to the firm’s objectives by delivering rapid and reliable data solutions that unlock value for Cerberus desks, portfolio companies, and other businesses. You’ll do this by designing and implementing robust data architectures, pipelines, and workflows that enable advanced analytics and AI applications. You may also support initiatives such as due diligence and pricing analyses by ensuring high-quality, timely data availability.
What you will do:
- Design, build, and maintain scalable, cloud-based data pipelines and architectures to support advanced analytics and machine learning initiatives.
- Develop robust ELT workflows using tools like dbt, Airflow, and SQL (PostgreSQL, MySQL) to transform raw data into high-quality, analytics-ready datasets.
- Collaborate with data scientists, analysts, and software engineers to ensure seamless data integration and availability for predictive modeling and business intelligence.
- Optimize data storage and processing in Azure environments for performance, reliability, and cost-efficiency.
- Implement best practices for data modeling, governance, and security across all platforms.
- Troubleshoot and enhance existing pipelines to improve scalability and resilience.
Sample Projects You Work On:
- Financial Asset Management Pipeline: Build and manage data ingestion from third-party APIs, model data using dbt, and support machine learning workflows for asset pricing and prediction using Azure ML Studio. This includes ELT processes, data modeling, running predictions, and storing outputs for downstream analytics.
Your Experience:
We’re a small, high-impact team with a broad remit and diverse technical backgrounds. We don’t expect any single candidate to check every box below - if your experience overlaps strongly with what we do and you’re excited to apply your skills in a fast-moving, real-world environment, we’d love to hear from you.
- Strong technical foundation: Degree in a STEM field (or equivalent experience) with hands-on experience in production environments, emphasizing performance optimization and code quality.
- Python expertise: Advanced proficiency in Python for data engineering, data wrangling and pipeline development.
- Cloud platforms: Hands-on experience working with Azure. AWS experience is considered, however Azure exposure is essential.
- Data warehousing: Proven expertise with Snowflake – schema design, performance tuning, data ingestion, and security.
- Workflow orchestration: Production experience with Apache Airflow (Prefect, Dagster or similar), including authoring DAGs, scheduling workloads and monitoring pipeline execution.
- Data modeling: Strong skills in dbt, including writing modular SQL transformations, building data models, and maintaining dbt projects.
- SQL databases: Extensive experience with PostgreSQL, MySQL (or similar), including schema design, optimization, and complex query development.
- Infrastructure as code: Production experience with declarative infrastructure definition – e.g. Terraform, Pulumi or similar.
- Version control and CI/CD: Familiarity with Git-based workflows and continuous integration/deployment practices (experience with Azure DevOps or Github Actions) to ensure seamless code integration and deployment processes.
- Communication and problem-solving skills: Ability to articulate complex technical concepts to technical and non-technical stakeholders alike. Excellent problem-solving skills with a strong analytical mindset.
About Us:
We are a new, but growing team of AI specialists – data scientists, software engineers and technology strategists – working to transform how an alternative investment firm with $65B in assets under management leverages technology and data. Our remit is broad, spanning investment operations, portfolio companies and internal systems, giving the team the opportunity to shape the way the firm approaches analytics, automation and decision-making. We operate with the creativity and agility of a small team, tackling diverse, high-impact challenges across the firm. While we are embedded within a global investment platform, we maintain a collaborative, innovative culture where our AI talent can experiment, learn and have real influence on business outcomes.
Data Engineer employer: Cerberus Capital Management
Contact Detail:
Cerberus Capital Management Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Network like a pro! Reach out to current employees at the company through LinkedIn or industry events. A friendly chat can give us insights into the team culture and maybe even a referral!
✨Tip Number 2
Show off your skills in real-time! Consider working on a personal project that showcases your data engineering prowess. Share it on GitHub or during interviews to demonstrate your hands-on experience.
✨Tip Number 3
Prepare for technical interviews by brushing up on your SQL and Python skills. We recommend practicing common data engineering problems and being ready to discuss your thought process during problem-solving.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining our awesome team!
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that align with the Data Engineer role. Highlight your technical expertise in Python, Azure, and data warehousing to catch our eye!
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're excited about this position and how your background fits into our team. Share specific examples of your past projects that relate to building scalable data pipelines.
Showcase Your Projects: If you've worked on relevant projects, especially those involving ELT workflows or cloud-based solutions, make sure to mention them! We love seeing real-world applications of your skills.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows your enthusiasm for joining our team!
How to prepare for a job interview at Cerberus Capital Management
✨Know Your Tech Stack
Make sure you’re well-versed in the tools mentioned in the job description, like dbt, Airflow, and SQL. Brush up on your Python skills too, as you'll likely be asked to demonstrate your coding abilities or solve a problem on the spot.
✨Showcase Your Projects
Prepare to discuss specific projects you've worked on that relate to data pipelines and architectures. Be ready to explain your role, the challenges you faced, and how you overcame them, especially in cloud environments like Azure.
✨Collaboration is Key
Since the role involves working with data scientists and analysts, be prepared to talk about your experience collaborating with cross-functional teams. Highlight any instances where your communication skills helped bridge gaps between technical and non-technical stakeholders.
✨Problem-Solving Mindset
Expect to face some technical questions or case studies during the interview. Approach these with a problem-solving mindset, articulating your thought process clearly. This will show your analytical skills and ability to tackle complex issues head-on.