At a Glance
- Tasks: Join us to unlock data insights and innovate with cutting-edge technologies like AI and ML.
- Company: Talan Data x AI, a top-rated consultancy known for its positive culture and innovation.
- Benefits: Enjoy 25 days holiday, private medical insurance, and a cycle to work scheme.
- Why this job: Make a real impact in data engineering while working with industry experts.
- Qualifications: Experience with PySpark, AWS, Python, and SQL is essential.
- Other info: Great career growth opportunities and a supportive team environment.
The predicted salary is between 36000 - 60000 £ per year.
Talan Data x AI is a leading Data Management and Analytics consultancy, working closely with leading software vendors and top industry experts across a range of sectors, unlocking value and insight from their data. At Talan Data x AI, innovation is at the heart of our client offerings, and we help companies to further improve their efficiency with modern processes and technologies, such as Machine Learning (ML) and Artificial Intelligence (AI). Our consultants are at the heart of everything we do, and we have been recertified as a 2025 Great Place to Work. This achievement highlights Talan Data x AI’s positive organisational culture and strengthens its reputation as an employer of choice within the industry. We invest heavily in the training and development of our teams and hold regular socials in each region to encourage engagement and network building.
Skills and attributes for success:
- An excellent team player and able to work independently.
- Excellent client facing skills with experience on client projects.
- A self-starter who is proactive in nature.
- Excellent verbal, written communication, and presentational skills.
- Ability to build internal and external relationships.
- Effective negotiating and influencing skills.
- Ability to think creatively and propose innovative solutions.
- Leadership skills.
To qualify for this role, you must have:
- Proven experience and knowledge with PySpark and Apache Spark including the fundamentals of how it works.
- Core experience with AWS, with substantial and mature Azure platform offering.
- Experience with other cloud platforms, e.g. Azure, GCP and data lake architectures.
- Strong experience and programming skills in languages such as Python and SQL and the ability to write complex SQL queries.
- Use of GitHub and CI/CD practices.
- Support development of the Azure Databricks Lakehouse platform, shaping frameworks and solutions that other engineering teams will adopt in future data projects.
- Build, optimise, and maintain data processing frameworks using Python, ensuring performance, scalability, and maintainability.
- Support DBT integration and best practices for transformation pipelines within Databricks.
- Apply software engineering principles including: Source control, automated testing, CI/CD, Design patterns and reusable solutions, Coding standards and patterns.
- Collaborate with technical solution authorities, ensuring alignment with governance, design decisions, and platform standards.
- Collaborate closely with the Cloud Architecture and Data Architecture teams to deliver approved solutions.
- Stakeholder management, take ownership of requirements, communicate effectively across teams, and deliver high quality solutions.
- Experience of DevOps and infrastructure deployments (Azure and Databricks).
- A proactive awareness of industry standards, regulations, and developments.
- Multi-skilled experience in one or more of the following disciplines: Data Management, Data Engineering, Data Warehousing, Data Modelling, Data Quality, Data Integration, Data Analytics, Data Visualisation, Data Science and Business Intelligence.
- Proficiency in Infrastructure as Code tools, especially Terraform.
- Experience with Terraform for cloud resource provisioning (AWS, Azure, GCP).
- Project experience using one or more of the following technologies: Tableau, Power BI, Cloud, Azure, AWS, GCP, Snowflake and their integration with Databricks is advantageous.
Qualifications:
- Willing to work on client sites, potentially for extended periods.
- Willing to travel for work purposes and be happy to stay away from home for extended periods.
- Eligible to work in the UK without restriction.
What we offer:
- 25 days holiday + bank holidays.
- 5 days holiday buy/sell option.
- Private medical insurance.
- Life cover.
- Cycle to work scheme.
- Eligibility for company pension scheme (5% employer contribution, salary sacrifice option).
- Employee assistance programme.
- Bespoke online learning via Udemy for Business.
Data Engineering Consultant employer: Talan
Contact Detail:
Talan Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineering Consultant
✨Tip Number 1
Network like a pro! Get out there and connect with industry folks on LinkedIn or at local meetups. The more people you know, the better your chances of landing that Data Engineering Consultant gig.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving PySpark, AWS, and Databricks. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge and soft skills. Practice explaining complex concepts in simple terms, as client-facing roles require clear communication. We want to see how you can engage with clients effectively!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we love seeing candidates who are genuinely interested in joining our awesome team at Talan Data x AI.
We think you need these skills to ace Data Engineering Consultant
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineering Consultant role. Highlight your experience with PySpark, AWS, and any relevant projects you've worked on. We want to see how your skills align with what we do at Talan Data x AI!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our innovative team. Don’t forget to mention any client-facing experience you have – we love a good communicator!
Showcase Your Projects: If you've worked on any cool projects using Python, SQL, or cloud platforms, make sure to showcase them in your application. We’re keen to see your problem-solving skills and how you’ve applied your technical knowledge in real-world scenarios.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of your application and ensures you don’t miss out on any important updates. Plus, it’s super easy!
How to prepare for a job interview at Talan
✨Know Your Tech Inside Out
Make sure you brush up on your knowledge of PySpark, Apache Spark, and the cloud platforms mentioned in the job description. Be ready to discuss how you've used these technologies in past projects, as well as any challenges you faced and how you overcame them.
✨Showcase Your Problem-Solving Skills
Prepare to share specific examples of how you've approached complex data engineering problems. Think about times when you had to think creatively or propose innovative solutions, and be ready to explain your thought process during the interview.
✨Communicate Like a Pro
Since excellent communication skills are key for this role, practice articulating your thoughts clearly and concisely. You might even want to do a mock interview with a friend to get comfortable discussing your experience and technical knowledge.
✨Build Relationships
Demonstrate your ability to build relationships by being personable and engaging during the interview. Ask insightful questions about the team and company culture at Talan Data x AI, showing that you're genuinely interested in becoming part of their community.