At a Glance
- Tasks: Build and enhance data pipelines for shelf analytics to boost product visibility and sales.
- Company: Join a leading team at P&G focused on innovative data solutions.
- Benefits: Competitive salary, flexible work options, and opportunities for professional growth.
- Why this job: Make a real impact in retail by optimising product placement with data-driven insights.
- Qualifications: Strong Python and PySpark skills, experience with Databricks and SQL.
- Other info: Collaborative environment with a focus on continuous improvement and innovation.
The predicted salary is between 36000 - 60000 £ per year.
We are looking for an experienced Data Engineer to join the Shelf Analytics project – a data-driven application designed to analyze how P&G products are positioned on store shelves. The primary objective of the solution is to improve product visibility, optimize in-store execution, and ultimately increase sales by combining shelf layout data with sales insights.
As a Data Engineer, you will play a key role in building, maintaining, and enhancing scalable data pipelines and analytics workflows that power shelf-level insights. You will work closely with analytics and business stakeholders to ensure high-quality, reliable, and performant data solutions.
Responsibilities
- Design, develop, and maintain data pipelines and workflows using Databricks and PySpark
- Read, understand, and extend existing codebases; independently develop new components for Databricks workflows
- Implement object-oriented Python solutions (classes, inheritance, reusable modules)
- Develop and maintain unit tests to ensure code quality and reliability
- Work with Spark SQL and SQL Server Management Studio to create and optimize complex queries
- Create and manage Databricks workflows, clusters, databases, and tables
- Handle data storage and access management in Azure Data Lake Storage (ADLS), including ACL permissions
- Collaborate using GitHub, following CI/CD best practices and working with GitHub Actions
- Support continuous improvement of data engineering standards, performance, and scalability
Skills – must have
- Strong programming skills in Python and PySpark
- Hands-on experience with Databricks (workflows, clusters, tables, databases)
- Solid knowledge of SQL and experience with Spark SQL and SQL Server Management Studio
- Experience with pandas, dbx, and unit testing frameworks
- Practical experience working with Azure Storage (ADLS) and access control (ACLs)
- Proficiency with GitHub, including CI/CD pipelines and GitHub Actions
- Ability to work independently, analyze existing solutions, and propose improvements
Nice to have
- Experience with retail, CPG, or shelf analytics–related solutions
- Familiarity with large-scale data processing and analytics platforms
- Strong communication skills and a proactive, problem-solving mindset
Data Engineer for Shelf Analytics MŁ employer: Luxoft
Contact Detail:
Luxoft Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer for Shelf Analytics MŁ
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data pipelines, workflows, and any projects you've worked on. This is your chance to demonstrate your expertise in Python, PySpark, and Databricks.
✨Tip Number 3
Prepare for interviews by brushing up on your technical skills and understanding the business side of things. Be ready to discuss how your work can improve product visibility and sales – that’s what it’s all about!
✨Tip Number 4
Don’t forget to apply through our website! We’re always on the lookout for talented Data Engineers like you. Plus, it’s a great way to ensure your application gets seen by the right people.
We think you need these skills to ace Data Engineer for Shelf Analytics MŁ
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Python, PySpark, and Databricks. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re excited about the Shelf Analytics project and how your background makes you a perfect fit. Let us know what drives you in data engineering!
Showcase Your Problem-Solving Skills: In your application, mention specific challenges you've faced in previous roles and how you tackled them. We love seeing a proactive mindset and innovative solutions, especially when it comes to data pipelines and workflows.
Apply Through Our Website: Don’t forget to submit your application through our website! It’s the best way for us to receive your details and ensures you’re considered for the role. We can’t wait to see what you bring to the table!
How to prepare for a job interview at Luxoft
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially Python, PySpark, and Databricks. Brush up on your SQL skills too, as you'll likely be asked to demonstrate your ability to write complex queries.
✨Showcase Your Problem-Solving Skills
Prepare to discuss specific challenges you've faced in previous roles and how you tackled them. Companies love candidates who can think critically and propose improvements, so have a few examples ready that highlight your analytical mindset.
✨Familiarise Yourself with the Project
Research the Shelf Analytics project and understand its goals. Knowing how data pipelines can enhance product visibility and sales will help you align your answers with the company’s objectives during the interview.
✨Practice Collaboration Scenarios
Since you'll be working closely with analytics and business stakeholders, be ready to discuss how you’ve collaborated in the past. Think of examples where you used GitHub for version control and how you followed CI/CD best practices to deliver successful projects.