At a Glance
- Tasks: Build and optimise data pipelines using Databricks for innovative analytics.
- Company: Join the world's largest independent renewable energy company.
- Benefits: Competitive salary, flexible working, and opportunities for professional growth.
- Why this job: Make a real impact in the renewable energy sector with cutting-edge technology.
- Qualifications: Experience in Databricks, Python, and strong problem-solving skills required.
- Other info: Diverse and inclusive workplace committed to equal opportunity.
The predicted salary is between 36000 - 60000 ÂŁ per year.
We are the world's largest independent renewable energy company, dedicated to providing affordable, zero‑carbon energy. As part of our Digital Solutions business, we are seeking a skilled Data Engineer with expertise in Databricks to build and optimise scalable data pipelines that enable data‑driven analytics and machine learning for our asset performance management software. This is a 24‑month fixed‑term contract.
Responsibilities
- Design, develop, and maintain robust data pipelines using DLT on Databricks.
- Collaborate with software engineers, data scientists and platform engineers to understand data requirements and deliver high‑quality solutions.
- Implement ETL/ELT processes to ingest, transform, and store data from various sources (structured and unstructured).
- Optimize performance and cost‑efficiency of data workflows on Databricks.
- Ensure data quality, integrity, and governance through validation, monitoring, and documentation.
- Develop reusable components and frameworks to accelerate data engineering efforts.
- Support CI/CD practices and automation for data pipeline deployment.
- Stay current with Databricks features and best practices, and advocate for their adoption.
Knowledge
- Solid understanding of data modelling, warehousing concepts, and distributed computing.
- Familiarity with Delta Lake and Unity Catalog.
- Knowledge of data governance frameworks and compliance standards (e.g., GDPR, HIPAA).
Skills
- Strong programming skills in Python and SQL.
- Experience with version control (e.g., Git) and CI/CD tools.
- Excellent problem‑solving and communication skills, both written and oral.
Experience
- Proven experience as a Data Engineer with hands‑on expertise in Databricks and DLT.
- Experience with cloud data platforms, ideally Azure; experience with AWS or Google is an advantage.
- Exposure to machine learning workflows and integration with ML models.
- Delivering results working in a distributed, cross‑functional team.
Qualifications
- Databricks certification (e.g., Databricks Certified Data Engineer Associate/Professional).
At RES we celebrate difference and believe that diverse perspectives drive innovation. We encourage applicants with different backgrounds, ideas and points of view to apply. That is why we have a strong commitment to equal opportunity and to creating a welcoming workplace for everyone, regardless of ethnicity, culture, gender, nationality, age, disability, sexual orientation, gender identity, marital or parental status, education, or social background. We are an equal‑employment‑opportunity employer who strives for inclusion for all employees and applicants.
Data Engineer in Glasgow employer: RES
Contact Detail:
RES Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in Glasgow
✨Network Like a Pro
Get out there and connect with folks in the industry! Attend meetups, webinars, or even local tech events. The more people you know, the better your chances of landing that Data Engineer role.
✨Show Off Your Skills
Create a portfolio showcasing your projects, especially those involving Databricks and Python. Share it on platforms like GitHub or your personal website. This gives potential employers a taste of what you can do!
✨Ace the Interview
Prepare for technical interviews by brushing up on your data engineering concepts and coding skills. Practice common interview questions and be ready to discuss your past experiences with data pipelines and ETL processes.
✨Apply Through Our Website
Don’t forget to apply directly through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who take that extra step!
We think you need these skills to ace Data Engineer in Glasgow
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Databricks and Python. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how you can contribute to our mission of providing zero-carbon energy. Keep it concise but impactful!
Showcase Your Problem-Solving Skills: In your application, give examples of how you've tackled challenges in previous roles. We love seeing candidates who can think critically and come up with innovative solutions, especially in data workflows.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy!
How to prepare for a job interview at RES
✨Know Your Tech Inside Out
Make sure you brush up on your Databricks and Python skills before the interview. Be ready to discuss specific projects where you've built data pipelines or worked with DLT. The more you can showcase your hands-on experience, the better!
✨Understand the Company’s Mission
Since the company is focused on renewable energy, it’s crucial to understand their mission and how data engineering plays a role in achieving zero-carbon energy. Show your enthusiasm for their goals and how your skills can contribute to their success.
✨Prepare for Technical Questions
Expect technical questions related to data modelling, ETL/ELT processes, and cloud platforms like Azure. Practise explaining complex concepts in simple terms, as communication is key when collaborating with cross-functional teams.
✨Showcase Your Problem-Solving Skills
Be prepared to discuss challenges you've faced in previous roles and how you overcame them. Use the STAR method (Situation, Task, Action, Result) to structure your answers, highlighting your analytical thinking and problem-solving abilities.