Python Developer with Azure Databricks
Python Developer with Azure Databricks

Python Developer with Azure Databricks

London Full-Time 36000 - 60000 £ / year (est.) No home office possible
L

At a Glance

  • Tasks: Join a dynamic team to enhance an AI platform and manage Python-based reporting systems.
  • Company: Be part of an international team based in Krakow, Wroclaw, and London, focused on innovative data solutions.
  • Benefits: Enjoy flexible working options, a collaborative culture, and opportunities for professional growth.
  • Why this job: Work on impactful projects that shape the future of data processing in finance while developing your skills.
  • Qualifications: Proficiency in Python, PySpark, SQL, and experience with Databricks are essential.
  • Other info: Ideal for tech-savvy individuals passionate about AI and regulatory compliance in a fast-paced environment.

The predicted salary is between 36000 - 60000 £ per year.

You will join the team behind an internal AI platform for processing and interacting with unstructured data. The team is currently over 30 people strong and is organized into agile teams, each of which is self-sufficient and handles the creation of features from the idea stage, through analysis, implementation, testing, production deployment, and maintenance. The team is international, and it's located in Krakow, Wroclaw and London.

Responsibilities

  • Maintain and enhance the existing Python-based reporting system, ensuring stability and accuracy in regulatory reporting.
  • Monitor, troubleshoot, and resolve issues in the current reporting workflows.
  • Ensure that reports comply with regulatory requirements and meet audit standards.
  • Lead the migration of the existing reporting system to Azure Databricks, ensuring a smooth transition.
  • Design and implement new, scalable reporting solutions using PySpark, SQL, and Databricks visualization tools.
  • Optimize data processing pipelines to enhance performance, reliability, and efficiency of regulatory reports.
  • Optimize PySpark performance to handle large-scale financial data processing for regulatory reporting.
  • Improve SQL query execution to ensure fast, efficient report generation.
  • Implement best practices for structured and unstructured data processing.
  • Automate reporting processes to reduce manual intervention and enhance accuracy.
  • Work with DevOps teams to ensure seamless Azure Databricks integration into the existing infrastructure.

Skills

  • Must have:
  • Databricks Concepts
  • Databricks performance optimizations
  • ACID properties of RDMS
  • Python, PySpark
  • SQL Concepts
  • Databricks Queries and Visualization
  • Exposure to CI/CD DevOps Pipelines
  • Nice to have:
  • Azure Data Services
  • ETL & Data Processing
  • Financial Data Processing, experience handling risk, compliance, and regulatory data in an investment banking environment.
  • Data Modeling, meta data management
  • Git, GitHub, GitLab, Jenkins
  • Regulatory compliance knowledge: Basel, MiFID, GDPR
  • Big Data
  • Cloud security and access controls (IAM, RBAC)
  • Familiarity with Docker, Kubernetes, Apache

Python Developer with Azure Databricks employer: Luxoft

Join a dynamic and innovative team in Krakow, where your contributions as a Python Developer with Azure Databricks will directly impact our cutting-edge AI platform. We pride ourselves on fostering a collaborative work culture that encourages professional growth through continuous learning and development opportunities, while also offering competitive benefits and a supportive environment for work-life balance. With a diverse international team, you'll be part of a forward-thinking company that values creativity and excellence in technology.
L

Contact Detail:

Luxoft Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Python Developer with Azure Databricks

✨Tip Number 1

Familiarise yourself with Azure Databricks and its performance optimisation techniques. Understanding how to leverage Databricks for scalable reporting solutions will give you a significant edge during discussions with our team.

✨Tip Number 2

Brush up on your Python and PySpark skills, especially in the context of data processing and regulatory reporting. Being able to demonstrate your coding abilities and problem-solving skills in these areas will be crucial.

✨Tip Number 3

Gain a solid understanding of regulatory compliance standards like Basel and GDPR. Showing that you are knowledgeable about these regulations will highlight your suitability for maintaining compliance in our reporting systems.

✨Tip Number 4

Engage with our current projects or similar open-source initiatives on platforms like GitHub. This not only showcases your technical skills but also demonstrates your proactive approach to learning and collaboration.

We think you need these skills to ace Python Developer with Azure Databricks

Python Programming
PySpark
SQL
Databricks Concepts
Databricks Performance Optimizations
Data Processing Pipelines
Regulatory Compliance Knowledge (Basel, MiFID, GDPR)
ETL Processes
Data Modelling
Git and Version Control (GitHub, GitLab)
CI/CD DevOps Pipelines
Cloud Security and Access Controls (IAM, RBAC)
Troubleshooting and Problem-Solving Skills
Automation of Reporting Processes
Agile Methodologies

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights relevant experience with Python, Azure Databricks, and any regulatory compliance knowledge. Use specific examples that demonstrate your skills in data processing and reporting.

Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention how your background aligns with the responsibilities listed, particularly your experience with PySpark and SQL, and your understanding of regulatory requirements.

Showcase Relevant Projects: If you have worked on projects involving Databricks or similar technologies, be sure to include these in your application. Describe your role, the challenges faced, and the outcomes achieved to demonstrate your capability.

Proofread Your Application: Before submitting, carefully proofread your application for any spelling or grammatical errors. A polished application reflects attention to detail, which is crucial for a role that involves regulatory reporting.

How to prepare for a job interview at Luxoft

✨Showcase Your Python Skills

Be prepared to discuss your experience with Python, especially in the context of reporting systems. Highlight specific projects where you've maintained or enhanced Python-based applications, and be ready to demonstrate your coding skills if asked.

✨Understand Azure Databricks

Familiarise yourself with Azure Databricks concepts and performance optimisations. Be ready to explain how you would lead the migration of a reporting system to this platform, and discuss any relevant experience you have with similar migrations.

✨Emphasise Regulatory Knowledge

Since the role involves regulatory reporting, make sure to highlight your understanding of compliance standards like Basel, MiFID, and GDPR. Discuss any previous experience you have in ensuring reports meet these requirements.

✨Demonstrate Problem-Solving Skills

Prepare to discuss how you've monitored, troubleshot, and resolved issues in reporting workflows. Use specific examples to illustrate your problem-solving approach, particularly in optimising data processing pipelines and SQL query execution.

Python Developer with Azure Databricks
Luxoft
L
  • Python Developer with Azure Databricks

    London
    Full-Time
    36000 - 60000 £ / year (est.)

    Application deadline: 2027-07-10

  • L

    Luxoft

Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>