At a Glance
- Tasks: Join a dynamic team to enhance an AI platform and manage Python-based reporting systems.
- Company: Be part of an international team focused on innovative data solutions in Krakow, Wroclaw, and London.
- Benefits: Enjoy flexible working options, a collaborative culture, and opportunities for professional growth.
- Why this job: Work on impactful projects that shape the future of data processing and compliance in finance.
- Qualifications: Proficiency in Python, PySpark, SQL, and experience with Databricks is essential.
- Other info: Ideal for tech-savvy individuals eager to tackle challenges in a fast-paced environment.
The predicted salary is between 36000 - 60000 £ per year.
You will join the team behind an internal AI platform for processing and interacting with unstructured data. The team is currently over 30 people strong and is organized into agile teams, each of which is self-sufficient and handles the creation of features from the idea stage, through analysis, implementation, testing, production deployment, and maintenance. The team is international, and it's located in Krakow, Wroclaw and London.
Responsibilities
- Maintain and enhance the existing Python-based reporting system, ensuring stability and accuracy in regulatory reporting.
- Monitor, troubleshoot, and resolve issues in the current reporting workflows.
- Ensure that reports comply with regulatory requirements and meet audit standards.
- Lead the migration of the existing reporting system to Azure Databricks, ensuring a smooth transition.
- Design and implement new, scalable reporting solutions using PySpark, SQL, and Databricks visualization tools.
- Optimize data processing pipelines to enhance performance, reliability, and efficiency of regulatory reports.
- Optimize PySpark performance to handle large-scale financial data processing for regulatory reporting.
- Improve SQL query execution to ensure fast, efficient report generation.
- Implement best practices for structured and unstructured data processing.
- Automate reporting processes to reduce manual intervention and enhance accuracy.
- Work with DevOps teams to ensure seamless Azure Databricks integration into the existing infrastructure.
Skills
- Must have: Databricks Concepts, Databricks performance optimizations, ACID properties of RDMS, Python, PySpark, SQL Concepts, Databricks Queries and Visualization, Exposure to CI/CD DevOps Pipelines.
- Nice to have: Azure Data Services, ETL & Data Processing, Financial Data Processing, experience handling risk, compliance, and regulatory data in an investment banking environment, Data Modeling, meta data management, Git, GitHub, GitLab, Jenkins, Regulatory compliance knowledge: Basel, MiFID, GDPR, Big Data, Cloud security and access controls (IAM, RBAC), Familiarity with Docker, Kubernetes, Apache.
Python Developer with Azure Databricks employer: Luxoft
Contact Detail:
Luxoft Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Python Developer with Azure Databricks
✨Tip Number 1
Familiarise yourself with Azure Databricks and its performance optimisation techniques. Understanding how to effectively use Databricks will not only help you during the interview but also demonstrate your commitment to mastering the tools we use.
✨Tip Number 2
Brush up on your Python and PySpark skills, especially in the context of data processing and reporting. Being able to showcase your coding abilities and problem-solving skills in these areas will set you apart from other candidates.
✨Tip Number 3
Gain a solid understanding of regulatory compliance standards like Basel and GDPR. This knowledge is crucial for the role, and being able to discuss how you've applied these standards in past projects will impress us.
✨Tip Number 4
Network with professionals in the field, particularly those who have experience with financial data processing. Engaging with others can provide insights into the industry and may even lead to referrals, increasing your chances of landing the job with us.
We think you need these skills to ace Python Developer with Azure Databricks
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience with Python, Azure Databricks, and any regulatory compliance knowledge. Use keywords from the job description to demonstrate that you meet the requirements.
Craft a Compelling Cover Letter: In your cover letter, explain why you're passionate about working with unstructured data and how your skills in PySpark and SQL can contribute to the team. Mention specific projects or experiences that align with the responsibilities outlined in the job description.
Showcase Relevant Projects: If you have worked on projects involving data processing, reporting systems, or Azure Databricks, be sure to include these in your application. Provide details on your role, the technologies used, and the outcomes achieved.
Highlight Team Collaboration Skills: Since the role involves working in agile teams, emphasise your experience in collaborative environments. Mention any tools or methodologies you've used, such as Git for version control or CI/CD pipelines, to show your ability to work effectively within a team.
How to prepare for a job interview at Luxoft
✨Showcase Your Python Skills
Be prepared to discuss your experience with Python, especially in the context of reporting systems. Highlight specific projects where you've maintained or enhanced Python-based applications, and be ready to demonstrate your problem-solving skills through coding challenges.
✨Understand Azure Databricks
Familiarise yourself with Azure Databricks concepts and performance optimisations. Be ready to explain how you would lead the migration of a reporting system to Databricks and discuss any relevant experiences you have with similar transitions.
✨Emphasise Regulatory Knowledge
Since the role involves regulatory reporting, brush up on your knowledge of compliance standards like Basel, MiFID, and GDPR. Be prepared to discuss how you've ensured compliance in past projects and how you would approach this in the new role.
✨Demonstrate Team Collaboration
The team is international and works in agile environments, so highlight your experience working in diverse teams. Discuss how you’ve collaborated with DevOps teams or other stakeholders to ensure seamless integration of new technologies into existing infrastructures.