We are seeking a highly skilled and experienced Senior Data Engineer to join our team in
Paris, specifically in Nanterre. The successful candidate will have expertise in Pyspark, Azure
DevOps, Databricks, and Data Factory, and will play a pivotal role in our data engineering
and analytic initiatives.
- Design, develop, and maintain data pipelines using Pyspark, Azure DevOps, Databricks, and Data Factory.
- Collaborate with cross-functional teams to understand data requirements and ensure data is readily available for analytics and reporting purposes.
- Implement data quality and data governance best practices.
- Troubleshoot and optimise data pipelines for performance and scalability.
- Ensure data security and compliance with industry standards and regulations.
- Proven experience as a Data Engineer, with a strong focus on Pyspark, Azure DevOps, Databricks, and Data Factory.
- Solid knowledge of data engineering best practices, data integration, and ETL processes.
- Proficiency in French (Mandatory) and English, as effective communication is essential for cross-team collaboration.
- Strong problem-solving and analytical skills.
- Experience working with large data sets and distributed computing.
- Ability to work in a fast-paced, dynamic environment.
- This is a 3-year contract.
- You will be working with a multinational finance company, providing opportunities
- for professional growth and development.