




Summary: Expleo is seeking a Data Engineer to design, develop, and maintain data pipelines and ETL processes, ensuring data integrity and consistency for financial platforms. Highlights: 1. Design, develop, and maintain data pipelines and ETL processes 2. Participate in data migration projects ensuring data integrity 3. Collaborate with data engineers, analysts, and business stakeholders Overview: Expleo is a trusted partner for your innovation journey. As a global engineering, technology and consulting service provider, we are ideally positioned to help you achieve your ambitions and future\-proof your business. With a smart blend of bold thinking and reliable execution, we’re able to fast\-track innovation through each step of your value chain. We are strategically positioned to build value, with a global footprint across 30 countries. We are as global and local as you need us to be, with strong best\-in\-class pan\-European technological centres and unique best\-shoring capabilities. We leverage a network of high value\-adding affiliates in consulting and industrial excellence, and leading partners across multiple sectors to provide you with the most comprehensive services and solutions in an ever\-changing environment. Responsibilities: * Design, develop, and maintain data pipelines and ETL processes to support data migration initiatives. * Participate in data migration projects, ensuring data integrity, quality, and consistency across systems. * Work with AWS\-based data environments to build scalable and reliable data solutions. * Develop and optimize Python and SQL scripts for data processing and transformation. * Integrate and manage data from financial platforms such as BlackRock Aladdin and other investment management systems. * Collaborate with data engineers, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. * Ensure high performance, reliability, and security of data workflows. * Use Docker and Git for version control and containerized environments. * Troubleshoot and resolve data\-related issues in Linux and Windows environments. * Document data architecture, pipelines, and migration processes. Essential skills: * Minimum 8 years of experience as a Data Engineer or in a similar role. * Strong experience with Python and SQL for data processing and transformation. * Hands\-on experience with AWS cloud services. * Solid knowledge of ETL development and data pipeline design. * Experience with data migration projects. * Proficiency with PostgreSQL or other relational databases. * Experience with Docker and Git. * Comfortable working in Linux and Windows environments. * Fluent English communication skills. What do I need before I apply: * The opportunity is remote but the candidate must already be living in PT.


