




Summary: Junior Data Engineer sought to build and consolidate a career in Data Engineering, participating in data pipeline development, preparation, and transformation within Azure Data Lake / Lakehouse architectures. Highlights: 1. Work with experienced technical team with guidance and mentorship 2. Participate in the development of data pipelines 3. Opportunities for continuous learning and technical growth We are looking for a **Junior Data Engineer** with **1 to 2 years of experience**, who is interested in continuing to build and consolidate their career in **Data Engineering**, working on projects based on **Microsoft Azure**. The selected candidate will join an experienced technical team, participating in the development of data pipelines, data preparation and transformation, and gaining hands\-on experience with **Data Lake / Lakehouse architectures**, with guidance and mentorship. #### **Responsibilities** * Support the gathering and understanding of functional and technical requirements, in collaboration with the team. * Participate in the implementation of data solutions on **Microsoft Azure**, following best practices and defined standards. * Contribute to the development and maintenance of **ETL/ELT pipelines** using: + Azure Data Factory + Azure Databricks + PySpark * Support data preparation, transformation, and validation activities in **Data Lake / Lakehouse** environments. * Help ensure data quality, consistency, and reliability. * Document developments, processes, and technical solutions. * Continuously learn and evolve in the team’s technologies and methodologies. #### **Mandatory Requirements** * Bachelor’s degree (or Master’s degree) in areas such as: + Computer Engineering + Data Science + Information Systems + Electrical Engineering or similar fields * 1 to 2 years of experience (or relevant internship) in **Data Engineering, BI**, or related areas. #### **Basic (Practical) Knowledge of** * Azure Databricks and/or Azure Data Factory * PySpark * Power BI #### **Familiarity With** * Data pipelines and ETL/ELT processes * Data formats such as **Parquet** and **Delta Lake** * Basic knowledge of **SQL and/or Python** * Interest in **Agile methodologies** and **DevOps/DataOps practices** * Good organizational and problem\-solving skills #### **Personal Profile** * Strong willingness to learn and grow technically * Proactive attitude and positive mindset * Good communication skills and ability to work in a team * Critical thinking and attention to detail * Strong sense of responsibility and commitment to delivery **What can you expect from us?** Mind\-blowing workplace culture. You will be integrated in a professional, dynamic and collaborative team. **100% Remote opportunities** We want you to have the flexibility to work where you feel most comfortable and productive. **International Career** * You can expect professional growth and to be connect with the world. * We are represented in Portugal, Belgium, Luxembourg, and Denmark. * And with projects in many other countries: Netherlands, Luxembourg, Singapore and in the United States of America (and a lot more is coming…) **Extra Benefits \& Perks** If you wish to work with us and you are outside European Union (good news…) we are a Tech Visa Company, We will help! **As a plus, we provide Health and Life Insurance.**


