




What will you do?* Use specialized knowledge of AWS services such as Lambda, Glue, and Step Functions to design, implement, and maintain scalable data solutions; * Develop robust solution architectures considering scalability, performance, security, and cost optimization; * Demonstrate proficiency in cloud networking, including VPCs, sub\-nets, security groups, and routing tables; * Design efficient data models to optimize query performance; * Write and optimize SQL queries, identifying potential performance bottlenecks; * Manage ETL processes and data integration in Redshift, MySQL, and PostgreSQL; * Create documentation and train team members; * Configure and manage logging and tracing mechanisms on AWS through services such as AWS CloudTrail and AWS X\-Ray; * Implement orchestration solutions using Apache Airflow and AWS Step Functions; * Use Athena for interactive analysis of large volumes of data in Amazon S3; * Provide technical leadership and act as an expert in AWS and data engineering technologies; * Produce comprehensive technical and solution documentation; * Stay up to date with new technologies and market trends; * Challenge business requirements and propose innovative solutions to improve efficiency and performance. What are we looking for?* AWS Specialization: Solid experience with AWS services, including Lambda, Glue, Step Functions, CloudFormation, and CloudWatch; * Solution Architecture Knowledge: Ability to design scalable and efficient data solutions on AWS, following cloud architecture best practices; * Proficiency in Python and Databases: Advanced programming skills in Python and experience with relational databases (MySQL, PostgreSQL, Redshift) and NoSQL; * Workflow Orchestration and Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automation and management of data pipelines; * ETL and Big Data Tools: Knowledge of ETL tools and experience handling large volumes of data (experience with Kafka is valued); * Experience with Iceberg Tables: Familiarity with Iceberg tables for efficient management of large\-scale datasets, ensuring consistency and ACID transaction support; * Production Awareness and Troubleshooting: Proactive approach to monitoring and resolving production issues, anticipating and mitigating risks; * Technical Leadership and Communication: Ability to grow into a tech lead role, with excellent communication and teamwork skills; * Analytical and Problem\-Solving Skills: Ability to analyze requirements, define technical approaches, and propose innovative solutions to complex problems; * Documentation and Requirements Analysis: Experience writing technical and solution documentation, with the ability to question and refine business requirements; * Knowledge of Azure Databricks: Familiarity with Databricks for data engineering and analytics tasks.


