Short Description:
Mactores, a leading provider of modern data platform solutions, is hiring a Senior AWS Data Engineer based in Mumbai for a full-time remote position. The ideal candidate should have over three years of experience in PySpark and SQL, with expertise in developing and maintaining data pipelines using Amazon EMR or Glue. Proficiency in data modeling and end-user querying using Amazon Redshift or Snowflake, Athena, Presto, and experience with Airflow orchestration is crucial. The role involves collaborating with teams, troubleshooting and optimizing data pipelines, and staying updated on emerging AWS data technologies. Mactores values equal opportunities and emphasizes a culture guided by ten core leadership principles.
Senior AWS Data Engineer
Location: Mumbai, MH
Department: Data Engineering and Data Science
Employment Type: Full Time (Remote)
About Us: Mactores has been a reliable leader in delivering contemporary data platform solutions to businesses since 2008. Our commitment to automating End-to-End Data Solutions has empowered businesses to enhance their value through agility and security. We collaborate closely with our clients to facilitate a seamless digital transformation, offering expertise in assessments, migration, and modernization.
Position Overview: Mactores is in search of a skilled Senior AWS Data Engineer to join our dynamic team. The successful candidate will bring extensive experience in PySpark and SQL, with a proven track record in developing data pipelines utilizing Amazon EMR or Amazon Glue. Additionally, the ideal candidate will possess expertise in data modeling, end-user querying through platforms like Amazon Redshift or Snowflake, Amazon Athena, Presto, and hands-on experience with orchestration using Airflow.
Responsibilities:
- Develop and manage data pipelines employing Amazon EMR or Amazon Glue.
- Construct data models and facilitate end-user querying using Amazon Redshift or Snowflake, Amazon Athena, and Presto.
- Build and sustain orchestration of data pipelines using Airflow.
- Collaborate with cross-functional teams to comprehend their data requirements and contribute to solution design.
- Troubleshoot and optimize data pipelines and data models.
- Scripting in PySpark and SQL for data extraction, transformation, and loading.
- Document and convey technical solutions to both technical and non-technical audiences.
- Stay abreast of emerging AWS data technologies and assess their impact on existing systems.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 3+ years of experience with PySpark and SQL.
- 2+ years of experience building and maintaining data pipelines using Amazon EMR or Amazon Glue.
- 2+ years of experience with data modeling and end-user querying using Amazon Redshift or Snowflake, Amazon Athena, and Presto.
- 1+ years of experience building and maintaining data pipeline orchestration using Airflow.
- Strong problem-solving and troubleshooting skills.
- Excellent communication and collaboration skills.
- Ability to work independently and within a team environment.
Preferred:
- AWS Data Analytics Specialty Certification.
- Experience with Agile development methodology.
Life at Mactores: We are dedicated to cultivating a culture that positively impacts the lives of every Mactorian. Our 10 Core Leadership Principles, centered around Decision-making, Leadership, Collaboration, and Curiosity, guide our work.
The Path to Joining the Mactores Team: Our recruitment process consists of three stages:
- Pre-Employment Assessment: Evaluate technical proficiency and suitability for the role.
- Managerial Interview: Assess technical skills, hands-on experience, leadership potential, and communication abilities.
- HR Discussion: Discuss the offer and next steps with an HR team member.
Please click here to apply.
Comments
Post a Comment
Please feel free to share your thoughts and discuss.