About the Role
Mandatory Skills:
Experience deploying and running AWS-based data solutions using services like S3, Lambda, SNS, Cloud Step Functions
Strong hands-on experience with PySpark
Proficiency in Python packages like NumPy, Pandas, etc.
Sound working knowledge of AWS services is a must
Should be able to function as an Individual Contributor
Familiarity with metadata management, data lineage, and data governance principles is a plus
➕ Good to Have:
Experience processing large sets of semi-structured and structured data
Experience building Data Lakes & working with Delta Tables
Knowledge of computing & cost optimization strategies
Ability to build holistic Data Integration frameworks
Good exposure to MWAA (Airflow orchestration)
💬 Soft Skills:
Strong communication skills to effectively interact with IT stakeholders and business teams
Ability to understand business pain points and deliver accordingly
Requirements
Additional Details:
Work Mode: Work from Office (WFO)
Shift: Day Shift (Monday to Friday)
Timings: 9:00 AM – 6:00 PM
Cab Facility: Not provided
Notice Period: 30 days
Career Gap: Should not exceed 3–4 months
📞 Interview Process:
Walk-in drive for shortlisted candidates on 12th July 2025 at Chennai office