
11/09/2025) Python+AWS Data Engineer
4 days ago
Responsibilities
- Analyze and translate business requirements into scalable and resilient design.
- Own parts of the application and continuously improve them in an agile environment.
- Create high quality maintainable products and applications using best engineering
- practices.
- Pair with other developers and share design philosophy and goals across the team.
- Work in cross functional teams (DevOps, Data, UX, Testing etc.).
- Build and manage fully automated build/test/deployment environments.
- Ensure high availability and provide quick turnaround to production issues.
- Contribute to the design of useful, usable, and desirable products in a team environment.
- Adapt to new programming languages, methodologies, platforms, and frameworks to
- support the business needs.
Qualifications
- Degree in computer science or a similar field.
- Four or more years of experience in architecting, designing, developing, and implementing
- cloud solutions on AWS and/or Azure platforms.
- Technologies: Python, Azure, AWS, MLFlow, Kubernetes, Terraform, AWS Sage maker,
- Lambda, Step Function.
- Development experience with configuration management tools (Terraform, Ansible,
- CloudFormation).
- Developing and maintaining continuous integration and continuous deployment pipelines -
- Jenkins Groovy scripts.
- Developing containerized solutions and orchestration (Docker, Kubernetes, ECS, ECR)
- Experience of server less architecture, cloud computing, cloud native application and
- scalability etc.
- Understanding of core cloud concepts like Infra as code, IaaS, PaaS and SaaS.
- Relevant certification of Azure or AWS preferred.
- Troubleshooting and analytical skills.
- Knowledge of AI ML technologies, as well as ML model management context
- Strong verbal written communication: should be able to articulate concisely clearly.
-
AWS Data Engineer
2 weeks ago
Mumbai, Maharashtra, India Talent Worx Full time ₹ 15,00,000 - ₹ 20,00,000 per yearWe are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages.Exp- 3 to 7 years Location- Mumbai RequirementsKey Responsibilities:1. Design and implement...
-
AWS Data Engineer
2 weeks ago
Mumbai, Maharashtra, India Talent Worx Full time ₹ 15,00,000 - ₹ 20,00,000 per yearWe are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages.Exp- 8 to 12 years Location- Bangalore, Coimbatore, Delhi NCR, Mumbai RequirementsKey...
-
Aws Data Engineer
4 days ago
Kolkata, Noida, Pune, India Tekskills Full time ₹ 8,00,000 - ₹ 12,00,000 per yearJob Title: EngineerSkills Required: AWS Data EngineerExperience Range Required: 4+ YearsMandatory Skills:Data Engineering( Snowflake, Python)
-
AWS Data Engineer
6 days ago
Mumbai, Maharashtra, India Infogain Full time ₹ 15,00,000 - ₹ 20,00,000 per yearRoles & ResponsibilitiesRole SummaryWe are looking for a highly skilledAWS Databricks Data Engineerto design, develop, and optimize data pipelines and lakehouse solutions on AWS using Databricks. The ideal candidate will have strong experience inPySpark, SQL, Delta Lake, AWS native services, Unity Catalogand building scalable data processing solutions for...
-
AWS Data Engineer
6 hours ago
Pune, Maharashtra, India Zorba AI Full timeJob DescriptionSenior AWS Data Engineer (PySpark & Python) On-site, IndiaIndustry & Sector: Leading IT services & cloud data engineering sector focused on end-to-end data platforms, analytics, and enterprise-scale ETL/ELT solutions. We deliver production-grade data pipelines, real-time streaming, and analytics integrations for large enterprise customers...
-
Scalable Data Engineer – Python, SQL, AWS
1 week ago
Pune, Maharashtra, India beBeeDataEngineer Full time ₹ 15,00,000 - ₹ 20,10,000We are seeking a seasoned data engineering expert to design and build scalable data pipelines using Python, SQL, and AWS.Develop ETL/ELT pipelines with PySpark, AWS, and Databricks.Professionally profile, validate, and transform data using Python and SQL.Optimize workflows for performance, scalability, and cost efficiency.Create automation scripts in...
-
Python/Django + AWS
3 days ago
Noida, Uttar Pradesh, India Iris Software Full time ₹ 15,00,000 - ₹ 28,00,000 per yearPosted On: 11 Sept 2025Location: Noida, UP, IndiaCompany: Iris SoftwareWhy Join Us?Are you inspired to grow your career at one of India's Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies? Do you aspire to thrive in an award-winning work culture that values your talent and...
-
AWS Data Engineer
1 week ago
Noida, Uttar Pradesh, India Coforge Full timeRole - AWS Data EngineerSkills - AWS & PySparkJob Locations - Pune, Greater Noida & HyderabadExperience - 6+ yearsWe at Coforge are hiring AWS Data Engineers with the following skill-set:- Design, develop, and maintain robust ETL/ELT pipelines using tools like Apache Spark, Airflow, or similar.- Build and optimize data architectures (data lakes, data...
-
AWS Data Engineer
6 days ago
Noida, Uttar Pradesh, India VyTCDC Full time US$ 1,20,000 - US$ 2,00,000 per yearKey ResponsibilitiesDesign and implement ETL/ELT pipelines using Databricks, PySpark, and AWS GlueDevelop and maintain scalable data architectures on AWS (S3, EMR, Lambda, Redshift, RDS)Perform data wrangling, cleansing, and transformation using Python and SQLCollaborate with data scientists to integrate Generative AI models into analytics workflowsBuild...
-
AWS Data Engineer
1 week ago
Noida, Uttar Pradesh, India Coforge Full timeRole - AWS Data Engineer Skills - AWS & PySpark Job Locations - Pune, Greater Noida & Hyderabad Experience - 6+ years We at Coforge are hiring AWS Data Engineers with the following skill-set: Design, develop, and maintain robust ETL/ELT pipelines using tools like Apache Spark, Airflow, or similar. Build and optimize data architectures (data lakes, data...