AWS Data Engineer for one of the leading Big4

11 hours ago


Bengaluru, India Acme Services Full time

Job Title:
AWS Data Engineer

Location:
Bangalore, Gurgaon and Pune

Experience:
5 to 10 years

Employment Type:
Full-time

About the Role

We are seeking a highly skilled
AWS Data Engineer
to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in
AWS cloud services, Snowflake, Python, Airflow, Glue, Redshift, and Kafka
and will be responsible for ensuring efficient data integration, processing, and availability for analytics and business insights.

Key Responsibilities

  • Design, develop, and maintain scalable
    data pipelines
    and ETL processes on
    AWS
    .
  • Work extensively with
    Snowflake
    for data warehousing, transformations, and performance optimization.
  • Develop automation scripts using
    Python
    and
    PyScript
    to streamline data workflows.
  • Orchestrate and schedule complex workflows using
    Apache Airflow
    .
  • Integrate, clean, and process structured/unstructured data from multiple sources including
    Kafka streams
    .
  • Manage data storage, transformation, and loading in
    AWS Redshift, Glue, and S3
    .
  • Ensure
    data quality, governance, and security
    across all stages of the pipeline.
  • Collaborate with Data Scientists, Analysts, and other stakeholders to deliver reliable and timely data solutions.
  • Monitor, troubleshoot, and optimize data processes for performance and cost efficiency.

Required Skills & Experience

  • Strong hands-on experience in
    AWS cloud services
    (Redshift, Glue, S3, Lambda, IAM, EC2, etc.).
  • Expertise in
    Snowflake
    data warehousing and query optimization.
  • Proficiency in
    Python & PyScript
    for data engineering tasks.
  • Solid experience with
    Airflow
    for workflow orchestration and scheduling.
  • Knowledge of
    Kafka
    for real-time data ingestion and streaming.
  • Strong understanding of
    ETL/ELT pipelines
    , data modeling, and data integration best practices.
  • Experience with
    SQL
    and handling large-scale datasets.
  • Good problem-solving and analytical skills with attention to detail.


  • Bengaluru, India Acme Services Full time

    Job Title: AWS Data Engineer Location: Bangalore, Gurgaon and Pune Experience: 5 to 10 years Employment Type: Full-time About the Role We are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python, Airflow,...


  • Bengaluru, India Acme Services Full time

    Job Title: AWS Data EngineerLocation: Bangalore, Gurgaon and PuneExperience: 5 to 10 yearsEmployment Type: Full-timeAbout the RoleWe are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python, Airflow, Glue,...


  • Bengaluru, Karnataka, India Acme Services Full time

    Job Title: AWS Data EngineerLocation: Bangalore, Gurgaon and PuneExperience: 5 to 10 yearsEmployment Type: Full-timeAbout the RoleWe are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python, Airflow, Glue,...


  • Bengaluru, Karnataka, India Acme Services Full time

    Job Title: AWS Data Engineer Location: Bangalore, Gurgaon and Pune Experience: 5 to 10 years Employment Type: Full-time About the Role We are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python, Airflow,...


  • Bengaluru, Karnataka, India Acme Services Full time

    Job Title: AWS Data EngineerLocation: Bangalore, Gurgaon and PuneExperience: 5 to 10 yearsEmployment Type: Full-timeAbout the RoleWe are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python, Airflow, Glue,...


  • Bengaluru, India Acme Services Full time

    Job Title: AWS Data EngineerLocation: Bangalore, Gurgaon and PuneExperience: 5 to 10 yearsEmployment Type: Full-timeAbout the RoleWe are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python, Airflow, Glue,...


  • Bengaluru, Karnataka, India Acme Services Full time ₹ 15,00,000 - ₹ 20,00,000 per year

    Job Title:AWS Data EngineerLocation:Bangalore, Gurgaon and PuneExperience:5 to 10 yearsEmployment Type:Full-timeAbout the RoleWe are seeking a highly skilledAWS Data Engineerto design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise inAWS cloud services, Snowflake, Python, Airflow, Glue, Redshift, and...


  • Bengaluru, India Acme Services Full time

    Job Title: AWS Data EngineerLocation: Bangalore, Gurgaon and PuneExperience: 5 to 10 yearsEmployment Type: Full-timeAbout the RoleWe are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python, Airflow, Glue,...


  • Bengaluru, India Acme Services Full time

    Job Title: AWS Data EngineerLocation: Bangalore, Gurgaon and PuneExperience: 5 to 10 yearsEmployment Type: Full-timeAbout the RoleWe are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python, Airflow, Glue,...


  • Bengaluru, Delhi, NCR, Pune, India Acme Services Full time ₹ 9,00,000 - ₹ 12,00,000 per year

    Seeking AWS Data Engineers with hands-on experience in Snowflake, Python, Airflow, Redshift, Glue, and Kafka. Design, develop, and maintain scalable ETL/ELT pipelinesIngest data from various sources (APIs, databases, etc)