AWS Data Engineer- Sagemaker

1 week ago


Bengaluru, Karnataka, India YASH Technologies Full time ₹ 15,00,000 - ₹ 25,00,000 per year

Primary skillsets :
AWS services including Glue, Pyspark, SQL, Databricks, Python

Secondary skillset :
Any ETL Tool, Github, DevOPs(CI-CD)

Mandatory Skill Set:

  • Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications
  • Strong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, Redshift
  • Strong in developing and maintaining applications using Python and PySpark for data manipulation, transformation, and analysis
  • Design and implement robust ETL pipelines using PySpark, focusing on performance, scalability, and data quality
  • Lead and manage projects, including planning, execution, testing, and documentation and handling customer interacation as key point of contact
  • Translate business requirements into technical solutions using AWS cloud services and Python/PySpark
  • Deep understanding of Python and its data science libraries, along with PySpark for distributed data processing
  • Proficiency in PL/SQL, T-SQL for data querying, manipulation, and database interactions
  • Excellent written and verbal communication skills to collaborate with team members and stakeholders
  • Experience leading and mentoring teams in a technical environment and providing proposals on solutioning and designed based approach
  • 5+ years working experience in data integration and pipeline development.
  • 5+ years of Experience with AWS Cloud on data integration with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, MongoDB/DynamoDB ecosystems Databricks, Redshift experience is a major plus
  • 3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server)
  • Strong real-life experience in python development especially in PySpark in AWS Cloud environment
  • Strong SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch Workflow management tools like Airflow

Good to Have : Snowflake, Palantir Foundry



  • Bengaluru, Karnataka, India YASH Technologies Full time

    Primary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, DevOPs(CI-CD) Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2, S3, EMR,...

  • AWS Data Engineer

    2 weeks ago


    Bengaluru, Karnataka, India Sampoorna Consultants Pvt. Ltd Full time ₹ 15,00,000 - ₹ 20,00,000 per year

    Job RequirementsMandatory SkillsBachelors degree in computer science, Data Science, engineering, mathematics, information systems, or a related technical discipline7+ years of relevant experience in data engineering rolesDetailed knowledge of data warehouse technical architectures, data modelling, infrastructure components, ETL/ ELT and reporting/analytic...

  • AWS Engineer

    2 weeks ago


    Bengaluru, Karnataka, India Gainwell Technologies Full time ₹ 10,00,000 - ₹ 25,00,000 per year

    Position Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform).Key Responsibilities:Architect, provision, and maintain AWS...


  • Bengaluru, Karnataka, India Gainwell Technologies Full time

    Job DescriptionAWS Engineer with GenAI and Python/C#/JavaPosition Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform)....


  • Bengaluru, Karnataka, India Gainwell Technologies Full time

    AWS Engineer with GenAI and Python/C#/Java Position Summary: The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role...


  • Bengaluru, Karnataka, India Gainwell Technologies Full time

    AWS Engineer with GenAI and Python/C#/JavaPosition Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role is...


  • Bengaluru, Karnataka, India Gainwell Technologies Full time

    AWS Engineer with GenAI and Python/C#/JavaPosition Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role is...

  • AWS Engineer

    2 weeks ago


    Bengaluru, Karnataka, India Gainwell Technologies Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Gainwell Technologies LLCGainwell Technologies is the leading provider of technology solutions vital to the administration and operations of health and human services programs.We are the key player in the Medicaid space with a presence in 51 of the 56 U.S. states and territories with offerings including Medicaid Management InformationSystems (MMIS), Fiscal...

  • AWS Data Engineer

    5 days ago


    Bengaluru, Karnataka, India, Karnataka Tata Consultancy Services Full time

    Experience: 5-10 Yrs Location - Bangalore,Chennai,Hyderabad,Pune,Kochi,Bhubaneshawar,KolkataKey Skills AWS Lambda, Python, Boto3 ,Pyspark, GlueMust have Skills Strong experience in Python to package, deploy and monitor data science apps Knowledge in Python based automation Knowledge of Boto3 and related Python packages Working experience in AWS and AWS...

  • Data Engineer

    5 days ago


    Bengaluru, Karnataka, India, Karnataka Mphasis Full time

    ResponsibilitiesAutomate data quality checks and validation processes using SQL, Python, and data testing frameworks.Perform reconciliation, integrity, and transformation testing across data platforms.Work with AWS SageMaker Studio (Unified Studio) for validating ML/data workflows and integrations.Validate data flows on cloud platforms (AWS, Azure).Integrate...