Aws data engineer- sagemaker

4 days ago


Bangalore, India YASH Technologies Full time

Primary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, Dev OPs(CI-CD) Mandatory Skill Set: Python, Py Spark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, Redshift Strong in developing and maintaining applications using Python and Py Spark for data manipulation, transformation, and analysis Design and implement robust ETL pipelines using Py Spark, focusing on performance, scalability, and data quality Lead and manage projects, including planning, execution, testing, and documentation and handling customer interacation as key point of contact Translate business requirements into technical solutions using AWS cloud services and Python/Py Spark Deep understanding of Python and its data science libraries, along with Py Spark for distributed data processing Proficiency in PL/SQL, T-SQL for data querying, manipulation, and database interactions Excellent written and verbal communication skills to collaborate with team members and stakeholders Experience leading and mentoring teams in a technical environment and providing proposals on solutioning and designed based approach 5+ years working experience in data integration and pipeline development. 5+ years of Experience with AWS Cloud on data integration with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, Mongo DB/Dynamo DB ecosystems Databricks, Redshift experience is a major plus 3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server) Strong real-life experience in python development especially in Py Spark in AWS Cloud environment Strong SQL and No SQL databases like My SQL, Postgres, Dynamo DB, Elasticsearch Workflow management tools like Airflow Good to Have : Snowflake, Palantir Foundry



  • Bangalore, India Gainwell Technologies Full time

    AWS Engineer with GenAI and Python/C#/Java Position Summary: The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role is...


  • Bangalore, India YASH Technologies Full time

    Primary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, DevOPs(CI-CD) Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2,...

  • Data Engineer

    3 days ago


    Bangalore, India Mphasis Full time

    Responsibilities Automate data quality checks and validation processes using SQL, Python, and data testing frameworks. Perform reconciliation, integrity, and transformation testing across data platforms. Work with AWS SageMaker Studio (Unified Studio) for validating ML/data workflows and integrations. Validate data flows on cloud platforms (AWS,...


  • Bangalore, India Gainwell Technologies Full time

    AWS Engineer with GenAI and Python/C#/Java Position Summary: The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role...


  • bangalore, India Gainwell Technologies Full time

    AWS Engineer with GenAI and Python/C#/JavaPosition Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role is...


  • Bangalore, India Digitrix Software LLP Full time

    Location: Bangalore / Pune / Kolkata / Hyderabad / Gurugram Data Engineer Experience: 4 to 6 years Python, AWS Python (core language skill) -- Backend, Pandas, PySpark (DataFrame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda) Data Processing: Spark (PySpark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions, EMR...


  • bangalore, India Digitrix Software LLP Full time

    Location: Bangalore / Pune / Kolkata / Hyderabad / GurugramData EngineerExperience: 4 to 6 yearsPython, AWS Python (core language skill) -- Backend, Pandas, PySpark (DataFrame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda)Data Processing: Spark (PySpark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions,...

  • AWS Data Engineer

    1 week ago


    bangalore, India L&T Technology Services Full time

    AWS Data Engineer:Exp: 8-10 yrsLocation: BangaloreSkills:1) Graduate with 8-10 years of experience working as a AWS Data Engineer2) Excellent knowledge of PySpark, JavaScript, AWS - RDS, and Redshift

  • Data Engineer

    2 days ago


    bangalore, India Natlov Technologies Pvt Ltd Full time

    Hiring: Data Engineer (AWS ) Location: Remote | Shift: 6 PM – 3 AM ISTJoin Natlov Technologies Pvt. Ltd. and be part of a dynamic data engineering team!What You’ll DoBuild and optimize scalable data pipelines & modelsEnsure data quality, security, and performanceCollaborate across teams & mentor juniorsWork with modern tools like AWS, BigQuery,...


  • Bangalore, India Digitrix Software LLP Full time

    Location: Bangalore / Pune / Kolkata / Hyderabad / Gurugram Data Engineer Experience: 4 to 6 years Python, AWS Python (core language skill) -- Backend, Pandas, Py Spark (Data Frame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda) Data Processing: Spark (Py Spark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions, EMR ...