Aws data engineer- sagemaker

4 weeks ago


Bangalore, India YASH Technologies Full time

Primary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, Dev OPs(CI-CD) Mandatory Skill Set: Python, Py Spark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, Redshift Strong in developing and maintaining applications using Python and Py Spark for data manipulation, transformation, and analysis Design and implement robust ETL pipelines using Py Spark, focusing on performance, scalability, and data quality Lead and manage projects, including planning, execution, testing, and documentation and handling customer interacation as key point of contact Translate business requirements into technical solutions using AWS cloud services and Python/Py Spark Deep understanding of Python and its data science libraries, along with Py Spark for distributed data processing Proficiency in PL/SQL, T-SQL for data querying, manipulation, and database interactions Excellent written and verbal communication skills to collaborate with team members and stakeholders Experience leading and mentoring teams in a technical environment and providing proposals on solutioning and designed based approach 5+ years working experience in data integration and pipeline development. 5+ years of Experience with AWS Cloud on data integration with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, Mongo DB/Dynamo DB ecosystems Databricks, Redshift experience is a major plus 3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server) Strong real-life experience in python development especially in Py Spark in AWS Cloud environment Strong SQL and No SQL databases like My SQL, Postgres, Dynamo DB, Elasticsearch Workflow management tools like Airflow Good to Have : Snowflake, Palantir Foundry



  • Bangalore, India YASH Technologies Full time

    Primary skillsets : AWS Sagemaker, Power BI & Python Secondary skillset : Any ETL Tool, Github, Dev OPs(CI-CD) Mandatory Skill Set: Python, Py Spark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, Redshift Strong in...


  • bangalore, India DigiHelic Solutions Pvt. Ltd. Full time

    Job Role: ML Engineer Experience: 6-12 Years Location: Pune, Bangalore, Hyderabad, Trivandrum, Chennai, Kochi, Gurgaon, Noida Key Summary: ● The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency ● Model Development: Algorithms and architectures span traditional statistical methods to deep...


  • bangalore, India DigiHelic Solutions Pvt. Ltd. Full time

    Job Role: ML EngineerExperience: 6-12 YearsLocation: Pune, Bangalore, Hyderabad, Trivandrum, Chennai, Kochi, Gurgaon, NoidaKey Summary:● The MLE will design, build, test, and deploy scalable machine learning systems,optimizing model accuracy and efficiency● Model Development: Algorithms and architectures span traditional statistical methods todeep...

  • Data Engineer

    7 days ago


    Bangalore, India Mphasis Full time

    Responsibilities Automate data quality checks and validation processes using SQL, Python, and data testing frameworks. Perform reconciliation, integrity, and transformation testing across data platforms. Work with AWS SageMaker Studio (Unified Studio) for validating ML/data workflows and integrations. Validate data flows on cloud platforms (AWS, Azure)....


  • bangalore, India Tata Consultancy Services Full time

    Job DescriptionSNRequired InformationDetails1RoleAWS Data Engineer2Required Technical Skill SetAWS Desired Experience Range4-6 yrs.5Location of RequirementPuneDesired Competencies (Technical/Behavioral Competency)Must-HaveShould have expertise in creating data warehouses in AWS utilizing the following tools: EC2, S3, EMR, Athena, Sagemaker, Aurora and...

  • AWS Data Engineer

    6 days ago


    bangalore, India Info Origin Inc. Full time

    NEW OPPORTUNITY || IMMEDIATE TO 30 DAYS JOINERS REQUIRED || AWS Data Engineer ||Job Title: AWS Data EngineerLocation: BangaloreEmployment Type: Full-timeRole OverviewWe are looking for an experienced AWS Engineer with around 5-7 years of hands-on experience in designing, deploying, and managing applications on the AWS cloud platform. The ideal candidate...

  • Aws data engineer

    2 weeks ago


    Bangalore, India L&T Technology Services Full time

    AWS Data Engineer: Exp: 8-10 yrs Location: Bangalore Skills: 1) Graduate with 8-10 years of experience working as a AWS Data Engineer 2) Excellent knowledge of Py Spark, Java Script, AWS - RDS, and Redshift

  • AWS Data Engineer

    5 days ago


    bangalore, India Coforge Full time

    AWS Data EngineerJob Location: BengaluruExperience Required: 5+ YearsMandatory Skills: AWS Services, ETL, ETL Integration, CodePipeline, Jenkins, Glue, EMR, Athena, ECS, EKS, Kubernetes, CloudWatch, Prometheus, Grafana, Python, Shell, or PowerShellJob Description:We are looking for an experienced AWS Engineer with around 5-8 years of hands-on experience in...

  • AWS Data Engineer

    6 days ago


    bangalore, India Coforge Full time

    We are Hiring AWS Data Engineers at Coforge Ltd.Job Location: BangaloreExperience Required: 5 to 7 Years.Availability: Immediate joiners preferred📧 Send your CV to Gaurav.2.Kumar@coforge.com📱 WhatsApp: 9667427662 for any queriesRole Overview:-Coforge Ltd. is seeking a skilled AWS Engineer with 5–7 years of hands-on experience in designing, deploying,...

  • Python data engineer

    4 weeks ago


    Bangalore, India Digitrix Software LLP Full time

    Location: Bangalore / Pune / Kolkata / Hyderabad / Gurugram Data Engineer Experience: 4 to 6 years Python, AWS Python (core language skill) -- Backend, Pandas, Py Spark (Data Frame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda) Data Processing: Spark (Py Spark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions, EMR ...