
AWS Data Engineer- Sagemaker
3 days ago
Primary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, PythonSecondary skillset : Any ETL Tool, Github, DevOPs(CI-CD)Mandatory Skill Set:Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applicationsStrong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, RedshiftStrong in developing and maintaining applications using Python and PySpark for data manipulation, transformation, and analysisDesign and implement robust ETL pipelines using PySpark, focusing on performance, scalability, and data qualityLead and manage projects, including planning, execution, testing, and documentation and handling customer interacation as key point of contactTranslate business requirements into technical solutions using AWS cloud services and Python/PySparkDeep understanding of Python and its data science libraries, along with PySpark for distributed data processingProficiency in PL/SQL, T-SQL for data querying, manipulation, and database interactionsExcellent written and verbal communication skills to collaborate with team members and stakeholdersExperience leading and mentoring teams in a technical environment and providing proposals on solutioning and designed based approach5+ years working experience in data integration and pipeline development.5+ years of Experience with AWS Cloud on data integration with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, MongoDB/DynamoDB ecosystems Databricks, Redshift experience is a major plus3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server)Strong real-life experience in python development especially in PySpark in AWS Cloud environmentStrong SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch Workflow management tools like AirflowGood to Have : Snowflake, Palantir Foundry
-
Bengaluru, Karnataka, India Amazon Web Services (AWS) Full time US$ 1,50,000 - US$ 2,00,000 per yearDescriptionAWS Utility Computing (UC) provides product innovations — from foundational services such as Amazon's Simple Storage Service (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS's services and features apart in the industry. As a member of the UC organization, you'll support the...
-
Bengaluru, India Alcon Full timeJob Description : Looking for a Sr MLOps Engineer with a strong background in Data Engineering, who aspire to be part of a Data Science & AI Operations function in R&D. This role involves applying advanced software engineering skills for the design, creation, management, and business use of large datasets, across a variety of Data/ML platforms. This is an...
-
AWS Data Engineer- Sagemaker
7 days ago
Bengaluru, Karnataka, India YASH Technologies Full timeAWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, DevOPs(CI-CD) Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications ~ Strong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, Redshift ~ Strong in developing and...
-
AWS Data Engineer- Sagemaker
1 week ago
Bengaluru, Karnataka, India YASH Technologies Full time ₹ 15,00,000 - ₹ 28,00,000 per yearPrimary skillsets :AWS services including Glue, Pyspark, SQL, Databricks, PythonSecondary skillset :Any ETL Tool, Github, DevOPs(CI-CD)Mandatory Skill Set:Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applicationsStrong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue,...
-
AWS Data Engineer- Sagemaker
1 week ago
Bengaluru, Karnataka, India YASH Technologies Full timePrimary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, DevOPs(CI-CD) Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applicationsStrong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue,...
-
AWS Data Engineer- Sagemaker
7 days ago
Bengaluru, Karnataka, India YASH Technologies Full timePrimary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, DevOPs(CI-CD) Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2, S3, EMR,...
-
AWS Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Sampoorna Consultants Pvt. Ltd Full time ₹ 15,00,000 - ₹ 20,00,000 per yearJob RequirementsMandatory SkillsBachelors degree in computer science, Data Science, engineering, mathematics, information systems, or a related technical discipline7+ years of relevant experience in data engineering rolesDetailed knowledge of data warehouse technical architectures, data modelling, infrastructure components, ETL/ ELT and reporting/analytic...
-
AWS Engineer with GenAI
3 days ago
Bengaluru, India Gainwell Technologies Full timeAWS Engineer with GenAI and Python/C#/JavaPosition Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role is...
-
AWS Engineer with GenAI
13 hours ago
Bengaluru, India Gainwell Technologies Full timeAWS Engineer with GenAI and Python/C#/JavaPosition Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role is...
-
AWS Engineer
2 weeks ago
Bengaluru, Karnataka, India Gainwell Technologies Full time ₹ 15,00,000 - ₹ 25,00,000 per yearPosition Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform).Key Responsibilities:Architect, provision, and maintain AWS...