
AWS Data Engineer- Sagemaker
2 days ago
Primary skillsets : AWS Sagemaker, Power BI & Python
Secondary skillset : Any ETL Tool, Github, DevOPs(CI-CD)
Mandatory Skill Set:
- Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications
- Strong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, Redshift
- Strong in developing and maintaining applications using Python and PySpark for data manipulation, transformation, and analysis
- Design and implement robust ETL pipelines using PySpark, focusing on performance, scalability, and data quality
- Lead and manage projects, including planning, execution, testing, and documentation and handling customer interacation as key point of contact
- Translate business requirements into technical solutions using AWS cloud services and Python/PySpark
- Deep understanding of Python and its data science libraries, along with PySpark for distributed data processing
- Proficiency in PL/SQL, T-SQL for data querying, manipulation, and database interactions
- Excellent written and verbal communication skills to collaborate with team members and stakeholders
- Experience leading and mentoring teams in a technical environment and providing proposals on solutioning and designed based approach
- 3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server)
- Strong real-life experience in python development especially in PySpark in AWS Cloud environment
- Strong SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch Workflow management tools like Airflow
-
AWS Data Engineer- Sagemaker
2 days ago
bangalore, India YASH Technologies Full timePrimary skillsets : AWS Sagemaker, Power BI & PythonSecondary skillset : Any ETL Tool, Github, DevOPs(CI-CD)Mandatory Skill Set:Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applicationsStrong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, RedshiftStrong in developing and...
-
Python AWS Data Engineer
6 days ago
bangalore, India Digitrix Software LLP Full timeExperience: 5 to 8 yearsJob description: Python AWS Data EngineerPython, AWS Python (core language skill) -- Backend, Pandas, PySpark (DataFrame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda)Data Processing: Spark (PySpark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions, EMRContainerization: DockerOrchestration:...
-
Python AWS Data Engineer
6 days ago
bangalore, India Digitrix Software LLP Full timeExperience : 5 to 8 years Job description: Python AWS Data Engineer Python, AWS Python (core language skill) -- Backend, Pandas, PySpark (DataFrame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda) Data Processing: Spark (PySpark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions, EMR Containerization: Docker ...
-
Aws data engineer- sagemaker
1 week ago
Bangalore, India YASH Technologies Full timePrimary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, Dev OPs(CI-CD) Mandatory Skill Set: Python, Py Spark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2, S3,...
-
Data Engineer
1 week ago
Bangalore, India Mphasis Full timeResponsibilities Automate data quality checks and validation processes using SQL, Python, and data testing frameworks. Perform reconciliation, integrity, and transformation testing across data platforms. Work with AWS SageMaker Studio (Unified Studio) for validating ML/data workflows and integrations. Validate data flows on cloud platforms (AWS,...
-
AWS Engineer with GenAI
2 weeks ago
bangalore, India Gainwell Technologies Full timeAWS Engineer with GenAI and Python/C#/JavaPosition Summary:The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role is...
-
Python Data Engineer
1 week ago
Bangalore, India Digitrix Software LLP Full timeLocation: Bangalore / Pune / Kolkata / Hyderabad / Gurugram Data Engineer Experience: 4 to 6 years Python, AWS Python (core language skill) -- Backend, Pandas, PySpark (DataFrame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda) Data Processing: Spark (PySpark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions, EMR...
-
Python Data Engineer
1 week ago
bangalore, India Digitrix Software LLP Full timeLocation: Bangalore / Pune / Kolkata / Hyderabad / GurugramData EngineerExperience: 4 to 6 yearsPython, AWS Python (core language skill) -- Backend, Pandas, PySpark (DataFrame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda)Data Processing: Spark (PySpark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions,...
-
AWS Data Engineer- Sagemaker
6 days ago
bangalore district, India YASH Technologies Full timePrimary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, DevOPs(CI-CD) Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2, S3, EMR,...
-
AWS Engineer with GenAI
6 days ago
bangalore district, India Gainwell Technologies Full timeAWS Engineer with GenAI and Python/C#/Java Position Summary: The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). This role is...