
Data Engineer(AWS,S3, Datalake, AWS Glue, Lambda, RMR,"Python", Pyspark Engineer)
4 days ago
Job DescriptionSNRequired InformationDetails1RoleAWS Data Engineer2Required Technical Skill SetAWS Desired Experience Range4-6 yrs.5Location of RequirementPuneDesired Competencies (Technical/Behavioral Competency)Must-HaveShould have expertise in creating data warehouses in AWS utilizing the following tools: EC2, S3, EMR, Athena, Sagemaker, Aurora and Snowflake., Kafka, Kinesis, Glue, Lambda, DMS, AppFlow, PowerBIAdvanced development experience in cloud technologies such as AWS Aurora/RDS/S3, Lambda, JSON, PythonProficiency in scripting, querying, and/or analytics tools: Linux, python, SQL (already covered in description)Analyze, re-architect and re-platform on premise / cloud data sources to AWS platform using AWS Glue.Design, build and automate AWS data pipelines from ingestion of data into consumption layer using Java / PythonGood-to-HaveBasic knowledge in Redhat Linux and Windows Operating SystemsGood at Console and the AWS CLI and APIsKnowledge of writing infrastructure as code (IaC) using CloudFormation or Terraform.AWS API's for integrationSNRole descriptions / Expectations from the Role1Ability to understand and articulate the different functions within AWS and design appropriate solution, HLD, LLD around it.2Ability to identify and gather requirements to define a solution to be built and operated on AWS, perform high level and low-level design.3AWS Data Engineer will be responsible for building the services as per the design
-
AWS Data Engineer
3 days ago
bangalore, India Tata Consultancy Services Full timeDear Candidate Greetings from TATA Consultancy Services Job Openings at TCS Skill: AWS Data Engineer Exp range: 6 yrs to 10 yrs location : Bangalore Notice period – 30 – 45 days Pls find the Job Description below. Hands-on experience in Pyspark, Glue Experience on EMR, S3, IAM, Lambda, Cloud formation, Python AMI Rehydration, Python, ELB and other AWS...
-
AWS Data Engineer- Sagemaker
5 days ago
Bangalore, India YASH Technologies Full timeAWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, DevOPs(CI-CD) Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications ~ Strong understanding and hands-on experience with AWS services like EC2, S3, EMR, Glue, Redshift ~ Strong in developing and...
-
AWS Data Engineer
2 weeks ago
bangalore, India Tata Consultancy Services Full timeDear CandidateGreetings from TATA Consultancy ServicesJob Openings at TCSSkill: AWS Data Engineer Exp range: 6 yrs to 10 yrslocation: BangaloreNotice period – 30 – 45 daysPls find the Job Description below.Hands-on experience in Pyspark, GlueExperience on EMR, S3, IAM, Lambda, Cloud formation, PythonAMI Rehydration, Python, ELB and other AWS components...
-
AWS Data Engineer
2 weeks ago
Bangalore, India Mastech Digital Full timePosition: AWS Data Engineer Locations :- Chennai, Hyderabad, Bangalore Duration: Contract / FTE Exp:- 5+ years Hybrid model weekly 3 days at office. Primary Skills: Python, Pyspark, Glue, Redshift, Lambda, DMS, RDS ,Cloud Formation and other AWS serverless Strong exp in SQL Detailed JD Seeking a developer who has good Experience...
-
AWS Data Engineer
5 days ago
bangalore, India Tata Consultancy Services Full timeExperience: 5-10 Yrs Location - Bangalore,Chennai,Hyderabad,Pune,Kochi,Bhubaneshawar,Kolkata Key Skills AWS Lambda, Python, Boto3 ,Pyspark, Glue Must have Skills Strong experience in Python to package, deploy and monitor data science apps Knowledge in Python based automation Knowledge of Boto3 and related Python packages Working experience in AWS...
-
AWS Data Engineer
6 days ago
bangalore, India Tata Consultancy Services Full timeExperience: 5-10 Yrs Location - Bangalore,Chennai,Hyderabad,Pune,Kochi,Bhubaneshawar,KolkataKey Skills AWS Lambda, Python, Boto3 ,Pyspark, GlueMust have Skills Strong experience in Python to package, deploy and monitor data science apps Knowledge in Python based automation Knowledge of Boto3 and related Python packages Working experience in AWS and AWS...
-
bangalore, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, Glue Location: Bangalore Experience: 6 to 10 Years Notice Period: 30-45 days Job Description: Must: PySpark, AWS(ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow) ,Qlik Replicate, Data Testing Nice To Have: Hadoop, Teradata Background, IaC(Cloud Formation / Terraform), Git Kind Regards, Priyankha M
-
Senior Cloud Engineer AWS
1 week ago
bangalore, India Matrix USA Full timeJob Overview We are seeking an experienced AWS Developer proficient in Python and PySpark to design, develop, and maintain scalable, serverless data processing and workflow automation solutions on AWS. The ideal candidate will build Lambda functions, Step Functions, Glue ETL jobs, and integrate various AWS services to support complex data pipelines and...
-
bangalore, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, GlueLocation: BangaloreExperience: 6 to 10 YearsNotice Period: 30-45 daysJob Description:Must: PySpark, AWS[ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow] ,Qlik Replicate, Data TestingNice To Have: Hadoop, Teradata Background, IaC[Cloud Formation / Terraform], GitKind Regards,Priyankha M
-
Python data engineer
2 weeks ago
Bangalore, India Digitrix Software LLP Full timeLocation: Bangalore / Pune / Kolkata / Hyderabad / Gurugram Data Engineer Experience: 4 to 6 years Python, AWS Python (core language skill) -- Backend, Pandas, Py Spark (Data Frame API), interacting with AWS (e.g., boto3 for S3, Glue, Lambda) Data Processing: Spark (Py Spark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions, EMR ...