
Pyspark, Aws
2 days ago
**Exp: 4 to 13 years**
**Skill: Pyspark, AWS**
**Location : Bangalore/Hyderabad/Kolkata/Pune/Chennai**
**Technical Skills**: Python,AWS Glue Studio,AWS Glue ETL,AWS Glue Catalog,Amazon S3,Apache Spark
**Responsibilities : -** Develop and maintain data pipelines using AWS Glue Studio and AWS Glue ETL. - Ensure data quality and integrity by implementing robust data validation and transformation processes. - Optimize performance of data pipelines to handle large volumes of data efficiently. - Collaborate with data engineers and analysts to understand data requirements and deliver solutions. - Utilize Python for scripting and automation tasks related to data processing. - Manage and maintain AWS Glue Catalog to ensure accurate metadata management. - Store and retrieve data from Amazon S3, ensuring data security and accessibility. - Leverage Apache Spark for distributed data processing and analytics. - Monitor and troubleshoot data pipeline issues to ensure smooth operations. - Provide technical support and guidance to team members on data-related projects. - Document data processes and workflows for future reference and compliance. - Stay updated with the latest trends and best practices in data engineering and cloud technologies. - Contribute to the companys data strategy by providing insights and recommendations based on data analysis. -Qualifications - Possess strong experience in Python for data processing and automation. - Demonstrate expertise in AWS Glue Studio, AWS Glue ETL, and AWS Glue Catalog. - Have hands-on experience with Amazon S3 for data storage and retrieval. - Show proficiency in Apache Spark for distributed data processing. - Exhibit excellent problem-solving skills and attention to detail. - Have a solid understanding of data quality and validation techniques. - Be able to work collaboratively in a team environment. - Stay updated with the latest advancements in cloud and data engineering technologies.
-
Date Engineer(Python, Pyspark and AWS)
2 weeks ago
Hyderabad, India Zorba AI Full timeKey Skills Required- Minimum 6 years of working experience in Python, Pyspark, AWS and SQL. Programming & Frameworks: Python, PySpark Databases & Querying: Strong SQL (query optimization, joins, window functions) Big Data & Processing: PySpark for distributed data processing, ETL pipelines Cloud Platform: AWS (S3, Glue, Lambda, EMR, Redshift, Athena) Data...
-
Pyspark,sql & aws glue Professional
1 week ago
Hyderabad, Telangana, India IDESLABS PRIVATE LIMITED Full time ₹ 12,00,000 - ₹ 36,00,000 per yearLooking to onboard a skilled professional with 6-8 years of experience in Pyspark, SQL, and AWS Glue. The ideal candidate will have a strong background in these technologies and excellent problem-solving skills. This position is located across Pan India.Roles and ResponsibilityDesign, develop, and implement data processing pipelines using Pyspark and AWS...
-
Hyderabad, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, GlueLocation: HyderabadExperience: 6 to 10 YearsNotice Period: 30-45 daysJob Description:Must: PySpark, AWS(ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow) ,Qlik Replicate, Data TestingNice To Have: Hadoop, Teradata Background, IaC(Cloud Formation / Terraform), GitKind Regards,Priyankha M
-
Hyderabad, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, GlueLocation: HyderabadExperience: 6 to 10 YearsNotice Period: 30-45 daysJob Description:Must: PySpark, AWS(ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow) ,Qlik Replicate, Data TestingNice To Have: Hadoop, Teradata Background, IaC(Cloud Formation / Terraform), GitKind Regards,Priyankha M
-
Hyderabad, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, Glue Location: Hyderabad Experience: 6 to 10 Years Notice Period: 30-45 days Job Description: Must: PySpark, AWS(ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow) ,Qlik Replicate, Data Testing Nice To Have: Hadoop, Teradata Background, IaC(Cloud Formation / Terraform), Git Kind Regards, ...
-
Hyderabad, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, Glue Location: Hyderabad Experience: 6 to 10 Years Notice Period: 30-45 days Job Description: Must: PySpark, AWS(ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow) ,Qlik Replicate, Data Testing Nice To Have: Hadoop, Teradata Background, IaC(Cloud Formation / Terraform), Git Kind Regards, ...
-
Hyderabad, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, Glue Location: Hyderabad Experience: 6 to 10 Years Notice Period: 30-45 days Job Description: Must: PySpark, AWS(ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow) ,Qlik Replicate, Data Testing Nice To Have: Hadoop, Teradata Background, IaC(Cloud Formation / Terraform), Git Kind Regards, ...
-
hyderabad, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, GlueLocation: HyderabadExperience: 6 to 10 YearsNotice Period: 30-45 daysJob Description:Must: PySpark, AWS[ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow] ,Qlik Replicate, Data TestingNice To Have: Hadoop, Teradata Background, IaC[Cloud Formation / Terraform], GitKind Regards,Priyankha M
-
Hyderabad, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, GlueLocation: HyderabadExperience: 6 to 10 YearsNotice Period: 30-45 daysJob Description:Must: PySpark, AWS[ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow] ,Qlik Replicate, Data TestingNice To Have: Hadoop, Teradata Background, IaC[Cloud Formation / Terraform], GitKind Regards,Priyankha M
-
Hyderabad, India Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, GlueLocation: HyderabadExperience: 6 to 10 YearsNotice Period: 30-45 daysJob Description:Must: PySpark, AWS[ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow] ,Qlik Replicate, Data TestingNice To Have: Hadoop, Teradata Background, IaC[Cloud Formation / Terraform], GitKind Regards,Priyankha M