Pyspark
4 days ago
Skill: Pyspark
Experience: 6 to 9 years
Location: Kolkata
Job description: Responsibilities
- Develop and maintain scalable data pipelines using Python and PySpark.
- Collaborate with data engineers and data scientists to understand and fulfill data processing needs.
- Optimize and troubleshoot existing PySpark applications for performance improvements.
- Write clean, efficient, and well-documented code following best practices.
- Participate in design and code reviews.
- Develop and implement ETL processes to extract, transform, and load data.
- Ensure data integrity and quality throughout the data lifecycle.
- Stay current with the latest industry trends and technologies in big data and cloud computing.
Qualifications
- Proven experience as a Python Developer with expertise in PySpark.
- Strong understanding of big data technologies and frameworks.
- Experience with distributed computing and parallel processing.
- Proficiency in SQL and experience with database systems.
- Solid understanding of data engineering concepts and best practices.
- Ability to work in a fast-paced environment and handle multiple projects simultaneously.
- Excellent problem-solving and debugging skills.
- Strong communication and collaboration abilities.
-
PySpark Developer
4 days ago
Kolkata, West Bengal, India Apex One Full time ₹ 1,04,000 - ₹ 1,30,878 per yearType: Contract-to-Hire (C2H)Job SummaryWe are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark, Python, and working with modern data engineering tools in cloud environments such as AWS. Key Skills &...
-
Kolkata, West Bengal, India, West Bengal Tata Consultancy Services Full timeJob Title: AWS Senior Data Engineer with Pyspark, AWS, GlueLocation: KolkataExperience: 6 to 10 YearsNotice Period: 30-45 daysJob Description:Must: PySpark, AWS[ETL Concepts, S3, Glue, EMR, Redshift, DMS, AppFlow] ,Qlik Replicate, Data TestingNice To Have: Hadoop, Teradata Background, IaC[Cloud Formation / Terraform], GitKind Regards,Priyankha M
-
Data Science Manager
4 days ago
Kolkata, West Bengal, India Tredence Full time ₹ 20,00,000 - ₹ 25,00,000 per yearRole Overview: As a member of the Classical ML+NLP+Gen AI - L4 team at Tredence, you will be involved in tasks such as Model Fine-tuning, Model Pre-training, Named Entity Recognition, Prompt Engineering, Pyspark, Python, Regular expressions, SQL, Supervised ML, Tokenization, Transformer Models, Transformers, Unstructured Data Pre-processing, and Unsupervised...
-
AWS/ Azure Data Engineer
4 days ago
Kolkata, West Bengal, India Talentico Consultancy Services Full time ₹ 8,00,000 - ₹ 24,00,000 per year•Exp of unified data platform,Data lifecycle starting from Data Ingestion, Transformation, Serve and consumption.• Must have excellent coding skills in Pyspark,Spark sql.• Must be strong in SQL and sprak-sql.Expin Data Engineering domain.•
-
AWS Glue
4 days ago
Kolkata, West Bengal, India Cognizant Full time ₹ 12,00,000 - ₹ 36,00,000 per yearSkill: AWS GlueExperience: 6 to 9 yearsLocation: KolkataJob descriptionTechnical Skills :AWS Glue: 3+ years of hands-on experience in AWS Glue ETL developmentPython/PySpark: Strong programming skills in Python and PySpark for data transformationAWS Services: Proficiency in S3, Redshift, Athena, Lambda, and EMRData Formats: Experience with Parquet, Avro,...
-
AWS Databricks Developer
4 days ago
Kolkata, West Bengal, India Tata Consultancy Services Full time ₹ 15,00,000 - ₹ 25,00,000 per yearRole & responsibilities• Develop and maintain scalable data pipelines using Apache Spark on Databricks.• Build end-to-end ETL/ELT pipelines on AWS using services like S3, Glue, Lambda, EMR, and Step Functions.• Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.• Design and implement data...
-
Snowflake Data Engineer
2 days ago
Kolkata, West Bengal, India Vidpro Consultancy Services Full time ₹ 8,00,000 - ₹ 24,00,000 per yearExp: YrsWork Mode: HybridLocation: Bangalore, Chennai, Kolkata, Pune and GurgaonPrimary Skills: Python, pyspark, Azure Data Factory, snowflake, snowpipe, snowsql, Snowsight, Snowpark, ETL, SQL, and Architect Designing.Snowpro certified is plusPrimary Roles and Responsibilities:Developing Modern Data Warehouse solutions using Snowflake, Databricks and...
-
Aws Data Engineer
2 days ago
Kolkata, West Bengal, India Exavalu Full time ₹ 15,00,000 - ₹ 25,00,000 per yearRole & responsibilities :Design, Develop and Test data ingestion/ETL pipelines with S3 Storage, Postgres, Athena, Redshift.Strong experience in Python, Pyspark, SQL, Glue, Lambda and orchestration techniquesDevelops stored procedures, database triggers and SQL queries with an understanding of data warehousing objects (type-2 dimensions, CDC, aggregations,...
-
Azure Data Engineer
4 days ago
Kolkata, West Bengal, India Sys Edge Micro Informatics Full time ₹ 6,00,000 - ₹ 12,00,000 per yearRole & responsibilitiesAssist in data collection, cleansing, and preparation from various sources.Support basic data modeling and documentation efforts.Develop and maintain simple data pipelines using tools like Azure Data Factory (ADF).Write and optimize queries using SQL.Use Python or PySpark scripts to perform data transformations.Perform data validation...
-
AWS Databricks Developer
2 weeks ago
Kolkata, West Bengal, India Integrated Personnel Services Full time ₹ 15,00,000 - ₹ 25,00,000 per yearJob role - DeveloperExperience - 6 to 10 yearsLocation - KolkataAWS Databricks Developer Experience More than 3 years in data integration, pipeline development, and data warehousing, with a strong focus on AWS Databricks.Technical Skills Proficiency in Databricks platform, management, and optimization. Strong experience in AWS Cloud, particularly in data...