
Pyspark Developer
4 days ago
Job Summary - Design develop and implement scalable data pipelines and streaming use cases using PySpark and Spark on a distributed computing platform - Possess strong programming skills in Spark streaming - Have familiarity with cloud platforms like GCP - Gain experience in big data technologies such as Hadoop Hive and HDFS - Perform ETL operations from various data sources and have experience with data warehousing concepts - Optimize PySpark jobs for performance and efficiency - Develop and maintain unit tests for data pipelines and streaming use cases - Troubleshoot and debug Spark applications - Collaborate with data scientists and analysts to understand data requirements - Document data pipelines and data models clearly and concisely - Participate in code reviews and knowledge sharing sessions - Stay updated with the latest advancements in PySpark and related technologies - Able to provide production support for the developed use cases - Have 3 years of experience as a Data Engineer - Proven experience using PySpark for data processing and streaming use cases - Strong understanding of data warehousing data modeling and ETL processes - Familiarity with big data concepts and distributed computing frameworks such as Hadoop Spark Kafka - Experience with SQL and a relational database management system like MySQL PostgreSQL - Experience with cloud platforms like AWS Azure GCP is a plus - Excellent problem-solving and analytical skills - Strong communication and collaboration skills - Ability to work independently and as part of a team
-
PySpark Developers
2 weeks ago
Chennai, Tamil Nadu, India LTIMindtree Full time ₹ 15,00,000 - ₹ 20,00,000 per yearSkill : PySpark DeveloperJob Locations : Chennai, PuneNotice Period : AnyExperience : 3-8 years Job Description :PySpark DeveloperMandatory Skills : (Apache Spark, Big Data Hadoop Ecosystem, SparkSQL, Python)A good professional experience in Bigdata PySpark HIVE Hadoop PLSQLGood knowledge of AWS and SnowflakeGood understanding of CICD and system design...
-
Chennai, Tamil Nadu, India Tata Consultancy Servicess Full time ₹ 20,00,000 - ₹ 25,00,000 per yearSkill : PySpark DeveloperExperience Range: 4 to 8Notice : (Immediate to 60 Days)Job Description:Technical Skills: Spark ,PySpark, Python, Hive, Scala, MapReduce, Unix shell scriptingob Description:4+ Years of experience with Developing/Fine tuning and implementing programs/applicationsUsing Python/PySpark/Scala on Big Data/Hadoop Platform.Roles and...
-
Databrick Pyspark
2 weeks ago
Chennai, Tamil Nadu, India Virtusa Full time ₹ 9,00,000 - ₹ 12,00,000 per yearKey Responsibilities:Design, develop, and maintain scalable data pipelines using Apache Spark on Databricks. Write efficient and production-ready PySpark or Scala code for data transformation and ETL processes. Integrate data from various structured and unstructured sources into a unified platform. Implement Delta Lake and manage data versioning, updates,...
-
Python/Pyspark Development
1 week ago
Chennai, Tamil Nadu, India Standard Chartered Bank Full time ₹ 1,04,000 - ₹ 1,30,878 per yearJob ID: 39129Location: Chennai, INArea of interest: TechnologyJob type: Regular EmployeeWork style: Office WorkingOpening date: 10 Sept 2025Job SummaryStrategy - Deliver on the functional requirements for the Regulatory exercisesBusiness - Group Functions - FinanceProcesses - Regulatory Stress Testing.Risk Management - Group standard requirementsKey...
-
AWS/PySpark Engineer
2 weeks ago
Chennai, Tamil Nadu, India Barclays Full time ₹ 15,00,000 - ₹ 20,00,000 per yearJoin us as an AWS/PySpark Engineer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of team of developers, you will deliver technology stack, using strong analytical...
-
Principal Software Developer
5 days ago
Chennai, Tamil Nadu, India beBeeSoftwareDeveloper Full time ₹ 10,00,000 - ₹ 20,00,000Job Role:We are seeking a highly skilled software engineer to join our development team. The ideal candidate will be responsible for designing, developing and maintaining high-quality software solutions using PySpark.Key Responsibilities:Design and develop scalable and efficient PySpark applications for multiple clients.Analyze and troubleshoot complex...
-
Big Data Developer
6 days ago
Chennai, Tamil Nadu, India beBeeData Full time ₹ 1,50,00,000 - ₹ 2,00,00,000Big Data DeveloperWe are looking for a highly skilled Big Data Developer to join our team. In this role, you will be responsible for developing and fine-tuning programs/applications using Python/PySpark/Scala on Big Data/Hadoop Platform.The ideal candidate will have strong knowledge of Big Data/Hadoop Platform and experience with Spark, PySpark, Python,...
-
Senior Data Pipeline Developer
1 week ago
Chennai, Tamil Nadu, India beBeeDataEngineer Full time ₹ 15,00,000 - ₹ 2,51,20,000Data Engineer OpportunityWe are seeking an experienced data engineer to join our team in a role that will involve designing, developing, testing, and supporting data pipelines and applications using Python, PySpark, and SQL on AWS.Key Responsibilities:Designing and developing scalable and efficient data pipelines using Python, PySpark, and SQL on...
-
Scala Developer
6 days ago
Chennai, Tamil Nadu, India Tata Consultancy Services Full timeRole : Scala DeveloperExp : 4+ year Location : Pune,Chennai, Kolkata, Hyderabad, Gurgoan, Functional Skills: Experience in Credit Risk/Regulatory risk domain Technical Skills: Spark ,PySpark, Python, Hive, Scala, MapReduce, Unix shell scripting Good to Have Skills: Exposure to Machine Learning TechniquesJob Description:4+ Years of experience with...
-
Chennai, Tamil Nadu, India Suzva Software Technologies Full time ₹ 15,000 - ₹ 28,00,000 per yearExperience with Azure datafactory,Azure ,Python, Pyspark, Databricks, SQL Key ResponsibilitiesDesign and build ETL/ELT pipelines to ingest, transform, and load data from clinical, omics, research, and operational sources.Optimize performance and scalability of data flows using tools like Apache Spark, Databricks, or AWS Glue.Collaborate with domain experts...