
PySpark Developer
3 weeks ago
We are seeking a highly skilled PySpark Developer with 7+ years of experience and expertise in Reltio MDM to join our data engineering team. You will design and implement scalable data processing solutions, integrate enterprise systems with Reltio, and ensure high-quality data governance.
Key Responsibilities :
- Develop and maintain PySpark data pipelines on platforms like AWS EMR or Databricks.
- Integrate and synchronize data between enterprise applications and Reltio MDM.
- Design and implement data transformation, cleansing, and enrichment logic.
- Collaborate with architects and analysts for effective data modeling.
- Build and manage API-based integrations between Reltio and upstream/downstream systems.
- Optimize PySpark jobs for performance, scalability, and cost-efficiency.
- Ensure data quality, integrity, and governance throughout the pipeline lifecycle.
- Troubleshoot and resolve data-related and performance issues.
Required Skills & Qualifications :
- 7+ years of hands-on experience in PySpark and distributed data processing.
- Strong command of Apache Spark, Spark SQL, and DataFrames.
- Deep expertise in Reltio MDM (entity modeling, survivorship rules, match & merge configuration).
- Proficiency in REST APIs, JSON, and data integration techniques.
- Experience with AWS services (S3, Lambda, Step Functions).
- Solid understanding of ETL workflows, data warehousing, and data modeling.
- Familiarity with CI/CD pipelines and Git.
- Excellent problem-solving, communication, and collaboration skills.
-
Python PySpark Developer
9 hours ago
Gurgaon, Haryana, India Gokaimizu Full time ₹ 8,00,000 - ₹ 12,00,000 per yearResponsibilities:* Design, develop, and maintain PySpark applications using Python and Spark.* Collaborate with cross-functional teams on data engineering projects.Health insuranceProvident fund
-
Software Engineer, PySpark
1 week ago
Gurgaon, Haryana, India NatWest Group Full time ₹ 15,00,000 - ₹ 25,00,000 per yearJoin us as a Software Engineer, PySparkThis is an opportunity for a driven Software Engineer to take on an exciting new career challengeDay-to-day, you'll be engineering and maintaining innovative, customer centric, high performance, secure and robust solutionsIt's a chance to hone your existing technical skills and advance your career while building a wide...
-
Software Engineer, PySpark, AVP
1 week ago
Gurgaon, Haryana, India RBS Full time ₹ 1,50,00,000 - ₹ 2,50,00,000 per yearJoin us as a Software Engineer, PySparkThis is an opportunity for a technically minded individual to join us as a Software EngineerYou'll be designing, producing, testing and implementing working software, working across the lifecycle of the systemHone your existing software engineering skills and advance your career in this critical roleWe're offering this...
-
Data Engineer(Pyspark+SQL)
4 hours ago
Gurgaon, Haryana, India Accolite Full time ₹ 15,00,000 - ₹ 25,00,000 per yearAbout The RoleWe are seeking an experienced Data Engineer to design, implement, and optimize a global data handling and synchronization solution across multiple regions. You will work with cloud-based databases, data lakes, and distributed systems, ensuring compliance with data residency and privacy requirements (e.g., GDPR).Requirements6+ years of...
-
ETL - Pyspark Testing - Strong SQL Professional
8 hours ago
Gurgaon, Haryana, India IDESLABS PRIVATE LIMITED Full time ₹ 15,00,000 - ₹ 25,00,000 per yearAt least 6-8 yrs of experience in ETL Testing with Automation Testing Expert in database testing using SQL. Must have worked on Databricks and aware of Databricks related concepts Check the data source locations and formats, perform a data count, and verify that the columns and data types meet the requirements. Test the accuracy of the data, and its...
-
ETL with Pyspark Testing with Strong SQL
5 days ago
Gurgaon, Haryana, India IDESLABS PRIVATE LIMITED Full time ₹ 15,00,000 - ₹ 25,00,000 per yearWe are looking for a skilled ETL Testing professional with 6-12 years of experience to join our team in Bengaluru. The ideal candidate will have expertise in database testing using SQL and experience working with Databricks.Roles and ResponsibilityDevelop and execute comprehensive test plans, test cases, and test scripts for ETL processes.Verify the accuracy...
-
Python Developer
2 days ago
Gurgaon, Haryana, India Ahom Technologies Private Limited Full time ₹ 70,00,000 - ₹ 7,20,00,000 per yearExpertise in Pyspark, AWS Big Data Stack, Kafka SQLExperience in big dataDevelop and optimize the data warehousesStrong experience with GCP data services, including Big Query, Data flow,Experience in implementing data governance on GCPFamiliar with other platforms like Snowflow, Data bricks etc.Experience with containerization solutions using Google...
-
Spark Scala Developer
2 days ago
Gurgaon, Haryana, India Kezan Inc Full time ₹ 12,00,000 - ₹ 15,00,000 per yearRole:Spark Scala DeveloperExperience: 4+ YearsLocation: Bangalore / Gurgaon (Hybrid)Budget: 12–13 LPANotice Period: Immediate JoinersJob Description:We are looking for a highly skilledSpark Scala Developerwith a strong background in big data technologies and cloud platforms. The ideal candidate should have a solid understanding of data engineering,...
-
Lead Engineer
5 days ago
Gurgaon, Haryana, India EXL Full time ₹ 20,00,000 - ₹ 25,00,000 per yearTITLE: Lead EngineerKey Skills: SQL, Python, Pyspark, Azure, Data Bricks, Team LeadingWork Schedule: Full-time, on-site in Gurgaon, 5 days a week.Job Summary: We are seeking a skilled and experienced Azure Data Engineer to join our team. The ideal candidate will have hands-on experience in Azure Data Engineering, strong expertise in Databricks, PySpark, and...
-
Automation Developer
3 days ago
Gurgaon, Haryana, India Airtel Full time ₹ 15,00,000 - ₹ 25,00,000 per yearAutomation Developer Job Description • Hands-on experience in Development • Hands on experience with Python scripts • Hands on experience in PySpark coding. Worked in spark cluster computing technology. • Hands on end to end data pipeline experience working on AWS environments • Hands on experience with writing Unix Shell...