Job - Data Engineer(Pyspark+SQL)
1 week ago
General Description Of Role
Professionals in this group design and implement high-performance, scalable and optimized data solutions for large enterprise-wide data mining and processing. They are responsible for design and development of data flows and Big Data Platform deployment and implementation. Incumbents usually require expert knowledge in Databricks, Spark, SQL etc.
JOB RESPONSIBILITIES
Design and propose end-to-end data pipelines ((ingestion to storage to consumption) for data projects as well as databases to support web-based applications.
Design and implement data warehouses and data marts to serve data consumers.
Execute implementation of the designs and bring it from design stages to implementation followed by operationalization to maintenance.
Database design and modelling. To be able to organize data at both macro and micro level and provide logical data models for the consumers.
Database performance tuning and data lifecycle management
Assist in support and enhancements of existing data pipelines and databases.
SKILLS/COMPETENCIES REQUIRED
6-10 years of total experience working with data integration teams.
3+ years of in-depth experience developing data pipelines within an Apache Spark environment (preferably Databricks).
2+ years of active work with Databricks, demonstrating in-depth experience.
2+ years of in-depth experience in working with a data warehouse and knowledge of data warehouse modelling techniques.
Strong knowledge on Pyspark, Python and SQL and distributed computing principles.
Strong knowledge of data modelling, database technologies, data warehousing.
Experience with ETL/ELT processes, both design and implementation via SSIS or some other ELT/ETL tool.
Knowledge of cloud platforms (Aws or Azure) and big data technologies (Hadoop, Spark, etc.)
Fluent in both complex SQL query performance tuning and database performance tuning
Understand The Importance Of Performance And Able To Implement Best Practices In Ensuring Performance And Maintainability For Data-centric Projects. Nice To Haves
Some experience developing data solutions using native IaaS and PaaS solutions on AWS (Redshift, RDS, S3) will be an advantage.
-
Data Engineer(Pyspark+SQL)
2 weeks ago
Gurgaon, Haryana, India Accolite Full time ₹ 15,00,000 - ₹ 25,00,000 per yearAbout The RoleWe are seeking an experienced Data Engineer to design, implement, and optimize a global data handling and synchronization solution across multiple regions. You will work with cloud-based databases, data lakes, and distributed systems, ensuring compliance with data residency and privacy requirements (e.g., GDPR).Requirements6+ years of...
-
Python/Pyspark Developer
2 weeks ago
Gurgaon, Haryana, India DGLiger Consulting Pvt ltd Full time ₹ 20,00,000 - ₹ 25,00,000 per yearWe are looking for a skilled Python PySpark Developer with 34 years of experience in designing, developing, and maintaining big data solutions. The ideal candidate will have hands-on expertise in Python, PySpark, and data pipeline development, along with strong problem-solving skills and the ability to work in a collaborative environment. Key...
-
Python/Pyspark Developer
1 day ago
Gurgaon, Haryana, India DGLiger Consulting Pvt ltd Full time ₹ 15,00,000 - ₹ 25,00,000 per yearLocation: Gurgaon ( Work from office) We are looking for a Python PySpark Developer with 34 years of experience, primarily focused on Python programming and automation using GitHub Actions. The ideal candidate will be responsible for developing scalable Python-based data workflows, writing PySpark scripts for large-scale data processing, and implementing...
-
ETL - Pyspark Testing - Strong SQL Professional
2 weeks ago
Gurgaon, Haryana, India IDESLABS PRIVATE LIMITED Full time ₹ 15,00,000 - ₹ 25,00,000 per yearAt least 6-8 yrs of experience in ETL Testing with Automation Testing Expert in database testing using SQL. Must have worked on Databricks and aware of Databricks related concepts Check the data source locations and formats, perform a data count, and verify that the columns and data types meet the requirements. Test the accuracy of the data, and its...
-
Data Engineer
1 week ago
Gurgaon, Haryana, India Impetus Full time ₹ 15,00,000 - ₹ 25,00,000 per yearOpen Location- Gurgaon and BangaloreJob Description4-7 years' experience working on Data engineering & ETL/ELT processes, data warehousing, and data lake implementation with Cloud servicesHands on experience in designing and implementing solutions like creating/deploying jobs, Orchestrating the job/pipeline and infrastructure configurationsExpertise in...
-
Lead Data Engineer
5 days ago
Gurgaon, Haryana, India EXL Service Full time ₹ 20,00,000 - ₹ 25,00,000 per yearLead Data Engineer Immediate Hiring Alert – Lead Data Engineer (5 days, Work from Office) - Expert in Azure, Databricks, PySpark & SQL with a passion for leading teams Total Experience: 8 yrs Must Have:3 yrs in Azure Data Engineering 2 yrs in Databricks & PySpark Strong SQL expertise Team leadership (5 members) Excellent...
-
Software Engineer, PySpark
7 days ago
Gurgaon, Haryana, India RBS Full time ₹ 15,00,000 - ₹ 25,00,000 per yearJoin us as a Software Engineer, PySparkThis is an opportunity for a driven Software Engineer to take on an exciting new career challengeDay-to-day, you'll be engineering and maintaining innovative, customer centric, high performance, secure and robust solutionsIt's a chance to hone your existing technical skills and advance your career while building a wide...
-
Lead Data Engineer(MDM)
2 weeks ago
Gurgaon, Haryana, India Accolite Full time ₹ 12,00,000 - ₹ 36,00,000 per yearAbout The RoleWe are looking for a Lead Data Engineer to architect and guide the design and implementation of a global data handling and synchronization platform. In addition to being hands-on, you will provide technical leadership to a small team of data engineers and advise on master data management (MDM) best practices, ensuring compliance with global...
-
AWS Data Engineer
6 days ago
Gurgaon, Haryana, India Imbibe Consultancy Services Pvt Ltd Full time ₹ 6,00,000 - ₹ 18,00,000 per yearExperience:5 to 8 yearsLocation:Bengaluru, Gurgaon, PuneJob code:101356Posted on:Oct 27, 2025About UsAceNet Consulting is a fast-growing global business and technology consulting firm specializing in business strategy, digital transformation, technology consulting, product development, start-up advisory and fund-raising services to our global clients across...
-
Data Engineer
7 days ago
Gurgaon, Haryana, India Apps Associates Careers Full time ₹ 6,00,000 - ₹ 18,00,000 per yearRole: Data Engineer Skills:• years of experience as a Data engineer with extensive working experience using Pyspark, Advanced SQL, Snowflake, complex understanding of SQL and Skills, Spark, Snowflake and Glue.• AWS expertise (Azure, google will work too) - Lambda, Glue, S3 etc.• Experience in software development, CI/CD, Agile methodology•...