Pyspark/Python Data Engineer

4 months ago


Delhi, India Genpact Full time
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI.Inviting applications for the role of Principal Consultant – Pyspark/Python Data EngineerWe are looking for a passionate Python developer to join our team at Genpact.You will be responsible for developing and implementing high-quality software solutions for data transformation and analytics using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features.As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools.If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance, and professional growth and development opportunities.

Responsibilities• Develop, test and maintain high-quality solutions using PySpark /Python programming language.• Participate in the entire software development lifecycle, building, testing and delivering high-quality data pipelines.• Collaborate with cross-functional teams to identify and solve complex problems.• Write clean and reusable code that can be easily maintained and scaled.• Keep up to date with emerging trends and technologies in Python development.

Qualifications we seek in youMinimum qualifications• years of experience as a Python Developer with a strong portfolio of projects.• Bachelor's degree in Computer Science, Software Engineering or a related field.• Experience on developing pipelines on cloud platforms such as AWS or Azure using AWS Glue or ADF• In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, Great Expectations, Splink and PyTorch.• Experience with data platforms such as Databricks/ Snowflake• Experience with front-end development using HTML or Python.• Familiarity with database technologies such as SQL and NoSQL.• Excellent problem-solving ability with solid communication and collaboration skills.

Preferred skills and qualifications• Experience with popular Python frameworks such as Django, Flask, FastAPI or Pyramid.• Knowledge of GenAI concepts and LLMs.• Contributions to open-source Python projects or active involvement in the Python community.

Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook.Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
  • Python +pyspark

    6 months ago


    Delhi, India Nityo Infotech Full time

    2 hours ago **Job Code**: JD-19623 **JOB DESCRIPTION**: Mandatory Skills - Python, Pyspark, DataBricks, SQL Primary Skills: - Hands-on Python, PySpark, Databricks, SQL, AWS (S3, Lambda, EC2, RDS), and CI-CD tools. Job description: - 6 or more years of experience developing, testing, and implementing major Information Technology programs or projects that...


  • Delhi, India KC Executive Search Full time

    Technical Stack: Python, Spark, AWS, PySparkResponsibilitiesDevelop and enhance data-processing, orchestration, monitoring, and more by leveraging popular open-source software, AWS, and GitLab automation.Collaborate with product and technology teams to design and validate the capabilities of the data platform Identify, design, and implement process...


  • Delhi, India KC Executive Search Full time

    Technical Stack: Python, Spark, AWS, PySparkResponsibilitiesDevelop and enhance data-processing, orchestration, monitoring, and more by leveraging popular open-source software, AWS, and GitLab automation.Collaborate with product and technology teams to design and validate the capabilities of the data platform Identify, design, and implement process...

  • AWS Data Engineer

    4 weeks ago


    delhi, India ITI Data Full time

    Job Description We are looking for an AWS Data with primary skills on PySpark development who will be able to design and build solutions for one of our Fortune 500 Client programs, which aims towards building an Enterprise Data Lake on AWS Cloud platform, build Data pipelines by developing several AWS Data Integration, Engineering & Analytics resources....

  • AWS Data Engineer

    1 month ago


    delhi, India ITI Data Full time

    Job DescriptionWe are looking for an AWS Data with primary skills on PySpark development who will be able to design and build solutions for one of our Fortune 500 Client programs, which aims towards building an Enterprise Data Lake on AWS Cloud platform, build Data pipelines by developing several AWS Data Integration, Engineering & Analytics resources. There...

  • AWS Data Engineer

    4 months ago


    Delhi, India ITI Data Full time

    Job DescriptionWe are looking for an AWS Data with primary skills on PySpark development who will be able to design and build solutions for one of our Fortune 500 Client programs, which aims towards building an Enterprise Data Lake on AWS Cloud platform, build Data pipelines by developing several AWS Data Integration, Engineering & Analytics resources. There...


  • Delhi, India KC Executive Search Full time

    Technical Stack: Python, Spark, AWS, PySparkResponsibilitiesDevelop and enhance data-processing, orchestration, monitoring, and more by leveraging popular open-source software, AWS, and GitLab automation.Collaborate with product and technology teams to design and validate the capabilities of the data platform Identify, design, and implement process...


  • delhi, India Data Warehouse Engineer Full time

    Experience : 2- 5 YearsPrimary Skills : Strong SQL knowledge, Data warehouse concepts and ETL hands on experience, Azure services like Data Factory, Azure Data Lake, and Good analysis skills.Good to have skills: Power BI, Databricks, Python.


  • Delhi, India Data Warehouse Engineer Full time

    Experience : 2- 5 YearsPrimary Skills : Strong SQL knowledge, Data warehouse concepts and ETL hands on experience, Azure services like Data Factory, Azure Data Lake, and Good analysis skills.Good to have skills: Power BI, Databricks, Python.

  • Data Engineer

    2 days ago


    Delhi, India Tata Consultancy Services Full time

    Job DescriptionName of the position: Data engineerLocation : Bangalore, Hyderabad,Mumbai, Pune ,Chennai,NCRSkill Requirements & Experience- Overall 8+ years of work experience in Data Warehouse(DWH) Development- Experience in Azure, Python , SQL , Pyspark- Hands on exp in Azure data factory, data bricks- Exposure and experience to Cosmos- Understanding of...


  • Delhi, India KC Executive Search Full time

    Technical Stack: Python, Spark, AWS, PySparkResponsibilitiesDevelop and enhance data-processing, orchestration, monitoring, and more by leveraging popular open-source software, AWS, and GitLab automation.Collaborate with product and technology teams to design and validate the capabilities of the data platform Identify, design, and implement process...

  • PySpark Developer

    1 day ago


    Delhi, India Tata Consultancy Services Full time

    Role : PySpark Developer Technical Skill Set : PySpark, Python, HDFS, Hadoop, SQLLocation : Mumbai, Pune ,Chennai, Banglore , NCR, HyderabadMust-Have :Sound programming knowledge on PySpark & SQL in terms of processing large amount of semi structured & unstructured dataAbility to design data pipelines in end to end mannerKnowledge on Avro, Parquet...

  • Aws data engineer

    2 weeks ago


    Delhi, India Recro Full time

    Job Title: Data EngineerExperience : 5+ yearsLocation : RemoteJob Type : Full-timeRole Overview:We are seeking an experienced Data Engineer to develop and optimize data pipelines and manage cloud infrastructure. The ideal candidate has strong Python, Pyspark, and AWS skills, with a focus on performance optimization and clean, maintainable code.Key...

  • AWS Data Engineer

    1 week ago


    Delhi, India Recro Full time

    Job Title: Data Engineer Experience : 5+ yearsLocation : RemoteJob Type : Full-timeRole Overview:We are seeking an experienced Data Engineer to develop and optimize data pipelines and manage cloud infrastructure. The ideal candidate has strong Python, Pyspark, and AWS skills, with a focus on performance optimization and clean, maintainable code.Key...

  • AWS Data Engineer

    2 weeks ago


    Delhi, India Recro Full time

    Job Title: Data EngineerExperience : 5+ yearsLocation : RemoteJob Type : Full-timeRole Overview:We are seeking an experienced Data Engineer to develop and optimize data pipelines and manage cloud infrastructure. The ideal candidate has strong Python, Pyspark, and AWS skills, with a focus on performance optimization and clean, maintainable code.Key...

  • Pyspark Developer

    1 week ago


    Delhi, India Tata Consultancy Services Full time

    Greetings from Tcs !!!We are conducting interviews for Pyspark Developer .Role: Pyspark DeveloperExperience: 6-10 YearsLocation: BangaloreResponsibility of / Expectations from the Role- Minimum 5 years of PySpark Development experience, especially in Spark SQL and Complex Transformations- Minimum 3 years of Python development experience- Minimum 2 years of...

  • Pyspark Developer

    1 week ago


    Delhi, India Tata Consultancy Services Full time

    Greetings from Tcs !!!We are conducting interviews for Pyspark Developer .Role: Pyspark DeveloperExperience: 6-10 YearsLocation:BangaloreResponsibility of / Expectations from the RoleMinimum 5 years of PySpark Development experience, especially in Spark SQL and Complex TransformationsMinimum 3 years of Python development experienceMinimum 2 years of...

  • Senior Data Engineer

    2 weeks ago


    Delhi, India Recro Full time

    Key skills: Python - 4+ years/Azure - 1 years/Pyspark - 4+ years/Github/DB - SQL any SQLLocation - Bengaluru & KochiDomain : Pharmacy solutions, clinical care services.Data Engineering project which needs to source data from multiple sources(Batch sources primarily, streaming sources is something that is good to have) to transform load into Data Lake. The...

  • Data Engineer

    3 weeks ago


    delhi, India Luxoft Full time

    Project Description: As a Data Engineer you'll be working alongside data architects to take data throughout its lifecycle - acquisition, exploration, data cleaning, integration, analysis, interpretation and visualization. You will be creating the pipeline for data processing, data visualization, and analytics products, including automated services, and...

  • Data Engineer

    3 weeks ago


    delhi, India Luxoft Full time

    Project Description:As a Data Engineer you'll be working alongside data architects to take data throughout its lifecycle - acquisition, exploration, data cleaning, integration, analysis, interpretation and visualization. You will be creating the pipeline for data processing, data visualization, and analytics products, including automated services, and...