Current jobs related to Software Engineer II, PySpark, Databricks - Hyderabad - JPMorgan Chase & Co.


  • Hyderabad, Telangana, India JPMorganChase Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    JOB DESCRIPTIONDescriptionWe have an exciting and rewarding opportunity for you to take your software engineering career to the next level.Job summaryAs a Software Engineer II at JPMorgan Chase within the Corporate technology, you design and deliver trusted technology products in a secure, stable, and scalable way. You are responsible for implementing...


  • Hyderabad, India JPMorganChase Full time

    Job Description Description JOB DESCRIPTION We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. Job Summary As a Software Engineer II at JPMorgan Chase within the Corporate technology, you design and deliver trusted technology products in a secure, stable, and scalable way. You are responsible for...


  • Hyderabad, Telangana, India Cognizant Full time ₹ 1,00,00,000 - ₹ 3,00,00,000 per year

    Skills- Databricks+ PysparkExperience: 4 to 13 yearsLocation: AIA-PuneWe are looking for a highly skilled Data Engineer with expertise in PySpark and Databricks to design, build, and optimize scalable data pipelines for processing massive datasets.Key Responsibilities:Build & Optimize Pipelines: Develop high-throughput ETL workflows using PySpark on...


  • Hyderabad, Telangana, India Cognizant Full time

    **Skills - Databricks+ Pyspark** **Experience: 4 to 13 years** **Location: AIA-Pune** **We are looking for a highly skilled Data Engineer with expertise in PySpark and Databricks to design, build, and optimize scalable data pipelines for processing massive datasets.** **Key Responsibilities**: - Build & Optimize Pipelines: Develop high-throughput ETL...


  • Hyderabad, Telangana, India Cognizant Technology Solutions Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Job SummaryWe are seeking a highly skilled Sr. Developer with 6 to 10 years of experience to join our team. The ideal candidate will have expertise in Databricks SQL Databricks Workflows and PySpark. Experience in the Cards & Payments domain is a plus. This is a hybrid work model with day shifts and no travel required.ResponsibilitiesDevelop and maintain...


  • Hyderabad, India Fusion Plus Solutions Full time

    Job Description Roles and Responsibilities - 5+ years of experience on IT industry in Data Engineering & Data Analyst role. - 5 years of development experience using tool Databricks and PySpark, Python, SQL - Proficient in writing SQL queries including writing of windows functions - Good communication skills with analytical abilities in doing problem solving...

  • Databricks Pyspark

    1 week ago


    Hyderabad, India Fusion Plus Solutions Full time

    Job Description - Develop and optimize data processing jobs using PySpark to handle complex data transformations and aggregations efficiently. - Design and implement robust data pipelines on the AWS platform, ensuring scalability and efficiency. - Leverage AWS services such as EC2, S3 for comprehensive data processing and storage solutions. - Manage SQL...

  • Databricks Engineer

    4 weeks ago


    Hyderabad, India Tata Consultancy Services Full time

    Role : Databricks Engineer Required Technical Skill Set : Azure Databricks using Scala, PySpark Experience : 5 to 10 years Location : Hyderabad, Noida, Chennai & Mumbai Job Description : Must Have:- What skills/attributes are a must have (Please be detailed as to number of years of experience)? 5 or more years of experience working on Azure Databricks using...

  • Databricks Engineer

    4 weeks ago


    Hyderabad, India Tata Consultancy Services Full time

    Role : Databricks Engineer Required Technical Skill Set : Azure Databricks using Scala, PySpark Experience : 5 to 10 years Location : Hyderabad, Noida, Chennai & Mumbai Job Description : Must Have:- What skills/attributes are a must have (Please be detailed as to number of years of experience)? 5 or more years of experience working on Azure Databricks using...

  • Databricks Engineer

    2 days ago


    hyderabad, India Tata Consultancy Services Full time

    Role : Databricks Engineer Required Technical Skill Set : Azure Databricks using Scala, PySpark Experience : 5 to 10 years Location : Hyderabad, Noida, Chennai & Mumbai Job Description : Must Have:- What skills/attributes are a must have (Please be detailed as to number of years of experience)? 5 or more years of experience working on Azure Databricks using...

Software Engineer II, PySpark, Databricks

4 weeks ago


Hyderabad, India JPMorgan Chase & Co. Full time

Description

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

Job summary


As a Software Engineer II at JPMorgan Chase within the Corporate technology, you design and deliver trusted technology products in a secure, stable, and scalable way. You are responsible for implementing critical solutions across multiple technical areas to support the firm’s business objectives.

Job responsibilities

Execute software solutions, design, development, and technical troubleshooting to solve complex problems. Create secure, high-quality production code and maintain algorithms that run synchronously with systems. Produce architecture and design artifacts for complex applications, ensuring design constraints are met. Gather, analyze, and develop visualizations and reporting from large, diverse data sets to improve applications and systems. Identify hidden problems and patterns in data to drive improvements in coding hygiene and system architecture. Contribute to software engineering communities of practice and events exploring new and emerging technologies.

Required qualifications, capabilities, and skills

Formal training or certification on software engineering concepts and 2+ years applied experience At least one year of practical experience with Spark, SQL, Databricks, and AWS cloud ecosystem. Expertise in Apache NiFi, Lakehouse or Delta Lake architectures, system design, application development, testing, and operational stability. Strong programming skills in PySpark and SparkSQL. Proficient in orchestration using Airflow. In-depth knowledge of Big Data and data warehousing concepts. Experience with CI/CD processes.

Preferred qualifications, capabilities, and skills

Familiarity with Snowflake, Terraform, and large language models. Exposure to cloud technologies such as AWS Glue, S3, SQS, SNS, Lambda. AWS certifications such as Solutions Architect Associate, Developer Associate, Data Analytics Specialty, or Databricks certification.