Azure Databricks

5 hours ago


Bengaluru Karnataka, India PradeepIT Consulting Services Full time

**About the job Azure Databricks**:
**Job Description: Data Engineer (Azure Databricks & PySpark)**:
**Position**: Data Engineer
**Experience**: 3 to 5 years
**Primary Skills**: Azure Databricks, PySpark, SQL
**Secondary Skills**: ADF (Azure Data Factory)
**Location**: Bengaluru/Hyderabad
**Mode of Work**: Hybrid
**Notice Period**: 0 to 30 Days

**Key Responsibilities**:
**ETL Development & Data Processing**:

- Design, develop, and maintain **ETL pipelines** using **Azure Databricks, PySpark, and ADF**.
- Extract, transform, and load data from multiple sources into **Azure Data Lake Storage**.
- Optimize and fine-tune **ETL workflows for performance and scalability**.

**SQL & Query Optimization**:

- Write **complex SQL queries, joins, subqueries, and functions** for data processing.
- Perform **SQL tuning and query optimization** for improved efficiency.
- Work with **large datasets** and ensure optimal data handling.

**Azure Cloud Integration**:

- Implement **data solutions using Azure technologies** (Databricks, ADF, ADLS).
- Work with **batch and streaming data processing** for real-time analytics (preferred).
- Ensure **data security, compliance, and reliability** in cloud-based systems.

**Collaboration & Problem-Solving**:

- Work closely with cross-functional teams to support **data analytics and business intelligence initiatives**.
- Identify and resolve **data-related issues proactively**.
- Provide insights and recommendations for **improving ETL performance**.

**Required Skills & Qualifications**:
**3 to 5 years of experience** in **Azure Databricks, PySpark, and SQL**.
**Strong SQL skills** complex joins, functions, procedures, query tuning.
Hands-on experience in **ADF (Azure Data Factory) and Azure Data Lake Storage**.
Ability to handle **large datasets efficiently**.
Experience in **performance optimization of ETL workflows**.
Good understanding of **batch and streaming data processing** (preferred).
Strong analytical and problem-solving skills.
Ability to **work independently** and take ownership of tasks.


  • Azure Databrick

    3 days ago


    Bengaluru, India Ushankk Full time

    **Azure Databrick** *** **Location**: Bangalore **Industry**: Information Technology **Skill**: Datalake **Experience**: 5 - 9 years Azure Databrick / Datalake - Strong knowledge of Data Management principles Designing and implementing data ingestion pipelines using Spark. - Direct experience of building data pipelines using Azure Data Factory and...


  • Bengaluru, Karnataka, India BirlaSoft Full time

    Country/Region: IN - Requisition ID: 25277 - Work Model: - Position Type: - Salary Range: - Location: INDIA - BENGALURU - HP **Title**:Azure Databricks with Pyspark**: - Description: **Area(s) of responsibility**: Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise...


  • Bengaluru, Karnataka, India BirlaSoft Full time

    Country/Region: IN - Requisition ID: 25276 - Work Model: - Position Type: - Salary Range: - Location: INDIA - BENGALURU - HP **Title**:Azure Databricks with Pyspark**: - Description: **Area(s) of responsibility**: Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise...


  • Bengaluru, Karnataka, India Tata Consultancy Services Full time

    Inviting Applications for Azure Databricks along with Python Job Title: Azure Databricks along with Python Experience Required: 4 to 10 Years Location: Bangalore - Extensive expertise in designing and implementing data load processes using Azure Data Factory, Azure Databricks, Delta Lake, Azure Delta Lake Storage and Python/PySpark - Proficient with...

  • Azure Databricks

    4 days ago


    Bengaluru, Karnataka, India Virtusa Full time

    Key Responsibilities: Azure Data Engineering: Design, develop, and maintain data pipelines and data workflows using Azure Databricks, ADLS Gen2, and Pyspark for data processing and transformation. Business Analysis & Requirements Gathering: Collaborate with clients to understand their business requirements and translate them into technical solutions. Work...

  • azure databricks

    2 days ago


    Bengaluru, Karnataka, India Response Informatics Full time

    Alteryx, Databricks and Azure Data Engineering. Secondary skill: Python

  • Azure Databricks

    1 day ago


    Bengaluru, India CIEL HR Services Full time

    Work experience-Above 6 years Notice period-Immediate to 15 days Work Mode-Hybrid Must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies Expertise in Azure Data Bricks, ADF,...


  • Bengaluru, Karnataka, India Recmatrix Consulting Full time

    Job Title: Azure Databricks EngineerLocation: Bangalore – ITPL (5days WFO )Employment Type: Full-TimeExperience Required: 5+ YearsNotice Period: Immediate to 15 Days (Only)About the Role: We are looking for a Data Engineer with 6–8 years of experience in Databricks, Python, and SQL to join our team. This role will focus on migrating on-premise Big Data...


  • Bengaluru, Karnataka, India Recmatrix Consulting Full time

    Job Title: Azure Databricks Engineer Location: Bangalore – ITPL (5days WFO ) Employment Type: Full-Time Experience Required: 5+ Years Notice Period: Immediate to 15 Days (Only) About the Role: We are looking for a Data Engineer with 6–8 years of experience in Databricks , Python , and SQL to join our team. This role will focus on migrating...


  • Bengaluru, Karnataka, India Tata Consultancy Services Full time

    Position-Azure Databricks + PysparkExperience-5 to 9 yearsLocation-Bangalore, Chennai, Hyderabad, Pune Notice Period-0 to 30/ 0 to 60 DaysInterview Mode- Virtual Interview Data- 18-July-25Responsibilities5+ years of relevant experience in Pyspark and Azure Databricks.Proficiency in integrating, transforming and consolidating data from various structured and...