Databricks + Pyspark

5 hours ago


Andhra Pradesh, India Virtusa Full time

**Detailed Job Description for**Databricks + PySpark Developer**:

- Data Pipeline Development: Design, implement, and maintain scalable and efficient data pipelines using PySpark and Databricks for ETL processing of large volumes of data.
- Cloud Integration: Develop solutions leveraging Databricks on cloud platforms (AWS/Azure/GCP) to process and analyze data in a distributed computing environment.
- Data Modeling: Build robust data models, ensuring high-quality data integration and consistency across multiple data sources.
- Optimization: Optimize PySpark jobs for performance, ensuring the efficient use of resources and cost-effective execution.
- Collaborative Development: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver actionable insights.
- Automation & Monitoring: Implement monitoring solutions for data pipeline health, performance, and failure detection.
- Documentation & Best Practices: Maintain comprehensive documentation of architecture, design, and code. Ensure adherence to best practices for data engineering, version control, and CI/CD processes.
- Mentorship: Provide guidance to junior data engineers and help with the design and implementation of new features and components.

**Required Skills & Qualifications**:

- Experience: 6+ years of experience in data engineering or software engineering roles, with a strong focus on PySpark and Databricks.

**Technical Skills**:

- Proficient in PySpark for distributed data processing and ETL pipelines.
- Experience working with Databricks for running Apache Spark workloads in a cloud environment.
- Solid knowledge of SQL, data wrangling, and data manipulation.
- Experience with cloud platforms (AWS, Azure, or GCP) and their respective data storage services (S3, ADLS, BigQuery, etc.).
- Familiarity with data lakes, data warehouses, and NoSQL databases (e.g., MongoDB, Cassandra, HBase).
- Experience with orchestration tools like Apache Airflow, Azure Data Factory, or DBT.
- Familiarity with containerization (Docker, Kubernetes) and DevOps practices.
- Problem Solving: Strong ability to troubleshoot and debug issues related to distributed computing, performance bottlenecks, and data quality.
- Version Control: Proficient in Git based workflows and version control.
- Communication Skills: Excellent written and verbal communication skills, with the ability to explain complex technical concepts to both technical and non-technical stakeholders.
- Education: Bachelor or Master’s degree in Computer Science, Engineering, or a related field (or equivalent practical experience).

**About Virtusa**

Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.

Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.

Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.



  • Bangalore, Andhra Pradesh, India Atos Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    About Atos GroupAtos Group is a global leader in digital transformation with c. 70,000 employees and annual revenue of c. € 10 billion, operating in 67 countries under two brands — Atos for services and Eviden for products. European number one in cybersecurity, cloud and high-performance computing, Atos Group is committed to a secure and decarbonized...

  • Azure Databricks

    1 day ago


    uttar pradesh, India Tata Consultancy Services Full time

    Greetings From TCS!!! "Opportunities don't happen, you create them" Job Title: Azure Databricks Exp Range: 4-10 Years Job Location: Noida or Bangalore or Chennai or Hyderabad or Kolkata Job Description: Required skill: Azure Databricks, PySpark and Python. Must have: 1.Azure Data Bricks (Python) 2.Pyspark 3. ADLS 4. complex problem solving skill Good to...


  • Bilaspur, Uttar Pradesh, India Growel Softech Private Limited Full time

    Job Description Key Responsibilities: - Code Interpretation: Read, understand, and interpret SAS code to relay functional logic to Databricks developers. - Code Conversion: Perform end-to-end SAS-to-Python/PySpark code conversion for Databricks implementation. - Technical Guidance: Provide Databricks platform guidance and best practices to the team during...

  • Azure Databricks

    3 days ago


    uttar pradesh, India Tata Consultancy Services Full time

    Greetings From TCS!!!"Opportunities don't happen, you create them"Job Title: Azure DatabricksExp Range: 4-10 YearsJob Location: Noida or Bangalore or Chennai or Hyderabad or KolkataJob Description:Required skill: Azure Databricks, PySpark and Python.Must have: 1.Azure Data Bricks (Python)2.Pyspark3. ADLS4. complex problem solving skillGood to Have: 1.Azure...


  • Andhra Pradesh, India Growel Softech Pvt. Ltd. Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    JD -7+ years of hands on experience in Python especially dealing with Pandas and Numpy Good hands-on experience in Spark PySpark and Spark SQLHands on experience in Databricks Unity Catalog Delta Lake Lake house Platform Medallion Architecture Azure Data Factory ADLS Experience in dealing with Parquet and JSON file format Knowledge in Snowflake.


  • uttar pradesh, India NeerInfo Solutions Full time

    Azure Databricks Architect - Noida, Pune,Chennai,Bangalore,Hyderabad Years of experience - 9-16 yearsNotice period- Immediate to Serving to 30 daysRole Overview:We are looking for experienced Data Architects / Senior Data Architects to join our team. In this role, you will lead the architecture, design, and delivery of modern data platforms—including Data...

  • AWS Data Engineer

    6 hours ago


    Andhra Pradesh, India Inityinfotech Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    AWS Data EngineerExperience :- 5+ yearsLocation : RemoteJob DescriptionDesign, development, and implementation of performant ETL pipelines using python API (pySpark) of Apache Spark on AWS EMR.Writing reusable, testable, and efficient codeIntegration of data storage solutions in spark – especially with AWS S3 object storage. Performance tuning of pySpark...


  • uttar pradesh, India beBeeData Full time

    Big Data EngineerWe are seeking an experienced Big Data Engineer to design and build scalable data pipelines using Azure Databricks, PySpark, and Python.Key Responsibilities:Data Pipeline Design and Development: Responsible for designing and building efficient data pipelines using Azure Databricks.Data Transformation: Experienced in transforming data using...


  • uttar pradesh, India NeerInfo Solutions Full time

    Location-Chennai,Bangalore,Hyderabad, Noida,PuneYears of experience- 9 to 16 yearsMS Fabric Data Architect Skills Required 8+ years of overall technical experience with at least 2 years working hands-on with Microsoft Fabric, preferably coming from Azure Databricks/Synapse background. Experience of leading at least 2 end to end Data Lakehouse project on MS...

  • Mlops

    1 week ago


    Andhra Pradesh, India Virtusa Full time

    Must have 7+ years of total IT experience in Data engineering and 3 years as MLOps engineer. Designing, developing, and implementing robust MLOps pipelines using industry-leading tools like MLflow, Apache Airflow etc. Working Experience with AWS and Databricks. Must have strong Proficiency in Python, Pyspark, SQL, Machine Learning, NLP, Deep Learning,...