Pyspark + Databricks
6 days ago
**Skill: Pyspark + Databricks**
** Experience**: 6 to 9 Years
** Location : AIA - Gurgaon**
**Job Summary**
**Responsibilities**
- Lead the design and implementation of scalable data solutions using Databricks Workflows to enhance data processing capabilities.
- Oversee the integration of Databricks Unity Catalog to ensure seamless data governance and security across platforms.
- Provide technical expertise in Databricks to optimize performance and drive innovation in data analytics.
- Collaborate with cross-functional teams to deliver high-quality solutions that meet business requirements and objectives.
- Develop and maintain comprehensive documentation for data workflows and processes to facilitate knowledge sharing.
- Mentor and guide junior team members to foster a culture of continuous learning and improvement.
- Conduct regular code reviews to ensure adherence to best practices and coding standards.
- Analyze complex data sets to identify trends and insights that support strategic decision-making.
- Implement robust testing frameworks to ensure the reliability and accuracy of data solutions.
- Coordinate with stakeholders to gather requirements and translate them into technical specifications.
- Drive the adoption of new technologies and methodologies to enhance the teams capabilities and efficiency.
- Ensure compliance with industry standards and regulations to protect data integrity and privacy.
- Monitor system performance and troubleshoot issues to maintain optimal operation of data platforms.
**Qualifications**
- Possess extensive experience in Databricks Workflows and Unity Catalog demonstrating a deep understanding of their functionalities.
- Have a strong background in data engineering and analytics with a proven track record of delivering successful projects.
- Exhibit excellent problem-solving skills and the ability to work effectively in a hybrid work model.
- Demonstrate proficiency in programming languages such as Python or Scala relevant to Databricks environments.
-
databricks/pyspark
2 weeks ago
Chennai, Tamil Nadu, India NRM Analytix Full time ₹ 4,00,000 - ₹ 12,00,000 per yearResponsibilities:* Design, develop & maintain PySpark solutions using Databricks platform* Optimize performance through efficient data processing techniques* Collaborate with cross-functional teams on project deliveryAccessible workspace
-
Pyspark
1 week ago
Chennai, Tamil Nadu, India Cognizant Full time**Job Summary** **Responsibilities** - Develop and maintain data solutions using Databricks SQL Databricks Delta Lake and Databricks Workflows. - Optimize and enhance existing data workflows to improve performance and efficiency. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. -...
-
Etl Databricks with Aws
2 weeks ago
Chennai, Tamil Nadu, India Virtusa Full timeDevelop and maintain a metadata driven generic ETL framework for automating ETL code Design, build, and optimize ETL/ELT pipelines using Databricks (PySpark/SQL) on AWS. Ingest data from a variety of structured and unstructured sources (APIs, RDBMS, flat files, streaming). Develop and maintain robust data pipelines for batch and streaming data using Delta...
-
Databrick
1 week ago
Chennai, Tamil Nadu, India Virtusa Full timeKey Responsibilities: Design, develop, and maintain scalable data pipelines using Apache Spark on Databricks. Write efficient and production-ready PySpark or Scala code for data transformation and ETL processes. Integrate data from various structured and unstructured sources into a unified platform. Implement Delta Lake and manage data versioning, updates,...
-
Databricks Data Engineer
6 days ago
tamil nadu, India Tata Consultancy Services Full timeAbout the Role The Databricks Data Engineer will play a crucial role in migrating and optimizing data workflows, ensuring high performance and reliability in our data operations. Experience- 3 to 6 years Location - Kolkata /Chennai/Trivandrum Required Technical Skill Set ------ Databricks, AWS, Python & PySpark Good to have- Experience in AWS platform (S3,...
-
Azure Databricks
6 days ago
Chennai, Tamil Nadu, India Cognizant Full time**Skill: Azure Databricks** **Experience**: 9 to 12 years **Location: AIA-Gurgaon** **Job Summary** As a Sr. Developer you will leverage your expertise in Databricks PySpark and AWS to drive innovative solutions in a hybrid work model. With 6 to 10 years of experience you will collaborate with cross-functional teams to enhance data processing...
-
Databricks Spark
3 days ago
Chennai, Tamil Nadu, India Cognizant Full time**Job Summary** **Responsibilities** - Develop and maintain scalable data processing solutions using Kafka Python and PySpark to enhance data flow and analytics capabilities. - Collaborate with cross-functional teams to design and implement Databricks Workflows that streamline data operations and improve efficiency. - Utilize Databricks SQL to perform...
-
Databricks Data Engineer
3 weeks ago
Chennai, Tamil Nadu, India, Tamil Nadu Tata Consultancy Services Full timeAbout the RoleThe Databricks Data Engineer will play a crucial role in migrating and optimizing data workflows, ensuring high performance and reliability in our data operations.Experience- 3 to 6 yearsLocation - Kolkata /Chennai/TrivandrumRequired Technical Skill Set ------ Databricks, AWS, Python & PySparkGood to have- Experience in AWS platform (S3, EC2,...
-
Databricks with Python
2 weeks ago
Chennai, India Teamware Solutions Full timeJob Description Teamware Solutions is seeking a skilled Databricks with Python Developer to build, optimize, and manage our big data processing and analytics solutions. This role is crucial for working with relevant technologies, ensuring smooth data operations, and contributing significantly to business objectives through expert analysis, development,...
-
Databricks Admin
2 weeks ago
tamil nadu, India Tata Consultancy Services Full timeDatabricks Admin Exp- 5 to 8 years Location- Chennai/ Kolkata Required Skills Databricks Administration Terraform AWS cloud services MLflow Workflows Databricks Asset Bundles AWS IAM VPC Private endpoints Firewalls S3 SQL + PySpark dbx CLI Responsibilities Implement, and maintain the Databricks platform, including workspace setup, user and group management,...