
Python Pyspark
2 weeks ago
7 Years overall IT experience with minimum 4 years of work experience in below tech skills
Tech Skills
Proficient in Python scripting and PySpark for data processing task
Strong SQL capabilities with hands on experience managing big data using ETL tools like Informatica
Experience with the AWS cloud platform and its data services including S3 Redshift Lambda EMR Airflow, Postgres SNS and EventBridge
Skilled in BASH Shell scripting
Understanding of data lakehouse architecture particularly with Iceberg format is a plus
Preferred Experience with Kafka and Mulesoft API
Understanding of healthcare data systems is a plus
Experience in Agile methodologies
Strong analytical and problem solving skills
Effective communication and teamwork abilities
**About Virtusa**
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
-
Python & Pyspark
1 week ago
Karnataka, India Teradata Full timeTeradata’s Global Delivery Center (GDC) brings together data enthusiasts from all corners of the world to support our customers across the globe. We analyze their business needs and address them with the most sophisticated data solutions including Data Warehouses, Data Lakes, Operational Data Stores and much more. We deliver and manage complex hybrid...
-
Python, Pyspark
2 weeks ago
Bengaluru, India Ziniosedge Full timePython, Pyspark 5 Technology Lead (Data Engineer) BE - 8+ years of industry experience as Lead Developer - Experience in implementing ETL and ELT data pipelines with PySpark - Spark Structured API, SPARK SQL & Spark Performance tuning are highly preferred. - Experience in building data pipelines on data lake or Lakehouse(AWS Databricks) & handling...
-
Pyspark
2 weeks ago
Bengaluru, India Riverforest Connections Full timePosition Purpose - For the projects which are implemented on Data SSC platform, we need senior developer for python - pyspark development to migrate project from spark/scala to python-pyspark._ **Responsibilities** **Direct Responsibilities** Design high quality deliverables adhering to business requirement with defined standards and design principles,...
-
Pyspark
6 hours ago
Bengaluru, Karnataka, India Cognizant Full time**Job Summary** **Responsibilities** - Lead the design and implementation of data processing pipelines using Databricks PySpark and Python - Oversee the development and maintenance of scalable data solutions to support business needs - Provide technical guidance and mentorship to team members to ensure high-quality deliverables - Collaborate with...
-
Python Pyspark Aws SQL
1 week ago
Bengaluru, Karnataka, India Virtusa Full timeRole: Data Engineer1) Minimum of 5+ years of experience in Data Engineering. Required technical skills: Python, PySpark, SQL, AWS2) Required Domain: Healthcare3) Extensive data analysis skills and Good communication skills Writing scalable code using Python programming language. Developing back-end components. Integrating user-facing elements using...
-
Python Pyspark Data Engineer
2 weeks ago
Bengaluru, Karnataka, India DXC Technology Full timePython Pyspark Data Engineer Job Location Hyderabad Bangalore Chennai Kolkata Noida Gurgaon Pune Indore Mumbai Strong Python Skills - Proficient in Python for data manipulation automation and building reusable components Data Pipeline Development - Experience designing and maintaining ETL data pipelines using tools like Airflow or...
-
Python Web Scraper
7 days ago
Bengaluru, India Talent Acceleration Corridor Full timeRequirements : They are looking for Python Web Scraper, who should continually strive to advance engineering excellence and technology innovation. The mission is to power the next generation of digital products and services through innovation, collaboration, and transparency. You will be a technology leader and doer who enjoys working in a dynamic,...
-
Pyspark
1 week ago
Bengaluru, Karnataka, India Hero Moto Corp Full time» **Date**:12 Nov 2024 **Location**: Bengaluru, KA, IN, 560038 **Company**:Hero Motocorp **Function** - Digital & Information Technologies**Pay Band** - E4 to M2**Role** - PySpark, Glue or Databricks developer proficient in cloud technologies who can become a member of HMCL Digital Connected core tech team. The role demands amazing ETL coding skills...
-
Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Golden Opportunities Full time ₹ 1,04,000 - ₹ 1,30,878 per yearJob DescriptionJob title:Data Engineer ( Python + PySpark + SQL )Candidate Specification: Minimum 6 to 8 years of experience in Data EngineerJob DescriptionData Engineer with strong expertise in Python, PySpark, and SQL.Design, develop, and maintain robust data pipelines using PySpark and Python.Strong understanding of SQL and relational databases (e.g.,...
-
Data Engineer
5 days ago
Bengaluru, India Golden Opportunities Full timeJob DescriptionJob title:Data Engineer ( Python + PySpark + SQL ) Candidate Specification: Minimum 6 to 8 years of experience in Data EngineerJob Description Data Engineer with strong expertise in Python, PySpark, and SQL. Design, develop, and maintain robust data pipelines using PySpark and Python. Strong understanding of SQL and relational databases (e.g.,...