Python/Pyspark, Bigquery with GCP, Apache Iceberg
5 days ago
TCS Hiring Virtual Drive *** 12-Nov-25 Python/Pyspark, Bigquery with GCP, Apache Iceberg TCS - Hyderabad 12 PM to 1 PM Immediate Joiners 5 to 7 yearsRole**Python/Pyspark, Bigquery with GCP, Apache IcebergExp - 5 to 7 yearsPlease read Job description before ApplyingNOTE: If the skills/profile matches and interested, please reply to this email by attaching your latest updated CV and with below few details: Name: Contact Number: Email ID: Highest Qualification in: (Eg. B.Tech/B.E./M.Tech/MCA/M.Sc./MS/BCA/B.Sc./Etc.) Current Organization Name: Total IT Experience- 5 to 7 years Location : Hyderabad Current CTC Expected CTC Notice period: Immediate Whether worked with TCS - Y/NMust-Have** (Ideally should not be more than 3-5) Strong proficiency inPythonprogramming. Hands-on experience withPySparkandApache Spark . Knowledge ofBig Datatechnologies (Hadoop, Hive, Kafka, etc.). Experience withSQLand relational/non-relational databases. Familiarity withdistributed computingandparallel processing . Understanding ofdata engineeringbest practices. Experience withREST APIs ,JSON/XML , and data serialization. Exposure tocloud computingenvironments. Good-to-Have 5+ years of experience in Python and PySpark development. Experience withdata warehousinganddata lakes . Knowledge ofmachine learninglibraries (e.g., MLlib) is a plus. Strong problem-solving and debugging skills. Excellent communication and collaboration abilities.SN Responsibility of / Expectations from the Role 1 Develop and maintain scalable data pipelines usingPythonandPySpark . Design and implementETL (Extract, Transform, Load)processes. Optimize and troubleshoot existing PySpark applications for performance. Collaborate with cross-functional teams to understand data requirements. Write clean, efficient, and well-documented code. Conduct code reviews and participate in design discussions. Ensure data integrity and quality across the data lifecycle. Integrate with cloud platforms likeAWS ,Azure , orGCP . Implement data storage solutions and manage large-scale datasets.
-
Data Engineer
5 days ago
New Delhi, India Tata Consultancy Services Full timeGreetings from Tata Consulting Services!!!Role: Data Engineer with Mandatory Pyspark, Python Knowledge Desired Experience Range: 5 to 10 yrs Location: Pune, Hyderabad, Chennai, Bangalore, Kochi, Bhubaneshwar Notice period: 0 To 30 Days Interview Mode: Virtual Mode Interview Date: 13th Nov 2025 Time Of Interview: 10AM TO 2PM Required Technical Skill Set:...
-
Data Engineer
5 days ago
Delhi, India Tata Consultancy Services Full timeGreetings from Tata Consulting Services!!!Role: Data Engineer with Mandatory Pyspark, Python KnowledgeDesired Experience Range: 5 to 10 yrsLocation: Pune, Hyderabad, Chennai, Bangalore, Kochi, BhubaneshwarNotice period: 0 To 30 DaysInterview Mode: Virtual ModeInterview Date: 13th Nov 2025Time Of Interview: 10AM TO 2PMRequired Technical Skill Set: Python,...
-
Pyspark Data Engineer
5 days ago
New Delhi, India Tata Consultancy Services Full timeRole: Data Engineer with Mandatory Pyspark KnowledgeRequired Technical Skill Set: Python , Pyspark, BigQueryDesired Experience Range: 5+ yrsDesired Experience: 0-60 DaysLocation: Bangalore, Pune, Chennai, GurugramDesired Competencies (Technical/Behavioral Competency)- Development, Production Support and delivery of Python, BigQuery, SQL, GCP, Airflow based...
-
GCP BigQuery Developer
3 weeks ago
New Delhi, India Tata Consultancy Services Full timeDear Candidate,Greetings from TATA Consultancy Services!!Thank you for expressing your interest in exploring a career possibility with the TCS Family.Hiring For:- GCP BigQuery DeveloperLocation: Pune / KochiExperience: 6 to 10 YrsRole :GCP DeveloperRequired Technical Skill Set : Strong in Python, Big Data and ability to write SQLs for Specialized Data...
-
GCP Data Engineer
3 weeks ago
New Delhi, India Impetus Full timeJob Title: GCP Data EngineerExperience: 4–7 YearsLocation: Bangalore / GurgaonEmployment Type: Full-TimeAbout the RoleWe are looking for an experienced GCP Data Engineer with a strong background in Big Data, PySpark, and Python, and hands-on experience with core Google Cloud Platform (GCP) services. The ideal candidate will be responsible for designing,...
-
GCP BigQuery Developer
4 weeks ago
New Delhi, India Tata Consultancy Services Full timeDear Candidate, Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. Hiring For:- GCP BigQuery Developer Location: Pune / Kochi Experience: 6 to 10 YrsRole:GCP Developer Required Technical Skill Set: Strong in Python, Big Dataand ability to write SQLs for Specialized Data...
-
Data Engineer
4 weeks ago
New Delhi, India LTIMindtree Full timeGCP Big Data EngineerLocation -Bangalore & GurgaonLeadership role with 8-10 yrs experienceSkills Set GCP SQL PySpark ETL knowledge MUST SkillsMandatory Skills :GCP Storage,GCP BigQuery,GCP DataProc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Datastream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP...
-
Lead Data Engineer
5 days ago
New Delhi, India People Prime Worldwide Full timeImportant Note (Please Read Before Applying)Do NOT apply if:You have less than 10 years or more than 12 years of total IT experience You do not have hands-on experience with Python and GCP Data Engineering projects You lack real-world experience in building and deploying ETL/ELT pipelines using GCP services (Dataflow, BigQuery, Composer, etc.) You have no...
-
Bengaluru, Delhi, Hyderabad, NCR, India Tata Consultancy Services Full time ₹ 15,00,000 - ₹ 25,00,000 per yearRole & responsibilitiesKey QualificationsExperience with open Data Lakehouse using Apache IcebergExperience with Data Lakehouse architecture with Apache Iceberg and TrinoExperience with building/managing Lakehouse for large scale data platforms storage, computing aspects, data ingestion/governance and data management and performance.Experience with...
-
Software Engineer
1 day ago
Delhi, India Sourcebae Full timeETL Specialist – MumbaiExperience: 5–6 YearsLocation: Mumbai (Onsite)Job DescriptionWe are hiring an ETL Specialist with strong skills in PySpark, Python, SQL, Hive, Kafka, and GCP BigQuery . The candidate will build and maintain scalable ETL/ELT pipelines, manage big data processing, and work with streaming and batch workloads.Key...