Python/Pyspark, Bigquery with GCP, Apache Iceberg
4 weeks ago
TCS Hiring Virtual Drive *** 12-Nov-25 Python/Pyspark, Bigquery with GCP, Apache Iceberg TCS - Hyderabad 12 PM to 1 PM Immediate Joiners 5 to 7 yearsRole**Python/Pyspark, Bigquery with GCP, Apache IcebergExp - 5 to 7 yearsPlease read Job description before ApplyingNOTE: If the skills/profile matches and interested, please reply to this email by attaching your latest updated CV and with below few details: Name: Contact Number: Email ID: Highest Qualification in: (Eg. B.Tech/B.E./M.Tech/MCA/M.Sc./MS/BCA/B.Sc./Etc.) Current Organization Name: Total IT Experience- 5 to 7 years Location : Hyderabad Current CTC Expected CTC Notice period: Immediate Whether worked with TCS - Y/NMust-Have** (Ideally should not be more than 3-5) Strong proficiency inPythonprogramming. Hands-on experience withPySparkandApache Spark . Knowledge ofBig Datatechnologies (Hadoop, Hive, Kafka, etc.). Experience withSQLand relational/non-relational databases. Familiarity withdistributed computingandparallel processing . Understanding ofdata engineeringbest practices. Experience withREST APIs ,JSON/XML , and data serialization. Exposure tocloud computingenvironments. Good-to-Have 5+ years of experience in Python and PySpark development. Experience withdata warehousinganddata lakes . Knowledge ofmachine learninglibraries (e.g., MLlib) is a plus. Strong problem-solving and debugging skills. Excellent communication and collaboration abilities.SN Responsibility of / Expectations from the Role 1 Develop and maintain scalable data pipelines usingPythonandPySpark . Design and implementETL (Extract, Transform, Load)processes. Optimize and troubleshoot existing PySpark applications for performance. Collaborate with cross-functional teams to understand data requirements. Write clean, efficient, and well-documented code. Conduct code reviews and participate in design discussions. Ensure data integrity and quality across the data lifecycle. Integrate with cloud platforms likeAWS ,Azure , orGCP . Implement data storage solutions and manage large-scale datasets.
-
Data Engineer
4 weeks ago
New Delhi, India Tata Consultancy Services Full timeGreetings from Tata Consulting Services!!!Role: Data Engineer with Mandatory Pyspark, Python Knowledge Desired Experience Range: 5 to 10 yrs Location: Pune, Hyderabad, Chennai, Bangalore, Kochi, Bhubaneshwar Notice period: 0 To 30 Days Interview Mode: Virtual Mode Interview Date: 13th Nov 2025 Time Of Interview: 10AM TO 2PM Required Technical Skill Set:...
-
Gcp Data Engineer
3 weeks ago
New Delhi, India Tata Consultancy Services Full timeGreetings from Tata Consulting ServicesTCS is Hiring for GCP Data engineerExperience : 5-10 yearsLocation: Pune, Hyderabad, Bangalore, Chennai, GurgaonJob description- Expertise in Google Data Engineering tools (BigQuery, Dataproc). - Strong programming skills in Python and PySpark for data manipulation. - Proficient in SQL and NoSQL databases. - Solid...
-
Pyspark Data Engineer
4 weeks ago
New Delhi, India Tata Consultancy Services Full timeRole: Data Engineer with Mandatory Pyspark KnowledgeRequired Technical Skill Set: Python , Pyspark, BigQueryDesired Experience Range: 5+ yrsDesired Experience: 0-60 DaysLocation: Bangalore, Pune, Chennai, GurugramDesired Competencies (Technical/Behavioral Competency)- Development, Production Support and delivery of Python, BigQuery, SQL, GCP, Airflow based...
-
Pyspark Data Engineer
3 weeks ago
New Delhi, India Tata Consultancy Services Full timeRole: Data Engineer with Mandatory Pyspark Knowledge Required Technical Skill Set: Python , Pyspark, BigQuery Desired Experience Range: 5+ yrs Desired Experience: 0-60 Days Location: Bangalore, Pune, Chennai, GurugramDesired Competencies (Technical/Behavioral Competency)Development, Production Support and delivery of Python, BigQuery, SQL, GCP, Airflow based...
-
GCP Data Engineer
3 days ago
New Delhi, India Intellistaff Services Pvt. Ltd Full timeJob description:Data Engineer with Python and GCP Experience Level:5 to 9 Years Loc:ChennaiMust Have Skillset SQL (4+ Years) Python or PySpark (4+ Years) GCP services (3+ Years) BigQuery Dataflow or Dataproc Pub/Sub Scheduled Query Cloud Functions Monitoring Tools Dashboard Apache Kafka Terraform scripting (2+ Year) Airflow/Astronomer/Cloud Composer (2+...
-
GCP Data Engineer
2 days ago
New Delhi, India Intellistaff Services Pvt. Ltd Full timeJob description:Data Engineer with Python and GCP Experience Level:5 to 9 Years Loc:ChennaiMust Have Skillset SQL (4+ Years) Python or PySpark (4+ Years) GCP services (3+ Years) BigQuery Dataflow or Dataproc Pub/Sub Scheduled Query Cloud Functions Monitoring Tools Dashboard Apache Kafka Terraform scripting (2+ Year) Airflow/Astronomer/Cloud Composer (2+...
-
Freelance GCP Data Engineer
2 days ago
New Delhi, India ThreatXIntel Full timeCompany DescriptionThreatXIntel is a startup specializing in cybersecurity, offering tailored and cost-effective solutions for businesses and organizations of all sizes. Our expertise spans cloud security, web and mobile security testing, cloud security assessments, DevSecOps, and more. Driven by a proactive approach, we monitor and test our clients' digital...
-
Freelance – Apache NiFi
2 days ago
Delhi, India ThreatXIntel Full timeCompany DescriptionThreatXIntel is a dynamic cybersecurity startup dedicated to protecting businesses and organizations against cyber threats. Specializing in services such as cloud security, DevSecOps, and vulnerability testing, the company offers tailored, cost-effective solutions to meet clients' unique requirements. Focused on startups and small...
-
Lead GCP Data Engineer
2 weeks ago
New Delhi, India Impetus Full timeJob Title:Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location:Gurgaon Experience:8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role: You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data...
-
Lead GCP Data Engineer
2 days ago
New Delhi, India Impetus Full timeJob Title:Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location:Gurgaon Experience:8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role: You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data...