Python/Pyspark, Bigquery with GCP, Apache Iceberg

7 days ago


New Delhi, India Tata Consultancy Services Full time

TCS Hiring Virtual Drive ***12-Nov-25 Python/Pyspark, Bigquery with GCP, Apache IcebergTCS - Hyderabad12 PM to 1 PMImmediate Joiners5 to 7 yearsRole** Python/Pyspark, Bigquery with GCP, Apache IcebergExp -5 to 7 yearsPlease read Job description before ApplyingNOTE: If the skills/profile matches and interested, please reply to this email by attaching your latest updated CV and with below few details:Name:Contact Number:Email ID:Highest Qualification in: (Eg. B.Tech/B.E./M.Tech/MCA/M.Sc./MS/BCA/B.Sc./Etc.)Current Organization Name:Total IT Experience-5 to 7 yearsLocation : HyderabadCurrent CTCExpected CTCNotice period: ImmediateWhether worked with TCS - Y/NMust-Have**(Ideally should not be more than 3-5)- Strong proficiency in Python programming. - Hands-on experience with PySpark and Apache Spark. - Knowledge of Big Data technologies (Hadoop, Hive, Kafka, etc.). - Experience with SQL and relational/non-relational databases. - Familiarity with distributed computing and parallel processing. - Understanding of data engineering best practices. - Experience with REST APIs, JSON/XML, and data serialization. - Exposure to cloud computing environments.Good-to-Have- 5+ years of experience in Python and PySpark development. - Experience with data warehousing and data lakes. - Knowledge of machine learning libraries (e.g., MLlib) is a plus. - Strong problem-solving and debugging skills. - Excellent communication and collaboration abilities.SNResponsibility of / Expectations from the Role1- Develop and maintain scalable data pipelines using Python and PySpark. - Design and implement ETL (Extract, Transform, Load) processes. - Optimize and troubleshoot existing PySpark applications for performance. - Collaborate with cross-functional teams to understand data requirements. - Write clean, efficient, and well-documented code. - Conduct code reviews and participate in design discussions. - Ensure data integrity and quality across the data lifecycle. - Integrate with cloud platforms like AWS, Azure, or GCP. - Implement data storage solutions and manage large-scale datasets.


  • Data Engineer

    4 weeks ago


    New Delhi, India Tata Consultancy Services Full time

    Greetings from Tata Consulting Services!!!Role: Data Engineer with Mandatory Pyspark, Python Knowledge Desired Experience Range: 5 to 10 yrs Location: Pune, Hyderabad, Chennai, Bangalore, Kochi, Bhubaneshwar Notice period: 0 To 30 Days Interview Mode: Virtual Mode Interview Date: 13th Nov 2025 Time Of Interview: 10AM TO 2PM Required Technical Skill Set:...

  • Gcp Data Engineer

    3 weeks ago


    New Delhi, India Tata Consultancy Services Full time

    Greetings from Tata Consulting ServicesTCS is Hiring for GCP Data engineerExperience : 5-10 yearsLocation: Pune, Hyderabad, Bangalore, Chennai, GurgaonJob description- Expertise in Google Data Engineering tools (BigQuery, Dataproc). - Strong programming skills in Python and PySpark for data manipulation. - Proficient in SQL and NoSQL databases. - Solid...

  • Pyspark Data Engineer

    4 weeks ago


    New Delhi, India Tata Consultancy Services Full time

    Role: Data Engineer with Mandatory Pyspark KnowledgeRequired Technical Skill Set: Python , Pyspark, BigQueryDesired Experience Range: 5+ yrsDesired Experience: 0-60 DaysLocation: Bangalore, Pune, Chennai, GurugramDesired Competencies (Technical/Behavioral Competency)- Development, Production Support and delivery of Python, BigQuery, SQL, GCP, Airflow based...

  • Pyspark Data Engineer

    3 weeks ago


    New Delhi, India Tata Consultancy Services Full time

    Role: Data Engineer with Mandatory Pyspark Knowledge Required Technical Skill Set: Python , Pyspark, BigQuery Desired Experience Range: 5+ yrs Desired Experience: 0-60 Days Location: Bangalore, Pune, Chennai, GurugramDesired Competencies (Technical/Behavioral Competency)Development, Production Support and delivery of Python, BigQuery, SQL, GCP, Airflow based...

  • GCP Data Engineer

    2 days ago


    New Delhi, India Intellistaff Services Pvt. Ltd Full time

    Job description:Data Engineer with Python and GCP Experience Level:5 to 9 Years Loc:ChennaiMust Have Skillset SQL (4+ Years) Python or PySpark (4+ Years) GCP services (3+ Years) BigQuery Dataflow or Dataproc Pub/Sub Scheduled Query Cloud Functions Monitoring Tools Dashboard Apache Kafka Terraform scripting (2+ Year) Airflow/Astronomer/Cloud Composer (2+...

  • GCP Data Engineer

    3 days ago


    New Delhi, India Intellistaff Services Pvt. Ltd Full time

    Job description:Data Engineer with Python and GCP Experience Level:5 to 9 Years Loc:ChennaiMust Have Skillset SQL (4+ Years) Python or PySpark (4+ Years) GCP services (3+ Years) BigQuery Dataflow or Dataproc Pub/Sub Scheduled Query Cloud Functions Monitoring Tools Dashboard Apache Kafka Terraform scripting (2+ Year) Airflow/Astronomer/Cloud Composer (2+...


  • New Delhi, India ThreatXIntel Full time

    Company DescriptionThreatXIntel is a startup specializing in cybersecurity, offering tailored and cost-effective solutions for businesses and organizations of all sizes. Our expertise spans cloud security, web and mobile security testing, cloud security assessments, DevSecOps, and more. Driven by a proactive approach, we monitor and test our clients' digital...


  • Delhi, India ThreatXIntel Full time

    Company DescriptionThreatXIntel is a dynamic cybersecurity startup dedicated to protecting businesses and organizations against cyber threats. Specializing in services such as cloud security, DevSecOps, and vulnerability testing, the company offers tailored, cost-effective solutions to meet clients' unique requirements. Focused on startups and small...


  • New Delhi, India Impetus Full time

    Job Title:Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location:Gurgaon Experience:8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role: You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data...


  • New Delhi, India Impetus Full time

    Job Title:Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location:Gurgaon Experience:8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role: You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data...