Current jobs related to GCP Data Engineer - Chennai, Tamil Nadu - Royal Cyber Inc.

  • GCP Data Engineer

    2 weeks ago


    Chennai, Tamil Nadu, India Qode Full time ₹ 1,04,000 - ₹ 1,30,878 per year

    GCP Data Engineer Location: Chennai Workplace Type: HybridAbout the RoleWe are seeking a highly skilled and experienced Data Engineer to join our growing data team. In this role, you will be responsible for designing, building, and maintaining our data infrastructure on Google Cloud Platform (GCP). You will work closely with data scientists, analysts, and...

  • GCP Data Engineer

    2 weeks ago


    Chennai, Tamil Nadu, India Qode Full time ₹ 6,00,000 - ₹ 8,00,000 per year

    GCP Data Engineer Location: Chennai Workplace Type: Hybrid About the Role We are seeking a highly skilled and experienced Data Engineer to join our growing data team. In this role, you will be responsible for designing, building, and maintaining our data infrastructure on Google Cloud Platform (GCP). You will work closely with data scientists, analysts,...

  • GCP Data Engineer

    1 week ago


    Chennai, Tamil Nadu, India Raah Techservices Full time ₹ 47,300 - ₹ 20,00,000 per year

    Job Summary:We are seeking an experienced and results-driven GCP Data Engineer with over 5 years of hands-on experience in building and optimizing data pipelines and architectures using Google Cloud Platform (GCP). The ideal candidate will have strong expertise in data integration, transformation, and modeling, with a focus on delivering scalable, efficient,...

  • GCP Data Engineer

    3 weeks ago


    Chennai, Tamil Nadu, India CustomerLabs 1P Data OPs Full time

    Position Overview:"Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call itthe present." - Master OogwayJoin CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role intransforming raw marketing data into actionable insights that power our digitalmarketing platform. As a key member of our data infrastructure...

  • Sr GCP Data Engineer

    2 weeks ago


    Chennai, Tamil Nadu, India Kovan Technology Solutions Full time ₹ 1,04,000 - ₹ 1,30,878 per year

    Job description:Responsibilities:Design, develop, and maintain data pipelines on GCP.Implement data storage solutions and optimize data processing workflows.Ensure data quality and integrity throughout the data lifecycle.Collaborate with data scientists and analysts to understand data requirements.Monitor and maintain the health of the data...


  • Chennai, Tamil Nadu, India beBeeData Full time ₹ 18,00,000 - ₹ 24,00,000

    Job Overview:We are seeking a seasoned professional to lead the design, development, and maintenance of scalable data pipelines on Google Cloud Platform (GCP). The ideal candidate will have expertise in GCP and a strong understanding of data warehousing concepts and ETL processes.The selected individual will be responsible for implementing data storage...

  • GCP Data Engineer

    3 weeks ago


    Chennai, Tamil Nadu, India Tata Consultancy Services Full time

    Greetings from TCS.TCS is Hiring For GCP Data EngineerExperience: 10 to 14yrsLocation: Chennai, Pune, Bangalore , Hyderabad and GurugramJob Description:Bigquery / DataProc - handon 5yrs +Google Cloud Storage - handon 5yrs +Pub sub / Cloud Data flow - handon 5yrs +

  • GCP Data Engineer

    3 weeks ago


    Chennai, Tamil Nadu, India Tata Consultancy Services Full time

    Greetings from TCS. TCS is Hiring For GCP Data Engineer Experience: 10 to 14yrs Location: Chennai, Pune, Bangalore , Hyderabad and Gurugram Job Description: Bigquery / DataProc - handon 5yrs + Google Cloud Storage - handon 5yrs + Pub sub / Cloud Data flow - handon 5yrs +

  • GCP Data Engineer

    2 weeks ago


    Chennai, Tamil Nadu, India Tata Consultancy Services Full time

    Greetings from TCS.TCS is Hiring For GCP Data EngineerExperience: 10 to 14yrsLocation: Chennai, Pune, Bangalore , Hyderabad and GurugramJob Description: Bigquery / DataProc - handon 5yrs + Google Cloud Storage - handon 5yrs + Pub sub / Cloud Data flow - handon 5yrs +


  • Chennai, Tamil Nadu, India Getronics Full time

    Greeting from Getronics We have multiple opportunities for Senior GCP Data Engineers for our automotive client in Chennai Location. Position Description: Data Analytics team is seeking a GCP Data Engineer to create, deliver, and support custom data products, as well as enhance/expand team capabilities. They will work on analysing and manipulating large...

GCP Data Engineer

2 weeks ago


Chennai, Tamil Nadu, India Royal Cyber Inc. Full time US$ 1,25,000 - US$ 1,75,000 per year
Job Title: Lead GCP Data Engineer (Senior Level) Reports to: SVP, Head of Data, Technology & Analytics Location: Remote – Global (must be available through 2 p.m. U.S. Eastern Time) Employment Type: Full-time
• Long-term Contract (Annual Renewal) ________________________________________ Summary We are seeking a highly skilled and motivated Lead GCP Data Engineer to join our team. This role is critical to the development and operation of cloud-native, AI-driven enterprise data products that power global media planning and analytics. As a Senior Data Engineer, you will architect, build, and maintain scalable, secure, and optimized data solutions on Google Cloud Platform (GCP). Your focus will be on developing robust ELT pipelines, streaming workloads, API-based ingestion frameworks, and orchestration using tools such as Apache Spark, Airflow (Cloud Composer), and BigQuery. You'll operate in a fast-paced environment, supporting data-driven innovation across cross-functional teams and ensuring reliability, compliance, and cost efficiency in all workflows. ________________________________________ Key Responsibilities Data Engineering & Development
• Design, build, and optimize scalable ELT/ETL pipelines to process structured and unstructured data across batch and streaming systems.
• Architect and deploy cloud-native data workflows using GCP services including BigQuery, Cloud Storage, Cloud Functions, Cloud Pub/Sub, Dataflow, and Cloud Composer.
• Build high-throughput Apache Spark workloads in Python and SQL, with performance tuning for scale and cost.
• Develop parameterized DAGs in Apache Airflow with retry logic, alerting, SLA/SLO enforcement, and robust monitoring.
• Build reusable frameworks for high-volume API ingestion, transforming Postman collections into production-ready Python modules.
• Translate business and product requirements into scalable, efficient data systems that are reliable and secure. Cloud Infrastructure & Security
• Implement IAM and VPC-based security to manage and deploy GCP infrastructure for secure data operations.
• Ensure robustness, scalability, and cost-efficiency of all infrastructure, following FinOps best practices.
• Apply automation through CI/CD pipelines using tools like Git, Jenkins, or Bitbucket. Data Quality, Governance & Optimization
• Design and implement data quality frameworks, monitoring, validation, and anomaly detection.
• Build observability dashboards to ensure pipeline health and proactively address issues.
• Ensure compliance with data governance policies, privacy regulations, and security standards. Collaboration & Project Delivery
• Work closely with cross-functional stakeholders including data scientists, analysts, DevOps, product managers, and business teams.
• Effectively communicate technical solutions to non-technical stakeholders.
• Manage multiple concurrent projects, shifting priorities quickly and delivering under tight timelines.
• Collaborate within a globally distributed team with real-time engagement through 2 p.m. U.S. Eastern Time. ________________________________________ Qualifications & Certifications Education
• Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field. Experience
• Minimum 7 years in data engineering with 5 years of hands-on experience on GCP.
• Proven track record with tools and services like BigQuery, Cloud Composer (Apache Airflow), Cloud Functions, Pub/Sub, Cloud Storage, Dataflow, and IAM/VPC.
• Demonstrated expertise in Apache Spark (batch and streaming), PySpark, and building scalable API integrations.
• Advanced Airflow skills including custom operators, dynamic DAGs, and workflow performance tuning. Certifications
• Google Cloud Professional Data Engineer certification preferred. ________________________________________ Key Skills Mandatory Technical Skills
• Advanced Python (PySpark, Pandas, pytest) for automation and data pipelines.
• Strong SQL with experience in window functions, CTEs, partitioning, and optimization.
• Proficiency in GCP services including BigQuery, Dataflow, Cloud Composer, Cloud Functions, and Cloud Storage.
• Hands-on with Apache Airflow, including dynamic DAGs, retries, and SLA enforcement.
• Expertise in API data ingestion, Postman collections, and REST/GraphQL integration workflows.
• Familiarity with CI/CD workflows using Git, Jenkins, or Bitbucket.
• Experience with infrastructure security and governance using IAM and VPC. Nice-to-Have Skills
• Experience with Terraform or Kubernetes (GKE).
• Familiarity with data visualization tools such as Looker or Tableau.
• Exposure to MarTech/AdTech data sources and campaign analytics.
• Knowledge of machine learning workflows and their integration with data pipelines.
• Experience with other cloud platforms like AWS or Azure. ________________________________________ Soft Skills
• Strong problem-solving and critical-thinking abilities.
• Excellent verbal and written communication skills to engage technical and non-technical stakeholders.
• Proactive and adaptable, with a continuous learning mindset.
• Ability to work independently as well as within a collaborative, distributed team. ________________________________________ Working Hours
• Must be available for real-time collaboration with U.S. stakeholders every business day through 2 p.m. U.S. Eastern Time (minimum 4-hour overlap).