GCP Data engineer
1 week ago
Job Summary:We are seeking a Senior Data Engineer to join a Global Data Analytics & Insights team focused on transforming enterprise data management and democratizing access to insights. This role will lead a large-scale data migration from external agency-managed environments into an internal Google Cloud Platform (GCP) ecosystem. You will drive discovery, define and execute migration strategy, rebuild ingestion pipelines, and ensure secure, compliant, always-on access for internal teams and approved external partners. The work directly supports global analytics initiatives by delivering scalable, high-quality, governed data productsRoles and ResponsibilityLead discovery and deep analysis of existing third-party data warehouses (e.g., Snowflake or hybrid setups), including data assets, lineage, dependencies, and integration patterns.Define and execute end-to-end migration strategy to move data assets and workflows into internal GCP environments.Handle and migrate third-party API-based ingestion flows so inbound data lands in GCP data lakes/warehouses.Partner with product lines, business stakeholders, and downstream consumers (dashboards, CRM, analytics) to capture requirements and ensure smooth data consumption.Design, build, and maintain scalable ingestion and transformation pipelines using Python, SQL, PySpark, and DBT/Dataform.Develop and manage GCP-native data solutions using BigQuery, Dataflow, Pub/Sub, Cloud Functions, Cloud Run, Dataproc, etc.Implement strong data governance, security, access controls, and compliance practices using GCP security capabilities.Integrate DevSecOps quality and security checks (e.g., SonarQube, FOSSA) into CI/CD pipelines, and respond to findings.Orchestrate workflows with Apache Airflow/Astronomer and provision infrastructure via Terraform (IaC best practices).Monitor and optimize performance, scalability, reliability, and cost-efficiency of pipelines and storage.Promote engineering best practices, reusable patterns, automation, and continuous improvement across teams.Produce clear documentation and communicate technical decisions effectively to technical and non-technical audiences.Required Skills:Expert proficiency in Python (NumPy, Pandas), SQL, and PySpark.Strong production-grade GCP experience with BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Functions, Cloud Run, Data Fusion/Data Prep.Proven ability to build scalable ELT/ETL ingestion and curation pipelines.Hands-on experience with DBT and/or Dataform for transformations.Workflow orchestration expertise with Apache Airflow and/or Astronomer.Proven experience integrating and managing third-party APIs for data ingestion.CI/CD and DevOps exposure (e.g., Tekton) and strong GitHub-based version control.Infrastructure as Code with Terraform.Solid knowledge of data governance, encryption, masking, access management, and cloud security best practices.Strong understanding of modern data ecosystems: data warehouses/lakes, metadata, meshes/fabrics, and analytics/AI use cases.Experience with Agile delivery, user stories, and cross-functional collaboration.Willingness to work from office (Chennai location).Preferred Skills:Direct experience with Snowflake migration, exploration, and optimization.Experience with Java.Familiarity with Master Data Management (MDM) concepts/tools.Exposure to additional security/performance tools (e.g., Checkmarx, Dynatrace).Working knowledge of GDPR and data privacy impact on architecture.Required Experience:5–7 years in Data Engineering or Software Engineering with data-intensive systems.2+ years building and deploying production-scale cloud data platforms on GCP.Demonstrated leadership in delivering migration or large data engineering programs.Strong track record of optimizing compute/storage cost and performance in cloud environments.Education:Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or related field (or equivalent practical experience).
-
Data engineer
4 weeks ago
Kannur, India HCLTech Full timeRole: GCP Data Engineer Location: Bangalore/ Chennai/ Hyderabad Experience: 4 years to 13 Years Hackathon: 22nd Nov at Chennai, Hyderabad, Bangalore locations Mandatory Skills: Big Query Cloud Composer or Airflow Dataflow or Dataproc or Datafusion Python or Pyspark (any one is required) DBT SQL GKE Nice to Have Spanner, Harness, Gen AI, Looker Interested...
-
Principal Data Engineer
1 week ago
kannur, India beBeeData Full timeJob SummaryWe seek a seasoned data professional to spearhead our data engineering initiatives. The ideal candidate will have extensive experience in designing and implementing scalable data pipelines using GCP services.Key Responsibilities:Design, develop, and deploy complex data workflows using GCP services including Dataflow, Big Query, and...
-
Data Engineering Leader
2 weeks ago
kannur, India beBeeDataEngineering Full timeData Engineering LeadershipAs a seasoned Data Engineering Manager, you will lead the development and maintenance of large-scale web crawling systems. This role requires strategic vision, technical expertise, and team leadership skills.Key Responsibilities:Team Oversight: Supervise a team of data engineers responsible for building and maintaining complex web...
-
Cloud Data Engineer Opportunity
1 week ago
kannur, India beBeeDataEngineer Full timeJob OpportunityWe seek a skilled professional to fill the position of Cloud Data Engineer.This role requires expertise in Google Cloud Platform (GCP), specifically with BigQuery and Airflow, alongside strong SQL and Python skills. The ideal candidate is an independent problem solver who communicates effectively and delivers reliable data...
-
Real-Time Data Engineer
1 week ago
kannur, India beBeeFreelance Full timeJob OpportunityWe are seeking a highly skilled Freelance Data Engineer to join our team and help us achieve our mission of protecting businesses and organizations against cyber threats.This is an exciting opportunity for a talented individual with expertise in Apache NiFi and Google Cloud Platform (GCP) services to design real-time data ingestion and...
-
Senior Cloud Data Engineering Specialist
2 weeks ago
kannur, India beBeeDataEngineer Full timeAbout the PositionWe are seeking an experienced Cloud Data Engineer to join our team. The successful candidate will have a strong background in designing, building and optimizing scalable data pipelines using Snowflake and Databricks.The ideal candidate will be responsible for developing efficient data models, schemas, and data warehousing solutions using...
-
Senior Cloud Data Engineer
1 week ago
kannur, India beBeeDataEngineer Full timeWe are seeking a seasoned Data Engineer to join our team. This role involves designing, building, and maintaining large-scale data systems using Python, AWS, and SQL.Key responsibilities include architecting data pipelines, developing cloud-based data solutions, and ensuring data quality and integrity.The ideal candidate will have 5-7 years of experience in...
-
Senior Data Engineering Specialist
2 weeks ago
kannur, India beBeeDataEngineering Full timeSenior Data Engineering SpecialistWe are seeking a highly skilled data engineering professional to join our team. As a key member, you will collaborate closely with customers to ensure successful implementation of our distributed database solutions.Lead customer engagements to design and deploy scalable and efficient data systems, ensuring maximum...
-
Chief Data Engineering Specialist
1 week ago
kannur, India beBeeDataEngineer Full timeJob Title: Data ArchitectWe are seeking an experienced professional to lead the development of our data engineering solutions and back-end services.The ideal candidate will have a strong understanding of microservices principles, data engineering best practices, and experience with building scalable solutions.They will be responsible for designing,...
-
Senior Big Data Engineer
1 week ago
kannur, India beBeeBigDataEngineer Full timeBig Data EngineerAt our organization, we are seeking an experienced Big Data Engineer to develop and maintain scalable data pipelines using Apache Spark on Databricks.This role involves building end-to-end ETL/ELT pipelines on AWS/GCP/Azure using services like S3, Glue, Lambda, EMR, and Step Functions. The successful candidate will collaborate with data...