Cloud Data Pipelines Specialist

2 days ago


Pune, Maharashtra, India beBeeDataEngineer Full time ₹ 1,50,000 - ₹ 28,00,000

The role of a Data Engineer involves designing, developing and maintaining data pipelines using Python and SQL programming languages on Google Cloud Platform (GCP). The ideal candidate will have experience in Agile methodologies, ETL, ELT, data movement and processing skills. They will work with Cloud Composer to manage and process batch data jobs efficiently and develop and optimize complex SQL queries for data analysis, extraction and transformation.

Additional responsibilities include developing and deploying GCP services using Terraform and implementing CI/CD pipelines using GitHub Actions. The candidate will also be expected to consume and host REST APIs using Python and monitor and troubleshoot data pipelines, resolving any issues in a timely manner.

A successful Data Engineer will be able to quickly learn new technologies and possess strong problem-solving skills. Certification in Professional Google Cloud Data Engineering is highly desirable but not mandatory.

To apply for this role, candidates should have at least 6 years of IT experience as a hands-on technologist. Proficiency in Python for data engineering and SQL is essential. Experience in GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and Kubernetes Engine (GKE) is also necessary. Additionally, proficiency in Terraform/Hashicorp, experience in GitHub and Git Actions, and knowledge of CI/CD are required.

Candidates should also have hands-on experience in automating ETL testing using Python and SQL. Experience with API Gateway and Bitbucket is desirable but not essential.



  • Pune, Maharashtra, India beBeeAutomation Full time ₹ 9,00,000 - ₹ 12,00,000

    We are seeking a skilled Data Pipeline Automation Specialist to join our team. The ideal candidate will have extensive experience in designing, developing and maintaining automated testing frameworks for large-scale data pipelines.The successful candidate will be responsible for ensuring the stability and quality of our data pipeline solutions.This is an...


  • Pune, Maharashtra, India beBeeData Full time ₹ 1,80,00,000 - ₹ 2,00,00,000

    Job Title: Data Architect SpecialistDescription:We are looking for a skilled Data Architect Specialist to craft and maintain robust data pipelines, utilizing Python and SQL, to ensure efficient extraction, transformation, and loading (ETL) of data. This role involves building and optimizing data pipelines to facilitate seamless data flow across systems and...


  • Pune, Maharashtra, India beBeeDataEngineer Full time ₹ 80,00,000 - ₹ 2,00,00,000

    Job Role    We seek an expert in building and maintaining scalable, high-performance data pipelines using Python and PySpark. The ideal candidate will have hands-on experience with Google Cloud Platform (GCP) services including BigQuery, Dataflow, and Cloud Functions.Data Responsibilities:    Design and build efficient data pipelines using Python and...


  • Pune, Maharashtra, India beBeeDataEngineer Full time ₹ 15,00,000 - ₹ 25,00,000

    Job OverviewWe are seeking a seasoned Data Engineer to join our team. This role involves crafting and maintaining robust data pipelines using Python and SQL to ensure efficient extraction, transformation, and loading (ETL) of data.The ideal candidate will have expertise in building and optimizing data pipelines to facilitate seamless data flow across systems...


  • Pune, Maharashtra, India beBeeData Full time ₹ 1,80,00,000 - ₹ 2,50,00,000

    Senior Data Engineer - AWS ExpertWe are seeking a seasoned professional to spearhead the design, development, and deployment of end-to-end data pipelines on AWS cloud infrastructure.Key Responsibilities:Design scalable data processing workflows using Apache Spark and SQL for analytics and reporting requirements.Build and maintain orchestration workflows to...


  • Pune, Maharashtra, India beBeeDataEngineer Full time ₹ 25,00,000 - ₹ 35,00,000

    Job DescriptionPosition Overview:We're seeking a skilled Data Engineer to join our team. The successful candidate will be responsible for maintaining and supporting data transformation pipelines from Landing Zone to Consumption Layer, ensuring seamless operations in GCP and SAP BW environments.Key Responsibilities:Maintain and support data transformation...


  • Pune, Maharashtra, India beBeeData Full time ₹ 2,00,00,000 - ₹ 2,50,00,000

    Job Title: Senior Cloud Data SpecialistWe are seeking a highly skilled Senior Cloud Data Specialist to join our team. The ideal candidate will have extensive experience working with cloud-based technologies, including Bigquery and DataProc.The successful candidate will have at least 5 years of experience in each of these areas, as well as a solid...


  • Pune, Maharashtra, India beBeeCloudDataSpecialist Full time ₹ 2,00,00,000 - ₹ 2,50,00,000

    Cloud Data Specialist JobWe're looking for a skilled Cloud Data Specialist to drive data platform innovation and modern software development methodologies.Main Responsibilities:Design, develop, and implement scalable solutions using Azure Data Services, including Data Lakes, Blob Storage, Power BI, and Databricks.Collaborate with cross-functional teams to...


  • Pune, Maharashtra, India beBeeData Full time ₹ 2,00,00,000 - ₹ 2,50,00,000

    Cloud Migration SpecialistWe are seeking a highly skilled and experienced Cloud Migration Specialist to join our team. The ideal candidate will have a minimum of 10 years of experience in data engineering and ETL, with a strong background in migrating on-premise data systems to the cloud.The successful candidate will design, develop, and implement complex...


  • Pune, Maharashtra, India beBeeDataEngineering Full time ₹ 12,00,000 - ₹ 15,00,000

    Job Title: Data Engineering SpecialistThis is an exciting opportunity to join a team of talented professionals in designing and implementing data pipelines and workflows using Snowflake and Matillion.The successful candidate will be responsible for collaborating with cross-functional teams to gather requirements, define data models, and develop solutions...