Data Engineer

1 week ago


Any Location, India NS Global Corporation Full time ₹ 6,00,000 - ₹ 18,00,000 per year

We are looking for an experienced GCP Data Engineer with a minimum of 5 years of professional experience in data engineering, cloud-based data solutions, and large-scale distributed systems. This role is fully remote and requires a hands-on professional who can design, build, and optimize data pipelines and solutions on Google Cloud Platform (GCP).

Key Responsibilities :


- Architect, design, and implement highly scalable data pipelines and ETL workflows leveraging GCP services.

- Develop and optimize data ingestion, transformation, and storage frameworks to support analytical and operational workloads.

- Work extensively with BigQuery, Dataflow, Pub/Sub, Dataproc, Data Fusion, Cloud Composer, and Cloud Storage to design robust data solutions.

- Create and maintain efficient data models and schemas for analytical reporting, machine

learning pipelines, and real-time processing.

- Collaborate closely with data scientists, analysts, and business stakeholders to understand requirements and convert them into technical data solutions.

- Implement best practices for data governance, security, privacy, and compliance across the entire data lifecycle.

- Monitor, debug, and optimize pipeline performance ensuring minimal latency and high

throughput.

- Design and maintain APIs and microservices for data integration across platforms.

- Perform advanced data quality checks, anomaly detection, and validation to ensure data

accuracy and consistency.

- Implement CI/CD for data engineering projects using GCP-native DevOps tools.

- Stay updated with emerging GCP services and industry trends to continuously improve existing solutions.

- Create detailed documentation for data processes, workflows, and standards to enable smooth knowledge transfer.

- Support the migration of on-premise data systems to GCP, ensuring zero downtime and efficient cutover.

- Automate repetitive workflows, deployment processes, and monitoring systems using Python, Shell scripting, or Terraform.

- Provide mentoring and technical guidance to junior data engineers in the team.

Required Skills & Experience :


- 5 years of experience in data engineering with a strong focus on cloud-based data solutions.

- Hands-on expertise with Google Cloud Platform (GCP) and services including BigQuery, Dataflow, Pub/Sub, Dataproc, Data Fusion, Cloud Composer, and Cloud Storage.

- Strong proficiency in SQL, including query optimization, performance tuning, and working with large datasets.

- Advanced programming skills in Python, Java, or Scala for building data pipelines.

- Experience with real-time data streaming frameworks such as Apache Kafka or Google Pub/Sub.

- Solid knowledge of ETL/ELT processes, data modeling (star/snowflake), and schema design for both batch and streaming use cases.

- Proven track record of building data lakes, warehouses, and pipelines that can scale with

enterprise-level workloads.

- Experience integrating diverse data sources including APIs, relational databases, flat files, and

unstructured data.

- Knowledge of Terraform, Infrastructure as Code (IaC), and automation practices in cloud environments.

- Understanding of CI/CD pipelines for data engineering workflows and integration with Git,

Jenkins, or Cloud Build.

- Strong background in data governance, lineage, and cataloging tools.

- Familiarity with machine learning workflows and enabling ML pipelines using GCP services is

an advantage.

- Good grasp of Linux/Unix environments and shell scripting.

- Exposure to DevOps practices and monitoring tools such as Stackdriver or Cloud

Logging/Monitoring.

- Excellent problem-solving, debugging, and analytical skills with the ability to handle complex

technical challenges.

- Strong communication skills with the ability to work independently in a remote-first team

environment.

Nice-to-Have Skills :

- Experience with multi-cloud or hybrid environments (AWS/Azure alongside GCP).

- Familiarity with data visualization platforms such as Looker, Tableau, or Power BI.

- Exposure to containerization technologies such as Docker and Kubernetes.

- Understanding of big data processing frameworks like Spark, Hadoop, or Flink.

- Prior experience in industries with high data volume such as finance, retail, healthcare, or

telecom.

Educational Background :

- Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field.

- Relevant GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect) will be highly preferred.

Why Join Us?

- Opportunity to work on cutting-edge cloud data projects at scale.

- Fully remote working environment with flexible schedules.

- Exposure to innovative data engineering practices and advanced GCP tools.

- Collaborative team culture that values continuous learning, innovation, and career growth.


  • Data Engineer

    6 days ago


    Any Location, India SkillKart Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    About the Role : We are seeking experienced Google Cloud Platform (GCP) Engineers to join our team in building a scalable and robust Model Monitoring Framework. This solution will integrate data from diverse sources including Excel files, SAS datasets, and GCP-native services, with the target architecture fully hosted on Google Cloud Platform.Key...


  • Any Location, India TESTQ Technologies Limited Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Role Overview : We are looking for a talented Snowflake DBT Engineer with extensive experience in building and optimizing data pipelines using DBT and Snowflake. The ideal candidate will have a strong command of SQL, a sound understanding of Data Vault 2.0 (DV2.0) methodology, and familiarity with Enterprise Data Management (EDM) principles. ...


  • Any Location, India SkyFlow Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Skyflow is a data privacy vault company built to radically simplify how companies isolate, protect, and govern their customers most sensitive data. With its global network of data privacy vaults, Skyflow is also a comprehensive solution for companies around the world looking to meet complex data localization requirements. Skyflow currently supports a...


  • Any Location, India Sheryl strategic solutions Pvt. LTD . Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Description : Position : Gen AI Workflow Engineer - Power Platform Location : India Duration : 6 MonthsJob Summary : We are seeking a highly skilled and experienced Gen AI Workflow Engineer to design, develop, and deploy intelligent, agentic workflows using the Microsoft Power Platform, specifically integrating with Azure's Generative AI...


  • Any Location, India iAastha Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Job Title : Innovation Engineering Lead Responsibilities : - Lead and manage a team of engineers to drive innovation in product development. - Collaborate with cross-functional teams to identify and prioritize technology solutions that address business needs. - Oversee the end-to-end product development lifecycle, from ideation to launch,...

  • Python Engineer II

    9 hours ago


    Any Location, India Sheryl strategic solutions Pvt. LTD . Full time ₹ 8,00,000 - ₹ 12,00,000 per year

    Description : Job Title : Middleware Python Engineer II Location : Remote Employment Type : Contract Experience Level : Mid-Level years) Position Overview : We are seeking a Middleware Python Engineer II to join our technology team. The engineer will play a key role in designing, developing, and maintaining middleware...

  • Cloud Engineer

    2 weeks ago


    Any Location, India Volto Consulting Full time ₹ 5,00,000 - ₹ 25,00,000 per year

    Role : Senior AWS Cloud Engineer.Key Responsibilities : - Design, build, and maintain scalable AWS cloud infrastructure aligned with business objectives. - Architect and deploy containerized platforms using ECS/EKS, Docker, and AMIs. - Automate infrastructure provisioning using Terraform and CloudFormation. - Define and implement...


  • Any Location, India Sheryl strategic solutions Pvt. LTD . Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Description : Responsibilities : - Technical Leadership : Provide technical leadership, guide, and mentor data engineering teams in designing, developing, and implementing robust and scalable Databricks solutions. - Architecture & Design : Develop and implement scalable data pipelines and architectures using the Databricks Lakehouse platform,...


  • Any Location, India Digihelic Solutions Private Limited Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Job Role : Senior Software Engineer.Job Location : Pune, Mumbai, Bangalore, Coimbatore, Chennai, Gurgaon.Relevant Experience : 6 Years.Shift Timings : 2 PM - 11 PM.Skills : Adobe, Adobe GSD , ReactJS, NodeJS.Joining : Immediate to 30 days.Key Responsibilities : Full Stack Development : - Develop and maintain eCommerce platforms using Shopify and...

  • QlikSense Developer

    2 weeks ago


    Any Location, India Kagool Full time ₹ 5,00,000 - ₹ 15,00,000 per year

    About Kagool : We are a fast-growing IT consultancy specialising in the transformation of complex Global enterprises that use SAP. We are looking for several project managers to help deliver for our global customer base. We embrace the opportunities of the future and work proactively to make good use of technology. As you can imagine, this means...