Data Engineer
2 weeks ago
Job Description :
We are looking for an experienced GCP Data Engineer with a minimum of 5 years of professional experience in data engineering, cloud-based data solutions, and large-scale distributed systems. This role is fully remote and requires a hands-on professional who can design, build, and optimize data pipelines and solutions on Google Cloud Platform (GCP).
Key Responsibilities :
- Architect, design, and implement highly scalable data pipelines and ETL workflows leveraging GCP services.
- Develop and optimize data ingestion, transformation, and storage frameworks to support analytical and operational workloads.
- Work extensively with BigQuery, Dataflow, Pub/Sub, Dataproc, Data Fusion, Cloud Composer, and Cloud Storage to design robust data solutions.
- Create and maintain efficient data models and schemas for analytical reporting, machine learning pipelines, and real-time processing.
- Collaborate closely with data scientists, analysts, and business stakeholders to understand requirements and convert them into technical data solutions.
- Implement best practices for data governance, security, privacy, and compliance across the entire data lifecycle.
- Monitor, debug, and optimize pipeline performance ensuring minimal latency and high throughput.
- Design and maintain APIs and microservices for data integration across platforms.
- Perform advanced data quality checks, anomaly detection, and validation to ensure data accuracy and consistency.
- Implement CI/CD for data engineering projects using GCP-native DevOps tools.
- Stay updated with emerging GCP services and industry trends to continuously improve existing solutions.
- Create detailed documentation for data processes, workflows, and standards to enable smooth knowledge transfer.
- Support the migration of on-premise data systems to GCP, ensuring zero downtime and efficient cutover.
- Automate repetitive workflows, deployment processes, and monitoring systems using Python, Shell scripting, or Terraform.
- Provide mentoring and technical guidance to junior data engineers in the team.
Required Skills & Experience :
- 5 years of experience in data engineering with a strong focus on cloud-based data solutions.
- Hands-on expertise with Google Cloud Platform (GCP) and services including BigQuery, Dataflow, Pub/Sub, Dataproc, Data Fusion, Cloud Composer, and Cloud Storage.
- Strong proficiency in SQL, including query optimization, performance tuning, and working with large datasets.
- Advanced programming skills in Python, Java, or Scala for building data pipelines.
- Experience with real-time data streaming frameworks such as Apache Kafka or Google Pub/Sub.
- Solid knowledge of ETL/ELT processes, data modeling (star/snowflake), and schema design for both batch and streaming use cases.
- Proven track record of building data lakes, warehouses, and pipelines that can scale with enterprise-level workloads.
- Experience integrating diverse data sources including APIs, relational databases, flat files, and unstructured data.
- Knowledge of Terraform, Infrastructure as Code (IaC), and automation practices in cloud environments.
- Understanding of CI/CD pipelines for data engineering workflows and integration with Git, Jenkins, or Cloud Build.
- Strong background in data governance, lineage, and cataloging tools.
- Familiarity with machine learning workflows and enabling ML pipelines using GCP services is an advantage.
- Good grasp of Linux/Unix environments and shell scripting.
- Exposure to DevOps practices and monitoring tools such as Stackdriver or Cloud Logging/Monitoring.
- Excellent problem-solving, debugging, and analytical skills with the ability to handle complex technical challenges.
- Strong communication skills with the ability to work independently in a remote-first team environment.
Nice-to-Have Skills :
- Experience with multi-cloud or hybrid environments (AWS/Azure alongside GCP).
- Familiarity with data visualization platforms such as Looker, Tableau, or Power BI.
- Exposure to containerization technologies such as Docker and Kubernetes.
- Understanding of big data processing frameworks like Spark, Hadoop, or Flink.
- Prior experience in industries with high data volume such as finance, retail, healthcare, or telecom.
Educational Background :
- Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field.
- Relevant GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect) will be highly preferred.
Why Join Us?
- Opportunity to work on cutting-edge cloud data projects at scale.
- Fully remote working environment with flexible schedules.
- Exposure to innovative data engineering practices and advanced GCP tools.
- Collaborative team culture that values continuous learning, innovation, and career growth.
-
Azure AI Data Engineer
1 week ago
Remote, India Data PlatformExperts Full time ₹ 70,000 - ₹ 15,00,000 per yearJob Description:We are looking for a skilled and motivated AI Azure Data Engineer with strong expertise in Microsoft Azure data services. The ideal candidate must have hands-on experience with Databricks or Microsoft Fabric and hold a valid certification in either. You will be responsible for designing, developing, and deploying scalable data pipelines,...
-
sme sql
4 days ago
Remote, India Data Engineer Academy LLP Full time ₹ 76,800 - ₹ 96,000 per yearWe're Hiring: SQL Subject Matter Expert (SME) – Mock Interviews & Mentoring (Part-Time)Remote (India) | Work From Home | Flexible Timings | Pay: $600 – $800 / 80 hoursAbout the OpportunityData Engineer Academy is looking for an experienced SQL Subject Matter Expert (SME) to conduct mock interviews and provide mentorship to aspiring data professionals....
-
Subject Matter Expert Specialist
6 days ago
Remote, India Data Engineer Academy LLP Full time ₹ 7,20,000 - ₹ 12,00,000 per yearWe're Hiring: Subject Matter Expert (SME) – Snowflake & Databricks on AWS CloudRemote | Part Time | Flexible Timings | Pay: ₹600 – ₹1000/HourAbout the OpportunityWe are seeking a highly experienced SME with extensive expertise in Snowflake, Databricks, and AWS cloud platforms.What We're Looking ForMinimum 10 years of experience in cloud data...
-
GIS Data Intern
4 days ago
Remote, India Engineer Philosophy Web Services Pvt Ltd Full time ₹ 88,800 - ₹ 14,40,000 per yearJob Title: GIS Data Intern (Freshers Welcome)Company: Engineer Philosophy Web Services Pvt. Ltd.Location: Indore, Madhya Pradesh (Remote)Job Type: Internship / Full-timeJob Description:Engineer Philosophy Web Services Pvt. Ltd. is seeking a motivated and detail-oriented GIS Data Intern to join our Indore-based team. This internship provides hands-on...
-
Data Scientist
5 days ago
Remote, India Applied Data Finance Full timeAbout the Role **Responsibilities**: - Conduct deep-dive analyses to uncover insights on user behaviour, marketing performance, and product engagement. - Create clear, concise dashboards and reports to communicate findings to stakeholders. - Partner with product managers, marketers, and engineers to design experiments and evaluate their outcomes. -...
-
▷ High Salary! Data
3 weeks ago
Remote, India NTT Data Full timeJob Description NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data & AI Engineer Lead to join our team in Remote, Karntaka (IN-KA), India (IN). Job Duties: Role Overview The Data...
-
Data Scientist
5 days ago
Remote, India Applied Data Finance Full timeAbout the Role **Responsibilities**: - Conduct deep-dive analyses to uncover insights on user behaviour, marketing performance, and product engagement. - Create clear, concise dashboards and reports to communicate findings to stakeholders. - Partner with product managers, marketers, and engineers to design experiments and evaluate their outcomes. -...
-
Data Engineer
6 days ago
Remote, India TriDevSofts Full time ₹ 6,00,000 - ₹ 18,00,000 per yearData Engineer – DBT & RedshiftExp YearsLocation: RemoteJob SummaryWe are seeking a skilled Data Engineer with hands-on experience in DBT (Data Build Tool) and Amazon Redshift to design, build, and maintain scalable data pipelines and models. You'll collaborate with analytics and engineering teams to transform raw data into actionable insights using modern...
-
Data Engineer
5 days ago
Remote, India Digital Rath Full time ₹ 18,00,000 - ₹ 36,00,000 per yearWe're looking for a skilled Data Engineer with deep Snowflake expertise to help modernize andscale our data platform. If you thrive in a fast-moving environment, can wrangle messy pipelines,and want to build the backbone of a cloud-first data strategy, this role is for you. You'll work acrosslegacy and modern systems to deliver reliable, high-quality data to...
-
Data Engineer
4 days ago
Remote, India Galactix Solutions Full time ₹ 2,00,000 - ₹ 3,00,000 per yearJob Title: Data EngineerLocation: RemoteJob Type: Remote ( Freelance )Experience Level: 4 to 8 yearsSend resumes to : About the Role:We are seeking a highly skilled and motivated Data Engineer to design, build, and maintain scalable and reliable data pipelines. You will work closely with data analysts, data scientists, and software engineers to ensure clean,...