Data Engineer with GCP, Databricks

19 hours ago


Mulshi Maharashtra, India Synechron Full time ₹ 15,00,000 - ₹ 28,00,000 per year

Job Summary
Synechron is seeking a skilled Data Engineer experienced in Google Cloud Platform (GCP), Databricks, PySpark, and SQL. In this role, you will design, develop, and maintain scalable data pipelines and workflows to enable advanced analytics and business intelligence solutions. You will work within a collaborative environment to integrate diverse data sources, optimize data processing workflows, and ensure data quality and availability. Your contributions will support strategic decision-making and enhance the organization's data-driven initiatives.

Software Requirements
Required Skills:

  • Hands-on experience with GCP services, specifically BigQuery, Cloud Storage, and Composer for data pipeline orchestration
  • Proficiency in Databricks platform with PySpark for building and optimizing large-scale ETL/ELT processes
  • Expertise in writing and tuning complex SQL queries for data transformation, aggregation, and reporting on large datasets
  • Experience integrating data from multiple sources such as APIs, cloud storage, and databases into a central data warehouse
  • Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer for scheduling, monitoring, and managing data jobs
  • Knowledge of version control systems (Git), CI/CD practices, and Agile development methodologies

Preferred Skills:

  • Experience with other cloud platforms (AWS, Azure) or additional GCP services (Dataflow, Pub/Sub)
  • Knowledge of data modeling and data governance best practices
  • Familiarity with containerization tools like Docker or Kubernetes

Overall Responsibilities

  • Design, develop, and maintain scalable data pipelines using GCP, Databricks, and associated tools
  • Write efficient, well-documented SQL queries to support data transformation, data quality, and reporting needs
  • Integrate data from diverse sources, including APIs, cloud storage, and databases, to create a reliable central data repository
  • Develop automated workflows and schedules for data processing tasks utilizing Composer or Airflow
  • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions
  • Monitor, troubleshoot, and optimize data pipelines for performance, scalability, and reliability
  • Maintain data security, privacy standards, and documentation compliance
  • Stay informed about emerging data engineering technologies and apply them effectively to improve workflows

Technical Skills (By Category)

  • Programming Languages:

  • Required: PySpark (Python in Databricks), SQL

  • Preferred: Python, Java, or Scala for custom data processing

  • Databases/Data Management:

  • Required: BigQuery, relational databases, large-scale data transformation and querying

  • Preferred: Data cataloging and governance tools

  • Cloud Technologies:

  • Required: GCP services including BigQuery, Cloud Storage, Composer

  • Preferred: Experience with other cloud services (AWS, Azure)

  • Frameworks and Libraries:

  • Required: Databricks with PySpark, Airflow or Cloud Composer

  • Preferred: Data processing frameworks such as Apache Beam, Dataflow

  • Development Tools and Methodologies:

  • Version control using Git

  • CI/CD pipelines for automated deployment and testing
  • Agile development practices

  • Security & Compliance:

  • Knowledge of data security best practices, access controls, and data privacy regulations

Experience Requirements

  • Minimum of 3 years of professional experience in data engineering or a related role
  • Proven expertise in designing and implementing large-scale data pipelines using GCP and Databricks
  • Hands-on experience with complex SQL query development and optimization
  • Working knowledge of workflow orchestration tools such as Airflow or Cloud Composer
  • Experience processing data from multiple sources, including APIs and cloud storage solutions
  • Experience in an Agile environment preferred

Alternative pathways:
Candidates with strong data pipeline experience on other cloud platforms who are willing to adapt and learn GCP services may be considered.

Day-to-Day Activities

  • Develop, test, and deploy data pipelines that facilitate analytics, reporting, and data science initiatives
  • Collaborate with cross-functional teams during sprint planning, stand-ups, and code reviews
  • Monitor scheduled jobs for successful execution, troubleshoot failures, and optimize performance
  • Document processes, workflows, and data sources in compliance with organizational standards
  • Continuously review pipeline performance, implement improvements, and ensure robustness
  • Participate in scalable architecture design discussions and recommend best practices

Qualifications

  • Bachelor's degree in Computer Science, Data Science, Information Technology, or equivalent field
  • At least 3 years of experience in data engineering, data architecture, or related roles
  • Demonstrated expertise with GCP, Databricks, SQL, and workflow orchestration tools

Certifications (preferred):

  • GCP certifications such as Professional Data Engineer or equivalent
  • Databricks Data Engineer certification

Professional Competencies

  • Critical thinking and effective problem-solving skills related to large-scale data processing
  • Strong collaboration abilities across multidisciplinary teams and stakeholders
  • Excellent communication skills with the ability to translate technical details into clear insights
  • Adaptability to evolving technologies and project requirements
  • Ability to prioritize tasks, manage time efficiently, and deliver on deadlines
  • Innovative mindset with a focus on continuous learning and process improvement

S
YNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative 'Same Difference' is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Candidate Application Notice


  • Presales Architect

    1 week ago


    Mumbai, Maharashtra, India Databricks Full time

    As a **Solutions Architect** (Analytics, AI, Big Data, Public Cloud), you will guide the technical evaluation phase in a hands-on environment throughout the sales process. You will be a **technical advisor **internally to the sales team, and work with the product team as an advocate of your customers in the field. You will help our customers to achieve...

  • Solutions Architect

    3 days ago


    Mumbai, Maharashtra, India Databricks Full time

    As a **Solutions Architect** (Analytics, AI, Big Data, Public Cloud), you will guide the technical evaluation phase in a hands-on environment throughout the sales process. You will be a **technical advisor **internally to the sales team, and work with the product team as an advocate of your customers in the field. You will help our customers to achieve...

  • Solutions Architect

    7 days ago


    Mumbai, Maharashtra, India Databricks Full time

    FEQ126R37 As a **Solutions Architect** (Analytics, AI, Big Data, Public Cloud), you will guide the technical evaluation phase in a hands-on environment throughout the sales process. You will be a **technical advisor **internally to the sales team, and work with the product team as an advocate of your customers in the field. You will help our customers to...


  • Mumbai, Maharashtra, India Databricks Full time

    CSQ426R257 We are seeking an experienced Resident Solution Architect (RSA) to join our Professional Services team and work directly with strategic customers on their data and AI transformation initiatives using the Databricks platform. As an RSA, you will serve as a trusted technical advisor and hands-on expert, guiding customers to solve complex big data...


  • Mumbai, Maharashtra, India Databricks Full time

    FEQ426R137 At Databricks, we are on a mission to empower our customers to solve the world's toughest data problems by utilizing the Databricks Data Intelligence Platform. As a Delivery Solutions Architect (DSA), you will play an important role during this journey. You will collaborate with our sales and field engineering teams to accelerate the adoption and...

  • Account Executive

    1 week ago


    Mumbai, Maharashtra, India Databricks Full time

    As we continue to increase our presence in the Unified Data Analytics and AI world, we're looking for a creative, motivated, and execution-oriented Enterprise Account Executive to sell to the growing Enterprise Segment within India to maximize the huge market opportunity that exists for Databricks today. You will report to the Director, as an Account...


  • Mumbai, Maharashtra, India Databricks Full time

    SLSQ426R161 As an Account Executive, your mission will be to help further build our India business, which is one of our fastest-growing markets in APJ. The Databricks Sales Team is driving growth through strategic and innovative partnerships with our customers, helping businesses thrive by solving the world's toughest problems with our solutions. You will...

  • Data Architect

    6 days ago


    Pune, Maharashtra, India NTT DATA Full time

    **Req ID**: 324663 We are currently seeking a Data Architect to join our team in Pune, Mahārāshtra (IN-MH), India (IN). Key Responsibilities: - Develop and articulate long-term strategic goals for data architecture vision and establish data standards for enterprise systems. - Utilize various cloud technologies, including Azure, AWS, GCP, and data...

  • Gcp Data Architect

    4 days ago


    Baner, Pune, Maharashtra, India Rapidera Technologies Pvt Ltd Full time

    Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and...

  • Solution Architect

    1 day ago


    Pune, Maharashtra, India Codvo.ai Full time

    **Solution Architect - Databricks (Document AI & Knowledge Graph Focus)** **Location: India - Pune** **Company Overview** **We are a global empathy-led technology services company where software and people transformations go hand-in-hand. Product innovation and mature software engineering are part of our core DNA. Our mission is to help our customers...