Current jobs related to Data Pipeline Specialist - India - beBeeDeveloper


  • India beBeeAzure Full time US$ 1,04,000 - US$ 1,30,878

    Azure Data Factory Specialist We are seeking an experienced Azure Data Factory specialist to join our team. The ideal candidate will have strong hands-on expertise in designing, building, and maintaining scalable and secure data pipelines using Azure ADF. Design, build, and maintain Azure Data Factory pipelines for ETL and ELT processes. Integrate data...


  • India beBeeEtl Full time ₹ 15,00,000 - ₹ 18,00,000

    Senior ETL SpecialistWe are seeking a highly skilled ETL specialist to design and optimize complex data pipelines.Key Skills:Talend Data Integration: Design & optimize complex ETL pipelines, debugging, and deployment using TMCBig Query: Partitioning, clustering, federated queries, advanced SQL (CTEs, arrays, window functions)Python: ETL scripting,...


  • India Pipeline Velocity Full time ₹ 5,00,000 - ₹ 7,00,000 per year

    Type: Full-time Job Location: Bengaluru, IndiaYears of Experience: 3 years Salary: 5-7 Lacs per AnnumIMPORTANT NOTEPlease fill out this form for us to consider your application. The introduction video is a must as part of the application process. You can upload through Loom, Google Drive, Dropbox, or anything similar and share the final link in the form...


  • India beBeeDataEngineer Full time ₹ 9,00,000 - ₹ 12,00,000

    Data Engineer - Build Scalable Data PipelinesWe're seeking an experienced Data Engineer to design, build and maintain production-level data pipelines handling terabytes of structured and unstructured data. The ideal candidate has a strong background in data engineering and hands-on experience with AWS services.### Job Responsibilities:Design and implement...


  • India beBeeData Full time ₹ 20,00,000 - ₹ 25,00,000

    Job DescriptionWe are seeking a seasoned data engineer to join our team. As a GCP data engineer, you will be responsible for designing, developing, and maintaining scalable data pipelines for both batch and real-time processing using GCP tools.Our ideal candidate has deep expertise in GCP services, including Big Query, Cloud Dataflow, Cloud Run, Pub/Sub, and...


  • India beBeeDataEngineer Full time ₹ 15,00,000 - ₹ 28,00,000

    Job Title: Google Cloud (GCP) Data EngineerLocation: RemoteJob Type: Full-TimeExperience Level: Minimum 5 years +Mandatory Skills:GCS + Google BigQuery + Airflow/Composer + PythonCompany Description:Tech T7 Innovations is a company that provides IT solutions to clients worldwide. The team consists of highly skilled and experienced professionals who are...


  • India beBeeIntegration Full time ₹ 12,00,000 - ₹ 25,00,000

    About Our Opportunity">We are seeking a skilled Software Integration Engineer to join our team. As an integral part of our product team, you will be responsible for designing and implementing seamless software integrations with various Property Management Systems (PMS) and third-party platforms.">The ideal candidate will have a strong background in backend...


  • India beBeeETLDeveloper Full time ₹ 15,00,000 - ₹ 25,00,000

    About this RoleAs a skilled ETL Developer, you will play a pivotal role in designing, developing, and maintaining scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data), AWS Glue, and Snowflake.The ideal candidate will have 4+ years of experience in ETL development with at least 1-2 years in IBM DataStage (preferably CP4D version)...


  • India beBeeDataIngestion Full time ₹ 9,00,000 - ₹ 12,00,000

    Job Title: Data Ingestion SpecialistDescription:The role of a Data Ingestion Specialist is to design, develop, and optimize data ingestion pipelines for integrating multiple sources into Databricks. This involves working closely with the Data Engineering team to ensure seamless data flow and efficient ingestion workflows.Responsibilities:Design and implement...


  • India beBeeData Full time ₹ 15,00,000 - ₹ 25,00,000

    Data Engineer Job Description\We are seeking a highly skilled and experienced Data Engineer to lead the design and implementation of scalable, high-performance data pipelines using Snowflake and dbt.\This role involves defining architectural best practices, driving data transformation at scale, and working closely with clients to translate business needs...

Data Pipeline Specialist

2 weeks ago


India beBeeDeveloper Full time US$ 1,04,000 - US$ 1,30,878
ETL Data Engineer - Scalable Pipeline Developer

We are seeking an experienced ETL data engineer to build and maintain scalable data pipelines using IBM DataStage and AWS Glue. As a key member of our data engineering team, you will work collaboratively with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI/ML use cases.

Responsibilities:

  • Design and develop ETL pipelines for ingestion from varied sources such as flat files, APIs, Oracle, and DB2.
  • Builed and optimize data flows for loading curated datasets into Snowflake, leveraging best practices for schema design, partitioning, and transformation logic.
  • Collaborate with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines.
  • Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations.
  • Troubleshoot and resolve issues in QA/UAT/Production environments.

Requirements:

  • 4+ years of experience in ETL development with at least 1–2 years in IBM DataStage.
  • Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing.
  • Experience working with Snowflake: loading strategies, stream-task, zero-copy cloning, and performance tuning.
  • Proficiency in SQL, Unix scripting, and basic Python for data handling or automation.
  • Familiarity with S3, version control systems (Git), and job orchestration tools.
  • Experience with data profiling, cleansing, and quality validation routines.
  • Understanding of data lake/data warehouse architectures and DevOps practices.