
High-Level Data Pipeline Developer
2 days ago
A seasoned Backend Engineer is required to spearhead the development of cutting-edge data pipelines that power real-time pricing engines.
- Design, develop, and maintain event decoders for prominent DeFi protocols (e.g., Uniswap, Curve, Balancer).
- Create and optimize data extractors that convert on-chain events into structured trade data.
- Implement sophisticated logic to calculate trade prices, encompassing swap routing, liquidity pool math, and price inference.
- Collaborate with data and infra teams to build efficient and scalable data ingestion pipelines.
- Refine performance and accuracy of pricing models (e.g., TWAP, oracle logic, tick-based price resolution).
- Write clean, well-tested, and efficient backend code in Go or Rust.
Requirements:
- At least 2–5+ years of experience in backend development using Go or Rust.
- Thorough understanding of Ethereum smart contract ABI decoding and event logs.
- Experience working with DeFi protocols, particularly AMM math principles:
- Constant product formulas: x * y = k
- Uniswap V3 ticks
- TWAP / oracle logic
- Familiarity with EVM-based chains and Ethereum node infrastructure (e.g., Geth, Erigon, RPC APIs).
- Experience building data extraction or indexing tools (e.g., subgraphs, custom indexers).
- Ability to work independently and oversee complex backend systems from design to deployment.
Nice to Have:
- Experience interacting directly with blockchain nodes or creating custom RPC-based services.
- Familiarity with price oracles and on-chain data feeds.
- Exposure to Kubernetes, Docker, and cloud infrastructure (AWS, GCP).
- Contributions to DeFi or crypto open-source projects.
-
Data Pipeline Specialist
22 hours ago
Kannur, Kerala, India beBeeDataEngineer Full time ₹ 80,00,000 - ₹ 2,40,00,000Job Summary:We are seeking a skilled professional to develop and maintain robust data pipelines for structured and unstructured data.The ideal candidate will have a strong background in computer science, data science, or engineering, with 3+ years of experience in working with Python, SQL, PySpark, and bash scripts.Key Responsibilities:Design and develop...
-
High Performance Pipeline Architect
22 hours ago
Kannur, Kerala, India beBeeLogicDesign Full time ₹ 2,00,00,000 - ₹ 2,50,00,000Senior Logic Design Engineer OpportunityWe are seeking a highly skilled professional to lead the architecture, design and development of Processor Core Front end pipeline units for high-performance Systems.Key Responsibilities:Architect and design I-Cache, Instruction Fetch, Branch Prediction and Decode units of a high performance processor CPU.Develop the...
-
Databricks Pipeline Developer
2 weeks ago
Kannur, Kerala, India beBeePipeline Full time ₹ 15,00,000 - ₹ 25,00,000Job Title: Databricks Pipeline Developer We seek an experienced Databricks pipeline developer to design and implement robust data ingestion pipelines for integrating multiple sources into Databricks.Key Responsibilities:Develop high-performance data ingestion pipelines for integrating multiple sources into Databricks.Implement and maintain continuous...
-
ETL Developer
2 days ago
Kannur, Kerala, India beBeeDataPipeline Full time ₹ 1,50,00,000 - ₹ 2,50,00,000Job Title: ETL Developer - Data Pipeline SpecialistGreetings from our organization!We are hiring for a Data Pipeline Specialist to join our team. This role involves designing, developing and managing data pipelines using ETL tools.The ideal candidate will have hands-on experience in ETL development and be able to analyze complex business...
-
Kannur, Kerala, India beBeeDataPipeline Full time ₹ 18,00,000 - ₹ 21,00,000Job Description:">Seeking a skilled ETL Developer to design, develop and maintain scalable data pipelines using IBM DataStage and AWS Glue. The ideal candidate will have experience working with Snowflake and proficient in SQL, Unix scripting and Python.Requirements:">4+ years of experience in ETL development with at least 1–2 years in IBM...
-
Optimizing Data Pipelines
3 days ago
Kannur, Kerala, India beBeeDataEngineer Full time ₹ 90,00,000 - ₹ 1,20,00,000**Data Engineer Role**We are seeking a highly skilled Data Engineer to design and implement optimal data pipelines.Key Responsibilities:Create and maintain an efficient data pipeline architecture using big data technologies.Design internal process improvements: automate manual processes, optimize data delivery, re-design infrastructure for greater...
-
Chief Data Pipeline Architect
7 days ago
Kannur, Kerala, India beBeeDataEngineer Full time ₹ 1,50,00,000 - ₹ 2,00,00,000Job OverviewAs a Data Engineer, you will play a pivotal role in designing and developing scalable data pipelines on Google Cloud Platform (GCP).Design and build robust data pipelines to extract, transform, and load data from various sources.Collaborate with cross-functional teams to understand business requirements and translate them into technical...
-
Advanced Data Pipelines Tester
22 hours ago
Kannur, Kerala, India beBeeData Full time ₹ 15,00,000 - ₹ 20,00,000We are seeking an experienced Data Pipelines Tester to lead our project forward. The ideal candidate will have a proven track record in designing and executing advanced testing strategies across complex data pipelines and enterprise-grade data systems.Key Responsibilities:Perform thorough testing of ETL processes to ensure data integrity and accuracy.Develop...
-
Senior Data Pipeline Specialist
3 days ago
Kannur, Kerala, India beBeeDataEngineering Full time US$ 1,50,000 - US$ 2,00,000We are seeking a highly skilled Senior Data Pipeline Specialist to join our team.The ideal candidate will have hands-on experience with data transformation pipelines using Snowflake and dbt.The successful candidate will design and implement scalable ELT pipelines using dbt on Snowflake, following industry-accepted best practices. They will also build...
-
Scalable Data Pipeline Engineer
1 week ago
Kannur, Kerala, India beBeeData Full time ₹ 17,00,000 - ₹ 23,00,000Job OpportunityWe're seeking a skilled data engineer to design and develop scalable data pipelines using AWS services. With expertise in Python and experience in PySpark, you'll be responsible for developing ETL/ELT processes, managing storage solutions with Amazon S3, and implementing unit and integration tests.