Current jobs related to Data Engineer - India - BrightEdge
-
Senior Data Engineer
3 weeks ago
India Aptus Data Labs Full timeJob Title: Senior Data EngineerLocation: RemoteExperience: 4+Employment Type: Full-TimeNotice- Looking for immediate joiners or who can join within 15-20 days.About the RoleAptus Data Labs is looking for a talented and proactive Senior Data Engineer to help build the backbone of our enterprise data and AI initiatives. You'll work on modern data lake...
-
Data Engineer
4 weeks ago
India Astreya Full timeData EngineerAstreya offers comprehensive IT support and managed services. These services include DataCenter and Network Management, Digital Workplace Services (like Service Desk, Audio Visual, andIT Asset Management), as well as Next-Gen Digital Engineering services encompassing SoftwareEngineering, Data Engineering, and cybersecurity solutions. Astreya's...
-
Data Engineer
4 weeks ago
India NP Group Full timeData Engineer - Python, Pyspark & Palantir Foundry Fully Remote Long Term Contract Rate: $10-13 ph ($1,600 - $2,000 per month)We have an immediate requirement for an experienced Data Engineer to join the global engineering team for a International Enterprise Organisation.You will bring Data Engineering expertise onto greenfield multi year project...
-
Data Engineer
3 weeks ago
India NP Group Full timeData Engineer - Python, Pyspark & Typescript Fully Remote Long Term Contract Rate: $20 - $25 per hour ($3,000 - $4,000 per month)We have an immediate requirement for an experienced Data Engineer to join the global engineering team for a International Enterprise Organisation.You will bring Data Engineering expertise onto greenfield multi year project...
-
Data Engineer
4 weeks ago
India NP Group Full timeData Engineer - Python, Pyspark & Palantir FoundryFully Remote6 month full time contractRate: $13.00 - $15.00 per hourWe have an immediate requirement for an experienced Data Engineer to join the global engineering team for a International Enterprise Organisation.You will bring Data Engineering expertise onto greenfield multi year project initiative.You...
-
Data Engineer
4 weeks ago
India NP Group Full timeData Engineer - Python, Pyspark & TypescriptFully RemoteLong Term ContractRate: $14 - $17 per hourWe have an immediate requirement for an experienced Data Engineer to join the global engineering team for a International Enterprise Organisation.You will bring Data Engineering expertise onto greenfield multi year project initiative.You should offer at least...
-
Data Engineer
4 weeks ago
India NP Group Full timeData Engineer - Python, Pyspark & Palantir FoundryFully Remote 6 month full time contractRate: $13.00 - $15.00 per hourWe have an immediate requirement for an experienced Data Engineer to join the global engineering team for a International Enterprise Organisation.You will bring Data Engineering expertise onto greenfield multi year project initiative.You...
-
Data Engineer
4 weeks ago
India NP Group Full timeData Engineer - Python, Pyspark & TypescriptFully Remote Long Term Contract Rate: $14 - $17 per hour We have an immediate requirement for an experienced Data Engineer to join the global engineering team for a International Enterprise Organisation.You will bring Data Engineering expertise onto greenfield multi year project initiative.You should offer at least...
-
Data Engineer
4 weeks ago
India MindWise Technologies Full timeWe are seeking a highly skilled and experienced Data Engineer based in India to join our growing Data Science team. The ideal candidate will play a critical role in designing, building, and optimizing our data pipelines and infrastructure to support advanced analytics and machine learning initiatives. This individual will work closely with data scientists,...
-
Data Engineer
1 day ago
India Insight Global Full time100% Remote Data EngineerRequired Skills & Experience5+ years of experience as a Data EngineerExpertise in Python, Pyspark and SQLHands-on Python/PySpark coding experienceStrong Big Data experience, GCP preferredTelemetry experienceStreaming experience with analytics and data processingBusiness analytics experience and working directly with business partners...

Data Engineer
4 weeks ago
About the Company
Brightedge is a global leader in AI-powered enterprise performance marketing and SEO solutions. We're building scalable, intelligent, cloud-native data platforms to power real-time insights and decision-making across our customer ecosystem. As part of our growth, we're hiring experienced data engineers to join our high-impact Professional Services team.
About the Role
As an SDE III – Data Engineer, you will play a key role in designing, building, and optimizing scalable data solutions on Google Cloud Platform (GCP) using BigQuery, Python, and modern orchestration tools. You will work on ingesting billions of rows of structured and semi-structured data, ensuring performance, reliability, and automation across the data stack.
Responsibilities:
- Design, build, and maintain high-performance data pipelines for batch and streaming ingestion across diverse sources (APIs, Pub/Sub, external DBs, file stores).
- Optimize BigQuery queries and architecture for cost efficiency, scalability, and reliability.
- Build modular and reusable Python-based data transformation logic for data wrangling, validation, and loading.
- Architect and manage data orchestration workflows using Airflow, Cloud Composer, or similar tools.
- Implement CI/CD, testing, and observability for data pipelines using tools like Terraform, GitHub Actions, and DataDog.
- Partner with analytics, ML, and product engineering teams to design scalable data models that power reporting and AI/ML use cases.
- Own SLAs and quality metrics for mission-critical pipelines that impact internal analytics and external customer-facing platforms.
Qualifications:
- 6+ years of hands-on data engineering experience, with at least 3+ years in GCP ecosystem.
- Strong command of BigQuery, partitioning, clustering, query tuning, and data lifecycle management.
- Advanced proficiency in Python for building production-grade ETL/ELT pipelines and data tools.
- Solid understanding of large unstructured datasets in JSON format from the web crawlers.
- Experience with data orchestration frameworks (Airflow / Cloud Composer / Dagster).
- Deep knowledge of performance optimization across query logic, storage formats (Parquet/Avro), and compute resources.
- Solid understanding of data architecture, including dimensional modeling, CDC, and lakehouse patterns.
- Experience with infrastructure-as-code (Terraform or equivalent) for managing GCP resources.
Required Skills:
- Familiarity with dbt, Looker, or data catalog tools.
- Experience with Kafka, Pub/Sub, or event-driven data pipelines.
- Exposure to data quality, lineage, and governance tools (e.g., Great Expectations, OpenLineage).
- Understanding of cost governance and resource optimization in a cloud-native data stack.