
Data Systems Architect
2 days ago
This is a key leadership role that involves building and operating production-grade data pipelines.
About the RoleThe ideal candidate will have experience with cloud platforms, strong leadership skills, and be able to lead technical direction.
Key Responsibilities- Technical Leadership & Delivery:
- Define milestones, manage dependencies and risks, unblock engineers, communicate status to stakeholders, land outcomes on time.
- Architecture & Direction:
- Define target states and data contracts, make build/buy calls, author RFCs, drive cost/performance guardrails and reliability patterns.
- Data Pipelines:
- Design, build, and operate reliable batch and change-driven pipelines, schedule/orchestrate jobs, handle schema evolution and failures gracefully.
- Transformations & Modeling:
- Implement clean, tested merge/transformation pipelines, produce models that are easy to consume for products and analytics, tune SQL for performance.
- APIs & Integration:
- Build or collaborate on APIs for data access/updates, design request/response contracts, handle auth (OAuth2/OIDC/JWT), idempotency, validation, and auditing.
- Infrastructure & DevOps:
- Provision/manage cloud resources with Terraform, automate build/test/deploy with CI/CD, write solid shell scripts for glue/ops tasks.
- Observability & Reliability:
- Instrument logs/metrics/traces, define SLOs and on-call runbooks, lead incident reviews and drive root-cause prevention.
- Security & Compliance:
- Apply least-privilege access, secrets management, and encryption in transit/at rest, partner on compliance requirements.
- Quality & Standards:
- Champion unit/integration/data-quality tests (including SQL), enforce code health via reviews, linters, and CI checks.
- Collaboration & Docs:
- Operate async-first, produce concise docs (data contracts, mappings, runbooks, RFCs), collaborate closely with product/engineering/analytics across time zones.
- 9+ years of experience in Data Engineering / Backend Engineering / DevOps, including leadership of a team or end-to-end ownership of large, multi-stakeholder projects.
- Strong SQL (tuning, indexing, query plans) and solid RDBMS fundamentals (e.g., Postgres, SQL Server, MySQL).
- Python for data work and services, comfortable with shell scripting (bash preferred; Powershell acceptable).
- Terraform and IaC fundamentals, CI/CD experience (version control workflows, automated tests, environment promotion).
- Cloud experience, preferably Azure (storage, compute, identity, monitoring); strong AWS background also acceptable.
- Data integration experience: ingestion (batch + change-driven), orchestration, schema/versioning, resilient retries/replays.
- Workflow orchestration experience: Prefect preferred (flows/deployments, retries/backoff, scheduling, observability); Airflow/Dagster also acceptable.
- APIs & auth experience: REST patterns, pagination, validation, rate limiting, OAuth2/OIDC/JWT.
- Observability experience: logs/metrics/traces and actionable alerts/dashboards.
- Testing mindset: habit of writing tests for code and SQL, fixtures/test data, CI.
- Communication: clear written/verbal comms, practiced cross-timezone collaboration.
- AI tooling: ready to use Claude Code (and similar assistants) to accelerate work—while exercising judgment and writing your own code/tests.
- Experience mentoring, interviewing, onboarding, and raising code quality standards.
- Regulatory/compliance awareness (e.g., data protection or public-sector standards) and how it impacts design/operations.
- Analytics exposure (dim/fact modeling, BI consumption patterns).
- Data platform architecture exposure (storage/layout choices, catalog/lineage, governance concepts).
- Familiarity with Kafka, streaming/message queues, and columnar formats (e.g., Parquet).
- Experience with cloud monitoring stacks (Azure Monitor/Application Insights, CloudWatch, Datadog, New Relic, etc.).
- Duration: 10+ months (extension/full-time possible).
- Schedule: Flexible, with overlap for key meetings across time zones.
- Compensation: Competitive, based on experience.
- Mode: Remote; available for virtual collaboration and on-call windows as agreed.
-
System Cloud Architect
5 days ago
Jamnagar, Gujarat, India beBeeInfrastructure Full time US$ 9,00,000 - US$ 15,00,000Cloud Infrastructure ArchitectWe are seeking an experienced Cloud Infrastructure Architect to design and build our cloud platform on GCP from the ground up.This is a unique opportunity for a talented individual to transform our current infrastructure into a world-class platform that balances speed and security standards.The ideal candidate will have...
-
Data Systems Architect
1 week ago
Jamnagar, Gujarat, India beBeeCloud Full time ₹ 1,50,00,000 - ₹ 2,00,00,000About this RoleWe are on an exciting journey to build and scale our advanced analytics capabilities.We are looking for a Cloud Data Engineer that has experience in building data products using Databricks and related technologies. In this position, you will apply your skills to manage the existing cloud data platform to make it more scalable, reliable, and...
-
Principal Data Architect
1 week ago
Jamnagar, Gujarat, India beBeeSoftware Full time US$ 1,80,000 - US$ 2,40,000About the RoleWe're seeking a talented Software Engineer to join our team, responsible for architecting and developing big data systems. If you have experience with Apache Spark and Hive, this could be the perfect opportunity for you.
-
Data Systems Architect
1 week ago
Jamnagar, Gujarat, India beBeeFinancial Full time ₹ 10,80,000 - ₹ 15,60,000Financial Data ConsultantThis role is responsible for configuring, integrating and optimizing financial data workflows within a platform.Key Responsibilities:Lead configuration and integration efforts across modules including CommServer, DFS and MQ.Design and develop modular pipelines to transform and validate XML data.Build secure automated data pipelines...
-
Data Architect
3 hours ago
Jamnagar, Gujarat, India beBeeData Full time ₹ 17,00,000 - ₹ 20,10,000Job Title: Data ArchitectTransform Data into Actionable Insights
-
Chief Data Systems Architect
5 days ago
Jamnagar, Gujarat, India beBeeDataEngineer Full time ₹ 90,00,000 - ₹ 1,20,00,000Job Overview:Data engineers design and implement large-scale data systems to ensure seamless data flow.Develop real-time and close-to-real-time data integrations using Python, Java, or C#.Build Business Intelligence Dashboards and Data Models using Power BI, Azure SQL, and Azure Synapse Analytics.The ideal candidate should have extensive knowledge of SQL,...
-
Senior Data Systems Architect
6 days ago
Jamnagar, Gujarat, India beBeeDataEngineer Full time ₹ 1,50,00,000 - ₹ 2,00,00,000Data Engineering is a dynamic field that requires innovative professionals to design, build and maintain data systems. The role demands proactive individuals who can collaborate across functions and ensure the highest standards of data quality and performance by extending their expertise in data engineering, data architecture, pipeline creation and big data...
-
Cloud Data Architect
3 hours ago
Jamnagar, Gujarat, India beBeeData Full time ₹ 1,20,00,000 - ₹ 2,50,00,000Cloud Data Architect PositionWe are seeking an accomplished Cloud Data Architect to spearhead the design, implementation, and management of our cloud-based data infrastructure.The ideal candidate will have a strong understanding of cloud computing principles, data governance, and big data technologies such as Hadoop, Spark, and Kafka. Experience with Azure...
-
Data Architect
3 hours ago
Jamnagar, Gujarat, India beBeeData Full time ₹ 17,00,000 - ₹ 23,50,000We are currently seeking a skilled Data Architect to spearhead our data pipeline initiatives. The successful candidate will design and implement robust data solutions to drive business growth.This role requires expertise in cloud-based technologies, including Amazon Web Services (AWS), as well as proficiency in Python for data processing and automation....
-
Lead Data Architect
4 days ago
Jamnagar, Gujarat, India beBeeData Full time ₹ 1,50,00,000 - ₹ 2,50,00,000Job Title: Senior Data ArchitectData Modeling: Design logical and physical data models to ensure efficient data storage and retrieval.ETL Processes: Develop and optimize ETL processes to accurately and efficiently move data from various sources into the data warehouse.Infrastructure Design: Plan and implement the technical infrastructure, including hardware,...