IT engineer data lakehouse
1 week ago
Your tasks
- Design, develop, and operate scalable and maintainable data pipelines in the Azure Databricks environment
- Develop all technical artefacts as code, implemented in professional IDEs, with full version control and CI/CD automation
- Enable data-driven decision-making in Sales & Finance by ensuring high data availability, quality, and reliability
- Implement data products and analytical assets using software engineering principles in close alignment with business domains and functional IT
- Apply rigorous software engineering practices such as modular design, test-driven development, and artifact reuse in all implementations
- Global delivery footprint; cross-functional data engineering support across Sales & Finance domains
- Collaboration with business stakeholders, functional IT partners, product owners, architects, ML/AI engineers, and Power BI developers
- Agile, product-team structure embedded in an enterprise-scale Azure environment
Main Tasks
- Design scalable batch and streaming pipelines in Azure Databricks using PySpark and/or Scala
- Implement ingestion from structured and semi-structured sources (e.g., SAP, APIs, flat files)
- Build bronze/silver/gold data layers following the defined lakehouse layering architecture & governance
- Implement use-case driven dimensional models (star/snowflake schema) tailored to Sales & Finance needs
- Ensure compatibility with reporting tools (e.g., Power BI) via curated data marts and semantic models
- Implement enterprise-level data warehouse models (domain-driven 3NF models) for Sales & Finance data, closely aligned with data engineers for other business domains
- Develop and apply master data management strategies (e.g., Slowly Changing Dimensions)
- Develop automated data validation tests using frameworks
- Monitor pipeline health, identify anomalies, and implement quality thresholds
- Establish data quality transparency by defining and implementing meaningful data quality rules with source system and business stakeholders and implementing related reports
- Develop and structure pipelines using modular, reusable code in a professional IDE
- Apply test-driven development (TDD) principles with automated unit, integration, and validation tests
- Integrate tests into CI/CD pipelines to enable fail-fast deployment strategies
- Commit all artifacts to version control with peer review and CI/CD integration
- Work closely with Product Owners to refine user stories and define acceptance criteria
- Translate business requirements into data contracts and technical specifications
- Participate in agile events such as sprint planning, reviews, and retrospectives
- Document pipeline logic, data contracts, and technical decisions in markdown or auto-generated docs from code
- Align designs with governance and metadata standards (e.g., Unity Catalog)
- Track lineage and audit trails through integrated tooling
- Profile and tune data transformation performance
- Reduce job execution times and optimize cluster resource usage
- Refactor legacy pipelines or inefficient transformations to improve scalability
Your profile
Degree in Computer Science, Data Engineering, Information Systems, or related discipline.
Certifications in software development and data engineering (e.g., Databricks DE Associate, Azure Data Engineer, or relevant DevOps certifications).
3–6 years of hands-on experience in data engineering roles in enterprise environments. Demonstrated experience building production-grade codebases in IDEs, with test coverage and version control.
Proven experience in implementing complex data pipelines and contributing to full lifecycle data projects (development to deployment)
Experience in at least one business domain: Sales & Finance or a comparable field
Experience working in international teams across multiple time zones and cultures, preferably with teams in India, Germany, and the Philippines.
Our offer
The well-being of our employees is important to us. That's why we offer exciting career prospects and support you in achieving a good work-life balance with additional benefits such as:
- Training opportunities
- Mobile and flexible working models
- Sabbaticals
and much more...
Sounds interesting for you? Click here to find out more.
Diversity, Inclusion & Belonging are important to us and make our company strong and successful. We offer equal opportunities to everyone - regardless of age, gender, nationality, cultural background, disability, religion, ideology or sexual orientation.
Ready to drive with Continental? Take the first step and fill in the online application.
About us
Continental develops pioneering technologies and services for sustainable and connected mobility of people and their goods. Founded in 1871, the technology company offers safe, efficient, intelligent, and affordable solutions for vehicles, machines, traffic and transportation. In 2023, Continental generated sales of €41.4 billion and currently employs around 200,000 people in 56 countries and markets.
Guided by the vision of being the customer's first choice for material-driven solutions, the ContiTech group sector focuses on development competence and material expertise for products and systems made of rubber, plastics, metal, and fabrics. These can also be equipped with electronic components in order to optimize them functionally for individual services. ContiTech's industrial growth areas are primarily in the areas of energy, agriculture, construction, and surfaces. In addition, ContiTech serves the automotive and transportation industries as well as rail transport.
The IT Digital and Data Services Competence Center of ContiTech caters to all the Business Areas in ContiTech and responsible among other on areas of Data & Analytics, Web and Mobile Software Development and AI
The team for Data services specializes in all platforms, business applications and products in the domain of data and analytics, covering the entire spectrum including AI, machine learning, data science, data analysis, reporting and dashboarding.
KeyfactsJob ID
REF87643R
Location
Bengaluru
Leadership level
Leading Self
Job flexibility
Hybrid Job
Legal Entity
ContiTech India Pvt. Ltd.
-
IT engineer Data Lakehouse
1 week ago
Bengaluru, Karnataka, India Continental Full time ₹ 12,00,000 - ₹ 36,00,000 per yearYour tasksGovern the enterprise-wide standards for data & analytics modeling and performance within the Databricks Lakehouse.Drive consistency and reuse of core data & analytics artifacts and ensure scalable integration across all business domains.Provide expert consulting, quality assurance, and enablement for data engineering and data science teams.Act as...
-
IT engineer Data Lakehouse
6 hours ago
Bengaluru, Karnataka, India, Karnataka Continental Full timeJob DescriptionGovern the enterprise-wide standards for data & analytics modeling and performance within the Databricks Lakehouse.* Drive consistency and reuse of core data & analytics artifacts and ensure scalable integration across all business domains.* Provide expert consulting, quality assurance, and enablement for data engineering and data science...
-
IT engineer Data Lakehouse
6 hours ago
Bengaluru, Karnataka, India, Karnataka KnowledgeTrek Consulting Full timeCompany Description:Continental develops pioneering technologies and services for sustainable and connected mobility of people and their goods. Founded in 1871, the technology company offers safe, efficient, intelligent, and affordable solutions for vehicles, machines, traffic and transportation. In 2023, Continental generated sales of €41.4 billion and...
-
Data Engineer
1 week ago
Bengaluru, Karnataka, India Mastek Full time ₹ 20,00,000 - ₹ 25,00,000 per yearData Engineer – NiFi / Cloudera / Iceberg / Snowflake / DatabricksOverviewWe are seeking a Data Engineer with strong Apache NiFi expertise to design and implement pipelines that move and transform data from Cloudera (HDFS/Hive/Impala) into Apache Iceberg tables, with downstream integration into Snowflake and Databricks. The ideal candidate will have...
-
Enterprise Data Architect
6 hours ago
Bengaluru, Karnataka, India, Karnataka Aptus Data Labs Full timeEnterprise Data Architect – Job DescriptionLocation: Bangalore (Hybrid)Experience: 15+ years in Data Engineering & Data PlatformsEmployment Type: Full-time About Aptus Data LabsAptus Data Labs is a global Data Engineering and AI solutions partner helping enterprises build modern, scalable, and intelligence-driven organizations. With deep expertise across...
-
Azure Data Engineer
6 hours ago
Bengaluru, Karnataka, India, Karnataka Sigmoid Full timeWe are looking for a skilled TL Azure Data Engineer with 9+ years of experience in big data technologies, particularly Python, PYSpark, SQL and data lakehouse architectures. The ideal candidate will have a strong background in building scalable data pipelines and experience with modern data storage formats, including Apache Iceberg. You will work closely...
-
Data Engineer
3 days ago
Bengaluru, Karnataka, India Careernet Full time ₹ 5,00,000 - ₹ 15,00,000 per yearKey Skills: Azure, Data modeling, Python, ETL, Data Engineering, Microsoft fabric, Azure DatabricksRoles and Responsibilities:Design, develop, and optimize ETL/ELT processes using SQL Server, Azure Data Factory, or equivalent tools.Build and maintain data pipelines supporting real-time and batch processing.Implement data transformations, cleansing, and...
-
Azure Data Engineer
6 hours ago
Bengaluru, Karnataka, India, Karnataka LTIMindtree Full timeRole Senior Data Engineer 8 years of experienceKey responsibilitiesBuild reusable utilities templates and automation pipelinesDesign scalable data engineering frameworks standards and best practicesProvide architectural guidance cost optimization and performance tuning support for Data Engineering solutions mainly on AzureEvaluate and onboard new featuresin...
-
AWS Lead Data engineer
3 days ago
Bengaluru, Karnataka, India Tata Consultancy Services Full time ₹ 12,00,000 - ₹ 36,00,000 per yearJob Summary:In this key leadership role, you will lead the development of foundational components for a Lakehouse architecture on AWS and drive the migration of existing data processing workflows to the new Lakehouse solution. You will work across the Data Engineering organisation to design and implement scalable data infrastructure and processes using...
-
Lead Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Aptean Full time ₹ 10,00,000 - ₹ 25,00,000 per yearOverviewJob Title - Lead Data Engineer(MS Fabrics)Location - BengaluruAptean is changing. Our bespoke ERP solutions are transforming a huge range of global businesses, from food producers to manufacturers. In a world of generic enterprise software, we provide targeted solutions that bring together the very best technology and drive greater results. With over...