Data Engineer

4 days ago


Lucknow, India de facto Infotech Full time

Data Engineer – Azure / Microsoft FabricRole SummaryAs a Data Engineer with 4–5 years of experience , you will design, implement, and maintain scalable, production‑grade data pipelines and data platform solutions. Your work will deliver reliable, high-quality data for analytics, reporting, ML/AI initiatives — enabling data-driven decision making across the business.What You’ll DoBuild, deploy and manage end‑to‑end ETL / ELT pipelines using Azure Data Factory (ADF) , Azure Databricks (ADB) / Microsoft Fabric .Ingest data from diverse sources, transform, clean and store it in Azure Data Lake / OneLake or Delta Lake / Lakehouse , enabling downstream analytics and reporting.Design and maintain robust data models, warehouse / lakehouse schemas , ensuring data integrity, reliability and performance.Write efficient data transformation and processing logic using PySpark, Python and/or Scala , optimizing for large‑scale data workloads.Collaborate with data scientists, analysts, product owners, and other stakeholders to understand data needs and deliver appropriate solutions.Implement data governance, quality, and security standards; ensure compliance and data reliability across data lifecycle.Apply version control and CI/CD practices to pipelines and infrastructure (e.g. Git / Azure DevOps), supporting seamless deployment, monitoring and maintenance.Troubleshoot, monitor, and optimize data pipelines to ensure performance, scalability and operational excellence.Required Skills & Experience4–5 years of hands-on experience as a Data Engineer working with Azure Data Factory (ADF) , Azure Databricks / Microsoft Fabric , and data‑lake / lakehouse storage (Azure Data Lake / OneLake / Delta Lake).Strong programming skills in PySpark, Python , and/or Scala ; ability to write clean, efficient, maintainable code for large‑scale data processing.Proficiency in data modeling, data‑warehouse / lakehouse architecture and schema design .Solid understanding of ETL/ELT patterns, orchestration, scheduling and data pipeline lifecycle.Experience with MS SQL (or equivalent relational databases / data stores) for data storage or warehousing.Familiarity with version control (Git) and CI/CD pipelines for data engineering workflows.Problem-solving mindset, with ability to work on large datasets and meet data reliability, performance, and quality requirements.Preferred / Nice‑to‑HaveExposure to event-driven or real-time data ingestion and processing (e.g. using messaging or streaming services).Familiarity with serverless or micro‑service style Azure components (e.g. Azure Functions, Logic Apps, Event Hub / Service Bus).Basic knowledge of reporting/BI tools (e.g. Power BI) — to aid end-to-end data-to-insight workflows.Experience with infrastructure‑as‑code / cloud‑infrastructure provisioning tools (e.g. Terraform / Bicep) and metadata/governance tools or processes.Experience or interest in building data governance, cataloging, lineage, and compliance standards.Why You Might Be a Great FitYou enjoy working with large-scale data, solving complex data‑architecture challenges, and building data platforms that scale.You value writing clean, maintainable code and building pipelines that are robust, efficient, and reliable.You appreciate collaboration — working with analysts, data scientists, product teams to turn raw data into actionable insights.You stay updated on cloud-native data technologies, big data patterns, and best practices, and you take initiative to learn and implement new technologies as appropriate.Bonus / Preferred QualificationsCertifications such as Azure Data Engineer Associate (or equivalent) or other Azure / cloud-data certificationsPrior experience working in agile / scrum teams or global delivery environmentsExposure to data governance, compliance, and security practices in enterprise data environmentsReady to make an impact with data? Apply now



  • Lucknow, India Aptus Data Labs Full time

    Company DescriptionAptus Data Labs is a leading Data and AI company specializing in Pharma, Manufacturing & Supply Chain, Banking & FinTech, and Technology domains. We offer innovative analytical solutions and consulting services to help businesses make quick, data-driven decisions essential for growth and sustainability in evolving industries. Leveraging...

  • Data Engineer

    3 weeks ago


    Lucknow, India Tata Consultancy Services Full time

    Big Data/Hadoop Experience particularly in ingesting data and implementing Data ingestion pipelines, SQOOP, HADOOP, HDFS, HIVE, IMPALA, Java, Scala, Spark Scala is Mandatory Data Engineer with below responsibilities: Lead Data Engineer to build data pipelines to support implementation of data science and analytics use cases. Candidate needs to be able to...

  • Data Engineer

    7 days ago


    lucknow, India beBeeDataEngineer Full time

    Data Engineer RoleWe are seeking a skilled Data Engineer to join our team. This is an exciting opportunity for the right individual to work on cutting-edge data science and AI projects.Key Responsibilities:Design and develop efficient, scalable data pipelines using Databricks to extract insights from complex datasetsBuild and maintain Extract Transform Load...

  • Senior Data Engineer

    4 weeks ago


    Lucknow, India RapidBrains Full time

    Job Title: Senior Data Engineer Experience: 6+ Years Employment Type: Contract Location: Remote Overview We are looking for a Senior Data Engineer with deep expertise in Azure Data Engineering to design, build, and optimize large-scale data pipelines. The ideal candidate will have strong experience with Azure Data Factory (ADF), Azure Synapse, PySpark, and...

  • Cloud Data Engineer

    3 days ago


    lucknow, India beBeeData Full time

    Data Engineering Expert on Cloud PlatformThe ideal candidate will have 5+ years of experience in designing and developing scalable data pipelines using cloud-based services, including data storage, processing, and analytics. The role requires a deep understanding of data engineering principles and the ability to work with cross-functional teams to ensure...


  • Lucknow, India Tech Phoenix Full time

    Company Description Tech Phoenix is a dynamic platform dedicated to all things technology, offering the latest news, trends, and insights from the tech industry. Whether you are a tech enthusiast or a professional, our website features engaging articles, exclusive interviews, and resourceful guides to keep you informed and inspired. We foster an interactive...


  • Lucknow, India Synechron Full time

    Good-day, We have opprotunity for AWS Data Engineer. Job Role: AWS Data Engineer Job Location: Synechron ( Bengaluru) Experience- 7 to 12 years Notice : Immediate joiner . About Company: At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver...

  • Cloud Data Engineer

    5 days ago


    lucknow, India beBeeData Full time

    Job DescriptionWe are seeking a skilled data engineer with expertise in cloud computing, big data processing, and API integrations to design, implement, and optimize real-time data pipelines.Design and implement real-time data ingestion pipelines using Apache NiFi for structured and unstructured data.Integrate data pipelines with the Google Cloud Platform...

  • Data Engineer

    4 weeks ago


    Lucknow, India Escalent Full time

    Who We Are Escalent is an award-winning data analytics and advisory firm that helps clients understand human and market behaviors to navigate disruption. As catalysts of progress for more than 40 years, our strategies guide the world’s leading brands. We accelerate growth by creating a seamless flow between primary, secondary, syndicated, and internal...

  • Cloud Engineer

    7 days ago


    lucknow, India beBeeDataEngineer Full time

    About Our GCP Data Engineering RoleWe are seeking an experienced data engineer to design and implement scalable data solutions using Google Cloud Platform.ResponsibilitiesDesign, develop, and maintain large-scale data processing pipelines using GCP Storage Classes, Dataflow, and Big QueryDevelop and deploy efficient data transformation, processing, and...