MLOps Engineer — Databricks

7 hours ago


Delhi, India Hadron Talent -Hadronfinsys Full time

We are looking for a skilled Databricks MLOps Developer to design, develop, and maintain machine learning operations workflows on the Databricks Lakehouse platform . This role involves building scalable pipelines, automating ML lifecycle processes, and ensuring robust deployment and monitoring of models in production. Key Responsibilities Develop MLOps Pipelines Implement automated workflows for ML model training, deployment, and monitoring using Databricks , MLflow , and Delta Lake . Model Lifecycle Management Manage models through development, staging, and production environments using MLflow Model Registry . CI/CD Integration Build and maintain CI/CD pipelines for ML projects using Git , Azure DevOps , or similar tools. Data Preparation & Feature Engineering Collaborate with data engineers and scientists to prepare datasets and optimize feature pipelines on Databricks. Monitoring & Governance Set up monitoring for model performance, data drift, and compliance using Databricks and cloud-native tools. Documentation & Collaboration Document workflows and best practices; work closely with cross-functional teams to deliver ML solutions. Required Skills & Qualifications Hands-on experience with Databricks , MLflow , and Delta Lake . Strong programming skills in Python and SQL . Familiarity with ML frameworks (TensorFlow, PyTorch, Scikit-learn). Experience with CI/CD tools (Azure DevOps, Jenkins, GitHub Actions). Knowledge of cloud platforms (AWS, Azure, or GCP). Understanding of containerization (Docker) and orchestration (Kubernetes). Preferred Qualifications Databricks certification (Machine Learning Professional or Data Engineer). Experience with Unity Catalog for data governance. Exposure to feature store concepts and ML observability tools. Soft Skills Strong problem-solving and analytical skills. Excellent communication and teamwork abilities.



  • Delhi, India Hadron Talent -Hadronfinsys Full time

    MLOps Engineer — Databricks Client: A large global enterprise (name not disclosed) Location: India Work Model: 100% Remote Contract: 6 months (initial) with possibility of extension Start Date: ASAP Engagement: Full-time / Long-term contract Role Overview We are seeking an experienced Databricks MLOps Developer to design, build, and manage scalable machine...


  • New Delhi, India Hadron Talent -Hadronfinsys Full time

    MLOps Engineer — DatabricksClient:A large global enterprise (name not disclosed) Location:India Work Model:100% Remote Contract:6 months (initial) with possibility of extension Start Date:ASAP Engagement:Full-time / Long-term contractRole Overview We are seeking an experiencedDatabricks MLOps Developerto design, build, and manage scalable machine learning...


  • New Delhi, India Hadron Talent -Hadronfinsys Full time

    MLOps Engineer — DatabricksClient:A large global enterprise (name not disclosed) Location:India Work Model:100% Remote Contract:6 months (initial) with possibility of extension Start Date:ASAP Engagement:Full-time / Long-term contractRole Overview We are seeking an experiencedDatabricks MLOps Developerto design, build, and manage scalable machine learning...


  • New Delhi, India Hadron Talent -Hadronfinsys Full time

    We are looking for a skilledDatabricks MLOps Developerto design, develop, and maintain machine learning operations workflows on theDatabricks Lakehouse platform . This role involves building scalable pipelines, automating ML lifecycle processes, and ensuring robust deployment and monitoring of models in production.Key Responsibilities Develop MLOps Pipelines...

  • MLOps Engineer

    1 week ago


    New Delhi, India Capgemini Full time

    ResponsibilitiesExperience in developing MLOps framework cutting ML lifecycle: model development, training, evaluation, deployment, monitoring including Model Governance Expert in Azure Databricks, Azure ML, Unity Catalog Hands-on experience with Azure DevOps, MLOPS CI/CD Pipelines, Python, Git, Docker Experience in developing standards and practices for...

  • MLOPS Engineer

    9 hours ago


    delhi, India techcarrot Full time

    Job Location - India- Offshore_Noida,Chennai,Hyderabad .Kindly share resume to parul.sharma@techcarrot.aeLooking for Immediate joiner or max 2 Weeks' notice periodJob Descriptions -Job Descriptions -- Practical experience implementing MLOps pipelines for end-to-end machine learning lifecycle management - including model training, versioning, deployment, and...

  • Mlops

    3 weeks ago


    New Delhi, India Recro Full time

    Exp: 4+ yearsBudget : 30 lpa Workmode: 4 days wfo+ 1 day wfhPrimary Skills Must Have: MLOps, Model Deployment & Monitoring Azure Cloud (Azure ML, ADF, AKS, etc.) Databricks (Delta Lake, MLflow, Notebooks, Jobs) CI/CD, GitOps, Infrastructure as Code (Terraform, ARM) Python, PySpark, MLflow, REST APIs Stakeholder & Cross-functional Team Management

  • Mlops

    3 weeks ago


    New Delhi, India Recro Full time

    Exp: 4+ years Budget : 30 lpa Workmode: 4 days wfo+ 1 day wfhPrimary Skills Must Have: MLOps, Model Deployment & Monitoring Azure Cloud (Azure ML, ADF, AKS, etc.) Databricks (Delta Lake, MLflow, Notebooks, Jobs) CI/CD, GitOps, Infrastructure as Code (Terraform, ARM) Python, PySpark, MLflow, REST APIs Stakeholder & Cross-functional Team Management

  • Mlops

    1 week ago


    New Delhi, India Recro Full time

    Exp: 4+ years Budget : 30 lpa Workmode: 4 days wfo+ 1 day wfhPrimary Skills Must Have: MLOps, Model Deployment & Monitoring Azure Cloud (Azure ML, ADF, AKS, etc.) Databricks (Delta Lake, MLflow, Notebooks, Jobs) CI/CD, GitOps, Infrastructure as Code (Terraform, ARM) Python, PySpark, MLflow, REST APIs Stakeholder & Cross-functional Team Management


  • Delhi, India Syren Full time

    Experience with Databricks GenAI , including Model Serving, Vector Search, and embedding workflows. Strong command of Databricks Lakehouse (Clustering, Unity Catalog, Delta Lake). Knowledge of LLMs , embeddings (OpenAI/Azure OpenAI), and GenAI app patterns (RAG / Agents). 4–5 years of hands-on experience in Python , PySpark , and Data Engineering ....