IT Architect- Data Operations Engineer

2 hours ago


Pune, Maharashtra, India Medtronic Full time ₹ 12,00,000 - ₹ 36,00,000 per year

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You'll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world.

A Day in the Life

Our Global Diabetes Capability Center in Pune is expanding to serve more people living with diabetes globally. Our state-of-the-art facility is dedicated to transforming diabetes management through innovative solutions and technologies that reduce the burden of living with diabetes.

We're a mission-driven leader in medical technology and solutions with a legacy of integrity and innovation, join our new Minimed India Hub as Data Operations Engineer. We are seeking an experienced DataOps Engineer to join our team. This candidate will have a strong background in DevOps, DataOps, or Cloud Engineering practices, with extensive experience in automating the CICD pipelines and modern data stack technologies.

This role offers a dynamic opportunity to join Medtronic's Diabetes business. Medtronic has announced its intention to separate the Diabetes division to promote future growth and innovation within the business and reallocate investments and resources across Medtronic, subject to applicable information and consultation requirements. While you will start your employment with Medtronic, upon establishment of SpinCo or the transition of the Diabetes business to another company, your employment may transfer to either SpinCo or the other company, at Medtronic's discretion and subject to any applicable information and consultation requirements in your jurisdiction

Responsibilities may include the following and other duties may be assigned:

  • Develop and maintain robust, scalable data pipelines and infrastructure automation workflows using GitHub, AWS, and Databricks.
  • Implement and manage CI/CD pipelines using GitHub Actions and GitLab CI/CD for automated infrastructure deployment, testing, and validation.
  • Deploy and manage Databricks LLM Runtime or custom Hugging Face models within Databricks notebooks and model serving endpoints.
  • Manage and optimize Cloud Infrastructure costs, usage, and performance through tagging policies, right-sizing EC2 instances, storage tiering strategies, and auto-scaling.
  • Set up infrastructure observability and performance dashboards using AWS CloudWatch for real-time insights into cloud resources and data pipelines.
  • Develop and manage Terraform or CloudFormation modules to automate infrastructure provisioning across AWS accounts and environments.
  • Implement and enforce cloud security policies, IAM roles, encryption mechanisms (KMS), and compliance configurations.
  • Administer Databricks Workspaces, clusters, access controls, and integrations with Cloud Storage and identity providers.
  • Enforce DevSecOps practices for infrastructure-as-code, ensuring all changes are peer-reviewed, tested, and compliant with internal security policies.
  • Coordinate cloud software releases, patching schedules, and vulnerability remediations using Systems Manager Patch Manage.
  • Automate AWS housekeeping and operational tasks such as:
  • Cleanup of unused EBS Volumes, snapshots, old AMIs
  • Rotation of secrets and credentials using secrets manager
  • Log retention enforcement using S3 Lifecycle policies and CloudWatch Log groups
  • Perform incident response, disaster recovery planning, and post-mortem analysis for operational outages.
  • Collaborate with cross-functional teams including Data Scientists, Data Engineers, and other stakeholders to gather, implement the infrastructure and data requirements.

Required Knowledge and Experience:

  • 4+ years of experience in DataOps / CloudOps / DevOps roles, with strong focus on infrastructure automation, data pipeline operations, observability, and cloud administration.
  • Strong proficiency in at least one Scripting language (e.g., Python, Bash) and one infrastructure-as-code tool (e.g., Terraform, CloudFormation) for building automation scripts for AWS resource cleanup, tagging enforcement, monitoring and backups.
  • Hands-on experience integrating and operationalizing LLMs in production pipelines, including prompt management, caching, token-tracking, and post-processing.
  • Deep hands-on experience with AWS Services, including
  • Core: EC2, S3, RDS, CloudWatch, IAM, Lambda, VPC
  • Data Services: Athena, Glue, MSK, Redshift
  • Security: KMS, IAM, Config, CloudTrail, Secrets Manager
  • Operational: Auto Scaling, Systems Manager, CloudFormation/Terraform
  • Machine Learning/AI: Bedrock, SageMaker, OpenSearch serverless
  • Working knowledge of Databricks, including:
  • Cluster and workspace management, job orchestration
  • Integration with AWS Storage and identity (IAM passthrough)
  • Experience deploying and managing CI/CD workflows using GitHub Actions, GitLab CI, or AWS CodePipeline.
  • Strong understanding of cloud networking, including VPC Peering, Transit Gateway, security groups, and private link setup.
  • Familiarity with container orchestration platforms (e.g., Kubernetes, ECS) for deploying platform tools and services.
  • Strong understanding of data modeling, data warehousing concepts, and AI/ML Lifecycle management.
  • Knowledge of cost optimization strategies across compute, storage, and network layers.
  • Experience with data governance, logging, and compliance practices in cloud environments (e.g., SOC2, HIPAA, GDPR)
  • Bonus: Exposure to LangChain, Prompt Engineering frameworks, Retrieval Augmented Generation (RAG), and vector database integration (AWS OpenSearch, Pinecone, Milvus, etc.)

Preferred Qualifications:

  • AWS Certified Solutions Architect, DevOps Engineer or SysOps Administrator certifications.
  • Hands-on experience with multi-cloud environments, particularly Azure or GCP, in addition to AWS.
  • Experience with infrastructure cost management tools like AWS Cost Explorer, or FinOps dashboards.
  • Ability to write clean, production-grade Python code for automation scripts, operational tooling, and custom CloudOps Utilities.
  • Prior experience in supporting high-availability production environments with disaster recovery and failover architectures.
  • Understanding of Zero Trust architecture and security best practices in cloud-native environments.
  • Experience with automated cloud resources cleanup, tagging enforcement, and compliance-as-code using tools like Terraform Sentinel.
  • Familiarity with Databricks Unity Catalog, access control frameworks, and workspace governance.
  • Strong communication skills and experience working in agile cross-functional teams, ideally with Data Product or Platform Engineering teams.

Physical Job Requirements

The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position.

Benefits & Compensation

Medtronic offers a competitive Salary and flexible Benefits Package

A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage.

About Medtronic

We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions.

Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people.

We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary.



  • Pune, Maharashtra, India Mars Data Insights Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Title: Data Operations Engineer & RUN SupportSkills:Data Operations Engineering, data manipulation, Python, Talend, GCP, Bigquery, DataIku, ITSM/ticketing tools, Helix, Jira, task management, data pipelines,  RUN Service, data infrastructure, data quality Job Location:Pune Job Type:Fulltime/Hybrid Work Experience:5+ years We are seeking a highly...


  • Pune, Maharashtra, India NTT DATA Full time

    Req ID 293343NTT DATA Services strives to hire exceptional innovative and passionate individuals who want to grow with us If you want to be part of an inclusive adaptable and forward-thinking organization apply now We are currently seeking a Data Architect Consultant to join our team in Pune Mah xc4 x81r xc4 x81shtra IN-MH India IN Data Architect...


  • Pune, Maharashtra, India True Talents Full time

    Job Description - Senior GCP Data Engineer / ArchitectPosition : Senior GCP Data Engineer / ArchitectJob Type : Full-time | PayrollLocation : Pan India (Onsite : Bengaluru / Pune / Gurgaon)Duration : Long-termAbout the Role :We are hiring Senior GCP Data Engineers and Architects for a leading MNC. The ideal candidate will have strong expertise in Google...

  • AIML Architect

    4 hours ago


    Pune, Maharashtra, India Princeton IT America Full time ₹ 25,00,000 - ₹ 30,00,000 per year

    Job Title: AI/ML ArchitectLocation: PuneExperience: 8+ YearsEmployment Type: Full-timeLooking for immediate joinersAbout the RoleWe are seeking a highly skilled AI/ML Architect with strong expertise in designing, developing, and deploying end-to-end AI/ML solutions. The ideal candidate will have hands-on experience with machine learning, deep learning, cloud...

  • Data Architect

    2 weeks ago


    Pune, Maharashtra, India MGrow Full time

    Experience Level: 10+ Years Who We Are Artefact is a new generation of a data service provider, specialising in data consulting and data-driven digital marketing, dedicated to transforming data into business impact across the entire value chain of organisations. We are proud to say we're enjoying skyrocketing growth. Our broad range of data-driven...

  • Data Architect

    4 weeks ago


    Pune, Maharashtra, India Artefact Full time

    Job Title: Data Architect Experience Level: 10+ Years Job Overview We are seeking a highly experienced and versatile Data Architect with over 10 years of experience. This role requires deep expertise in data architecture and data engineering, along with exceptional client management and team leadership skills. The ideal candidate will be both strategic and...

  • Data Architect

    3 weeks ago


    Pune, Maharashtra, India Artefact Full time

    Job Title: Data ArchitectExperience Level: 10+ YearsJob OverviewWe are seeking a highly experienced and versatile Data Architect with over 10 years of experience. This role requires deep expertise in data architecture and data engineering, along with exceptional client management and team leadership skills. The ideal candidate will be both strategic and...

  • Data Architect

    2 days ago


    Pune, Maharashtra, India Artefact Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Job Title:Data ArchitectExperience Level:10+ YearsJob OverviewWe are seeking a highly experienced and versatile Data Architect with over 10 years of experience. This role requires deep expertise in data architecture and data engineering, along with exceptional client management and team leadership skills. The ideal candidate will be both strategic and...

  • Data Architect

    6 hours ago


    Pune, Maharashtra, India Nitor Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Hi ,Greetings from Nitor infotechImmediate joiners only Preferred.Were Hiring: Senior Data Engineer (Big Data / GCP / Pyspark)Are you a seasoned Data Engineer who loves solving complex problems, building scalable data systems, and mentoring others?Were looking for a Senior Data Engineer to architect and lead the next generation of our data platforms. If you...

  • Data Engineer

    3 hours ago


    Pune, Maharashtra, India Jash Data Sciences Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Do you love solving real-world data problems with the latest and best techniques? And having fun while solving them in a team Then come join our high-energy team of passionate data people. Jash Data Sciences is the right place for you.We are a cutting-edge Data Sciences and Data Engineering startup based in Pune, India.We believe in continuous learning and...