Advisory Consultant- Python/pyspark,dwh, Metrics

2 weeks ago


Noida Uttar Pradesh, India UnitedHealth Group Full time

Optum is a global organization that delivers care aided by technology to help millions of people live healthier lives The work you do with our team will directly improve healtach outcomes by connecting people with the care pharmacy benefits data and resources they need to feel their best Here you will find a culture guided by diversity and inclusion talented peers comprehensive benefits and career development opportunities Come make an impact on the communities we serve as you help us advance health equity on a global scale Join us to start Caring Connecting Growing together This position will report to the head of the Actuarial Date Warehouse Business Intelligence The ADW is a cloud data and analytics environment where client data are ingested conformed cleansed and then enriched to support advanced actuarial and data science analytics This BI environment includes Databricks DBT MS Fabrics and Power BI This position will design develop implement test deploy monitor and maintain the delivery of data enrichments and reporting models to support actuarial reporting and analytics Primary Responsibilities Work with BI team to build and deploy healthcare data enrichments Design high performance reporting models using DBT that will be deployed in Power BI Design and develop Azure Databricks jobs using Python and Spark Develop and maintain CI CD processes using Jenkins GitHub Maven Support monthly and quarterly production activities Maintain high quality documentation of data definitions transformations and processes to ensure data governance and security Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Work with business owners to add new enrichments and to design and implement new reporting models Comply with the terms and conditions of the employment contract company policies and procedures and any and all directives such as but not limited to transfer and or re-assignment to different work locations change in teams and or work shifts policies in regards to flexibility of work benefits and or work environment alternative work arrangements and other decisions that may arise due to the changing business environment The Company may adopt vary or rescind these policies and directives in its absolute discretion and without any limitation implied or otherwise on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 7 years of overall experience in Data and Analytics engineering 5 years of experience writing code in developing Big Data solutions using Spark and Python 5 years of experience working with Azure Databricks and DBT 4 years of experience working with medical and RX claims eligibility tables and provider data Experience working with MS Fabric and Power BI Solid experience with CICD tools such as Jenkins GitHub Maven etc Experience designing and deploying reporting data models In-depth understanding of Azure architecture and ability to come up with efficient design and solutions Proven highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications Undergraduate degree in STEM field Snowflake experience Power BI development experience Experience working as an actuary or with actuaries Knowledge of health care concepts - benefits pricing underwriting stop loss reinsurance reserves etc GEN At UnitedHealth Group our mission is to help people live healthier lives and make the health system work better for everyone We believe everyone-of every race gender sexuality age location and income-deserves the opportunity to live their healthiest life Today however there are still far too many barriers to good health which are disproportionately experienced by people of color historically marginalized groups and those with lower incomes We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission


  • Gcp Python

    1 week ago


    Andhra Pradesh, India Virtusa Full time

    **Skills**:Python Knowledge with Pyspark, Pandas and Python Objects Knowledge of Google Cloud Platform Google Cloud : GCP cloud storage, Data proc, Big query SQL - Strong SQL & Advanced SQL Spark - writing skills on Pyspark DWH - Data warehousing concepts & dimension modeling GIT Any GCP Certification Roles & Responsibilities: Perform data analytics,...


  • uttar pradesh, India Tata Consultancy Services Full time

    TCS present an excellent opportunity for Python, Pyspark, SQL, GCPJob Location: TCS Noida YamunaExperience required : 7-12 yrsWalk in Interview Date: 08-Nov-25 (SaturdayMust-Have. ReactJS and Python f/w for Backend – preferably FAST API. Kubernetes/Docker (preferably AKS). Strong Hands-on experience . Handling large volumes of data on web pages (preferably...


  • Noida, Uttar Pradesh, India Zigsaw Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Roles & Responsibilities: Design, development and implementation of performant ETL pipelines using python API (pySpark) of Apache Spark on AWS EMR Writing reusable, testable, and efficient code Integration of data storage solutions in spark – especially with AWS S3 object storage. Performance tuning of pySpark scripts. Need to ensure overall build delivery...

  • Gcp Python

    2 weeks ago


    Andhra Pradesh, India Virtusa Full time

    Role: Lead Application Developer GCP JD: **Skills**:Knowledge of Google Cloud Platform Google Cloud : GCP cloud storage, Data proc, Big query sQL - Strong SQL & Advanced SQL Spark - writing skills on Pyspark DWH - Data warehousing concepts & dimension modeling (Good to have)Python GIT Any GCP Certification Roles & Responsibilities: Perform data analytics,...

  • Gcp Python

    2 weeks ago


    Andhra Pradesh, India Virtusa Full time

    Role: Backend / Frontend Developer - GCP **Skills**:Knowledge of Google Cloud Platform Google Cloud : GCP cloud storage, Data proc, Big query SQL - Strong SQL & Advanced SQL Spark - writing skills on Pyspark DWH - Data warehousing concepts & dimension modeling (Good to have) Python GIT Any GCP Certification Roles & Responsibilities: Perform data analytics,...

  • PySpark Developer

    2 weeks ago


    Gurgaon Noida, India Mancraft Consulting Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Job Description:Job Summary: PySpark Developer (with Python Migration Focus) - to design, develop, and optimize big data solutions, with a critical focus on migrating and modernizing existing Python data processing codebases within a Health Insurance Company. The PySpark Developer will be a key member of the Data Engineering team, responsible for...


  • Noida, India Genpact Full time

    Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster,...


  • Noida, India Tata Consultancy Services Full time

    TCS present an excellent opportunity for Python, Pyspark, SQL, GCPJob Location: TCS Noida YamunaExperience required : 7-12 yrsWalk in Interview Date: 08-Nov-25 (SaturdayMust-Have. ReactJS and Python f/w for Backend – preferably FAST API. Kubernetes/Docker (preferably AKS). Strong Hands-on experience . Handling large volumes of data on web pages (preferably...


  • Noida, India Tata Consultancy Services Full time

    TCS present an excellent opportunity for Python, Pyspark, SQL, GCP Job Location: TCS Noida Yamuna Experience required : 7-12 yrsWalk in Interview Date: 08-Nov-25 (SaturdayMust-Have . ReactJS and Python f/w for Backend – preferably FAST API . Kubernetes/Docker (preferably AKS) . Strong Hands-on experience . Handling large volumes of data on web pages...


  • Noida, India Tata Consultancy Services Full time

    TCS present an excellent opportunity for Python, Pyspark, SQL, GCPJob Location: TCS Noida YamunaExperience required : 7-12 yrsWalk in Interview Date: 08-Nov-25 (SaturdayMust-Have- . ReactJS and Python f/w for Backend – preferably FAST API- . Kubernetes/Docker (preferably AKS)- . Strong Hands-on experience- . Handling large volumes of data on web pages...