FinOps Data Engineer

5 days ago


Thane, India Blue Cloud Softech Solutions Limited Full time

FinOps Data EngineerBU / FUNCTION DESCRIPTIONData Operations team is responsible for Platform and Data excellence. They strive to provide excellent customer service and leverage data platforms to improve on operational efficiency and publishing data on time.we strongly believe that data and analytics are strategic drivers for future success. We are building a world class advanced analytics team that will solve some of the most complex strategic problems and deliver topline growth and operational efficiencies across our business The Analytics team is part of the Organization and is responsible for driving organic growth by leveraging big data and advanced analytics.We are on an exciting journey to build and scale our advanced analytics practice. We looking for a Data Engineer. The suitable candidate should have demonstrated experience in designing and implementing ETL solutions on-premise and cloud platforms to support Enterprise data warehouse, Data Lake and advanced analytics capabilities. Success in this role comes from marrying a strong data engineering background with product and business acumen to deliver data operational excellence.You will be responsible to help defining KPI’s, requirements analysis, design and implementation data solutions on-premise and cloud. The candidate will work closely with vendor partners, business unit representatives, project sponsors and Segment CIO teams to deliver the solutions. The candidate is expected to communicate data operations status, issues and defined metrics to all levels of management.ROLESWe are seeking a FinOps Data Engineer who will bridge the gap between cloud financial operations and engineering. This role involves designing and implementing ETL pipelines, building cost dashboards, ensuring tagging compliance, detecting anomalies, and collaborating with stakeholders to optimize cloud spend. The ideal candidate will have strong experience in DevOps, Data Engineering, and Cloud Cost management.RESPONSIBILITIES• Design and develop ETL solutions for cost and usage data using best practices for data warehousing and analytics.• Analyze cloud cost and usage across AWS, Databricks, and other Cloud & On-Prem platforms.• Build and maintain cloud cost dashboards and reporting solutions for visibility across LOBs and programs.• Implement tagging standards and establish compliance checks and generate reports to ensure adherence to tagging standards.• Detect & Analyze cost anomalies and usage patterns; proactively identify optimization opportunities using AWS Cost explorer, Databricks System tables & backup tables as driver.• Collaborate with stakeholders (DevOps, Application teams, Finance, Architect, Infra) to implement cost-saving strategies following FinOps Foundation standards.• Develop automated workflows for data ingestion, transformation, and validation.• Document processes, data flows, and standards for FinOps operations.• Work with vendors and internal teams to ensure KPIs for cost and tagging compliance are met.• Enable accurate showback/chargeback models aligned to LOB/Program.• Support forecast vs. actual reporting and provide monthly FinOps insights by enabling automated workflows, alerts, notifications, guardrails.• Work with DevOps teams on cluster governance, resource control, policy enforcement, guardrails, build cost validation queries, build granular user level cost at tags like LOBs, data products, sessions and roll up to workspace level cost.• Manage pipelines for ETL jobs, infrastructure automation, and monitoring tools.• Implement cost-aware DevOps practices (auto-scaling, scheduling, workload orchestration).• Collaborate on implementing cluster policies, SQL warehouse governance, and operational efficiency.• Have deep working knowledge of on-prem & cloud ESB architecture to address the client’s requirements for scalability, reliability, security, and performance.• Provide technical assistance in identifying, evaluating, and developing systems and procedures.• Manage foundational data administration tasks such as scheduling jobs, troubleshooting job errors, identifying issues with job windows, assisting with Database backups and performance tuning.• Design, Develop, Test, Adapt ETL code & jobs to accommodate changes in source data and new business requirements.• Proactively communicate innovative ideas, solutions, and capabilities over and above the specific task request• Effectively communicate status, workloads, offers to assist other areas.• Collaboratively work with a team and independently. Continuously strive for high performing business solutionsCompetencies & Experience Required/Desired• 6+ years in Data Engineering with strong ETL design, development, and optimization experience.• Hands-on experience with AWS services and cost management tools.• Strong knowledge of tagging strategies and governance in multi-cloud environments.• Proficiency in SQL, PL/SQL, and data warehousing best practices.• Experience with DevOps practices, CI/CD pipelines, and automation tools.• Familiarity with FinOps principles and cloud cost optimization techniques.• Ability to analyze large datasets and detect anomalies using scripting or BI tools (e.g., Power BI, Tableau).• Excellent communication skills to work with technical and business stakeholders.• Strong problem-solving capabilities. Results oriented. Relies on fact-based logic for decision-making.• Ability to work with multiple projects and work streams at one time. Must be able to deliver results based upon project deadlines.• Willing to flex daily work schedule to allow for time-zone differences for global team communications.• Strong interpersonal and communication skillsMOTIVATIONAL/CULTURAL FIT• Experience with other ETL tools/services• Data integration experience using platforms.• Experience in visualization tools, Tableau, PowerBI or other tools.• Experience in developing basic data science models using Python or a similar language• Leadership qualities and mentoring others on team.• Having AWS certification is a plus



  • thane, India beBeeCloud Full time

    We're seeking a skilled professional to bridge the gap between cloud financial operations and engineering.The ideal candidate will have strong experience in DevOps, Data Engineering, and Cloud Cost management. Key responsibilities include designing and developing ETL solutions for cost and usage data, analyzing cloud cost and usage across various platforms,...


  • Thane, India NTT DATA Services Full time

    Req ID:     NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Senior GenAI Engineers to join our team in Mumbai, Madhya Pradesh (IN-MP), India (IN). Location: | Experience: 8+ years...

  • Data Engineer

    1 day ago


    Thane, India Multi Recruit Full time

    Roles and Responsibility The Data Engineer is responsible for overseeing data engineering activities and aiding in building the business’ data collection systems and processing pipelines. Responsible for building and maintaining optimized and highly available data pipelines that facilitate deeper analysis and reporting by the Data and Analytics department....

  • Data engineer

    3 weeks ago


    Thane, India MyData Insights Pvt Ltd Full time

    Role: Data Engineer Experience: 6–12 years Location: Remote Engagement: Contract (4–6 Months) Work Type: Remote About the Role We are seeking an experienced Data Engineer to support data modernization and analytics initiatives across multi-cloud environments. The ideal candidate has a strong background in building scalable data solutions on AWS, Azure,...


  • Thane, India Cubical Operations LLP Full time

    Job Description: Technical Lead – Platform EngineeringLocation: Bangalore Experience: 5+ YearsAbout the RolePlatform Engineering is a strategic function focused on building frictionless, secure, scalable, and automated enterprise cloud platforms. As the Technical Lead – Platform Engineering, you will design, build, and enable platform services that power...

  • Data Engineer

    3 days ago


    thane, India Insight Global Full time

    Required Skills & Experience• Minimum 6 years of hands-on experience with Azure data services (Data Factory, Databricks, ADLS, SQL DB, etc.). - 2-3 years of architecture experience: Designing and optimizing data sources, application ingestion, datalake hosting. Python and Pysparks• Bachelor's degree in computer science, Computer Engineering, or a STEM...

  • Data Engineer

    3 weeks ago


    Thane, India Envu Full time

    We’re Hiring! Data Engineer At Envu, we partner with our customers to design world-class, forward-thinking innovations that protect and enhance the health of environments around the world. We offer dedicated services in: Professional Pest Management, Forestry, Ornamentals, Golf, Industrial Vegetation Management, Lawn & Landscape, Mosquito Management, and...


  • thane, India beBeeDataEngineer Full time

    Job OverviewWe are seeking a highly skilled professional to support our Data Engineering efforts with daily activities. This is a remote contract role.Develop secure, high-quality production code and data pipelines by thinking beyond routine approaches to build solutions or break down technical problems.Design, develop, and implement software and data...

  • Big Data Engineer

    7 days ago


    thane, India beBeeDataEngineer Full time

    Big Data EngineerA big data engineer is responsible for designing, developing and deploying large-scale data processing pipelines. They work closely with cross-functional teams to identify opportunities for data-driven insights and provide support and training to other teams on data engineering practices.In this role, you will maintain and optimize existing...

  • Data engineer

    3 weeks ago


    Thane, India Azilen Technologies Full time

    Job purpose: Manage and create design schema, SQL query tuning, and code review. Who you are: Min 4+ years of professional experience in the field of data engineering with knowledge of the data platform and DWH development Min. 2 years of hands-on experience in SSAS tabular models Designed Data Ingestion and Orchestration Pipelines with Kafka and Python...