DataPlatform Engineer

4 weeks ago


Noida, India Minutes to Seconds Pty Ltd Full time
About the job

AtMinutes to Seconds we match people having great skills withtailorfitted jobs to achieve welldeserved success. We know how tomatch people to the right job roles to create that perfect fit.This changes the dynamics of business success and catalyzes thegrowth of individuals. Our aim is to provide both our candidatesand clients with great opportunities and the ideal fit every time.We have partnered with the best people and the best businesses inAustralia in order to achieve success on all fronts. We repassionate about doing an incredible job for our clients and jobseekers. Our success is determined by the success of individuals attheworkplace.

Wewould love the opportunity to work withYOU

Minutesto Seconds is looking for an Data/Platform Engineer in a Contractposition.

RequirementsJobOverview

Theprimary goal of developers is efficiency consistency scalabilityandreliability.

We are responsiblefor the Platform all the toolingintegrations security accesscontrol data classification/managementorchestration selfservice lab concept observability reliability aswell as data availability (dataingestion)

We are NOT responsiblefor Data Modeling DataWarehousing Reporting(PowerBI)

although we do work with PBI team for access control from PBI toSnowflake.

Everything we do is achieved through code nothing is manual (orClickOps) everything is automated through the effectiveness of ourCI/CD framework GitHub GitHubActions TerraformPython.

Orchestrationis centrally managed using ManagedAirflow

Wemanage RBAC / AccessControl

We reresponsible for Tooling Integrations and all the connectivity andauthenticationrequirements

IngestionMethods/Patterns

oFivetran

oSnowflakeSnowpipe(FileBasedsources)

oSnowflakeSecureDataShare

SolidSoftware Development (Full SDLC)Experience with excellent Codingskills:

oPython(required).

oGood knowledgeof Git and GitHub(required)

oGoodcodemanagement experience/best practices(required).

Understanding ofCI/CD to automate and improve the efficiencyspeed and reliability of softwaredelivery.

    • BestPractices/principals
    • GithubActions
      • automateworkflows directly from their GitHubrepositories.
      • Automationof building testing and deploying code inc. code linting securityscanning and versionmanagement.
    • Experiencewith testingframeworks
    • Goodknowledge of IaC (Infrastructure as Code) using Terraform(required)
      • EVERYTHINGwe do isIaC
  • Strongverbal and written skills are amust ideally with the ability to communicate in both technical andsome businesslanguage
  • Agood level of experience with cloud technologiesAWS namely S3 Lambda SQS SNS API Gateway (APIDevelopment) Networking (VPCs) PrivateLink and SecretsManager.
  • Extensivehandson experience engineering datapipelines and a solidunderstanding of the full data supply chain from discovery& analysis data ingestion processing &transformation to consumption/downstreamdataintegration.

Apassion for continuous improvement andlearning for optimization bothin terms of cost and efficiency as well as ways of working.Obsessed with data observability (aka data reconciliation) ensuringpipeline and dataintegrity.

  • Experienceworking with large structured/semistructureddatasets
    • Agood understanding of Parquet AvroJSON/XML
  • Experiencewith Apache Airflow / MWAA or similarorchestrationtooling.
  • Experiencewith Snowflake as a DataPlatform
    • Solidunderstanding of Snowflake Architecture compute storagepartitioningetc.
    • Keyfeatures such as COPYINTO Snowpipe objectlevel tagging and maskingpolicies
    • RBAC(security model) design and administration intermediate skillrequired

oqueryperformance tuning and zero copy clone nice tohave

ovirtualwarehouse (compute)sizing

  • TSQLexperience ability to understand complex queries and think aboutoptimisationadvantageous
  • DataModelling experienceadvantageous
  • Exposureto dbt (data build tool) for datatransformationsadvantageous
  • Exposureto Alation or other Enterprise MetadataManagement (EMM) toolingadvantageous
  • Documentation:architectural designs operational procedures and platformconfigurations to ensure smooth onboarding and troubleshooting forteammembers.
Pleasesend resume at

Job Description: Extensive experience and ample hands-on using IBMCognos tool Experience in designing & developing dashboards andcomplex reports Understanding of India Insurance domain is plusPractical knowledge of connecting Cognos with on-prem DBs (likeDB2) and cloud platform database services like GCP BQ , AWSS3/Redshift Good knowledge of using javascript in Cognos reportsKnowledge of Cognos Administration is plus Experience incollaborating with cross-functional teams Experience Range: 5 - 8years Educational Qualifications: B.Tech/B.E Skills Required: IBMcognos tool, AWS, Redshift, GCP BQ. Click to apply the role or getin touch with us at
  • Data engineer

    2 months ago


    Greater Noida, India Dev Full time

    Overview:The dataengineer plays a crucial role in our organization responsible fordesigning developing and maintaining our data architecture andinfrastructure. They will collaborate with data scientists analystsand other stakeholders to ensure the efficient flow and storage ofdata across our systems. The data engineer will also beinstrumental in implementing...