Current jobs related to Sr. GCP Data Engineer - Chennai - Awign Expert

  • Sr. GCP Data Engineer

    4 weeks ago


    Chennai, India Awign Expert Full time

    Job Description Job Description  :  Sr. GCP Data Engineer Experience  : 5 to 12 Years Work location : Chennai, Bangalore, Hyderabad, Pune-Hybrid Shift Timing  : 2 to 11 PM  Interview proces s : L1 and L2 round Job description: 5+ years experience Should have experience in GCP BigQuery, DataProc(PySpark) Good to have experience on Informatica...


  • Chennai, Tamil Nadu, India Awign Full time ₹ 11,00,000 - ₹ 22,00,000 per year

    Job Description: Sr. GCP Data EngineerExperience: 7 to 12 YearsWork location: Chennai, Bangalore, Hyderabad, Pune-HybridShift Timing: 2 to 11 PMInterview process : L1 and L2 roundJob description:5+ years experienceShould have experience in GCP BigQuery, DataProc(PySpark)Good to have experience on InformaticaRequirementsHow do you upload files to GCS?Using...

  • GCP Data Engineer

    2 weeks ago


    Chennai, India Intellistaff Services Pvt. Ltd Full time

    Job description: Data Engineer with Python and GCP Experience Level: 5 to 9 Years Loc: Chennai Must Have Skillset SQL (4+ Years) Python or PySpark (4+ Years) GCP services (3+ Years) BigQuery Dataflow or Dataproc Pub/Sub Scheduled Query Cloud Functions Monitoring Tools Dashboard Apache Kafka Terraform scripting (2+ Year) Airflow/Astronomer/Cloud Composer (2+...

  • Gcp Data Engineer

    2 weeks ago


    Chennai, India Intellistaff Services Pvt. Ltd Full time

    Job description: Data Engineer with Python and GCP Experience Level: 5 to 9 Years Loc: Chennai Must Have Skillset SQL (4+ Years) Python or PySpark (4+ Years) GCP services (3+ Years) BigQuery Dataflow or Dataproc Pub/Sub Scheduled Query Cloud Functions Monitoring Tools Dashboard Apache Kafka Terraform scripting (2+ Year) Airflow/Astronomer/Cloud Composer (2+...

  • GCP Data Engineer

    1 week ago


    Chennai, India Intellistaff Services Pvt. Ltd Full time

    Job description: Data Engineer with Python and GCP Experience Level: 5 to 9 Years Loc: Chennai Must Have Skillset SQL (4+ Years) Python or PySpark (4+ Years) GCP services (3+ Years) BigQuery Dataflow or Dataproc Pub/Sub Scheduled Query Cloud Functions Monitoring Tools Dashboard Apache Kafka Terraform scripting (2+ Year) Airflow/Astronomer/Cloud Composer (2+...

  • GCP Data Engineer

    1 week ago


    Chennai, India Intellistaff Services Pvt. Ltd Full time

    Job description: Data Engineer with Python and GCPExperience Level: 5 to 9 YearsLoc: ChennaiMust Have SkillsetSQL (4+ Years)Python or PySpark (4+ Years)GCP services (3+ Years)BigQueryDataflow or DataprocPub/SubScheduled QueryCloud FunctionsMonitoring Tools DashboardApache KafkaTerraform scripting (2+ Year)Airflow/Astronomer/Cloud Composer (2+ Year)Good to...

  • GCP Data Engineer

    2 weeks ago


    Chennai, India Intellistaff Services Pvt. Ltd Full time

    Job description: Data Engineer with Python and GCPExperience Level: 5 to 9 YearsLoc: Chennai Must Have SkillsetSQL (4+ Years)Python or PySpark (4+ Years)GCP services (3+ Years)BigQueryDataflow or DataprocPub/SubScheduled QueryCloud FunctionsMonitoring Tools DashboardApache KafkaTerraform scripting (2+ Year)Airflow/Astronomer/Cloud Composer (2+ Year) Good to...

  • GCP Data Engineer

    2 weeks ago


    Chennai, India Intellistaff Services Pvt. Ltd Full time

    Job description: Data Engineer with Python and GCPExperience Level: 5 to 9 YearsLoc: Chennai Must Have SkillsetSQL (4+ Years)Python or PySpark (4+ Years)GCP services (3+ Years)BigQueryDataflow or DataprocPub/SubScheduled QueryCloud FunctionsMonitoring Tools DashboardApache KafkaTerraform scripting (2+ Year)Airflow/Astronomer/Cloud Composer (2+ Year) Good to...

  • GCP Data Engineer

    2 weeks ago


    Chennai, India Intellistaff Services Pvt. Ltd Full time

    Job description: Data Engineer with Python and GCPExperience Level: 5 to 9 YearsLoc: Chennai Must Have SkillsetSQL (4+ Years)Python or PySpark (4+ Years)GCP services (3+ Years)BigQueryDataflow or DataprocPub/SubScheduled QueryCloud FunctionsMonitoring Tools DashboardApache KafkaTerraform scripting (2+ Year)Airflow/Astronomer/Cloud Composer (2+ Year) Good to...

  • GCP Data Engineer

    5 days ago


    Chennai, India Intellistaff Services Pvt. Ltd Full time

    Job description: Data Engineer with Python and GCPExperience Level: 5 to 9 YearsLoc: Chennai Must Have SkillsetSQL (4+ Years)Python or PySpark (4+ Years)GCP services (3+ Years)BigQueryDataflow or DataprocPub/SubScheduled QueryCloud FunctionsMonitoring Tools DashboardApache KafkaTerraform scripting (2+ Year)Airflow/Astronomer/Cloud Composer (2+ Year) Good to...

Sr. GCP Data Engineer

4 weeks ago


Chennai, India Awign Expert Full time

Job Description  :  Sr. GCP Data Engineer Experience  : 5 to 12 Years Work location : Chennai, Bangalore, Hyderabad, Pune-Hybrid Shift Timing  : 2 to 11 PM  Interview proces s : L1 and L2 round Job description: 5+ years experience Should have experience in GCP BigQuery, DataProc(PySpark) Good to have experience on Informatica Requirements How do you upload files to GCS? Using Google Cloud Console (UI) – drag and drop files into a bucket. Using gsutil CLI: How do you query data in BigQuery?                Write SQL queries in: BigQuery Console (UI) bq CLI → bq query "SELECT ..." APIs or client libraries (Python, Java, etc.)        3)  What is the purpose of a Dataproc job? Dataproc job is a task you submit to a cluster to process data, such as: Spark job (Scala/Python/Java) Hive query Pig script Hadoop MapReduce job   How do you handle errors in Dataproc jobs? Job retry policies (set retries in workflow templates). Error logs in Cloud Logging (examine stdout/stderr). Graceful failure handling with workflow DAG (e.g., skip/stop). Cluster monitoring with Cloud Monitoring & alerts.   5.  How do you create an Airflow DAG in Cloud Composer? Write a Python script defining the DAG (dag_id, schedule_interval, default_args). Define tasks using operators (e.g., PythonOperator, BashOperator, BigQueryOperator) Upload the script to the DAGs folder in Composer’s GCS bucket → Composer automatically deploys it.