Gcp Python
2 weeks ago
Python Knowledge with Pyspark, Pandas and Python Objects
Knowledge of Google Cloud Platform
Google Cloud : GCP cloud storage, Data proc, Big query
SQL - Strong SQL & Advanced SQL
Spark
- writing skills on Pyspark
DWH - Data warehousing concepts & dimension modeling
GIT
Any GCP Certification
Roles & Responsibilities:
Perform data analytics, warehousing, ETL development
Design & Build Enterprise Datawarehouse & DataMarts, deploy in cloud (GCP)
Perform Descriptive Analytics & Reporting
Perform peer code reviews, design documents & test cases
Support systems currently live and deployed for customers
Build knowledge repository & cloud capabilities
Excellent troubleshooting, attention to detail, and communication skills in fast-paced setting.
Work as part of a team of Engineers/Consultants that globally ensure to provide customer support.
Good understand writing python code
Primary Skills: GCP Big Query/Airflow, Advanced SQL, Python & Pyspark
Secondary Skills: Data warehousing
**About Virtusa**
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
-
Python-GCP
4 days ago
Bengaluru, Mumbai, Pune, India Alike Thoughts Full time ₹ 6,00,000 - ₹ 18,00,000 per yearLocation : Mumbai, Bangalore, Pune, Chennai ,Hyderabad, Kolkata, Noida, Kochi, Coimbatore, Mysore, Nagpur, Bhubaneswar, Indore, WarangalJob descriptionExperience Required8 to 12 years of professional experience in software developmentMandatory SkillsProgramming FrameworksStrong handson experience with PythonProficient in API frameworks such as FastAPI and...
-
GCP BigQuery Developer with SQL
1 week ago
Pune, India IMR Soft LLC Full timeJob Title: GCP BigQuery Developer with SQL & Python No. Of Position : 8 Location: Pune, Bangalore, Hyderabad, Chennai Job Type: Full-time Experience Level: Mid to Senior (7+ Years) Job Summary: We are looking for a talented GCP BigQuery Developer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate...
-
GCP BigQuery Developer with SQL
1 week ago
Pune, India IMR Soft LLC Full timeJob Title: GCP BigQuery Developer with SQL & PythonNo. Of Position : 8Location: Pune, Bangalore, Hyderabad, ChennaiJob Type: Full-timeExperience Level: Mid to Senior (7+ Years)Job Summary:We are looking for a talented GCP BigQuery Developer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate should...
-
GCP BigQuery Developer with SQL
2 weeks ago
pune, India IMR Soft LLC Full timeJob Title: GCP BigQuery Developer with SQL & PythonNo. Of Position: 8Location: Pune, Bangalore, Hyderabad, Chennai Job Type: Full-timeExperience Level: Mid to Senior (7+ Years) Job Summary:We are looking for a talented GCP BigQuery Developer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate should...
-
DevOps+gcp
5 days ago
Pune, Maharashtra, India Niyara Consulting Full timeHello Everyone, Job Role:Devops+GCP Experience:4-8 years Job Location:Pune(Hybrid) Notice period:Immediate joiner **Key Responsibilities**: - Design, develop, and maintain infrastructure as code using **Terraform** on GCP. - Build and manage CI/CD pipelines using **Jenkins**, **Groovy**, **Cloud Build**, or other tools to support continuous integration...
-
Data Engineer with Python, Sql, GCP
4 days ago
Pune, Maharashtra, India iitjobs, Inc. Full time ₹ 15,00,000 - ₹ 25,00,000 per yearGreetingsRole : Data Engineer with Python, Sql, GCPLocation- Pune/Hyderabad-Hybrid (3 days work from office)Exp Range- 6+ YrsNotice Period- Immediate JoinersDuration- 6+ months ContractNote: -Interview will be face to face.Mandatory Skills --Prior experience working on a conv/migration HR project .-Common Skills - SQL, GCP BQ, ETL pipelines using...
-
GCP-T3
3 weeks ago
Pune, India Virtusa Full timeGCP-T3 - CREQ Description Responsibilities Working in a development environment to implement AI / ML tools, enable application deployments to GCP (Google Cloud Platform). Should have expert hands on knowledge on Integrate, Configure, deployment and manage centrally provided common cloud services. Ensure to fix compliance with security / violations and...
-
DevOps with Gcp and Some Python Experience
1 week ago
Pune, India NSR Information systems Full timeExperience level - 2 to 6 years (comp range: INR 8 - 20 LPA) - Experience in spinning up/ trouble shooting following GCP services for data engineering teams - BigQuery, Cloud SQL, Pub/Sub, Dataproc, etc. (no experience expected around working or using these GCP services) - Experience of working on terraform, Jenkins and Ansible using Python and/or Bash...
-
GCP Data Engineer
2 days ago
Pune, India NPG Consultants Full timeWere looking for a cloud-savvy Data Engineer to architect and optimize data pipelines on Google Cloud Platform. If you thrive in Python and live for scalable solutions, this ones for you.Must-Have Skills :- Python for data wrangling and automation- BigQuery for high-performance analytics- Apache Airflow for orchestration- CloudSQL for relational data...
-
Python+GCP Professional
1 week ago
Hyderabad, Pune, India IDESLABS PRIVATE LIMITED Full time ₹ 20,00,000 - ₹ 25,00,000 per yearAt least 8+ years of experience in any of the ETL tools Prophecy, Datastage 11.5/11.7, Pentaho.. etc.At least 3 years of experience in Pyspark with GCP (Airflow, Dataproc, Big query) capable of configuring data pipelines.Strong Experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc.Possess the following...