Gcp Data Modeler
4 days ago
GCP Data Modeler with Bigquery, Dataflow, LookML, Looker, SQL, Python
Can sit in Bangalore, Hyderabad, Nagpur, Chennai
Role - Fulltime
Availability to Join - Immediate
Years of experience - 10-15 years
**Mandatory skills**
GCP Data Modeler with Bigquery, Dataflow, LookML, Looker, SQL, Python**Job Description: Senior Data Modeler with Expertise in GCP and Looker**
**Key Responsibilities**:
- Data Modeling:
- Design, develop, and maintain conceptual, logical, and physical data models to support data warehousing and analytics needs.
- Ensure data models are scalable, efficient, and aligned with business requirements.
- Database Design:
- Create and optimize database schemas, tables, views, indexes, and other database objects in Google BigQuery.
- Implement best practices for database design to ensure data integrity and performance.
- ETL Processes:
- Design and implement ETL (Extract, Transform, Load) processes to integrate data from various source systems into BigQuery.
- Use tools like Google Cloud Dataflow, Apache Beam, or other ETL tools to automate data pipelines.
- Data Integration:
- Work closely with data engineers to ensure seamless integration and consistency of data across different platforms.
- Data Governance:
- Implement data governance practices to ensure data quality, consistency, and security.
- Define and enforce data standards, naming conventions, and documentation.
- Performance Optimization:
- Optimize data storage, processing, and retrieval to ensure high performance and scalability.
- Use partitioning, clustering, and other optimization techniques in BigQuery.
- Collaboration:
- Collaborate with business stakeholders, data scientists, and analysts to understand data requirements and translate them into effective data models.
- Provide technical guidance and mentorship to junior team members.
- Data Visualization:
- Work with data visualization tools like Looker, Looker Studio, or Tableau to create interactive dashboards and reports.
- Develop LookML models in Looker to enable efficient data querying and visualization.
- Documentation:
- Document data models, ETL processes, and data integration workflows.
- Maintain up-to-date documentation to facilitate knowledge sharing and onboarding of new team members.
**Required Expertise**:
- Looker: 2-5+ Years of Strong proficiency in Looker, including LookML, dashboard creation, and report development.
- BigQuery: 5+ Extensive experience with Google BigQuery, including data warehousing, SQL querying, and performance optimization.
- SQL& Python: 10+ years of SQL and Advanced SQL and Python skills for data manipulation, querying, and modelling.
- ETL: 10+ years of hands-on experience with ETL processes and tools for data integration from various source systems.
- Cloud Services: Familiarity with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, and Dataflow.
- Data Modelling Techniques: Proficiency in various data modelling techniques such as star schema, snowflake schema, normalized and denormalized models, and dimensional modelling. Knowledge of data modelling frameworks, including Data Mesh, Data Vault, Medallion architecture, and methodologies by Kimball and Inmon, is highly advantageous.
- Problem-Solving: Excellent problem-solving skills and the ability to work on complex, ambiguous projects.
- Communication: Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- Project Delivery: Proven track record of delivering successful data projects and driving business value through data insights.
**Preferred Qualifications**:
- Education: Bachelor's or Master's degree in Data Science, Computer Science, Information Systems, or a related field.
- Certifications: Google Cloud certification in relevance to Data Modeler or engineering capabilities.
- Visualization Tools: Experience with other data visualization tools such as Looker, Looker Studio and Tableau.
- Programming: Familiarity with programming languages such as Python for data manipulation and analysis.
- Data Warehousing: Knowledge of data warehousing concepts and best practices.
-
Data Modeller
2 weeks ago
Bengaluru, Karnataka, India Tata Consultancy Services Full timeAssociate have to play Data Modeller Role - Hands on experience on Hadoop, GCP, Datastage & Oracle will be preferred. - Credit Risk Data Modelling experience will be preferred Qualifications :Undergraduate
-
Gcp Data Architect
2 weeks ago
Bengaluru, Karnataka, India Tarento Technologies Pvt. Ltd. Full timeA GCP Data Architect is responsible for designing, implementing, and managing scalable, robust, and secure data solutions on Google Cloud Platform (GCP). This role involves collaborating with cross-functional teams to translate business and technical requirements into effective data architectures, ensuring data integrity, security, and optimal...
-
Data Modeler
2 weeks ago
Bengaluru, Karnataka, India growel softech Full timeData Modeler 5+ Years Bangalore/Mumbai Max 20 LPA **Responsibilities**: - Develop and maintain conceptual, logical, and physical data models that align with business requirements. - Collaborate with business analysts, data scientists, and other stakeholders to understand data requirements and translate them into effective data models. - Design and...
-
Data Modeler
7 days ago
Bengaluru, India Aptus Data LAbs Full time**Job Information**: Number of Positions - 1Industry - TechnologyWork Experience - 5-8 yearsLast Activity Time - 12/02/2021 23:00City - Bangalore NorthState/Province - KarnatakaCountry - IndiaZip/Postal Code - 560002The data modeller designs, implements, and documents data architecture and data modelling solutions, which include the use of relational,...
-
GCP Architect
2 weeks ago
Bengaluru, Karnataka, India NTT DATA North America Full time ₹ 12,00,000 - ₹ 36,00,000 per yearReq ID: NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.We are currently seeking a GCP Architect to join our team in Bangalore, Karnātaka (IN-KA), India (IN).GCP - ArchitectJob DescriptionRole SummaryAs a GCP...
-
Data Modeller
2 days ago
Bengaluru, Karnataka, India Algoleap Technologies Full time ₹ 12,00,000 - ₹ 36,00,000 per yearMinimum 6 years of experience in data modeling and data architecture. Proficiency in SQL, data warehousing, and dimensional modeling (e.g., star/snowflake schemas). Experience with data modeling tools such as Erwin, PowerDesigner, or dbt. Excellent SQL skills Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data stack tools. Strong expertise...
-
GCP Data Engineer
2 weeks ago
Bengaluru, India Impetus Full timeJob Title: GCP Data EngineerExperience: 4–7 YearsLocation: Bangalore / GurgaonEmployment Type: Full-TimeAbout the RoleWe are looking for an experienced GCP Data Engineer with a strong background in Big Data, PySpark, and Python, and hands-on experience with core Google Cloud Platform (GCP) services. The ideal candidate will be responsible for designing,...
-
GCP Data Engineer
3 weeks ago
Bengaluru, India Impetus Full timeJob Title: GCP Data EngineerExperience: 4–7 YearsLocation: Bangalore / GurgaonEmployment Type: Full-TimeAbout the RoleWe are looking for an experienced GCP Data Engineer with a strong background in Big Data, PySpark, and Python, and hands-on experience with core Google Cloud Platform (GCP) services. The ideal candidate will be responsible for designing,...
-
GCP Data Engineer
4 weeks ago
Bengaluru, India Impetus Full timeJob Title: GCP Data EngineerExperience: 4–7 Years Location: Bangalore / Gurgaon Employment Type: Full-TimeAbout the RoleWe are looking for an experienced GCP Data Engineer with a strong background in Big Data, PySpark, and Python, and hands-on experience with core Google Cloud Platform (GCP) services. The ideal candidate will be responsible for designing,...
-
GCP Data Engineer
3 weeks ago
Bengaluru, India Impetus Full timeJob Title: GCP Data EngineerExperience: 4–7 Years Location: Bangalore / Gurgaon Employment Type: Full-TimeAbout the RoleWe are looking for an experienced GCP Data Engineer with a strong background in Big Data, PySpark, and Python, and hands-on experience with core Google Cloud Platform (GCP) services. The ideal candidate will be responsible for designing,...