Data Engineer GCP
Diverse Global Solutions Provider Corp
Apply  

Highlights:

3.00 - 6.00 Years
10.00 - 20.00 INR (Lacs)/Yearly
Full-time
Mumbai

Roles & Responsibility

Experience

3-4 years of hands-on experience as a Data Engineer with strong focus on Google Cloud Platform. Experience in building and maintaining production-grade data pipelines and infrastructure on GCP.

Qualifications

Bachelor's or Master's degree in Statistics/ Applied statistics

Primary Skill Requirements

Primary Skills:

Google Cloud Platform Expertise:

1.Advanced proficiency in BigQuery (SQL, DML/DDL, optimization techniques)
2.Experience with Cloud Dataflow for batch and streaming data processing
3.Hands-on with Cloud Composer/Apache Airflow for orchestration
4.Cloud Storage, Cloud SQL, Cloud Spanner, and Bigtable implementation
5.Cloud Pub/Sub for event-driven architectures
6.Cloud Functions and Cloud Run for serverless computing
7.DataProc for managed Spark/Hadoop workloads

Programming & Tools:

1.Strong programming skills in Python, Java, or Scala
2.Proficiency in SQL and NoSQL databases
3.Experience with Apache Beam SDK for data processing
4.Infrastructure as Code using Terraform or Cloud Deployment Manager
5.Version control with Git and CI/CD pipelines

Data Engineering Concepts:

1.ETL/ELT design patterns and best practices
2.Data modeling (dimensional, normalized, denormalized)
3.Data warehousing and data lake architectures
4.tream processing and real-time analytics
5.Data partitioning, sharding, and optimization strategies

Requirements

Experience

3-4 years of hands-on experience as a Data Engineer with strong focus on Google Cloud Platform. Experience in building and maintaining production-grade data pipelines and infrastructure on GCP.

Qualifications

Bachelor's or Master's degree in Statistics/ Applied statistics

Primary Skill Requirements

Primary Skills:

Google Cloud Platform Expertise:

1.Advanced proficiency in BigQuery (SQL, DML/DDL, optimization techniques)
2.Experience with Cloud Dataflow for batch and streaming data processing
3.Hands-on with Cloud Composer/Apache Airflow for orchestration
4.Cloud Storage, Cloud SQL, Cloud Spanner, and Bigtable implementation
5.Cloud Pub/Sub for event-driven architectures
6.Cloud Functions and Cloud Run for serverless computing
7.DataProc for managed Spark/Hadoop workloads

Programming & Tools:

1.Strong programming skills in Python, Java, or Scala
2.Proficiency in SQL and NoSQL databases
3.Experience with Apache Beam SDK for data processing
4.Infrastructure as Code using Terraform or Cloud Deployment Manager
5.Version control with Git and CI/CD pipelines

Data Engineering Concepts:

1.ETL/ELT design patterns and best practices
2.Data modeling (dimensional, normalized, denormalized)
3.Data warehousing and data lake architectures
4.tream processing and real-time analytics
5.Data partitioning, sharding, and optimization strategies