GCP Data Engineer
Overview
We are looking for someone who knows several GCP services(at the minimum Dataproc, DataFlow, PubSub, GCS, Terraform) & their configuration impacts on behavior of services. Resource should be able to explain the behavior of all these services & performance impacts of various Config parameters.
No. of Vacancies
1
Specific Skills
GCP,
DataFlow,
Big Query,
SQL, Python, Scala
Responsible For
- At least 7 years of Information Technology experience.
- 4 to 5 years experience working with technologies like – GCP with data engineering – data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query.
- ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka
- Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks.
- Application build experience with core GCP Services like Dataproc, GKE, Composer,
- Deep understanding GCP IAM & Github.
- Must have done IAM set up
- Knowledge on CICD pipeline using Terraform in Git.
- Good knowledge on Google Big Query, using advance SQL programming techniques to build Big Query Data sets in Ingestion and Transformation layer.
- Experience in Relational Modelling, Dimensional Modelling and Modelling of Unstructured Data
- Knowledge on Airflow Dag creation, execution, and monitoring.
- Good understanding of Agile software development frameworks
- Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams.
- Experience and desire to work in a global delivery environment.
Job Nature
Full Time
Experience Requirements
7+
Job Location
Chennai
Job Level
Sr. Position
