Google Cloud, Data Warehouse, Data Engineering,
ETL Process
Experience (Years) : 10 & Above
Essential Skills :
- 10+ years of experience with Data Warehouse / Data Platforms
- 5+ years of experience creating ELT data pipelines from scratch, working with structured, semi-structured, and unstructured data and SQL.
- 2+ years of experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte or others.
- 5+ years of experience with Cloud : GCP
- 5+ years of experience working as a data developer, data engineering, programming, ETL, ELT, processes for data integration .
- 5+ years continuous integrations and continuous deployment pipeline (CI / CD) and working with source control systems such as Github, Bitbucket, and Terraform
Role Description :
- Experience creating ELT data pipelines from scratch
- Experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte etc.
- Experience in data modelling, manipulating large data sets and handling raw SQL, and handling other cleaning techniques
- Experience working with structured, semi-structured, and unstructured data.
- Experience building data pipelines, and composable cloud-based data platforms in AWS, Azure, or GCP.
- Experience collaborating and working with DevOps and Scrum Teams
- Experience working with source control systems such as Github, Bitbucket.
- Have prior experience with data developer, data engineering, programming, ETL, ELT, processes for data integration.
- Demonstrated team player with strong communication skills and a track record of successful delivery of product development.
- Expert at problem solving.
- Good understanding of continuous integrations and continuous deployment pipeline (CI / CD) Strong scripting skills
10 hours ago