Senior ETL Data Engineer with GCP experience to build data pipelines for ingestion using tools such as Fivetran, Qlik, Airbyte with one of our major banking clients-
Location Address : Toronto- Hybrid model ( 1 day a week in office- No specific days)
Contract Duration : 3 Months and will be extended beyond February. (Possibility of extension & conversion to FTE Possibly - Depending on performance and funding approval)
Schedule Hours : 9 am-5 pm Monday-Friday, 37.5 hours (No overtime to be discussed with the hiring manager if needed)
Story Behind the Need
Business group :
- GWRT Data and Analytics Technology (OU) Build the data pipelines and analytics for the bank.
- Project : International banking Salesforce effectiveness - Support to migrate IB Commercial Banking data to Google Cloud Platform (GCP) and to CB Commercial Banking's Salesforce Financial Services Cloud (FSC) instance to create a global commercial Salesforce org.
A total of 10 people working on this project Current pre-planning.
Typical Day in Role :
- Design, develop and maintain robust data pipelines for ingestion, transformation, and distribution of large datasets.
- Utilize services and tools to automate data workflows and streamline the data engineering process.
- Collaborate with stakeholders and product managers to analyze, and build data mapping, models and reporting needs.
- Monitor application and pipeline performance.
- Conduct data quality checks
Must Have Skills :
- 10+ years of experience as a Data Engineer with Data Warehouse / Data Platforms
- 5+ years of experience creating ELT data pipelines from scratch, working with structured, semi-structured, and unstructured data and SQL.
- 2+ years of experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte or others.
- 5+ years of experience with Cloud : GCP
- 5+ years of experience working as a data developer, data engineering, programming, ETL, ELT, processes for data integration.
- 5+ years continuous integrations and continuous deployment pipeline (CI / CD) and working with source control systems such as Github, Bitbucket, and Terraform
Nice-To-Have Skills :
- Experience in data modelling, manipulating large data sets and handling raw SQL, and handling other cleaning techniques.
- Python nice to have
- DBT nice to have
Education & Certificates :
Bachelor's degree in a technical field such as computer science, computer engineering or related field required or sufficient experience.
Best VS. Average Candidate :
The ideal candidate is someone who can quickly adapt to any change, solve problems effectively, and implement sustainable solutions.
1st round Interview with the Hiring manager and Tech lead MS Teams- Talk about work experience and get to know, the manager will ask questions to assess previous experience.
Ex : scenario-based questions and behavioural questions