Data Engineer with DBT and Azure Cloud Resources (ADLS, ADF, DataFlow).
Location : Toronto (Greater Toronto Area) - Hybrid
Employment Type : 12+ Months Contract
Job Description : We are looking for a highly skilled professional to design, build, and maintain scalable data solutions while ensuring data quality and driving data innovation.
Requirements :
- Over 3 years of experience working with Azure Cloud Resources (ADLS, ADF, DataFlow).
- At least 2 years of hands-on experience with DBT(Must).
- Proficiency in advanced SQL, data integration, and data modeling principles.
- Experience with Python programming and Git for version control.
- Familiarity with DevOps practices for automation and deployment.
- Strong technical and problem-solving skills.
Responsibilities :
Data Development & Integration : Design and build data solutions using Azure Cloud Resources (ADLS, ADF, DataFlow) in accordance with organizational standards to ensure resilience and scalability.
- Data Modeling & Metadata Management : Create, update, and maintain data models based on business needs and manage metadata repositories to ensure accurate and up-to-date information.
- ETL Processes & Data Preparation : Extract, transform, and load data from various sources, ensuring data quality, integrity, and convergence of datasets using DBT.
- Programming & Scripting : Develop scripts and programs using Python, and employ Git for version control and code management.
- DevOps & Automation : Implement DevOps practices to automate workflows and ensure continuous integration and deployment.
- Problem Resolution & Advanced SQL : Use advanced SQL skills to resolve issues within databases, data products, and processes.
Requirements / Qualifications :
- Data Development & Integration : Design and build data solutions using Azure Cloud Resources (ADLS, ADF, DataFlow) in accordance with organizational standards to ensure resilience and scalability.
- Data Modeling & Metadata Management : Create, update, and maintain data models based on business needs and manage metadata repositories to ensure accurate and up-to-date information.
- ETL Processes & Data Preparation : Extract, transform, and load data from various sources, ensuring data quality, integrity, and convergence of datasets using DBT.
- Programming & Scripting : Develop scripts and programs using Python, and employ Git for version control and code management.
- DevOps & Automation : Implement DevOps practices to automate workflows and ensure continuous integration and deployment.
- Problem Resolution & Advanced SQL : Use advanced SQL skills to resolve issues within databases, data products, and processes.
Il y a 1 jour