Search jobs > Toronto, ON > Data engineer

GCP Data Engineer

Tech Mahindra
Toronto, Ontario, Canada
$94.5K-$135K a year
Full-time

About Us :

At Tech Mahindra ( Tech Mahindra Connected World, Connected Experiences ), we live the philosophy of connected world and connected experiences.

We thrive on change that is powered by the intelligent symphony of technology and humans designing meaningful and sustainable experiences.

Consumer experiences’ are driving and disrupting industries like never before. Businesses must build seamless yet simple enterprises that collaborate, synergize, and drive the change.

Change that connects us all and empowers us to deliver experiences that span across the digital, the physical, the convergent, and everything in between.

That’s when truly connected experiences manifest.

Extraordinary is when experiences come together a continuous convergence of digital technologies, touchpoints, and most importantly people.

It’s time to reimagine, reinvent, and revolutionize business models & operations as well as to transform enterprises into living, breathing, and connected businesses.

We are the Digital Change makers who strive to change the way the world, communities, businesses, and humans interact digitally.

We are harnessing the power of change, brought in by technologies, that makes it the most exciting time to be alive in the human history.

Our universe, as we build it, disrupt it, and redesign it, is powering the digital change.

Tech Mahindra represents the connected world, offering innovative and customer-centric information technology experiences, enabling Enterprises, Associates, and the Society to Rise.

It has 150,000+ professionals working for 1000+ Global Customers (including Fortune 500 companies) in 90 Countries. We’re part of the esteemed Mahindra group, headquartered in India.

Under a new CEO, Tech Mahindra is committed to a transformative journey with 'Scale @ Speed' as our guiding principle.

About the Role and Job :

Position : GCP Data Engineer

Location : Toronto

Work Model : Hybrid

What is the role

  • Design, develop and maintain robust data pipelines for ingestion, transformation, and distribution of large datasets.
  • Utilize services and tools to automate data workflows and streamline the data engineering process.
  • Collaborate with stakeholders and product managers to analyze, and build data mapping, models and reporting needs.
  • Monitor application and pipeline performance.
  • Conduct data quality checks

Experience :

  • Experience creating ELT data pipelines from scratch
  • Experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte etc.
  • Experience in data modelling, manipulating large data sets and handling raw SQL, and handling other cleaning techniques.
  • Experience working with structured, semi-structured, and unstructured data.
  • Experience building data pipelines, and GCP Services, Dataproc, BigQuery, Cloud Spanner, Cloud Run functions, Dataflow, Pub / Sub etc
  • Experience collaborating and working with DevOps and Scrum Teams
  • Have prior experience with data developer, data engineering, programming, ETL, ELT, processes for data integration.
  • Demonstrated team player with strong communication skills and a track record of successful delivery of product development.
  • Expert at problem solving.
  • Good understanding of continuous integrations and continuous deployment pipeline (CI / CD)
  • Strong scripting skills
  • Experience working with source control systems such as Github, Bitbucket.

Technical Skills

  • 10+ years of experience with Data Warehouse / Data Platforms
  • 5+ years of experience creating ELT data pipelines from scratch, working with structured, semi-structured, and unstructured data and SQL.
  • 2+ years of experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte or others.
  • 5+ years of experience with Cloud : GCP
  • 5+ years of experience working as a data developer, data engineering, programming, ETL, ELT, processes for data integration.
  • 5+ years continuous integrations and continuous deployment pipeline (CI / CD) and working with source control systems such as Github, Bitbucket, and Terraform

The pay range for this role is $94500- $135000 per annum including any bonuses or variable pay. Tech Mahindra also offers benefits like medical, vision, dental, life, disability insurance and paid time off (including holidays, parental leave, and sick leave, as required by law).

Ask our recruiters for more details on our Benefits package. The exact offer terms will depend on the skill level, educational qualifications, experience and location of the candidate.

Tech Mahindra is an Equal Employment Opportunity employer. We promote and support a diverse workforce at all levels of the company.

All qualified applicants will receive consideration for employment without regard to race, religion, color, sex, age, national origin, or disability.

All applicants will be evaluated solely on the basis of their ability, competence, and performance of the essential functions of their positions with or without reasonable accommodations.

Reasonable accommodations also are available in the hiring process for applicants with disabilities. Candidates can request a reasonable accommodation by contacting the company ADA Coordinator at ADA [email protected].

12 hours ago
Related jobs
Rackspace
Remote, Canada
Remote

We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. Leverage GCP for scalable big data processing and storage solutions. This role involves working in Java, and working on Machine Learning pipelines for data collection or batch inference. Knowledge in p...

Tech Mahindra
Toronto, Ontario

Experience building data pipelines, and GCP Services, Dataproc, BigQuery, Cloud Spanner, Cloud Run functions, Dataflow, Pub/Sub etc. Have prior experience with data developer, data engineering, programming, ETL, ELT, processes for data integration. Utilize services and tools to automate data workflo...

Infinity Solutions
ON, Canada

OLTP applications with a minimum of 4+ years of working experience as Google Cloud Platform (GCP) developer</li> <li>Experience with the primary managed data services within GCP, including DataProc, Dataflow, BigQuery/DBT, Cloud Spanner, Cloud SQL, Cloud Pub/Sub etc. Experience with Goog...

Zortech Solutions
Toronto, Ontario

The Google Lead Data Engineer will play a key role in designing and implementing the Data Hub using Google cloud components working closely with the enterprise data team and data architects, solution architects, business systems analyst and data engineers</p> <p><b><u>Key Res...

S.i. Systems
Toronto, Ontario

Senior ETL Data Engineer with GCP experience to build data pipelines for ingestion using tools such as Fivetran, Qlik, Airbyte with one of our major banking clients-. Utilize services and tools to automate data workflows and streamline the data engineering process. GWRT Data and Analytics Technology...

Avanciers
Toronto, Ontario

Experience building data pipelines, and GCP Services, Dataproc, BigQuery, Cloud Spanner, Cloud Run functions, Dataflow, Pub/Sub etc. GCP Data Engineer in Toronto- Hybrid (FTE/Contract). Data Warehouse / Data Platforms. ELT data pipelines from scratch, working with structured, semi-structured, and un...

Viva Tech Solutions
Toronto, Ontario

Data Warehouse / Data Platforms. ELT data pipelines from scratch, working with structured, semi-structured, and unstructured data and SQL. ETL, ELT, processes for data integration. ...

Nova Tribes Inc
Toronto, Ontario

Theideal candidate will have a deep understanding of data warehousingand cloud technologies particularly within the Google CloudPlatform (GCP) ecosystem. Dataproc BigQuery Cloud Spanner Cloud Runfunctions DataflowPub/Sub. Proficiencyin building data pipelines using. Google Cloud Platform (GCP) and r...

Arctic Wolf
Remote, Canada
Remote

You’ll be working as a senior software developer on our Applications Team, responsible for delivering the cloud-based software that helps solve the real-world security problems that IT professionals face. Integrate software components into a fully functional software system; document and maintain so...

Six side Logistics
Mississauga, Ontario

Research and document data requirements, data collection and administration policy, and data access rules. Operate database management systems to analyze data. Develop policies and procedures for network access and usage and for the backup and recovery of data. ...