Our CompanyAt Kynetec, we're proud to be at the forefront of the intersection between agriculture, sustainability, and animal health. We’re redefining our industry with unparalleled insights and leading technology, whilst on an ambitious growth plan to supersede our influence from the food on our plates, to the health of our livestock and the care of our beloved pets at home.We owe our success to our industry experts. They are the driving force behind our reputation as a global leader in the industry - Their innovative ideas and expertise have helped us achieve new heights. From seasoned insights specialists, and client leaders to innovative tech genius. What connects us? A shared passion for Agriculture and Animal Health! We don’t settle for “business as usual”.Each day, we are taking strides towards transforming our industry and improving the lives of people and animals around the world. If you’re looking for a company who challenges the norm and fosters a culture of innovation, Kynetec is the place for you.Your RoleWe are seeking an experienced Data Engineer to join our team.This position sits in our Technology, Infrastructure Team. It is a full-time remote position, although we do have an office in Guelph, Ontario to allow for in-person collaboration. Do you have experience and certifications in Databricks? Do you have a robust background in IT Infrastructure? Can you master Azure environments, ensuring applications are managed, licensed, and secure? This may be the perfect opportunity for you!Day-to-Day TasksDesign and Build Pipelines : Create scalable, robust data pipelines using Databricks and Apache Spark.Optimise and Troubleshoot : Ensure data pipeline performance and reliability, and troubleshoot issues efficiently.Maintain Data Quality : Implement data quality checks and ensure data integrity across various sources.Collaborate Effectively : Work closely with data scientists, analysts, and engineers to meet data requirements.Utilise Delta Lake and CI / CD : Manage data lakes with Delta Lake and automate workflows using CI / CD pipelines.Stay Updated and Secure : Keep up with the latest Databricks features, ensure data security, comply with regulations and ensure best practise are applied.RequirementsHands-on experience with Databricks, working on Infrastructure & environment specifically.Bachelor's degree in Computer Science, Information Technology, or a related field.Proficiency in DevOps practices and tools.Strong understanding of cloud platforms (AWS, Azure).5+ years experience in SQL.Experience with infrastructure as code (IaC) tools such as Terraform.Familiarity with CI / CD workflows.Excellent problem-solving and analytical skills.Strong communication and collaboration skills.Relevant certifications in cloud technologies and Databricks are advantageous. Experience with Python is preferred. Experience with Snowflake is preferred. Next StepsPlease submit your application by applying directly to this vacancy on LinkedIn.Interview process : Informal screening call with Recruiter, 2 stage interview with hiring team & a short technical round.