Who we are
Joyride is the world's leading SaaS platform for micromobility, enabling businesses around the globe to launch, manage and grow their own branded fleets of bikes, scooters, mopeds and everything else smaller than a car.
We are passionate about transportation and changing the way people think about community connectivity. At Joyride, you'll get to work with a motivated and driven team to find creative solutions to exciting challenges in a rapidly evolving, fast-paced industry.
Job Overview :
We are seeking a highly skilled and experienced Data Engineer to join our team. As a Data Engineer, you will be responsible for handling a large volume of data from back-end apis, mobile applications and IoT devices.
Your primary responsibility will be creating meaningful insights from this data, developing and training machine learning models, and ensuring the efficient flow of data stored in multiple data storage systems such as MySQL, PostgreSQL, and Google BigQuery.
Key responsibilities include but are not limited to :
- Create and maintain machine learning models that can provide useful predictions and insight
- Design and develop scalable data processing pipelines that can handle large volumes of data from multiple data source.
- Develop and implement data storage and retrieval strategies that are efficient and scalable.
- Create and maintain databases using technologies like MySQL, PostgreSQL, and Google BigQuery.
- Implement and maintain data integration workflows between various data storage systems.
- Monitor and optimize the performance of data processing pipelines, databases, and machine learning models.
- Ensure the security and privacy of data by implementing appropriate access controls and encryption measures.
- Collaborate with other teams to understand their data requirements and provide solutions that meet their needs.
Desired qualifications
- Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- 5+ years of experience in a similar role.
- Experience in handling large volumes of data, preferentially from IoT devices.
- Experience in working with geospatial data and analyzing them at large scale.
- Strong programming skills in languages like Python, Java, and Scala.
- Strong experience in data storage and retrieval using technologies like MySQL, PostgreSQL, and Google BigQuery.
- Experience in designing and developing scalable data processing pipelines using technologies like Apache Spark, Apache Flink, or Kafka.
- Strong understanding of machine learning concepts and experience in developing and training machine learning models.
- Experience in implementing and maintaining data integration workflows between various data storage systems.
- Experience working with different cloud providers such as AWS, GCP and Azure.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Working conditions :
- Health & Dental Insurance
- Hybrid office schedule
- Head office conveniently located close to Union Station
- ESOP
- Micromobility perks