Master Thesis Project within Computer Vision
Irisity is now looking for an engaged Master student in the field of Computer Vision or Machine Learning who is interested in the following Master thesis project:
Reducing annotation efforts using transformer-based models on video data
The recently published segment-anything model (SAM) [1] suggests that it is possible to significantly reduce the amount of manual work required for producing accurate segmentation masks by utilizing only light human supervision.
The current SAM model operates on single frames only. It seems reasonable that the SAM (or a similar) model could have improved performance when also accounting for preceding and successive frames in a video. As Irisity has access to a large-scale dataset of annotated videos, extending an SAM-like model to utilize such data is of high interest.
The goal of this master thesis is to investigate and compare several ways of extending the SAM model to video data. Depending on interests this could be done either with the goal of further reducing human annotator input over the baseline SAM, or to improve fully unsupervised generalization.
Irisity has access to a large-scale database of annotated video data that could be used for evaluation and training, as well as computational resources for running experiments.
[1] Segment Anything Alexander Kirillov, Eric Mintun, Nikhila Ravi, Hanzi Mao, Chloe Rolland, Laura Gustafson, Tete Xiao, Spencer Whitehead, Alexander C. Berg, Wan-Yen Lo, Piotr Dollár, Ross Girshick
Sounds interesting? We are looking forward to your application!
Location
Irisity AB headquarters in Lindholmen, Gothenburg. This project can be done remotely.
Duration
30 ECTS
Preferred Starting Date January 2024 (flexible)
Please, apply with your CV and grade transcripts. Selection and interviews will be continuously ongoing, we therefore recommend applying as soon as possible. Should you have any inquiries about this thesis project, reach out to Gustav at gustav.hager@irisity.com. For general questions regarding the process, please email Anna at anna.engqvist@irisity.com.
About Irisity
Irisity is a leading provider of AI-powered video analytics solutions. We develop deep learning-based algorithms upgrading security cameras into intelligent detection devices while safeguarding personal integrity. We believe that enhanced AI performance, ethics, and privacy go hand in hand, creating a positive mark within the camera security industry.
Irisity currently serves customers in more than 90 countries and has offices in Sweden, the USA, Singapore, the UAE, and Israel, and operates through a network of resellers, partners, security companies, and camera manufacturers globally.
As a global company, we value diverse teams that can contribute to our team success through their unique perspectives, experiences, and backgrounds.
Read more at www.irisity.com
- Team
- Student Opportunities
- Locations
- Göteborg
- Remote status
- Hybrid
Göteborg
About Irisity
We develop video analytics software for the security industry with respect for personal integrity and privacy. At sensitive installations such as schools, our system even anonymises all individuals to hide their identity. This way, our system can detect what happens and alert, leading to reduced crime rates and increased safety without negative effects on personal integrity. We believe that enhanced AI performance, ethics and integrity go hand in hand and are working to create a positive mark within the camera security industry.
Irisity currently serves customers in more than 90 countries and has offices in Sweden, the USA, Singapore, the UAE, and Israel, and operates through a network of resellers, partners, security companies, and camera manufacturers globally. Our product IRIS+ is a cloud-based SaaS platform marketed towards global customers and with thousands of cameras connected.
For further information about Irisity, please visit our website.
Master Thesis Project within Computer Vision
Loading application form
Already working at Irisity?
Let’s recruit together and find your next colleague.