SiaSearch (prev. 'Redfish')

Video Clip Filtering Tool

client-projecttypescriptreactreduxhooksvideo-frame-data

A tool to filter a large amount of video clip data along semantic attributes. Seasearch allows autonomous driving engineers to find relevant data for driving scenarios, quickly.

Challenge

When software engineers develop an autonomous driving algorithm, they have to test it against a vast variety of driving scenarios. Collecting large amounts of driving data through sensor recordings and camera clips is comparatively simple. But if you look at the data, how do you filter all recordings by, e.g.: three or more pedestrians, one left turn, high speed, at night?

Solution

The team at MX Automotive – a Merantix company – use clever machine learning techniques to analyze driving data. They derive semantic attributes for every frame. The result is a rich data set for autonomous driving engineers. We set out to built a filtering tool on top of that dataset: SiaSearch allows formulating queries to find matching video clip data. Using SiaSearch, engineers are also able to play the clips, to check each frame against semantic attributes, and finally to export relevant clip data.

My role

I've built the web frontend for SiaSearch in 10 days, using typescript, react, hooks, and more. The tricky part was matching the video frame data with the closest semantic attributes to that frame in order to synchronize updates between map, data table, and video player. I liked the project a lot, because working with the whole team at MX Automotive was a breeze.

PROJECTS

If you're looking to collaborate,
let's meet for a coffee

Back to top

2021 © Kai-Adrian Rollmann
All rights reserved