Computer vision for the blind

Computer vision is a powerful technology solving real problems in the real world, already today. It holds the potential to significantly improve life on earth over the next decades. At Rerun we have the privilege to work directly with developers that are building that future. From time to time we will introduce companies building computer vision products for the real world. The first company we want to introduce is biped.

Translating vision to audio

biped is a Swiss robotics startup that uses self-driving technology to help blind and visually impaired people walk safely. The shoulder harness they develop embeds ultra wide depth and infrared cameras, a battery and a computation unit. The software running on the device estimates the positions and trajectories of all surrounding obstacles, including poles, pedestrians or vehicles, to predict potential collisions. Users are warned with 3D audio feedback, basically short sounds similar to the parking aid in a car, that convey the direction, elevation, velocity and type of obstacle. The device also provides GPS instructions for a full navigation assistance.

biped is one of the most advanced navigation devices for pedestrians, and seeks to improve the independence of blind and visually impaired people across the world.

A demo of biped being used out in the wild, including audio feedback users receive

A complex pipeline with complex data

On a high level, biped’s software does the following in real-time:

  1. Sequentially acquire image and depth data from all cameras, fusing the different inputs into a single unified 3D representation
  2. Run perception algorithms such as obstacle segmentation or object detection
  3. Prioritize the most important elements based on risks of collision
  4. Create 3D audio feedback to describe the prioritized elements of the environment

A small change anywhere in the pipeline can affect the performance of downstream tasks and thus the overall performance significantly. For example, the quality of environment understanding strongly affects the prioritization algorithm. biped employs different strategies to counter this problem and to make development as easy as possible. One of them is to visualize intermediate steps of the pipeline with Rerun. This allows the development team to get a quick understanding of how each change affects the whole pipeline.

A visualization of biped’s perception algorithms, built using Rerun.

biped is a part of the Rerun alpha user program and has recently shifted their internal visualizations to Rerun.

If you’re interested in getting early access to Rerun, then join our waitlist.