An app that uses hand gesture recognition to trigger actions in the streaming program OBS.

ARKit, Autolayout, Swift, UIKit, Vision Framework, Websockets
Scroll Down

Focal Points.

  • Build a functioning prototype in 24 hours
  • Learn the basics of ARKit and Vision Frameworks and implement them into the app
  • Train an MLModel to recognize different hand-poses
  • Build a backend that communicates with the OBS streaming program, using WebSockets

The Process.

This medium article written by my teammate Alex Silver is a great overview of how we made this project.

This project was the product of a hackathon called WhatTheHack, organized by Cyril Garcia and Ting Becker. The idea was to code a fully functioning app in 24 hours and present it to our peers. I managed to join a really cool team made up of (twitter handles) @silvr, @Ezra_Black_, and @nedimcodes. We got together and organized our idea before the event started, so we could spend as much time coding as possible.

Things we would change or improve

  • Add compatibility with more hand-poses
  • Improve hand-pose detection accuracy
  • Refactor and improve app architecture, perhaps in SwiftUI