Approaching the halfway point of the project, our team took some time to plan out how we’re going to approach each feature to hit our MVP.
We met this week on 3/6 to discuss, and are reinvigorated in our motivation. We’ve finalized all of the timelines up until the end of the project.
Bluetooth is successfully running on hardware, tested on apple. The app will now search for services, and is currently being refined by Isaiah to search for the aiSight Bluetooth UUID, and we’ve set up documentation for all of the characteristics and how to parse their binary data.
Jeff is working on the settings and help pages and Kat is working on the home screen. The progress is looking good, and everything looks almost identical to the Figma!
I’ve redesigned the system architecture with the knowledge that when hosting a webserver, the phone must disconnect from its network in order to join the hosted ESP32 network. After extensive research, the best solution seems to be taking the compromised speed of streaming over Bluetooth, and giving the user the option to give the ESP32 credentials to an existing network, that the ESP32 would share data over.
For MVP, I’ll be focused on just the Bluetooth connection, and if there is time, I’ll go back and implement the credential sharing. Since this is an app designed for people with visual impairments, we’d have to find a way to make it accessible, or require the user to get some help with that part, maybe like setting it up with their home network.
Emilio is getting good progress on the sign language model, almost ready to deploy it through ONNX to the phone.