The Myoplexer measures gross muscle anatomy/biomechanics to determine intended hand shape/position. It is composed of a wearable electronic device attached to custom gel liner. As the user's muscle contracts it gets fatter in its center, compressing the gel. The Myoplexer's IR proximity sensors detect this compression. Our observations have shown that the sensor's output signal is linearly proportional to the muscle's length.
With the length information recorded over time, we have speed of contraction and can calculate the relative force generated. By using 60 of these proximity sensors, we can detect the changes in the muscles responsible for finger and wrist motion. The way our code determines which sensors are seeing what muscle is based on a gestural calibration where the user attempts to flex their muscles to replicate a predetermined hand shape. This is standard in the industry and is widely successful.
We index our sensors by using a demultiplexer to variably turn on the sensors we want to read. This demux can also be used to turn on an LED above each sensor that we can use for debugging and for stereoscopic 3D computer vision tracking of limb geometry and position in space. The output of each sensor is then amplified/filtered electronically, so that its min and max voltages are scaled to be within the range of the ADC converter of the microprocessor control unit. Meaning, even if a person is stronger or weaker (e.g. atrophied) they should have the same performance with our Myoplexer.
While this microprocessor is responsible for controlling the hand via I2C communication, the AI/neuralnet math encoded on the microprocessor needs to be done on a smarter device. Therefore, we have both bluetooth and USB compatibility, so that we can use a smartphone or a computer that has our app to calibrate, run diagnostics, record user data, and to help troubleshoot problems that may occur. All user data is encrypted and anonymized for deeplearning and to justify coverage by insurers device usage for insurers.