There are different areas the team would like to investigate for future iterations of the emBRACE. The first task would be to improve the resolution of the extension and flexion angle measurements. Using the WCIED Sense's accelerometers and gyroscopes to triangulate the knee's angle proved to be quite challenging. After repeated attempts at calculating the raw angles using the sensors proved unfruitful, the team pivoted to a Machine Learning solution. The team classified the knee angle based off of training data. In the future we could spend further time investigating how the two WICED Senses could be used to actually calculate angle. Another option would be to use a flex sensor which would directly return an angle. The addition of a flex sensor would require some sort of additional micro controller or embedded system (such as the Intel Edison) to interface with the sensor. In that way, the WICED Senses are convenient because they are already enclosed systems that require no additional controller boards.
Another future step would be to classify more activity states. In it's current iteration, the emBRACE can determine whether the user is standing or sitting. The next step would be to identify whether the user is walking, running, jogging, etc. This would require further data collection sessions to obtain training data that would train the classifiers. However, this would give the doctor a greater insight into the type of tasks the user is in engaging in during his or her rehabilitation (i.e. a person who has knee surgery should not be running a marathon one week later).
The team's stretch goal was for the visualized data to be available through both a web app as well as a mobile app. There was not enough time to develop a mobile app that would contain all the visualization tools that were required, however this would be a natural future task to pursue going forward. Data accessible both through mobile devices and web apps reflect the visualization infrastructure of other medical wearables like Fitbit. In addition, to further the communication between doctor and patient, it might be worthwhile to look into allowing doctors to send notifications or provide recommendations to their patients through the app after viewing their progress.
Finally, it would be interesting if a user could compare his or her recovery progress over time to an "average" based on the type of injury. This would allow the user to identify whether he or she is recovering as quickly as other users who have a similar height, weight, injury type,etc. This would require more back end processing that would aggregate data from multiple users to determine recovery trends. Aside from higher resolution angle measurements, this might be the most promising future task to investigate to make the emBRACE platform as dynamic and useful as possible.
Another future step would be to classify more activity states. In it's current iteration, the emBRACE can determine whether the user is standing or sitting. The next step would be to identify whether the user is walking, running, jogging, etc. This would require further data collection sessions to obtain training data that would train the classifiers. However, this would give the doctor a greater insight into the type of tasks the user is in engaging in during his or her rehabilitation (i.e. a person who has knee surgery should not be running a marathon one week later).
The team's stretch goal was for the visualized data to be available through both a web app as well as a mobile app. There was not enough time to develop a mobile app that would contain all the visualization tools that were required, however this would be a natural future task to pursue going forward. Data accessible both through mobile devices and web apps reflect the visualization infrastructure of other medical wearables like Fitbit. In addition, to further the communication between doctor and patient, it might be worthwhile to look into allowing doctors to send notifications or provide recommendations to their patients through the app after viewing their progress.
Finally, it would be interesting if a user could compare his or her recovery progress over time to an "average" based on the type of injury. This would allow the user to identify whether he or she is recovering as quickly as other users who have a similar height, weight, injury type,etc. This would require more back end processing that would aggregate data from multiple users to determine recovery trends. Aside from higher resolution angle measurements, this might be the most promising future task to investigate to make the emBRACE platform as dynamic and useful as possible.