I sent the previous UI prototype to Dori to ask her opinion on it and to ask for examples of the feedback they currently give to patients. She sent back examples and some feedback on the prototype. She also forwarded it to their group at UCSD, who sent us examples of the graphs that were the most popular among their participants.We got some good feedback on what elements of the UI were liked, and what parts they didn't think were necessary. A summary:
- The Details screen included the same numbers as the Summary, which was unnecessary. It should focus on showing the breaks in sitting throughout the day as this is "one of the most important concepts for health benefits".
- They would like activity (walking/moving/etc) separated from standing time in the analysis.
- They currently use bar charts to show daily stats on how long a patient spent lying, sitting, standing, and moving, either with the bars separated or with them stacked on each other and distinguished by color.
The UCSD group also mentioned they would like a feature where the application asks patients how they are feeling throughout the day and tracks that as well; however, this would be a stretch to implement in the time we have available.
Based on the feedback, last week I drew a revised paper prototype for the Details and Week View screens. For the Week View I decided to use separate bars for lying, sitting, etc., but we could also stack them into a single bar for each day. This might make the UI less visually noisy, but I feel it could make it a little harder to compare the size of each bar between days.

For the Details screen, I filled in the graph with different shades to represent the different categories. In the finished version, this would be colored. The examples Dori sent us use a "stoplight" color scheme: red for sitting, yellow for standing, and green for moving. The examples from UCSD use the same color scheme, plus purple for sleeping time. We would probably adopt the same color scheme for our application.
For the next week, I plan to CAD up a case that will hold our battery and PCB. I've got some experience with Solidworks and with 3D printing so it should go fairly quickly once I figure out how to use the Dimension printer we've got in the lab. I'll need to figure out the precise dimensions of the battery and the PCB with our components attached, so that I can make a case that's compact but still fits everything. I'm planning to make the exterior gently curved, so that it will fit nicely on the hip, which will make the design somewhat more complex. Also, we'll need it to be in two pieces so we can put the hardware inside it, and the way those pieces will attach is still up in the air. We may resort to using one or two small screws to hold it together.
Last week's firmware accomplishments
- Ran benchmarks on default SensorTag firmware to determine how much stack/heap space is left after the bluetooth stack is up and running.
- Wrote C code to detect features from raw accelerometer data, using Steven's algorithm.
- Checked and rechecked PCB schematic. Verified our chosen pins and ports are usable from the software.
- Started porting code for Atmel Dataflash.
This week's firmware goals:
- Get ML classifier working completely on the CC2541 device.
- Finish the datalogging (and data transmitting) code.
- Finish the Atmel DataFlash driver.
I have also suffered from hardware problems of my own (namely, my laptop ingesting a large amount of earl grey tea). This has hampered my progress somewhat, but nothing was permanently lost.
I made an iPhone app that takes data from our 2nd prototype, and applied the same algorithm. We get similar graph and accuracy as the first prototype. I will start working on the iPhoene app that incorporates the Zach's UI design this week.
Edit: add the cycling data
In the past week, we were able to ordered most of the needed hardware, and I was able to use Eagle to start making custom PCB. Eagle has a lot of open source libraries for the hardware we ordered, yet I had to make my own libraries that I could not find online. For example, I made DC/DC buck converter TPS62730, and its footprint size is 1.5mm x 1.05mm, therefore even every little error could be critical. I also made a library for KTIX9 accelerometer that is little bigger the DC buck converter.
However, after talking with TA and Professor Patel, we decided not to use buck converter for two reason, the first one is it is not necessary to DC buck converter to make the system power efficient since the system will be operating with 3.3 V; the second reason the DC buck converter package is QFN, which in not human solderable. It is better off not to use. In conclusion, we will need finalize the both schematic and layout the PCB, and then launch 1x1 scale PCB on printing paper, and check if the parts we order fit into spots.
I did a first draft paper prototype of what our phone application UI might look like. This will no doubt change a lot as we communicate with Dori Rosenberg from Group Health about what they would like the data visualization to look like.I didn't draw the interface for pairing the phone and the device yet. The part I'm most unsure of is the calendar display; It might be better to display the summary information on the calendar visually, rather than using numeric times.
I collected accelerometer data from Peter, Zachary and myself, and applied some machine learning algorithm. I extracted three features out of the 10 Hz 3-axis accelerometer data. Those features are Average Motion Intensity (AI), Variance of Motion Intensity(VI), and Normalized Signal Magnitude Area (SMA). The diagram below show all the feature extracted from the data set, the data denote with CNN are used for CNN algorithm, which the data are condensed for that algorithm. We can get 97% test set accuracy by using CNN algorithm, and 91% by using logistic regression classifier. However, CNN algorithm is easy to affected by noise. We could remove the noise before we applied CNN, so that we get a higher accuracy. Since we already have the multiple machine learning algorithm ready, I will start working on the iOS app that interface with our device.

Edit:
I just found out I forgot to take absolute the values on the SMA calculation. However, the accuracy did not change. Below is the corrected diagram.
We acquired a TI SensorTag to use as a prototype for our posture sensor. Steven spent some time in Matlab doing data mining to figure out how to recognize sitting, standing, lying down, and walking from sample data we took with our first prototype.This week we will be modifying the SensorTag firmware to attempt to recognize posture using accelerometer data on the SensorTag itself. Soon we'll start working on developing an iOS application to receive posture logs from the chip over bluetooth.
This week we'll also be starting to design a circuitboard for our product. We hope to send off the design to be fabricated early next week, so we can get the board back the week after, then solder on the components, design and print a case, and test our application on the actual board.