I sent the previous UI prototype to Dori to ask her opinion on it and to ask for examples of the feedback they currently give to patients. She sent back examples and some feedback on the prototype. She also forwarded it to their group at UCSD, who sent us examples of the graphs that were the most popular among their participants.We got some good feedback on what elements of the UI were liked, and what parts they didn't think were necessary. A summary:
- The Details screen included the same numbers as the Summary, which was unnecessary. It should focus on showing the breaks in sitting throughout the day as this is "one of the most important concepts for health benefits".
- They would like activity (walking/moving/etc) separated from standing time in the analysis.
- They currently use bar charts to show daily stats on how long a patient spent lying, sitting, standing, and moving, either with the bars separated or with them stacked on each other and distinguished by color.
The UCSD group also mentioned they would like a feature where the application asks patients how they are feeling throughout the day and tracks that as well; however, this would be a stretch to implement in the time we have available.
Based on the feedback, last week I drew a revised paper prototype for the Details and Week View screens. For the Week View I decided to use separate bars for lying, sitting, etc., but we could also stack them into a single bar for each day. This might make the UI less visually noisy, but I feel it could make it a little harder to compare the size of each bar between days.

For the Details screen, I filled in the graph with different shades to represent the different categories. In the finished version, this would be colored. The examples Dori sent us use a "stoplight" color scheme: red for sitting, yellow for standing, and green for moving. The examples from UCSD use the same color scheme, plus purple for sleeping time. We would probably adopt the same color scheme for our application.
For the next week, I plan to CAD up a case that will hold our battery and PCB. I've got some experience with Solidworks and with 3D printing so it should go fairly quickly once I figure out how to use the Dimension printer we've got in the lab. I'll need to figure out the precise dimensions of the battery and the PCB with our components attached, so that I can make a case that's compact but still fits everything. I'm planning to make the exterior gently curved, so that it will fit nicely on the hip, which will make the design somewhat more complex. Also, we'll need it to be in two pieces so we can put the hardware inside it, and the way those pieces will attach is still up in the air. We may resort to using one or two small screws to hold it together.
Last week's firmware accomplishments
- Ran benchmarks on default SensorTag firmware to determine how much stack/heap space is left after the bluetooth stack is up and running.
- Wrote C code to detect features from raw accelerometer data, using Steven's algorithm.
- Checked and rechecked PCB schematic. Verified our chosen pins and ports are usable from the software.
- Started porting code for Atmel Dataflash.
This week's firmware goals:
- Get ML classifier working completely on the CC2541 device.
- Finish the datalogging (and data transmitting) code.
- Finish the Atmel DataFlash driver.
I have also suffered from hardware problems of my own (namely, my laptop ingesting a large amount of earl grey tea). This has hampered my progress somewhat, but nothing was permanently lost.
I did a first draft paper prototype of what our phone application UI might look like. This will no doubt change a lot as we communicate with Dori Rosenberg from Group Health about what they would like the data visualization to look like.I didn't draw the interface for pairing the phone and the device yet. The part I'm most unsure of is the calendar display; It might be better to display the summary information on the calendar visually, rather than using numeric times.
We acquired a TI SensorTag to use as a prototype for our posture sensor. Steven spent some time in Matlab doing data mining to figure out how to recognize sitting, standing, lying down, and walking from sample data we took with our first prototype.This week we will be modifying the SensorTag firmware to attempt to recognize posture using accelerometer data on the SensorTag itself. Soon we'll start working on developing an iOS application to receive posture logs from the chip over bluetooth.
This week we'll also be starting to design a circuitboard for our product. We hope to send off the design to be fabricated early next week, so we can get the board back the week after, then solder on the components, design and print a case, and test our application on the actual board.