CSE 477 Spring 2014

This is the project page for CSE 477 Spring 2014.

Belated weekly update

We acquired a TI SensorTag to use as a prototype for our posture sensor. Steven spent some time in Matlab doing data mining to figure out how to recognize sitting, standing, lying down, and walking from sample data we took with our first prototype.
This week we will be modifying the SensorTag firmware to attempt to recognize posture using accelerometer data on the SensorTag itself. Soon we'll start working on developing an iOS application to receive posture logs from the chip over bluetooth.
This week we'll also be starting to design a circuitboard for our product. We hope to send off the design to be fabricated early next week, so we can get the board back the week after, then solder on the components, design and print a case, and test our application on the actual board.

Data mining

We collect some data on Tuesday by tying the accelerometer on the right hip rather than the right upper thigh. The following picture shows how we collect the system.


We collected the accelerometer data in a 5 second interval for the motion of sitting to standing, standing to sitting, walking, limping, jumping. Then we processed those time-series data to statistical and physical features. Average, variance, normalized signal magnitude area, dominant frequency, energy, averaged acceleration energy. Scatter plots in 2D feature space below shown that we could use some logistic regression to classify different activity.






Platform perils

Week2 did not end smoothly, as our MegaMiniR4 died under mysterious circumstances before we could collect enough data to validate our project idea.  Because of the extremely bad timing (and because the MegaMini's untimely death was likely my fault), I put together a new test board.


This is an Arduino Uno R3 with BMA180 accelerometer.  Same idea as the MegaMini, except considerably less sexy because it's breadboarded instead of using a custom PCB.  I repurposed the parts from a prior project.


Bulky, but it is indeed wearable.


Experimentation shows that leg angle is an extremely accurate measure of standing.  Even dancing, or sitting cross-legged, can't fool it.  A simple IIR filter applied to the calculated tilt is sufficient to mitigate sensor noise (and spiky events such as footsteps).  The process is simple enough that sitting vs.  standing detection runs onboard the Arduino itself.


As a proof-of-concept, here's 20 minutes of me being a couch potato.  Whenever I get up for drinks, you can see the tilt of my upper leg shift to vertical, and my footsteps look like a minor earthquake.  I get up and walk around 3 times for a total of about 3 1/2 minutes.  Can you spot them all?

Data Collection!

Most of today was spent writing new tools for accelerometer data collection.  We have written Matlab scripts to scrape 5-second chunks of data from the MegaMini R4, save and graph the result.



As you can see, data collection has been very successful so far.  Next step - find commonalities and find techniques to programmatically recognize them.

First data collection

We collected some data from a Megamini R4 board taped to the upper leg, to see how difficult it will be to detect sitting and standing.
Our test setup.
The Matlab output of our data. We collected the data using Teraterm from the serial output of the arduino, and formatted and graphed it using Matlab. We started with sitting, and transitioned to standing and back to sitting again. The transitions can be clearly seen, particularly in the first and third graphs (X and Z accel). The next step is to determine how difficult it will be to detect these transitions and not other activity.