CSE 477 Spring 2014

This is the project page for CSE 477 Spring 2014.

Platform perils

Week2 did not end smoothly, as our MegaMiniR4 died under mysterious circumstances before we could collect enough data to validate our project idea.  Because of the extremely bad timing (and because the MegaMini's untimely death was likely my fault), I put together a new test board.


This is an Arduino Uno R3 with BMA180 accelerometer.  Same idea as the MegaMini, except considerably less sexy because it's breadboarded instead of using a custom PCB.  I repurposed the parts from a prior project.


Bulky, but it is indeed wearable.


Experimentation shows that leg angle is an extremely accurate measure of standing.  Even dancing, or sitting cross-legged, can't fool it.  A simple IIR filter applied to the calculated tilt is sufficient to mitigate sensor noise (and spiky events such as footsteps).  The process is simple enough that sitting vs.  standing detection runs onboard the Arduino itself.


As a proof-of-concept, here's 20 minutes of me being a couch potato.  Whenever I get up for drinks, you can see the tilt of my upper leg shift to vertical, and my footsteps look like a minor earthquake.  I get up and walk around 3 times for a total of about 3 1/2 minutes.  Can you spot them all?

Data Collection!

Most of today was spent writing new tools for accelerometer data collection.  We have written Matlab scripts to scrape 5-second chunks of data from the MegaMini R4, save and graph the result.



As you can see, data collection has been very successful so far.  Next step - find commonalities and find techniques to programmatically recognize them.

First data collection

We collected some data from a Megamini R4 board taped to the upper leg, to see how difficult it will be to detect sitting and standing.
Our test setup.
The Matlab output of our data. We collected the data using Teraterm from the serial output of the arduino, and formatted and graphed it using Matlab. We started with sitting, and transitioned to standing and back to sitting again. The transitions can be clearly seen, particularly in the first and third graphs (X and Z accel). The next step is to determine how difficult it will be to detect these transitions and not other activity.