Saturday, September 7, 2013

Data Logging with the Internet of Things - Xively

Sensor Graphing

The Raspberry Pi is great for collecting data points from various sensors.  For historical reasons, the sensor values can be stored in some sort of database, and retrieved for later usage.

In an article of The MagPi (Sep 2012), there was a recommendation to use the "Internet of Things" web service COSM to store data points.  COSM also provides the added benefit of graphing features.  Since this Sep 2013 article ran, COSM has been renamed to Xively.
Xively - The Internet of Things

The first step was sign up for a free Developer Account.  Then from the "Development Devices" select "Add Device".  Each device can have numerous data feeds, so I created my device with a generic name of "Raspberry Pi".

To see a sample of various feeds, and graphing options, the following is the feed from my web server.

The next step is to populate the device with data streams called "channels".  Each channel would represent an individual sensor.  The sensors can be added from the web interface, or dynamically added using the Xively API.

Xively has a simple REST based API that is fairly easy to use.  There are also numerous libraries for various programming languages, such as Python, C and Java.  Having a bit of an infatuation with Python, I of course chose to go with the Python library.

Installation of the Xively Python library was fairly easy.  There is a dependency on the external "requests" Python library that has to be installed first, and it does require Python 2.7 or higher.
# Xively Library Installation 
git clone
git clone
cd xively-python
ln -s ../requests/requests

# verify import throws no errors
echo "import xively" | python

Note: If you get an error about "ImportError: No module named requests.auth", make sure the "requests" python library is installed.
Note: If you get an error about "params = {k: v for k, v in (    SyntaxError: invalid syntax", make sure you are using Python 2.7+.

I put together this simple script for dynamically updating the channels.  I then plugged this directly into the various sensor reading scripts.


# NOTE: xively requires Python 2.7+

import sys
import datetime
import xively

API_KEY = "API_KEY"  # set to your API Key
FEED_ID = "FEED_ID"  # set to your Feed ID

if len(sys.argv) != 3:
    print "Usage: {0} <channel> <value>".format(sys.path[0])

channel = sys.argv[1]
value = sys.argv[2]

api = xively.XivelyAPIClient(API_KEY)
feed = api.feeds.get(FEED_ID)
now = datetime.datetime.utcnow()
feed.datastreams = [
    xively.Datastream(id=channel, current_value=value, at=now)

This script can then be run with a simple:
./ sensor1 42

Although the live graphing options are nice, I also wanted to be able to embed the graph into a web site.  Xively provides an API to do this as well:

The following dynamically generated CPU graph was generated from a simple query URL, which can be embedded into a web site:

Now that I have my sensor data being stored and graphed, the next task will be to extend the Python script to be able to pull the raw data back and perform some trending calculations.

1 comment:

Nusrat Borsha said...

Weather Measuring Instruments & stations are very useful for our lives. From radios, TV, social medias, newspapers and with different sources we are informed everyday of weather updates. Humidity, temperature and every thing.We could save our life getting informed about tornado of last year because of God and this weather measuring instruments.