Remote and Local RPi Sensors in Cacti

Remote and Local RPi Sensors in Cacti

Introduction – how’s it going

This post will cover the logging of data in Cacti using non SNMP/WMI sources.  We’ll gather local RPi information and plot it in Cacti.  Then we will go through the procedures to gather data from the internet and process it into Cacti.  This is an expansion of the NOAA XML file processing done in an earlier post.

Purpose – having a reference can give you an idea of were you’re going

Cacti is a multi purpose network device tracking tool.  In addition to that, it can graph data plots from other sources.

Details – getting there isn’t always as easy as it seems

Lets start with setting up our Cacti interface.  This was done on my earlier post, but I want to revisit it and make it more clear.  We’ll be adding 4 sensors to our Cacti interface.  These are 3 NOAA readings and 1 reading from the RPi chipset.  Start by logging into the Cacti interface.

A – The first step is to define the Data Input Method.

  1. From the Console, click “Data Input Methods” from the Collection Methods section.
  2. Click the Add link.
  3. Enter a meaningful name for the DIM, ie. “RPi Chipset Temp“.
  4. Select “Script/Command” from the Input Type pull down list.
  5. Enter in the script that will poll and format the results for Cacti, ie. “/var/www/cacti/scripts/“.
  6. Click save and this will add more fields to enter.
  7. Click add in the Output Fields list.
  8. Enter the field output, this has to match the script results deliminator in step 6, ie. “rpitemp“.
  9. Click save to set the Output Fields option.
  10. Click save to set the DIM settings.

B – The second step is to define the Data Sources.

  1. From the Console, click “Data Sources” under the Management section.
  2. Click the Add link.
  3. From the Host list, select the local machine.  It should have the loopback IP listed, ie.
  4. Click the Create button.
  5. Now more fields should be available.
  6. Enter in the Data Source Name, ie. “RPi Chipset Temp“.
  7. From the pull down in the Data Input Method section choose the correct DIM, ie. “RPi Chipset Temp“.
  8. In the Data Source Item section, enter in the Internal Data Source Name.  This has to match the item entered in step 10 and 6, ie. “rpitemp“.
  9. Lastly, enter in the Minimum and Maximum values.

C – Now the third step is to create the Graph Template that the readings will report on.

  1. From the Console, click “Graph Management” under the Management section.
  2. Click the Add link.
  3. From the Host list, select the local machine.  It should have the loopback IP listed, ie.
  4. Click the Create button.
  5. Now more fields should be available.
  6. Enter in the Title under Graph Configuration, ie. “NOAA Seattle Temp“.
  7. From the Vertical Label enter in the unit of measure, ie. “fahrenheit“.
  8. Now we’re ready to add Graph Items up above, click the Add link.
  9. In the Graph Items section, use the Data Source pull down to select the correct DS, ie. “NOAASeaTemp (noaaseatemp)“.
  10. Choose your color from the pull down, ie. “6EA100“.
  11. Change your Graph Item Type to “AREA“.
  12. In the Text Format field, enter in a description of the reading, ie. “Tempurature“.
  13. Click save to commit and now the graph should appear in the Graph Template Selection window, under Graph Items.
  14. Click save again and now you are done.

Go through these steps for each of the 4 sensors.  Don’t worry too much about the script in step A5, we’ll do that next.  This main thing up to this point is getting Cacti setup.  Since we’re on the subject, lets start checking our scripts.  Since most of the leg work has been done on the NOAA scraped data, we’ll start with that.

First off, I had run into a some issues when I setup the Cacti data set.  It was along the lines of the error the folks at NASA had made with imperial versus metric, which resulted in a lost mission.  I set my max value to 50 degrees, thinking it we would never get a day hotter than 122F.  Problem was my readings were in Fahrenheit.  The only way to correct this was to change the max value, delete the old rrd data set, and recreate it.

Next up are the scripts that scrape the readings from NOAA.  I recycled the scripts and didn’t change the variable definitions.  It caused some overlap which resulted in erroneous values.  When I created unique variable definitions, it worked like a charm.  It’s the easy stuff that will gang up on you and throw you off the mountain.

Finally, my last botch was connecting an IDE cable to the GPIO pins while the system was hot.  Really, how brain dead is that?!  Needless to say, the RPi became bricked.  I had hoped that it was a thermal fuse that would reset, but time wasn’t a healer.  So I got to put my backup/restore procedures to the test.  Now I’m running everything on the latest RPi and I think I’m a better person for it.

So here is the script I’m using to scrape readings from NOAA with the Cacti formatted output.


from lxml import etree
import urllib2
import os

tmpurl = ''
tmpfp = urllib2.urlopen(tmpurl)
tmpdoc = etree.parse(tmpfp)

tmptemp = tmpdoc.xpath("//current_observation/temp_f")[0].text
tmptemp = tmptemp[:-2]

file = open('/var/www/cacti/scripts/noaaseatemp.txt' , 'w')
tmpkeyval = "noaaseatemp:"
tmpstaticval = str(tmpkeyval)
tmpvalue = str(tmptemp)

file = open('/var/www/cacti/scripts/noaaseatemp.txt' , 'r')
tmpvariable =
print tmpvariable

The last one we’ll look at is the RPi BCM2835 system on a chip (SoC) temperature readings.  To get these reading, open a shell and enter in this command.

/opt/vc/bin/vcgencmd measure_temp

The reading should appear below the command after you press enter.  Here comes the problem.  We need to get the numeric value over to Cacti.  Do do that I used this script to get the correct format that Cacti can use.


/opt/vc/bin/vcgencmd measure_temp > "$chiptempdir"

echo "$chiptemphead$chiptemp" > "$chiptempdir"

echo "$chiptempvalue"

Cacti took to it like bear to honey.  It plotted everything as expected, then my router died.  That’s one of the dependencies for web scraping, internet access.  It seems obvious, but I wanted to point that out.  If you plan to use this function for critical applications, be sure you factor in that.

Relations – expanding control functions

Using web sources has its benefits and its risks.  It can save time with sensor development and deployment.  However, it increases the risk of disruption because web content is not controlled.  Using the content for validation is one way to reduce the risk.  The reliance of the data should be a factor in the design.

There are disclaimers on many data source outlets.  It is a risk if critical systems depend solely on these sources to make decisions.  The key to successfully using third party data sources is to diversify and validate

Summary – data is a commodity

We covered some concepts of data mining.  In this post we showed how to gather data from internal and external sources.  Concepts of reliability and contingency were also covered.  With these skills, advanced topics of system control and monitoring can be covered.

Comments are closed.