If you have not read Inventing with the IoT – Workshop go do so first, otherwise this will make less sense than usual!
As I said in my previous post, I never actually worked with the IoT code before the workshop, so I downloaded it and combined it with my PiDuino code from earlier this week. As a result I have an Internet of Things NeoPixel ring.
The modified Tell.py is straight forward enough. Remove all the PiBrella code as we’re not using the buttons, then add some lines to get a pixel ID and red, green and blue colour values to define the pixel colour. Bundle it together as a list object and send to the receiver.
The initial version of the Reveal.py read the incoming data, and uses the values to set the NeoPixel via the PiDuino (as described in my previous article). That’s all well and good, but how will the remote end sending the command know it’s worked? Good job I have a PiCamera to hand! A little Heath Robinson later (note the camera mount), once the pixel is set, an image is captured, some text is overlayed and then uploaded to a gallery on my web hosting. I am using UberGallery, which is just the job as it just displays the images in a folder.
The following morning @SouthendRPiJams (aka Andy) and I had an IoT exchange, Andy was in the workshop on Saturday. I sent his Pi a message (i.e. “Hello”) which then appeared on his screen and made his PiGlow flash via modified versions of the Tell.py and Reveal.py code (the tell() command just sends the string). He then set up a Tell.py to send me the details my Reveal.py needed to light a pixel, and you can see the results in the gallery currently.
So what’s the practical application? Currently none, this is just an experiment, but when you look at IoT lights like the Good Night Lamp this project could evolve into something similar; allowing you to be notified when family or friends are at home. For the more technically minded it could represent the status of a set of services.
I will endeavour to keep my Pi up for a few days if you would like to try the Tell.py script yourself. Check the gallery around 15-30 seconds after a command has been sent to see if it worked. Send me a tweet before you do, it’s always nice to see these things in action. Let me know were you’re from and we can see who can get the award for the furthest connection. Southend to Farnborough to beat.
Please do not make repeated, rapid calls else I’ll have to shut it down for fear of psychological damage to Babbage caused by all those lights!.
The code? Glad you asked…
Tell.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 | # Tell.py 12/09/2014 C.Monk # # Derived from # Tell.py 12/09/2014 D.J.Whale # # Tells a revealed actuator to do something. # Please exercise restraint and do not make frequent repetitive calls import IoticLabs.JoinIOT as IOT from config import * import time MY_NAME = MY_COMPUTER + "_Tell" THEIR_COMPUTER = "ForToffee_Pi_1" THEIR_NAME = THEIR_COMPUTER + "_Reveal" THEIR_ACTUATOR = "PIXEL" IOT.joinAs(MY_NAME) pixel = IOT.attachTo(THEIR_NAME, THEIR_ACTUATOR) def main(): while True: p = int(raw_input("Pixel (0 - 15): ")) if p < 0: break r = int(raw_input("Red (0-255): ")) g = int(raw_input("Green (0-255): ")) b = int(raw_input("Blue (0-255): ")) print ("Sending pixel {} = r{}, g{}, b{}".format(p,r,g,b)) pixel.tell([p,r,g,b]) try: main() finally: IOT.leave() |
Reveal.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 | # Reveal.py 12/09/2014 C.Monk # # Derived from # Reveal.py 05/09/2014 D.J.Whale # # Reveals a set of neopixels so they can be remotely controlled. # # This script is designed only for the Raspberry Pi import IoticLabs.JoinIOT as IOT from config import * import time from datetime import datetime import picamera import piduino from Copro import * import ftplib from subprocess import call piduino.firmware("firmware/copro/copro.hex") piduino.connect(baud=9600) MY_NAME = MY_COMPUTER + "_Reveal" MY_ACTUATOR = "PIXEL" IOT.joinAs(MY_NAME) def newData(actuator, value): #print("actuator: ", actuator.topic, actuator.originator, actuator.unique) #print("data:" + value) data = value.strip('[').strip(']').split(',') neopixel(int(data[0]), (int(data[1])/5, int(data[2])/5, int(data[3])/5)) #r,g,b / 5 to save retina! print 'Capturing image' i = datetime.now() now = i.strftime('%Y%m%d-%H%M%S') tweettime = i.strftime('%Y/%m/%d %H:%M:%S') photo_name = now + '.jpg' photo_full_path = '/home/pi/grabs/' + photo_name #http://picamera.readthedocs.org/en/release-1.8/recipes1.html with picamera.PiCamera() as camera: camera.resolution = (1024, 768) #camera.start_preview() # Camera warm-up time time.sleep(2) camera.capture(photo_full_path) #http://raspi.tv/2014/overlaying-text-and-graphics-on-a-photo-and-tweeting-it-pt-5-twitter-app-series overlay_text = "/usr/bin/convert "+ photo_full_path + " -pointsize 24 -fill white -annotate +40+728 '" + actuator.topic + "' " overlay_text += " -pointsize 36 -fill white -annotate +40+675 'Pixel " + data[0] + " = R:" + data[1] + " G:" + data[2] + " B:" + data[3] + "' " + photo_full_path print "overlaying text" call ([overlay_text], shell=True) #http://stackoverflow.com/questions/12613797/python-script-uploading-files-via-ftp print 'Uploading image' session = ftplib.FTP('','','') fup = open(photo_full_path, 'rb') session.storbinary('STOR ' + photo_name, fup) fup.close() session.quit() print 'Done' try: IOT.reveal(MY_ACTUATOR, incoming=newData) IOT.loop(True) finally: IOT.leave() neopixel(0xFF, (0, 0, 0)) |
What Next?
I need to look at the other functions currently implemented, the more curious amongst you will have noticed the library code is very much in its infancy. There’s no direct feedback to a message being sent so the remote doesn’t know what happened. One solution to that at the moment looks to be Feeds, so that may be my next step with this code.
Have fun, and let me know how you get on.