A #CheerLights Virtual Christmas Tree

What is #CheerLights?

Cheerlights is a IoT based light control system, originally intended to allow social media to dictate the colour of festive lights around the world.


The current cheerlight status

How does it work?

Put the word Cheerlight and a colour in a tweet and you’ve just told light systems around the world the colour they should show.  On the website is a list of the currently supported colours.

On the back-end there is a feed aggregator which exposes a JSON, TXT or XML API.

First Steps

My plan was to use the Pimoroni Unicorn HAT to display the most recent 64 colours from the “Full” feed.  Unfortunately I first started playing with this during my lunch hour at work….. where I didn’t have a Pi.  So I wrote a console app in Python to read the JSON, output all the available colours, and then poll the “Last” feed for updates.

console.py on GitHub

Time to put a HAT on

I wrote some code without the Unicorn HAT to hand then tested it when I got home (there weren’t too many bugs!).  Rather than using the UnicornHAT Python module I chose to use the underlying ws2812 module.  Madness you say?  But no! The Unicorn HAT is just a matrix of NeoPixles which means any ws2812 device should “just work”.  Up until now I could only control my 16 pixel NeoPixel from the Pi using a micro controller (read Arduino clone) as an interface.  So I wired it up; the data line goes to GPIO18 (via a low value resistor to reduce risk of power spikes) and +ve to 3.3v, otherwise we’ll blow up the Pi!  It worked as advertised first time – bonus.

Unicorn and 16 pixel ring

Unicorn HAT on A+ and 16 pixel ring on B+

neopixel.py on GitHub

In the code there is a maxPixels variable to allow you to limit the number of pixels you wish to light.  The 64 NeoPixels on the UnicornHAT are wired in a left to right/right to left pattern so it goes

0 1 2 3 4 5 6
13 12 11 10 9 8 7
14 15 16 17 18 19 20

up to 64.  So as the new colour is added the pattern shifts down in a “snake” like way.

The colours are translated to RGB values via a dictionary of the supported Cheerlight colours.

Let’s Get Festive

This section is UnicornHAT specific as the Python module allows you to treat the pixels as a grid, which makes it easier to draw patterns.  It’s Christmas, and Cheerlights was devised for the festive season, so a tree is the obvious choice.

cheertree.py on GitHub

I ended up with three modes;

  • 0 – All colours – the tree is a mosaic of the most recent colours with the most recent at the top
  • 1 – Lights – five key co-ordinates (defined in a list) represent fairy lights or baubles on the tree.  These are lit by the most recent colours, with the star on the top being the latest
  • 2 – Star – only the most recent colour and this is the top most pixel

The tree patten is defined by as a list of pixel co-ordinates.  The above pictures have some white paper over the pixels to diffuse the light which gives a nice effect.

Time for Turkey Curry?

Well, that’s that.  I hope a few of you reading this have a play with the code, be it with generic NeoPixels or a Unicorn.  Let me know how you get on in the comments below.

Posted in Add-On, IoT, NeoPixel, Raspberry Pi | Leave a comment

Word Clock with a Unicorn

A few days back I saw this article about a Word Clock that had been built using at ATmega328P and an 8×8 LED matrix. “Cool” I thought, I have bi-colour one of those in the “toy” box, I might have a go at that.

Around the same time, those pesky pirates Pimoroni were having their #yarrbooty twitter competition. I was fortunate enough to win a few rounds and decided to spend some of my booty on a Unicorn HAT and an A+. The Unicorn HAT is an 8×8 matrix of very bright RGB LEDs.

And low, the Unicorn Word Clock was born

The template for the words is just printed on a piece of white paper, cut to size.  It’s slightly off but it’s not far out.  The more adept amongst you may want to cut out a larger shape so it encompasses the entire case like a lid.

The code is fairly straight forward. I have defined lists of pixel co-ordinates for the minute phrases, which can then be combined (half + past, ten + to, etc). There is a function to determine the correct minute phrase, and one for the hour. Other than that, hopefully the comments will fill in any gaps.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
#!/usr/bin/env python
 
# wordclock.py by Carl Monk (@ForToffee)
# code is provided with no warranty, feel free to use.
# no unicorns were harmed in the making of this code
 
import unicornhat as UH
import time
 
#global variables
hourPattern = []
minPattern = []
 
#pre-defined patterns - groups of x,y co-ords - 0,0 is bottom right with GPIO at bottom
fivePattern = [[7,6],[6,6],[4,6],[2,6]]
tenPattern = [[1,7],[1,6],[0,6]]
fiftPattern = [[7,6],[6,6],[5,6],[3,6],[2,6],[1,6],[0,6]]
twenPattern = [[5,7],[4,7],[3,7],[2,7],[1,7],[0,7]]
halfPattern = [[7,7],[6,7],[7,5],[6,5]]
 
pastPattern = [[4,5],[3,5],[2,5],[1,5]]
toPattern = [[1,5],[0,5]]
 
#function to light the pixels we need to display the time
#pixels is a list of pixels
def showTime(pixels):
	UH.clear()
	for coords in pixels:
		UH.set_pixel(coords[0],coords[1],255,0,255)		#magenta
 
	UH.show()	#once pixels set, call .show() to enable them
 
#function to light the '*' character to show seconds and minutes elapsing
def showTick(m):
	colour = []
	minPart = m % 5		
	# % is modulo which gives us the remainder of m divided by 5
	# this tells us the value 0 - 4 or the part of 5 minutes the time is
 
	if m == -1:		# for setting the '*' off or black
		colour = [0,0,0]
 
	elif minPart == 0:	#:m0 or :m5
		colour = [255,0,0]		#red 
 
	elif minPart == 1 :	#:m1 or :m6
		colour = [0,255,0]		#green
 
	elif minPart == 2 : #:m2 or :m7
		colour = [0,0,255]		#blue
 
	elif minPart == 3 : #:m3 or :m8
		colour = [255,255,0]	#yellow
 
	elif minPart == 4 : #:m4 or :m9
		colour = [0,255,255]	#cyan
 
	UH.set_pixel(5,5,colour[0],colour[1],colour[2])	#5,5 is the position of '*'
	UH.show()
 
#takes the current hour and provides the required pattern of letters
def getHourPattern(h,m):
	global hourPattern
	hourPattern = []
 
	#convert 24hr into 12hr
	if h >= 12:
		h -= 12
 
	#if minutes > 35 then display will be 'to' the next hour
	if m >= 35:
		h = h + 1
		#special case for 11:35 - 12:00.  Hour is 0 to 11 so need to reset to 0
		if h == 12:		
			h = 0
 
	if h == 0:	#aka Twelve
		hourPattern =  [[7,2],[6,2],[5,2],[4,2],[2,2],[1,2]]
	elif h == 1:
		hourPattern =  [[7,3],[6,3],[5,3]]
	elif h == 2:
		hourPattern =  [[7,2],[6,2],[6,1]]
	elif h == 3:
		hourPattern =  [[4,3],[3,3],[2,3],[1,3],[0,3]]
	elif h == 4:
		hourPattern =  [[7,1],[6,1],[5,1],[4,1]]
	elif h == 5:
		hourPattern =  [[3,1],[2,1],[1,1],[0,1]]
	elif h == 6:
		hourPattern =  [[7,0],[6,0],[5,0]]
	elif h == 7:
		hourPattern =  [[4,0],[3,0],[2,0],[1,0],[0,0]]
	elif h == 8:
		hourPattern =  [[4,4],[3,4],[2,4],[1,4],[0,4]]
	elif h == 9:
		hourPattern =  [[7,4],[6,4],[5,4],[4,4]]
	elif h == 10:
		hourPattern =  [[0,4],[0,3],[0,2]]
	elif h == 11:
		hourPattern =  [[5,2],[4,2],[3,2],[2,2],[1,2],[0,2]]
 
#takes the current minute and provides the required pattern of letters
def getMinutePattern(m):
	global minPattern
	minPattern = []
	if 10 > m >= 5 or m >= 55:
		minPattern = fivePattern
	elif 15 > m >= 10 or 55 > m >= 50:
		minPattern = tenPattern
	elif 20 > m >= 15 or 50 > m >= 45:
		minPattern = fiftPattern
	elif 25 > m >= 20 or 45 > m >= 40:
		minPattern = twenPattern
	elif 30 > m >= 25 or 40 > m >= 35:
		minPattern = twenPattern + fivePattern
	elif 35 > m >= 30:
		minPattern = halfPattern
 
	#if time between 5 and 34 we need to show 'past' the hour
	if 35 > m >= 5:
		minPattern = minPattern + pastPattern
	elif m >= 35:	#otherwise 'to' the hour
		minPattern = minPattern + toPattern
 
#cycle through a full 12hrs minute by minute
def fullTest():		
	for n in range(12*60):
		getHourPattern(n / 60, n % 60)
		getMinutePattern(n % 60)
		showTime(minPattern + hourPattern)
		showTick(n)
		time.sleep(.25)
 
#cycle through hours, then minutes
def quickTest():
 
	for n in range(12):
		getHourPattern(n, 0)
		showTime(hourPattern)
		time.sleep(.5)
 
	for n in range(60):
		getMinutePattern(n)
		showTime(minPattern )
		showTick(n)
		time.sleep(.25)
 
 
#main function 
quickTest()
 
while True:
	#get time parts
	h = time.localtime().tm_hour
	m = time.localtime().tm_min
	s = time.localtime().tm_sec
 
	#get patterns
	getHourPattern(h, m)
	getMinutePattern(m)
 
	#show patterns
	showTime(minPattern + hourPattern)
 
	#flash '*' to show time passing, lit every 2 seconds
	if s % 2:
		showTick(m)
	else:
		showTick(-1)
 
 
	time.sleep(1)
Posted in Add-On, NeoPixel, Raspberry Pi | 1 Comment

Inventing with the IoT – Extension Task

If you have not read Inventing with the IoT – Workshop go do so first, otherwise this will make less sense than usual!

As I said in my previous post, I never actually worked with the IoT code before the workshop, so I downloaded it and combined it with my PiDuino code from earlier this week.  As a result I have an Internet of Things NeoPixel ring.

20140912-120350

IoT NeoPixel ring being modelled by Babbage

The modified Tell.py is straight forward enough.  Remove all the PiBrella code as we’re not using the buttons, then add some lines to get a pixel ID and red, green and blue colour values to define the pixel colour.  Bundle it together as a list object and send to the receiver.

The initial version of the Reveal.py read the incoming data, and uses the values to set the NeoPixel via the PiDuino (as described in my previous article).  That’s all well and good, but how will the remote end sending the command know it’s worked?  Good job I have a PiCamera to hand!  A little Heath Robinson later (note the camera mount), once the pixel is set, an image is captured, some text is overlayed and then uploaded to a gallery on my web hosting. I am using UberGallery, which is just the job as it just displays the images in a folder.

The finished article - stuffed toys optional

The finished article – stuffed toys optional

The following morning @SouthendRPiJams (aka Andy) and I had an IoT exchange, Andy was in the workshop on Saturday. I sent his Pi a message (i.e. “Hello”) which then appeared on his screen and made his PiGlow flash via modified versions of the Tell.py and Reveal.py code (the tell() command just sends the string). He then set up a Tell.py to send me the details my Reveal.py needed to light a pixel, and you can see the results in the gallery currently.

So what’s the practical application?  Currently none, this is just an experiment, but when you look at IoT lights like the Good Night Lamp this project could evolve into something similar; allowing you to be notified when family or friends are at home.  For the more technically minded it could represent the status of a set of services.

I will endeavour to keep my Pi up for a few days if you would like to try the Tell.py script yourself.  Check the gallery around 15-30 seconds after a command has been sent to see if it worked. Send me a tweet before you do, it’s always nice to see these things in action.  Let me know were you’re from and we can see who can get the award for the furthest connection.  Southend to Farnborough to beat.

Please do not make repeated, rapid calls else I’ll have to shut it down for fear of psychological damage to Babbage caused by all those lights!.

The code? Glad you asked…

Tell.py

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
# Tell.py  12/09/2014  C.Monk
#
# Derived from
# Tell.py  12/09/2014  D.J.Whale
#
# Tells a revealed actuator to do something.
# Please exercise restraint and do not make frequent repetitive calls
 
import IoticLabs.JoinIOT as IOT
from config import *
import time
 
MY_NAME        = MY_COMPUTER + "_Tell"
THEIR_COMPUTER = "ForToffee_Pi_1"
THEIR_NAME     = THEIR_COMPUTER + "_Reveal"
THEIR_ACTUATOR = "PIXEL"
 
IOT.joinAs(MY_NAME)
pixel = IOT.attachTo(THEIR_NAME, THEIR_ACTUATOR)
 
def main():
 
  while True:
    p = int(raw_input("Pixel (0 - 15): "))
    if p < 0:
      break
    r = int(raw_input("Red (0-255): "))
    g = int(raw_input("Green (0-255): "))
    b = int(raw_input("Blue (0-255): "))
 
    print ("Sending pixel {} = r{}, g{}, b{}".format(p,r,g,b))
    pixel.tell([p,r,g,b]) 
 
try:
  main()
finally:
  IOT.leave()

Reveal.py

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
# Reveal.py  12/09/2014  C.Monk
#
# Derived from
# Reveal.py  05/09/2014  D.J.Whale
#
# Reveals a set of neopixels so they can be remotely controlled.
#
# This script is designed only for the Raspberry Pi 
 
import IoticLabs.JoinIOT as IOT
from config import *
import time
from datetime import datetime
import picamera
import piduino
from Copro import *
import ftplib
from subprocess import call
 
piduino.firmware("firmware/copro/copro.hex")
piduino.connect(baud=9600)
 
MY_NAME        = MY_COMPUTER + "_Reveal"
MY_ACTUATOR    = "PIXEL"
 
IOT.joinAs(MY_NAME)
 
def newData(actuator, value):
  #print("actuator: ", actuator.topic, actuator.originator, actuator.unique)
  #print("data:" + value)
  data = value.strip('[').strip(']').split(',')
  neopixel(int(data[0]), (int(data[1])/5, int(data[2])/5, int(data[3])/5)) #r,g,b / 5 to save retina!
 
  print 'Capturing image'
  i = datetime.now()  
  now = i.strftime('%Y%m%d-%H%M%S')  
  tweettime = i.strftime('%Y/%m/%d %H:%M:%S')
  photo_name = now + '.jpg'
  photo_full_path = '/home/pi/grabs/' + photo_name
 
#http://picamera.readthedocs.org/en/release-1.8/recipes1.html
  with picamera.PiCamera() as camera:
      camera.resolution = (1024, 768)
      #camera.start_preview()
      # Camera warm-up time
      time.sleep(2)
      camera.capture(photo_full_path)
 
#http://raspi.tv/2014/overlaying-text-and-graphics-on-a-photo-and-tweeting-it-pt-5-twitter-app-series
  overlay_text = "/usr/bin/convert "+ photo_full_path + "  -pointsize 24 -fill white -annotate +40+728 '" + actuator.topic + "' "  
  overlay_text += " -pointsize 36 -fill white -annotate +40+675 'Pixel " + data[0] + " = R:" + data[1] + " G:" + data[2] + " B:" + data[3] + "' " + photo_full_path  
 
  print "overlaying text"  
  call ([overlay_text], shell=True)  
 
#http://stackoverflow.com/questions/12613797/python-script-uploading-files-via-ftp
  print 'Uploading image'
  session = ftplib.FTP('','','')
  fup = open(photo_full_path, 'rb')
  session.storbinary('STOR ' + photo_name, fup)
  fup.close()
  session.quit()
  print 'Done'
 
try:
  IOT.reveal(MY_ACTUATOR, incoming=newData)
  IOT.loop(True)
finally:
  IOT.leave()
  neopixel(0xFF, (0, 0, 0))

What Next?

I need to look at the other functions currently implemented, the more curious amongst you will have noticed the library code is very much in its infancy.  There’s no direct feedback to a message being sent so the remote doesn’t know what happened.  One solution to that at the moment looks to be Feeds, so that may be my next step with this code.

Have fun, and let me know how you get on.

Posted in Add-On, Arduino, IoT, NeoPixel, Projects, Raspberry Jam, Raspberry Pi | Leave a comment

Inventing with the IoT – Workshop

CamJam has a lot to answer for; mainly the evaporation of my evenings this past week. During the jam I helped run the “Inventing with the Internet of Things” workshop, I had no idea what it was all about as it was a last minute thing but I know enough Python and the principles of networking to bumble along. The workshop was run by David Whale in conjunction with IoTic Labs

@whalegeek & @geeky_tim demonstrate the "Internet of Things"

@whalegeek & @geeky_tim demonstrate the “Internet of Things”

The general gist of the workshop is that each Raspberry Pi was a “Thing” on the Internet. Pairing off, one Pi acted as a “sensor” and sent a message when a button was pressed. The other reacted to that message by lighting an LED and playing a sound. Over the course of the workshop the idea was to demonstrate an event happening (a door opened) and a reaction to that event occurring (an alarm sounded). Thus demonstrating what the IoT can/will evolve into.

The source from the workshop is available from the IoTic Labs github . Tell.py is the sensor, Reveal.py is the recipient of that sensor information.

Download, tweak and have fun. You can run both parts on your Pi, if you don’t have a PiBrella tweak the code to do something else (wait for input via raw_input(), or print a message on screen). Once you strip out the Pi specific code it will run on Python on any platform! Modify config.py and set the MY_COMPUTER variable to something unique to you. In my case it now reads

MY_COMPUTER = “ForToffee_Pi_” + str(MY_NUMBER)

When you’re ready hit up the next article for the Extension Task!

Posted in IoT, Projects, Raspberry Jam, Raspberry Pi | Leave a comment

PiDuino Adventures

At the weekend I took a 90 mile road trip north to Cambridge for the latest CamJam. As always lots to see and do, along with several purveyors of Pi related goodies, which may have been my wallets down fall.  Amongst my stash of swag is a PiDuino from the SKPang stall, at £10 down from £18 it was rude not to!

The PiDuino is an add-on board to connect an Atmega328 to a Raspberry Pi via SPI and serial.  The SPI is used to program the chip with your Arduino code and the serial (which can be disconnected) is there for communications between the devices.

PiDuino

A PiDuino

At the CamJam in July I had been shown an early version by David (whaleygeek) Whale who has provided not only software support for the Arduino code, but also a Python library to load the compiled firmware onto the chip at runtime (more on that later).

Soldering Iron at the Ready

The PiDuino comes in kit form so I had to get soldering first.  The instructions are nice and clear, however I have asked SKPang if they could adjust the sequence of events to start with the lower profile components first.  Starting with the jumper headers made the board unstable on the worktop.

After a couple of user errors (read twice, solder once!) the board powered up and away we went!

Getting Up and Running

The first thing I did was follow the instructions to install a modified version of avrdude from Gordon Henderson, and the Arduino IDE.  In reality you only need to do this if you plan on putting together and compiling your own firmware for the ATmega on the Pi.

Enter stage left WhaleyGeek and his Python loader.  Hats off to David on this one, I won’t go into details, just read his blog post about it.

It Lives!

Yup, it had to be done; Test_Blinky.py was run, the firmware loaded and the little red LED on the PiDuino winked at me.  It’s amazing how satisfying a flashing LED can be.

After this I downloaded Davids NeoPixel colour mixer code, wired up a 16 LED NeoPixel ring, tweaked the Python (24 NeoPixels down to 16) and ran the code.  It worked great!  The firmware that’s loaded is called CoPro and currently controls a set of NeoPixels, a servo and posts messages back when one of three analog inputs changes; all via serial comms.

Note: As I found out the PiDuino (understandably) draws current away from your Pi.  Make sure your power supply is up to the job, the USB hub on my Dell monitor wasn’t and every time I inserted/removed/touched a wire it caused a Pi resetting brown out.  A 2A power supply stabilised things nicely.

I also attached a 10k pot (variable resistor) I had kicking around in my kit.  This allowed me to simulate one of the resistive strips on the demo program.

Time to Tinker

Once I knew it was all working was time to tinker.  Using the demo program as a base I wrote a program to change the LED position on the Neopixel ring in step with the 10k pot; as it turns, the LED moves round the ring.  Oh, and it randomly picks a colour from a pre-set list, for added blinkyness.

When I was happy with that I added a servo I had to hand, to try out the servo support in the  CoPro firmware.  Again this worked very nicely as you can see in the video below (apologies for the poor lighting, hopefully the YouTube enhancements have improved things).

There were a couple of things that tripped me up

  1. The neopixel command required a short pause after each pixel is sent
  2. Global variables are a pain in the bum
  3. The CoPro firmware pushes the current analogue value to the serial buffer when it changes.  If nothing has changed since the last read you get back no values

The first I fixed in the copro.py class provided as the pause is already present if you provide multiple pixels.  I include the tweaked copro.py and my source in this zip.  I’ve not included all the rest of the PiDuino files as these are available from David’s blog post I mentioned earlier.

Here are some close ups of the wiring – A0 = 10k pot (3.3v), D9 = servo (5v), D10 = NeoPixel ring (5v)

PiDuino Wiring Breadboard Wiring

Posted in Add-On, Arduino, NeoPixel, Raspberry Pi | Leave a comment