ToffeeBot

OK, so not quite the Sheldon Virtual Presence Device from Big Bang Theory but who knows…. maybe my next version?

I have had various bits of robots floating around for a while now and never actually pulled them all together. I’ve made both a Scratch, and a PS3 controlled, Pibrella powered bot which I must blog about one of these days. I also built one from LEGO motors and a SN754410 H-bridge a year or two back, but the motors were old (20+ years) and it didn’t get far.

Hardware

4tronix MicRoCon – It’s an all in one platform that provides power conversion from 6-9v down to 5v to so it can power the Pi as well as an H-bridge to drive the motors.  The other feature I will make use of is some 5v connectors for servos.  Mine is the v1.1 from early 2014, v3 is the current version hence the change in looks.

AdaFruit Pan and Tilt - This I picked up from Pimoroni.  It’s a 2 servo device that has a base plate to mount items on like a……

Raspberry Pi Camera – If you don’t know what this is; it’s a webcam that plugs directly into the Pi, not via USB.  Loads of places stock these

Car/Bot Chassis – Mine is one of these from EBay.

WiFi – Yes, you could do this wired, but that would be very limiting!

Software

Flask- A Python based light weight web framework.  It’s relatively straight forward to use and allows us to control the GPIO directly.

RPIO – For this project I have moved away from the commonly used RPi.GPIO library as the servo control is less “jittery”.  The RPIO library uses C to better manage the precise timings needed.  Ultimately driving the servos via an Arduino or similar would provide the best results, but that was more complicated than I wanted to go.

The Build

Attach ServosThe two servos are attached to the MicRoCon board via the 3 pin headers

 

 

Attach Motors

 

The motors are wired into this 4-way screw terminal

 

 

Attach BatteryThe battery pack is wired into this 2-way screw terminal

 

 

 

Then, using a liberal amount of blu-tak, tape and will power it’s all fixed to the chassis

The Code

My code is derived/inspired by this and this, and adapted for my needs.  My source can be found on GitHub.  I’ll quickly go through the key components to give some context.

servo.py – as the name suggests, this is a class to control a servo.  When an instance of the class is created it takes the BCM pin number the servo is connected to.  Once created, the instance has two methods

  • setAngle(degrees) – set the angle of the servo between -90 and +90
  • incAngle(increment) – specify the number of degrees to move the servo (+ or -)

Why BCM numbers?  Mainly because they are default for RPIO

motor.py – This is a still “in progress” as I would like to add speed control via PWM (Pulse Width Modulation – i.e. turn the motor on and off quickly to reduce the rotation speed).  However, for my initial needs it serves its purpose.  On creation this class take two parameters;  these are the two BCM pin numbers that control the motor direction on the H-bridge chip (i.e. forward and back).

  • start(direction, speed, duration) – activates the motor.
    • direction can be F[orward] or B[ackwards]
    • speed (not used yet) will control how fast the motor goes
    • duration denotes the number of seconds to run for . If duration is 0 (zero) then the motor is on until stop() is called
  • stop() – stops all motor activity

camera_pi.py – Taken directly from here.  It provides a streaming interface from the Pi camera module

templates/index.html – Flask uses a templates folder to manage the HTML pages it displays.  This is the only page and provides the buttons to control the motors and an area to display the camera feed.

app.py – this is the brains of the outfit and acts as our web server.  The camera, servo, and motor control are all done from here.  The key functions of interest are

  • video_feed() – This is called by the HTML to request a new image to display from the camera.  The function gen() calls the Camera class from camera_pi.py and requests a new image
  • move(direction) – this function takes the commands generated by the buttons on the web page and acts on the request to move servos or turn motors.

It Lives!

The video quality is not great, but you get the idea.

Lesson Learned

More of an aide-mémoire in case I stumble over these again, or warnings for the weary traveller.

We need more power! – The 4xAA batteries I had were fine running the wifi, camera and servos.  When the motors were added the wifi connection would drop or the Pi would reset due to low voltage/current.  There are two ways to solve this – more batteries (up to 9v on this board), or powering the Pi direct.  I chose the latter option as I had a small(ish) 5.2Ah phone charging battery to hand.  The upside is if the motor batteries fail the Pi doesn’t go with it, plus my portable battery has a charge indicator on it so I could see if the Pi is about to expire.  I will try an 9v at some point as it will be far more compact.

Threading – No, we’re not talking sewing or wearable tech here.  Threading allows two or more actions in a program to occur at the same time.  The video feed is constantly updating, when a button is pressed this is a second action but it is blocked because the web page is still updating the image.  Enabling threading in Flask allowed both actions to take place without conflicting with each other.  This does carry some risks, and there are various dire warnings on the Internet about using it in production code.  However this is for one or two users in a local environment.  I think we’re safe!

To-Do

Like all good hacking projects there’s still stuff to do.

  • Speed in the motor control
  • Catch exceptions in the video feed to stop the server crashing
  • ‘Touch’ friendly, i.e. keep moving servo/motor while button pressed, stop when let go
Posted in Add-On, Projects | 2 Comments

A #CheerLights Virtual Christmas Tree

What is #CheerLights?

Cheerlights is a IoT based light control system, originally intended to allow social media to dictate the colour of festive lights around the world.


The current cheerlight status

How does it work?

Put the word Cheerlight and a colour in a tweet and you’ve just told light systems around the world the colour they should show.  On the website is a list of the currently supported colours.

On the back-end there is a feed aggregator which exposes a JSON, TXT or XML API.

First Steps

My plan was to use the Pimoroni Unicorn HAT to display the most recent 64 colours from the “Full” feed.  Unfortunately I first started playing with this during my lunch hour at work….. where I didn’t have a Pi.  So I wrote a console app in Python to read the JSON, output all the available colours, and then poll the “Last” feed for updates.

console.py on GitHub

Time to put a HAT on

I wrote some code without the Unicorn HAT to hand then tested it when I got home (there weren’t too many bugs!).  Rather than using the UnicornHAT Python module I chose to use the underlying ws2812 module.  Madness you say?  But no! The Unicorn HAT is just a matrix of NeoPixles which means any ws2812 device should “just work”.  Up until now I could only control my 16 pixel NeoPixel from the Pi using a micro controller (read Arduino clone) as an interface.  So I wired it up; the data line goes to GPIO18 (via a low value resistor to reduce risk of power spikes) and +ve to 3.3v, otherwise we’ll blow up the Pi!  It worked as advertised first time – bonus.

Unicorn and 16 pixel ring

Unicorn HAT on A+ and 16 pixel ring on B+

neopixel.py on GitHub

In the code there is a maxPixels variable to allow you to limit the number of pixels you wish to light.  The 64 NeoPixels on the UnicornHAT are wired in a left to right/right to left pattern so it goes

0 1 2 3 4 5 6
13 12 11 10 9 8 7
14 15 16 17 18 19 20

up to 64.  So as the new colour is added the pattern shifts down in a “snake” like way.

The colours are translated to RGB values via a dictionary of the supported Cheerlight colours.

Let’s Get Festive

This section is UnicornHAT specific as the Python module allows you to treat the pixels as a grid, which makes it easier to draw patterns.  It’s Christmas, and Cheerlights was devised for the festive season, so a tree is the obvious choice.

cheertree.py on GitHub

I ended up with three modes;

  • 0 – All colours – the tree is a mosaic of the most recent colours with the most recent at the top
  • 1 – Lights – five key co-ordinates (defined in a list) represent fairy lights or baubles on the tree.  These are lit by the most recent colours, with the star on the top being the latest
  • 2 – Star – only the most recent colour and this is the top most pixel

The tree patten is defined by as a list of pixel co-ordinates.  The above pictures have some white paper over the pixels to diffuse the light which gives a nice effect.

Time for Turkey Curry?

Well, that’s that.  I hope a few of you reading this have a play with the code, be it with generic NeoPixels or a Unicorn.  Let me know how you get on in the comments below.

Posted in Add-On, IoT, NeoPixel, Raspberry Pi | Leave a comment

Word Clock with a Unicorn

A few days back I saw this article about a Word Clock that had been built using at ATmega328P and an 8×8 LED matrix. “Cool” I thought, I have bi-colour one of those in the “toy” box, I might have a go at that.

Around the same time, those pesky pirates Pimoroni were having their #yarrbooty twitter competition. I was fortunate enough to win a few rounds and decided to spend some of my booty on a Unicorn HAT and an A+. The Unicorn HAT is an 8×8 matrix of very bright RGB LEDs.

And low, the Unicorn Word Clock was born

The template for the words is just printed on a piece of white paper, cut to size.  It’s slightly off but it’s not far out.  The more adept amongst you may want to cut out a larger shape so it encompasses the entire case like a lid.

The code is fairly straight forward. I have defined lists of pixel co-ordinates for the minute phrases, which can then be combined (half + past, ten + to, etc). There is a function to determine the correct minute phrase, and one for the hour. Other than that, hopefully the comments will fill in any gaps.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
#!/usr/bin/env python
 
# wordclock.py by Carl Monk (@ForToffee)
# code is provided with no warranty, feel free to use.
# no unicorns were harmed in the making of this code
 
import unicornhat as UH
import time
 
#global variables
hourPattern = []
minPattern = []
 
#pre-defined patterns - groups of x,y co-ords - 0,0 is bottom right with GPIO at bottom
fivePattern = [[7,6],[6,6],[4,6],[2,6]]
tenPattern = [[1,7],[1,6],[0,6]]
fiftPattern = [[7,6],[6,6],[5,6],[3,6],[2,6],[1,6],[0,6]]
twenPattern = [[5,7],[4,7],[3,7],[2,7],[1,7],[0,7]]
halfPattern = [[7,7],[6,7],[7,5],[6,5]]
 
pastPattern = [[4,5],[3,5],[2,5],[1,5]]
toPattern = [[1,5],[0,5]]
 
#function to light the pixels we need to display the time
#pixels is a list of pixels
def showTime(pixels):
	UH.clear()
	for coords in pixels:
		UH.set_pixel(coords[0],coords[1],255,0,255)		#magenta
 
	UH.show()	#once pixels set, call .show() to enable them
 
#function to light the '*' character to show seconds and minutes elapsing
def showTick(m):
	colour = []
	minPart = m % 5		
	# % is modulo which gives us the remainder of m divided by 5
	# this tells us the value 0 - 4 or the part of 5 minutes the time is
 
	if m == -1:		# for setting the '*' off or black
		colour = [0,0,0]
 
	elif minPart == 0:	#:m0 or :m5
		colour = [255,0,0]		#red 
 
	elif minPart == 1 :	#:m1 or :m6
		colour = [0,255,0]		#green
 
	elif minPart == 2 : #:m2 or :m7
		colour = [0,0,255]		#blue
 
	elif minPart == 3 : #:m3 or :m8
		colour = [255,255,0]	#yellow
 
	elif minPart == 4 : #:m4 or :m9
		colour = [0,255,255]	#cyan
 
	UH.set_pixel(5,5,colour[0],colour[1],colour[2])	#5,5 is the position of '*'
	UH.show()
 
#takes the current hour and provides the required pattern of letters
def getHourPattern(h,m):
	global hourPattern
	hourPattern = []
 
	#convert 24hr into 12hr
	if h >= 12:
		h -= 12
 
	#if minutes > 35 then display will be 'to' the next hour
	if m >= 35:
		h = h + 1
		#special case for 11:35 - 12:00.  Hour is 0 to 11 so need to reset to 0
		if h == 12:		
			h = 0
 
	if h == 0:	#aka Twelve
		hourPattern =  [[7,2],[6,2],[5,2],[4,2],[2,2],[1,2]]
	elif h == 1:
		hourPattern =  [[7,3],[6,3],[5,3]]
	elif h == 2:
		hourPattern =  [[7,2],[6,2],[6,1]]
	elif h == 3:
		hourPattern =  [[4,3],[3,3],[2,3],[1,3],[0,3]]
	elif h == 4:
		hourPattern =  [[7,1],[6,1],[5,1],[4,1]]
	elif h == 5:
		hourPattern =  [[3,1],[2,1],[1,1],[0,1]]
	elif h == 6:
		hourPattern =  [[7,0],[6,0],[5,0]]
	elif h == 7:
		hourPattern =  [[4,0],[3,0],[2,0],[1,0],[0,0]]
	elif h == 8:
		hourPattern =  [[4,4],[3,4],[2,4],[1,4],[0,4]]
	elif h == 9:
		hourPattern =  [[7,4],[6,4],[5,4],[4,4]]
	elif h == 10:
		hourPattern =  [[0,4],[0,3],[0,2]]
	elif h == 11:
		hourPattern =  [[5,2],[4,2],[3,2],[2,2],[1,2],[0,2]]
 
#takes the current minute and provides the required pattern of letters
def getMinutePattern(m):
	global minPattern
	minPattern = []
	if 10 > m >= 5 or m >= 55:
		minPattern = fivePattern
	elif 15 > m >= 10 or 55 > m >= 50:
		minPattern = tenPattern
	elif 20 > m >= 15 or 50 > m >= 45:
		minPattern = fiftPattern
	elif 25 > m >= 20 or 45 > m >= 40:
		minPattern = twenPattern
	elif 30 > m >= 25 or 40 > m >= 35:
		minPattern = twenPattern + fivePattern
	elif 35 > m >= 30:
		minPattern = halfPattern
 
	#if time between 5 and 34 we need to show 'past' the hour
	if 35 > m >= 5:
		minPattern = minPattern + pastPattern
	elif m >= 35:	#otherwise 'to' the hour
		minPattern = minPattern + toPattern
 
#cycle through a full 12hrs minute by minute
def fullTest():		
	for n in range(12*60):
		getHourPattern(n / 60, n % 60)
		getMinutePattern(n % 60)
		showTime(minPattern + hourPattern)
		showTick(n)
		time.sleep(.25)
 
#cycle through hours, then minutes
def quickTest():
 
	for n in range(12):
		getHourPattern(n, 0)
		showTime(hourPattern)
		time.sleep(.5)
 
	for n in range(60):
		getMinutePattern(n)
		showTime(minPattern )
		showTick(n)
		time.sleep(.25)
 
 
#main function 
quickTest()
 
while True:
	#get time parts
	h = time.localtime().tm_hour
	m = time.localtime().tm_min
	s = time.localtime().tm_sec
 
	#get patterns
	getHourPattern(h, m)
	getMinutePattern(m)
 
	#show patterns
	showTime(minPattern + hourPattern)
 
	#flash '*' to show time passing, lit every 2 seconds
	if s % 2:
		showTick(m)
	else:
		showTick(-1)
 
 
	time.sleep(1)
Posted in Add-On, NeoPixel, Raspberry Pi | 1 Comment

Inventing with the IoT – Extension Task

If you have not read Inventing with the IoT – Workshop go do so first, otherwise this will make less sense than usual!

As I said in my previous post, I never actually worked with the IoT code before the workshop, so I downloaded it and combined it with my PiDuino code from earlier this week.  As a result I have an Internet of Things NeoPixel ring.

20140912-120350

IoT NeoPixel ring being modelled by Babbage

The modified Tell.py is straight forward enough.  Remove all the PiBrella code as we’re not using the buttons, then add some lines to get a pixel ID and red, green and blue colour values to define the pixel colour.  Bundle it together as a list object and send to the receiver.

The initial version of the Reveal.py read the incoming data, and uses the values to set the NeoPixel via the PiDuino (as described in my previous article).  That’s all well and good, but how will the remote end sending the command know it’s worked?  Good job I have a PiCamera to hand!  A little Heath Robinson later (note the camera mount), once the pixel is set, an image is captured, some text is overlayed and then uploaded to a gallery on my web hosting. I am using UberGallery, which is just the job as it just displays the images in a folder.

The finished article - stuffed toys optional

The finished article – stuffed toys optional

The following morning @SouthendRPiJams (aka Andy) and I had an IoT exchange, Andy was in the workshop on Saturday. I sent his Pi a message (i.e. “Hello”) which then appeared on his screen and made his PiGlow flash via modified versions of the Tell.py and Reveal.py code (the tell() command just sends the string). He then set up a Tell.py to send me the details my Reveal.py needed to light a pixel, and you can see the results in the gallery currently.

So what’s the practical application?  Currently none, this is just an experiment, but when you look at IoT lights like the Good Night Lamp this project could evolve into something similar; allowing you to be notified when family or friends are at home.  For the more technically minded it could represent the status of a set of services.

I will endeavour to keep my Pi up for a few days if you would like to try the Tell.py script yourself.  Check the gallery around 15-30 seconds after a command has been sent to see if it worked. Send me a tweet before you do, it’s always nice to see these things in action.  Let me know were you’re from and we can see who can get the award for the furthest connection.  Southend to Farnborough to beat.

Please do not make repeated, rapid calls else I’ll have to shut it down for fear of psychological damage to Babbage caused by all those lights!.

The code? Glad you asked…

Tell.py

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
# Tell.py  12/09/2014  C.Monk
#
# Derived from
# Tell.py  12/09/2014  D.J.Whale
#
# Tells a revealed actuator to do something.
# Please exercise restraint and do not make frequent repetitive calls
 
import IoticLabs.JoinIOT as IOT
from config import *
import time
 
MY_NAME        = MY_COMPUTER + "_Tell"
THEIR_COMPUTER = "ForToffee_Pi_1"
THEIR_NAME     = THEIR_COMPUTER + "_Reveal"
THEIR_ACTUATOR = "PIXEL"
 
IOT.joinAs(MY_NAME)
pixel = IOT.attachTo(THEIR_NAME, THEIR_ACTUATOR)
 
def main():
 
  while True:
    p = int(raw_input("Pixel (0 - 15): "))
    if p < 0:
      break
    r = int(raw_input("Red (0-255): "))
    g = int(raw_input("Green (0-255): "))
    b = int(raw_input("Blue (0-255): "))
 
    print ("Sending pixel {} = r{}, g{}, b{}".format(p,r,g,b))
    pixel.tell([p,r,g,b]) 
 
try:
  main()
finally:
  IOT.leave()

Reveal.py

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
# Reveal.py  12/09/2014  C.Monk
#
# Derived from
# Reveal.py  05/09/2014  D.J.Whale
#
# Reveals a set of neopixels so they can be remotely controlled.
#
# This script is designed only for the Raspberry Pi 
 
import IoticLabs.JoinIOT as IOT
from config import *
import time
from datetime import datetime
import picamera
import piduino
from Copro import *
import ftplib
from subprocess import call
 
piduino.firmware("firmware/copro/copro.hex")
piduino.connect(baud=9600)
 
MY_NAME        = MY_COMPUTER + "_Reveal"
MY_ACTUATOR    = "PIXEL"
 
IOT.joinAs(MY_NAME)
 
def newData(actuator, value):
  #print("actuator: ", actuator.topic, actuator.originator, actuator.unique)
  #print("data:" + value)
  data = value.strip('[').strip(']').split(',')
  neopixel(int(data[0]), (int(data[1])/5, int(data[2])/5, int(data[3])/5)) #r,g,b / 5 to save retina!
 
  print 'Capturing image'
  i = datetime.now()  
  now = i.strftime('%Y%m%d-%H%M%S')  
  tweettime = i.strftime('%Y/%m/%d %H:%M:%S')
  photo_name = now + '.jpg'
  photo_full_path = '/home/pi/grabs/' + photo_name
 
#http://picamera.readthedocs.org/en/release-1.8/recipes1.html
  with picamera.PiCamera() as camera:
      camera.resolution = (1024, 768)
      #camera.start_preview()
      # Camera warm-up time
      time.sleep(2)
      camera.capture(photo_full_path)
 
#http://raspi.tv/2014/overlaying-text-and-graphics-on-a-photo-and-tweeting-it-pt-5-twitter-app-series
  overlay_text = "/usr/bin/convert "+ photo_full_path + "  -pointsize 24 -fill white -annotate +40+728 '" + actuator.topic + "' "  
  overlay_text += " -pointsize 36 -fill white -annotate +40+675 'Pixel " + data[0] + " = R:" + data[1] + " G:" + data[2] + " B:" + data[3] + "' " + photo_full_path  
 
  print "overlaying text"  
  call ([overlay_text], shell=True)  
 
#http://stackoverflow.com/questions/12613797/python-script-uploading-files-via-ftp
  print 'Uploading image'
  session = ftplib.FTP('','','')
  fup = open(photo_full_path, 'rb')
  session.storbinary('STOR ' + photo_name, fup)
  fup.close()
  session.quit()
  print 'Done'
 
try:
  IOT.reveal(MY_ACTUATOR, incoming=newData)
  IOT.loop(True)
finally:
  IOT.leave()
  neopixel(0xFF, (0, 0, 0))

What Next?

I need to look at the other functions currently implemented, the more curious amongst you will have noticed the library code is very much in its infancy.  There’s no direct feedback to a message being sent so the remote doesn’t know what happened.  One solution to that at the moment looks to be Feeds, so that may be my next step with this code.

Have fun, and let me know how you get on.

Posted in Add-On, Arduino, IoT, NeoPixel, Projects, Raspberry Jam, Raspberry Pi | Leave a comment

Inventing with the IoT – Workshop

CamJam has a lot to answer for; mainly the evaporation of my evenings this past week. During the jam I helped run the “Inventing with the Internet of Things” workshop, I had no idea what it was all about as it was a last minute thing but I know enough Python and the principles of networking to bumble along. The workshop was run by David Whale in conjunction with IoTic Labs

@whalegeek & @geeky_tim demonstrate the "Internet of Things"

@whalegeek & @geeky_tim demonstrate the “Internet of Things”

The general gist of the workshop is that each Raspberry Pi was a “Thing” on the Internet. Pairing off, one Pi acted as a “sensor” and sent a message when a button was pressed. The other reacted to that message by lighting an LED and playing a sound. Over the course of the workshop the idea was to demonstrate an event happening (a door opened) and a reaction to that event occurring (an alarm sounded). Thus demonstrating what the IoT can/will evolve into.

The source from the workshop is available from the IoTic Labs github . Tell.py is the sensor, Reveal.py is the recipient of that sensor information.

Download, tweak and have fun. You can run both parts on your Pi, if you don’t have a PiBrella tweak the code to do something else (wait for input via raw_input(), or print a message on screen). Once you strip out the Pi specific code it will run on Python on any platform! Modify config.py and set the MY_COMPUTER variable to something unique to you. In my case it now reads

MY_COMPUTER = “ForToffee_Pi_” + str(MY_NUMBER)

When you’re ready hit up the next article for the Extension Task!

Posted in IoT, Projects, Raspberry Jam, Raspberry Pi | Leave a comment