Wednesday, May 10, 2017

Raspberry Pi 3 with Google AIY VoiceHat

I was lucky enough to get a copy of Issue 57 of The MagPi with the VoiceHat from Google on the cover.

This is a great bit of kit with Raspberry Pi Hat, speaker, microphone, and a cardboard box to put it in.
For a bit of fun I fitted it into a cardboard R2D2 that I had from before Christmas.

This is not the droid you're looking for
After doing the usual of asking it to tell a joke and give me the weather and details of what's nearby I was wondering what I could use the VoiceHat for.

We have an Amazon Dot and it's main purpose in the house is playing radio station.  I tried this out and nothing played.  Seems streaming radio support isn't built in.

Then keeping up with what others were doing Mike Redrobe posted on the Raspberry Pi forum that he had it playing YouTube audio.

With a bit of assistance I got this working.  If you're careful with the name of the video you can get it to play fairly much any song.  My kids loved this.

But I still wanted to play streaming radio.  You know, one command and it's off rather than needing to keep telling it new songs to play.

In a later post Mike mentions using VLC as the player and armed with his example I began looking to see if I could get streaming radio to work.

I did. Yeah! Then I went on the trail to add playing podcasts.  Again a success.
You need to install VLC

sudo apt update
sudo apt upgrade
sudo apt install vlc

Radio Stations supported are:
Absolute Radio
Absolute 80s
Absolute 90s
Absolute 00s
Eagle Radio
BBC Radio 1
BBC Radio 2
BBC Radio 3
BBC Radio 4
Capital FM

Podcasts Supported are:
Good Job Brain
No Such Thing As a Fish

Here is a little video of it working.  I recorded it in Portrait as the cardboard R2D2 it is in is portrait shared so if viewing on a phone will fill the screen, while of viewing on a computer it will have the side bars which is probably better than seeing the messy table.

The file that's needed with the commands built in is available on GitHub

Other useful bits and pieces

Knowing what Google is actually hearing

While setting up the radio stations Absolute 90s was interpreted by Google as Absolute 90s, but Absolute 00s was interpreted as absolute noughties
It took me a while to figure this out and command that helped me was:

sudo journalctl -u voice-recognizer -n 20 -f

This shows what's happening when you issue a command,
Here's a small snippet showing what the voice command is converted to.
Absolute 00s or is that Absolute Noughties

Make sure to stop and start the service after each change to

sudo systemctl stop voice-recognizer && sudo systemctl start voice-recognizer

Want to add more radio stations
This website has a list of streaming radio stations and their urls.  This is the UK list.

Stopping the playback

Since I'm using the button to stop the playback based on Mike Redrobe's code I decided to do the GPIO setup at the top so it is shared by all the commands rather than having it in each command.

The code is a while loop that keeps going until the button is pressed and then a kill statement to stop the process.  To be truthful I'm not sue how the kill bit works, I just know it does, so thank you to Mike for sharing.

while gpio.input(23):

pkill = subprocess.Popen(["/usr/bin/pkill","vlc"],stdin=subprocess.PIPE)

Doing the podcasts
Getting the most recent podcast to play is a bit more hacky than the radio stream as the radio stream is always the same while each podcast episode has a different url.  To achieve this I first had to find the relevant rss feed and then parse the file to find the start and end of the url to the mp3. podcasts aren't too bad, but the Freakonomics one based off the feedburner rss was a little more complex as the url to the mp3 is actually a redirect url and they don't work.

You can see in the middle there is redirect.mp3 and then what looks like a path to a different website
This meant the parser had to first find "/redirect.mp3/" and then from there find "mp3" to get the main bit for the link.
Finally it needed to add back in "http://"
You can see how this is done in the code below.

url = '' response = urllib.request.urlopen(url) data = # a `bytes` object text = data.decode('utf-8') startmp3 = text.find('/redirect.mp3/')+14 endmp3 = text.find('.mp3',startmp3+16)+4 if startmp3 > 0: command = "http://"+text[startmp3:endmp3]

Have a play and see what you can get the Raspberry Pi AIY to do.
Just be sure to backup any files you modify before changing them.
I didn't backup my when I started playing but luckily google has made the original files available on GitHub

I'm really looking forward to seeing how this grows and expands.

Just saw that has been doing some great things with the AIY as well.  Worth checking out if you want to do more.

I'll definitely be adding the shutdown and reboot commands from here.

Tuesday, April 18, 2017

3rd Wimbledon Raspberry Jam final agenda

The 3rd Wimbledon Raspberry Jam on the 23rd April Agenda has been finalised.

Thursday, February 16, 2017

Green Screen photo booth using Raspberry Pi

We've seen a number of photo booths herehere and even the All Seeing Pi being done with Raspberry Pi. All of these photo booths are based on taking a picture and maybe putting an overlay to add a banner or something like a funny hat or mustache.

My thoughts were could the Raspberry Pi 3 do Green Screen.  You know that special effect from the movies where the background is removed and a different image put in it's place.

As a kid I think the first time I remember seeing this was Superman.

How cool would that be to make your own version of the special effects used in big budget movies.

So, I started the search for tools under Linux that would permit the green screen to be done.
As a short summary green screen/chroma key is where a a single colour is removed from an image.
Most often it is green as modern cameras are more sensitive to green and a bright green works best as it's less likely to be a colour in a natural scene.  I remember when I was younger hearing it being done with blue screen as well.  

After a bit of looking I found Imagemagick a jack of all trades image processing tool.  It has a function to remove a single colour from an image and as importantly for my use includes a 'fuzzy' search for the colour which gave a bit of tolerance to the lighting.  

In the code the most important line is:
os.system('/usr/bin/convert -limit thread 4 ' + folder +'imagecam.png -fuzz '+fuzzpercent+' -transparent "#'+rgbnum +' " ' + folder + 'imagecamt.png')
I know there is a lot going on in there.  She short version is it takes imagescam.png, makes transparent the colour rgbnum with a tolerance of fuzzpercent and saves it as imagecamt.png

This forum thread was really useful in understanding how to use it.

Once I had my head around this it was then a matter of doing the rest of the code.
Take the picture
Remove the green
Layer a background and the image with the green removed. 
For testing I used some of my kids PlayMobil and a piece of A0 green card.  

All looks like it's working well even with the ability to change the background using the arrow keys

Now for the next level.
Scale it Up to Life Size !!!!

This of course means I needed a green screen background with standard that I bought from eBay.  No idea when I'll use the black or  white backgrounds that came with this kit, but I have them now.

And then Tweet the pictures.
For Tweeting I used tweepy and followed the excellent guide by Alex Eames @RasPiTV 

Finally, wouldn't it be great to have a little remote control and not have to rely on a keyboard.  Since the Raspberry Pi 3 has Bluetooth I thought this might be the ideal solution.  No wires and no messing about.  Since Pygame was already being used for the displaying of the images and I knew Pygame had joystick support built in this looked like the obvious choice.

Again, on eBay I found this small little Bluetooth gamepad and thought it would be perfect.  Super small which means you can have it in your hand but not interfere with your final picture.

Iddy biddy, teeny weeny, black gamepad 

My usual style is before bring a new feature into a project I like to test it standalone to make sure it works.  For the gamepad I created a small Python/Pygame program to test the gamepad 

It paired with the Raspberry Pi first time and worked perfectly with the gamepad test program.  So, 100% sure it will function with the chromaCam setup.

Now all the parts are in place.  PiCamera to take the picture.  Imagemagick to remove the background colour.  Pygame to merge background, foreground picture with green removed and finally an overlay with the background images changed with the keyboard or the super small gamepad.  (Oh yeah, the circular thing at the bottom is an analogue joystick)

The chromaCam set up got it's outing at the last Wimbledon Raspberry Jam.  Here are some pictures.
On the day I used a camera tripod and a lot of tie wraps to hold the Pi Display and the Pi Camera onto the tripod. I would definitely recommend a more secure mounting method 

Getting set up

Who doesn't love a bit of Harry Potter -

Those Lions are dangerous

That doesn't look like a sensible thing to do -

Final notes.
If you did click the link above that explains Chroma Key/Green Screen then you'll have read that lighting is really important.  The goal is to one specific colour. If your lighting is uneven then due to shadows or folds in the green screen the green will be different giving varying results.
Taking out too much -

I'm sure the top of the Raspberry Pi logo isn't transparent

A few bits of rogue green -

If you want to make your own Green Screen Photo Booth then the code is available on GitHub

Final Thoughts:
Yes, this project worked and was great fun to build and see people using it.
For the project I limited the image size to 640x480 as even at that resolution it took about 2 seconds for the picture to be updated.  This meant it took a little bit of patients to get the picture you wanted before pressing the button to Tweet the image.

I took the set up to Hack Horsham as well but I didn't bring my own lighting and the lighting in the room was perfect for a Jam but didn't give enough contrast for chromaCam to work.  So, it didn't make the final cut and I stuck to Button Flash.

You can see the setup in the mannequin challenge video tweet. 

Again, this was a great learning experience for me and once the computing power for the Raspberry Pi enables it or access to the GPU is supported this will be amazing when the green screen transparency can be done in real time.  
I expect as new devices come out I'll be revisiting this project to see if the performance is improved.