My smart home, more server stuff, and plans for IoT
2018-02-20 | IoT , Python , Arduino , Html , Web ,
This post is a work in progress.
What I have now
I have a bunch of random stuff around that I plan on connecting together in a DIY IoT-thingy. While I am still looking for a flat this project is pretty much on hold for installing permanent fixtures but I can still experiment in my test environment and get a basic setup going.
Currently the components I have are:
- 3 raspberry pi's
- millions of ESP8266 Nodemcu kits (these things are awesome)
- two old security cameras
- an Amazon Echo Dot
- a few temperature sensors
- motion sensors
- a bluetooth speaker
- 5 metres of RGB LED striplights
- arduino's everywhere
- 433mhz power outlets!
What am I going to do with all of this, I hear you ask?
Here is a list of all the things that I plan on doing right away:

All of the sensors feed into a central hub that controls everything. This is Home Assistant and its freaking awesome. Its open source and made in python so is easy to configure and extend. My current home page looks like this (it's a bit basic of course):
todo put a photo here 6/3/18
I recently got an amazon echo from Kordia as a leaving gift. I plan on using it to add voice automation to home assistant - everything that can be controlled through the web interface should be voice controlled too. Alexa - the echo's voice - can be extended through the use of skills, which can be developed for free. There are already a bunch on the amazon store.
The Server
I recently got myself a new rackmount server - a Dell R170 poweredge. It has 64 gigs of ram, 16 cores, 5 odd hard drives and all the other redundant power supply goodness. Hopefully I should be getting a new managed switch as well to handle most of the networking in the flat. The server is running proxmox but i'm thinking about switching to VMware to be more like the industry standard.
Some of the things the server does currently and will do in the future:
- Plex - video streaming to every computer in the house (I have about 2tb of movies and TV shows currently)
- Pihole - network wide ad blocking (even on phones)
- Couchpotato & Sonarr - automatic downloading of TV and movies
- Deluge - manual centralised downloads
- Wiki (confluence) - centralised wiki notes for uni, the flat, homelab documentation
- Nextcloud - self-hosted fileserver like dropbox
- VPN - access any of our computers at home from anywhere in the world
- Jenkins - CI build server
- Web server - hosting my website and anyone elses
- Game servers - arma, tf2, etc etc
- Calibre - self hosted ebook server
- Gitlab - git repository (like github).
- Pfsense - router
- Home Assistant - IoT home automation hub
Here's what the flat lab will look like:

We made a Radio Controlled, Gimbal Stabilised Wirecam
2016-11-18 | robotics , arduino ,
For our final year high school robotics project, we were approached by the art students to construct a device that could “enhance events photography and videography through aerial photos and videos”. They wanted a camera that could ‘capture intense moments of action and school chanting from aerial angles that normally would be inaccessible’. After lengthy discussions (read:arguments), we decided that due to the large amount of specifications we had to follow for it to be assessable for NCEA standards a quadcopter or plane would be too hard to try to construct in our limited timeframe. We also designed lots of different ideas, including a really long pole, a remote controlled helium balloon, and a parachute, but we found they didn’t fit enough of our specifications to be a feasible project. The main document for this can be found here..
We eventually decided on a single axis wirecam, inspired by the likes of Varavon’s wirecam. It had easy installation requirements (as opposed to a two axis ‘spidercam’) and was much safer than a drone or plane, which made our principal happy. I’ll go over how it works in a bit.
The whole year consisted of an iterative development cycle that worked the best for me and my partner - I was mainly focused on developing the software and electronics of the robot while David worked predominantly on the physical construction. We did work together on most things However.
(my end of year exam modelling report is here, I’m very proud of it. It details the whole process of modelling, designing, and working around all the competing factors that could go into a technical project like this.)
The interesting Stuff.
One of our final prototypes:

Construction
Thankfully, we had access to a lazer cutter at our school, which David and I abused. Because the structure of the robot needed to be strong, we felt that a cutting each side out of a single piece of wood instead of cutting and gluing/screwing pieces together would be the best solution. We cut out a large part of the frame to save weight, but kept it in a ‘truss’ style as it is remained strong. Most of the construction parts are outlined in my modelling report above, so I won’t go into depth here. I suggest giving it a read if you are interested. With a cheap but high-rpm motor from JayCar, we used a 1:4 gear ratio system to get the motor from 9700rpm to 2500rpm, but giving us 4x the torque (which we absolutely needed to get the wirecam moving). 2700rpm on a 2.5cm drive wheel translates to 5ms-1 linear velocity - 2500rpm is pretty close, and gets us nearly to this spec. For the gimbal, we used an EVVGC gimbal board and a gopro hero 2. It only needed to be two axis as because it is on a rope, an entire axis will be stabilised already and there was no need to add unnecessary weight. We used a continuous rotation servo attached to a gear (with a ratio of 1:10 or something) so that spinning the motor at maximum speed moves the main gear around extremely slowly and precisely. The gopro remained attached to the main gear. (see the bottom of the photo above).
Electronics
For the electronics, the vast majority of discussion is in this document. I suggest reading that! Originally, I wanted to use a pair of low powered nRF24L01+ 2.4GHz modules for the wireless communication, but we found they caused more trouble than they were worth. We eventually decided on a pre-made radio controller built for RC cars, and a single arduino nano on the robot.
The controller had three channels - the trigger, the wheel and a push button. The trigger controlled the speed of the robot down the rope while the wheel controlled the direction of the gimbal. The button was used to switch the gimbal between moving up and down or left and right. The arduino thankfully played nicely with the radio receiver and got a value between 0-1024 for each channel, which we mapped to appropriate outputs for the motor controller and the servos.
The source code to the project is here, it ended up being quite simple in the end.

A Python/Pygame Robot Coloured Ball Sorter
2015-09-10 | Robotics , python , arduino , pygame ,
Over term 2-3 of 2015, I worked on a ball sorter that allowed me to quickly send multi-coloured ping-pong balls to different outputs depending on their colour. This took me and my partner roughly 40+ hours of development and I submitted it for a few NCEA standards as part of my school technology and robotics course. The project is mostly finished - I planned on doing a few improvement to the colour recognition algorithms and remaking parts of the structure, and then exams hit, so I couldn't complete it. We submitted this project towards Catalysts Arduino Academy competition, winning us first place and a $50 gift card.
The original plan was to use an analog colour sensor paired with an arduino, but it turned out to be nearly impossible to program (and also incredibly boring). After much consideration and looking into other methods I settled on using a dedicated raspberry pi with a webcam as well as the arduino, communicating through serial over USB.
How it works now:
The Raspberry Pi (2, Model B) is connected to a Playstation Eye webcam. I use the python library PyGame to analyze the colour of the middle 50x50 pixels of the camera and take an average of the colour (as shown in the picture to the right, the average colour is displayed in the top corner). From there, I can retrieve the Hue, Saturation and Value (HSV) of the colour - if the saturation is over a certain amount, we know that there is a ball in front of the camera. The script checks to see if the hue is within the range of values for the each of the four colours. From there it takes a modal average of the colours after 50 ‘readings’ and sends the most common colour to the arduino through USB serial.

The arduino waits for a serial input and then moves the funnel servo to a certain position depending on which colour is received. It then moves the gate servo to an open position and then back to let a single ball through. Whilst doing so, the arduino updates the count of balls that has passed through the system and updates the LCD screen. The arduino also makes use of the arduino-academy provided piezo buzzer for playing a startup sound. It was supposed to also play a sound after each colour was sensed (with varying pitched based off hue) but I ran out of time due to mock exams.
I tried to keep everything modular - the RPi controls the colour sensing, whilst the arduino does everything else. The arduino will work without being attached to the raspberry pi (for instance connecting it to my laptop for debugging), although the python code on the RPi will throw an error if there is no arduino connected. All the functions in all the code is also modular and should work by themselves although I haven’t tested this in depth.
I only have some pictures from an earlier point in development due to forgetting to take photos. In the first picture, I have the TFTLCD screen from the Arduino Academy attached the UNO. The screen is displaying the amount of balls going through the system (I had not yet hooked up the servos). The other two pictures are of an earlier prototype. I have since remade most of the structure using our school laser cutter, and changed from the original Arduino Mega to the Uno.

I have since improved the code so that the machine sorts balls faster.