Home > ARM, Embedded Linux, OpenCV, Raspberry Pi > Raspberry Pi playing ZomBuster

Raspberry Pi playing ZomBuster

Some time ago I had a course dealing with image analysis i.e. image segmentation, moments, colour detection, object recognition etc. As part of the course everyone had to make a project that showcased the theory we had been learning throughout the course. We were allowed to use OpenCV as the backbone for accessing the camera etc, but not allowed to use any of the built-in filters. Instead the goal was to implement the different algorithms ourself.

One day one of my friends was playing the Smartphone game ZomBuster. A screenshot of the gameplay can be seen below:

ZomBuster gameplay

ZomBuster gameplay

The goal of the game is to tap the lane with the zombie in it, in order to kill it. As the zombies are green and humans are blue I thought it would be a fun challenge to build a robot that could play the game autonomously for the course.

This also allowed me to use the 3D printer I had just bought at the time. For that reason I created a 3D model with all the needed components:

3D model

3D model

The black part is the phone being held by a printed flexible transparent dock. The main blue assembly is screwed onto the dock and holds two solenoids just above the phone along with a Raspberry Pi camera module. This allows a Raspberry Pi 2 to take pictures of the game using the camera modul. By analysing the image it can then determine the location of the zombies. Based on the location is will then activate either the left or the right solenoid in order to kill the bottom zombie.

In order to drive the two solenoids two IRF3205 N-Channel mosfets where used. A set of BC547 NPN transistor are used in order to level-shift the voltage from 3.3V to 12V. The simple hardware schematic can be seen below:

ZomBuster schematic

Hardware schematic

Some pictures of the final hardware can be seen below:

Assembled hardware front

Assembled hardware back Assembled hardware

In short the code works by first applying a HSV threshold to the original image, this will filter out any green objects as shown below:

Original image

HSV thresholdingOriginal image and image after HSV thresholding

A fractile/median filter is then applied to filter out any salt and pepper noise. A morphological filter is then applied to this image in order to close any potential holes in the zombies:

Fractile filter

Morphological filterImage after fractile and morphological filter

The first Hu set of invariant moments is then calculated for each object and compared to a certain predefined threshold. Furthermore the Euler number of the object is calculated. If the invariant moment is within the specified range and the Euler number is 1 it means that a zombie has been successfully detected. The center of mass and the contour of the zombie are then superimposed onto the original image.

Found objects including borders

Found zombie locations superimposed onto original image

Based on the location of the center of mass the algorithm then determines if it is on the left or right side. The result is then written into a ring buffer. A state machine will then read a value from the buffer and activate the corresponding solenoid. After the solenoid is activated it will then wait for it to go all the way up again. This is then repeated until the buffer is empty. The result is a series of burst, as the algorithm kills all zombies that is detected and then waits for all of them to disappear. This sets the limit for the robot, as the current algorithm spends a relative long time waiting for the kill animation of each zombie to disappear. Furthermore the solenoids are relative slow, which sets a physical limit on how fast a zombie can be killed.

The end result of the project can be seen below:

The loud sound is NOT from the tip of the solenoid hitting the phone, but due to the end of the solenoid hitting its lock ring.

A much more detailed explanation about the algorithm and the theory behind the different filters, moments etc can be found in the report: KristianSlothLauszus_s123808_2015.pdf.

The final code can be found on Github: https://github.com/Lauszus/ImageAnalysisWithMicrocomputer30330/tree/master/ZomBuster/src. For the record the final highscore I was able to obtain was 91.

Some information on how to compile the code in Eclipse CDT can be found at the Github wiki. A Makefile that was used to compile the code on the Raspberry Pi is available there as well. Notes and a script for compiling OpenCV on the Raspberry Pi can be found at this gist.

Categories: ARM, Embedded Linux, OpenCV, Raspberry Pi Tags:
  1. No comments yet.
  1. No trackbacks yet.