Painting with drones

December 27th, 2016 Leave a comment Go to comments

The purpose of this project was to come up with an interactive demonstration for the Pygmalion Festival 2016 at UIUC. The end result was a demo where an Android device was given to the visitors, each visitor could then draw any continuous path on the Android device. The x,y-coordinates would then be uploaded to the cloud and a trajectory based on Bézier curves would be generated using a Python script. Finally ROS was used to control a small drone. Camera software was then used to highlight the brightest light in the scene, in this case a LED on the drone. This resulted in the path being visualised in 3D-space.

An overview of the project can be seen in figure below. The Android application is used as a simply user interface. The path drawn is then uploaded to Dropbox and a trajectory is generated using a Python script.

Project overview

Project overview


Finally the drone flies the trajectory. A short video of the project can be seen below:

Android application
The Android application has a very simply layout with an empty canvas and a transparent button in the bottom. It works by detecting when the user moves his finger along the screen and then draws a quadratic Bézier curve halfway between the two points, thus giving the feeling as the user is actually drawing on the screen. A new point is determined whenever the finger is more than a given pixels away from the last point, in this case a value of four pixels where used. Once the user is done drawing they can simply press a button in bottom of the screen and the path will be up-sampled by a factor of 10. The x,y-coordinates are then normalized to have a range of [0,1] while still keeping the same scaling. This ensures that the demo is independent of the aspect ratio and the DPI of the device. The x,y-coordinates is then uploaded as a CSV-file to a linked Dropbox account.

Python code
A Python script is used to generate the trajectory from the path generated by the Android application. First 0.5 is subtracted from all x-coordinates, so the path flown will be centred in the arena. Both coordinates are scaled in order to make it travel within [0,3] m, which turned out to be a reliable size for the current Vicon setup in our arena.

Next step is to generate a Bézier curve from the given x,y-coordinates. This is done in order to make sure that the generated trajectory is smooth.

The Bernstein basis polynomials of degree n is given by:

b_{k,n}(t) = \binom{n}{k} \left(1-t\right)^{n-k} t^k, t \in [0, 1]

Where \binom{n}{k} is the binomial coefficient.

The Bézier curve is then given by:

\mathbf{B}(n, t) = \sum_{k=0}^n b_{k,n}(t) \mathbf{P}_k

Where \mathbf{P}_k are the control points for the Bézier curve. In this case it will be the x,y-coordinates of the path.

An alternative representation of the above equation is to store the coefficients of the Bernstein basis polynomials evaluated in t along each row of a matrix. The final Bézier curve can then be found by simply performing the dot product between this matrix and the control point vector \mathbf{P}.

However if the number of control points is large, then the first equation might cause numerical issues. In practice we started to experience floating-point underflow in the Bernstein basis polynomials coefficients when the number of control points where larger than 500. This was solved by splitting up the x,y-coordinates into chunks of maximum 500 points and then calculating the Bézier curve for each of these chunks. The final Bézier curve is then simply found by appending each curve to each other.

The timestamps for the trajectory is calculated based on the simple assumption that we want the drone to fly at a constant velocity. The distance between each x,y-coordinate in the path is first calculated and the cumulative sum of this vector is then calculated. The timestamps can then be calculated by simply dividing the vector containing the cumulative distance between each x,y-coordinate with the desired velocity:

\mathbf{t} = \frac{cumsum\left(dist\left(\mathbf{x},\mathbf{y}\right)\right)}{v_{const}}

Where cumsum represent the cumulative sum operator, dist is the euclidean distance operator, and v_{const} is the desired velocity. In this case the velocity was set to v_{const} = 0.1 \, m/s. Note the assumption of that we can simply calculate the distances between each point is only a good assumptions when the path has a lot of samples, this was the reason why the path was oversampled with a factor of 10 in the Android application.

Finally the t,x,y-coordinates are written to a CSV-file. An example of a generated Bézier curve given a path from the Android application can be seen in the figure below:

Path drawn on the Android application

The blue dotted line is the original path and the red line represent the Bézier curve The path is first drawn on the Android application. The blue dotted line is the original path and the red line represent the Bézier curve

ROS
ROS is used to control a Crazyflie 2.0 running a custom firmware that implements position and altitude hold based on Vicon data published using a ROS driver. The ROS driver uses a standard discrete Kalman filter with a single integrator model in order to low-pass filter the incoming raw Vicon data. A Python script is then used to implement a ROS node, which sends the waypoints according to timestamps from the trajectory generated previously.

When the node is first started it reads in the t,x,y-coordinates for the trajectory. However the y-coordinate is in fact used as the z-coordinate in order for the drone to fly in the x,z-plane instead.

The drone then takes off and flies to the initial position, it then stays there for a few seconds, turns on the LED and starts publishing x,y,z-coordinates. When the trajectory has finished it turns off the LED and lands back at where it started.

Note that the ROS Python script operates in open-loop, as the ROS Python script does not monitor the actual position of the drone, but simply relies on the fact that the drone should be able to follow the trajectory.

The trajectory flown by the drone is captured using special Camera software, which captures the brightest object in the scene over time, thus visualising the trajectory flown. The figure below shows an example of the drone flying along a trajectory forming a bush, house, and a tree. Note this example was generated before auto-landing and controlling of the LED was implemented, thus the take-off sequence and landing is shown as well.

Example trajectory of a bush, house, and a tree

Example trajectory of a bush, house, and a tree

Future improvements
Some future improvements includes allowing the drone to fly a non-continuous trajectory, as it could simply turn off the LED between segments. Also it would be nice if it monitored the drones battery level, so the battery level could be monitored in real-time. Finally the timestamps could be generated from a more sophisticated method, than just using constant velocity, instead a cost function could be implemented that for instance limited both the velocity and acceleration.

  1. No comments yet.
  1. No trackbacks yet.