Home > ARM, STM32 > Handheld XV-11 LIDAR with STM32F429 and MATLAB

Handheld XV-11 LIDAR with STM32F429 and MATLAB

The XV-11 LIDAR unit

The XV-11 LIDAR unit

To make our robots even more autonomous we would like to investigate the world of Laser range finding using LIDAR technology. Unfortunately for the users who want to try out LIDAR it’s a very expensive technology to get your hands on.

Throughout the years though Vacuum Clearner robots have evolved a lot, both in the algorithms gettings better but also in the use of more advanced sensors. Lately the Neato XV-11 All Floor Robotic Vacuum System included a small range (0.2m to 6m) LIDAR with 1 degree precision and a resolution of a couple of centimeters. As this vacuum cleaner only costs around $400 makes it a bargain to get hold of a LIDAR if just you could disassemble the robot and use just the LIDAR.

Luckily for us there was put up a bounty for people “hacking” the XV-11 LIDAR and a new hacking community had begun: http://xv11hacking.wikispaces.com/LIDAR+Sensor

When spinning the XV-11 LIDAR unit itself spits out quite a lot of data on a UART port at 115200 baud. A full rotation, meaning 360 distance measurements, consists of 1980 bytes and is sent out with a refresh rate of 5 Hz. So it’s quite an amount of data is required to be processed in a relatively short amount of time.

The mechanical design of the XV-11 LIDAR

The mechanical design of the XV-11 LIDAR

Compared to the more expensive LIDAR sensors on the market you will for sure not get same resolution and accuracy with the XV-11 LIDAR, though for hobby and lower cost research this LIDAR will do you just fine.


We purchased one of the XV-11 LIDAR units on eBay, sold as a “replacement unit” for the Neato vacuum cleaner. This was only $100 so quite a good deal for us.
We decided to connect the LIDAR to our STM32F429IDISCOVERY board for processing and display, due to the heavy amount of transmitted data and the use of the built-in Touch Display for display of the distance measurements.

Below you will find a video of the project in action.

The source code for the STM32, including the CoIDE project and of course the MATLAB script for display, can be found at our GitHub at: https://www.github.com/TKJElectronics/XV11Lidar_STM32F429

As many have been requesting a very exact description of the connection layout I will here try to explain it in three simple lines, as the connection is very simply.
The XV-11 LIDAR unit contains two cables, one with 2 wires defined as the MOTOR CABLE and one with 4 defined as the DATA CABLE.

  • The RED wire in the MOTOR CABLE should be connected to 3.3V for free running (open loop). Otherwise this should be pulsed with 12V.
  • The BLACK wire in the MOTOR CABLE should be connected to ground.
  • The RED wire in the DATA CABLE should be connected to 5V or 3.3V. Usually it is 5V in most of the units you can find on the market.
  • The BLACK wire in the DATA CABLE should be connected to ground.
  • The BROWN wire in the DATA CABLE should be connected to GPIOC.10 on the STM32F4 board, as this is the LIDAR RX being connected to the STM32F4 UART TX.
  • The ORANGE wire in the DATA CABLE should be connected to GPIOC.11 on the STM32F4 board, as this is the LIDAR TX being connected to the STM32F4 UART RX.

I hope this clears up all the connection questions.

The next steps of this project is to implement and use the XV-11 sensor on either the Balanduino or an Omniwheel robot
Finally the very future plans is to use the LIDAR coupled together with an Optical Flow Sensor on our Quadcopters to make them completely autonomous:
IMU+GPS+Optical Flow+LIDAR = Autonomous navigation in unknown environments

Categories: ARM, STM32 Tags:
  1. Bob Bowie
    September 25th, 2014 at 14:26 | #1

    I have a remote controlled lawnmower that I would like to convert to a robotic mower. You can view my mower at:


    I am familiar with BASIC programming and have purchased a Basic Stamp


    I am hoping that you might be able to guide me. The steps I envision are:

    Step #1-Learn Mode

    Capture/Record the Servo commands from the Futaba receiver to the Basic Stamp while sending these same commands to the 2 Astro-flight 208d Reversing Controllers which drive the 2 wheel chair motors.

    Step #2-Playback Mode

    Place the mower in the same start position and send commands recorded in Step #1 to the 2 Astro-flight 208d Reversing Controllers which drive the 2 wheel chair motors.

    I have been able to program the Basic Stamp to send commands to the 208d’s, but wheel slip presents a need to periodically determine location (x,y). Determining location will allow for commands to be sent to the 208d’s to correct location-this is where I need help! Is Lidar the solution?

    Specs on the Astro-flight controllers (208d’s) can be viewed at:


    Thank you,
    Bob Bowie

  2. Pondersome
    October 3rd, 2014 at 18:47 | #2

    Bob, I’m just another visitor, so I don’t speak for TKJ. But I have this lidar sensor and have built outdoor robots. This particular lidar just doesn’t have the intensity to work well in full sunlight. Like the Kinect, it was built for indoor applications. It is also planar -- it doesn’t do 3D. For outdoor odometry, you should probably look into the kind of optical flow sensors people are using on multicopters -- though it won’t help with obstacle avoidance. Outdoor capable lidar remains too pricey for your application.

  3. October 7th, 2014 at 22:24 | #3

    @Bob Bowie
    Hi Bob and Pondersome.
    I am currently in touch with Bob about discussing which options would be available for his lawnmower. But I completely agree with you Pondersome that this specific inexpensive LIDAR unit won’t work the best in outdoor environments, so I have also already been recommending other solutions such as RTK GPS (expensive though), positional radio-tracking, wires in the edges of the lawn for wireless mapping.

    For the optical flow sensor that you are mentioning this unfortunately won’t be to much help as we would like to exactly determine the current position of the mower. But it could be used as a combination with the GPS for more exact result.

    Regards Thomas

  4. Pondersome
    October 21st, 2014 at 04:45 | #4

    The optical flow suggestion was just for dead reckoning as an improvement over wheel encoders -- it certainly wouldn’t lead to absolute positioning without a known starting position and strong correction.

    Thomas, do you have any electrical setup instructions for the XV11 code you posted? I’ve been able to compile and load the software onto the discovery board, but not get any display of lidar readings. Maybe that’s partly because of the bluetooth module you added and which I don’t have. It seems like you are using USART 3 to drive receive of data from the lidar and send forward to the bluetooth module? I noticed in the video that you don’t have the brown receive wire from the lidar connected, but I can’t make out all the connections visually. So I’m assuming your program does not need to send data to the lidar unit. Do you think it should still work without the bluetooth module connected? If not could you give me a hint how to modify it to bypass that? I’m new to the STM32 platform.

    I also noticed that the PC10 line which I think is what you defined for the bluetooth serial connection is internally connected to the R2 line of the LCD-RGB interface and was wondering if writing to that is somehow intentional? Any advice you can provide is helpful. Thanks.

  5. Pondersome
    October 21st, 2014 at 06:46 | #5

    Actually, no worries. Persevered and whether it was disabling the serial writes or fixing up the voltage to the lidar sensor, the example now works! Thanks for sharing it.

  6. October 23rd, 2014 at 10:43 | #6

    So you got it working? What was the problem then.
    The UART transmission is not necessary for just displaying the data on the touch display.

    Regards Thomas

  7. Chris Arena
    June 10th, 2015 at 21:52 | #7

    Great little project!
    How are you putting together the code for the Discovery board?
    I’ll download the files and see what’t in there …
    Chris Arena
    Portsmouth, Rhode Island, USA

  8. June 11th, 2015 at 12:56 | #8

    @Chris Arena
    What do you mean with putting together the code.
    The project is made within CoIDE where the code can be programmed and debugged with the onboard programmer on the Discovery board.
    Regards Thomas

  9. Blair Kingsland
    November 3rd, 2016 at 05:36 | #9

    Your program is great, thank you. I have the STM32F429 Board. The UART RX gives me a HardFault while in SyncUp(). Any debug suggestions? What Stack and Heap size do you use?

  10. November 6th, 2016 at 14:35 | #10

    Make sure that the UART RX data is received correctly and that interrupts are being generated everytime a byte is sent.
    Please have in mind that this project is made as a quick prototype why the UART receive buffer might be overflowed if the data is not processed quickly enough.

  11. Blair Kingsland
    February 27th, 2017 at 01:29 | #11

    What power supply are you using for the XV-11? I am trying to power it from another STM32 Discovery Board. But XV-11 keeps rebooting and not transmitting proper serial.
    To read the ASCII code correctly I have to shift it by 255 and divide by 2. Do I have a serial hardware or setup problem? Maybe insufficient current supply.

  12. March 2nd, 2017 at 09:38 | #12

    @Blair Kingsland
    We are just using a 1-cell LiPo battery which powers the STM32F4 DISCOVERY board and the motor in the XV-11 sensor.
    Please be sure that your power source is capable of providing the necessary current without making the motor rotate too fast. If you power the motor directly from a regulated power supply, the voltage should be between 3-4V as the motor will otherwise spin too fast.

  13. Blair Kingsland
    March 8th, 2017 at 04:05 | #13

    USB 5V should supply sufficient current. Motor is powered at 2.96V. Maybe too slow.

  14. Charles
    March 24th, 2017 at 22:22 | #14

    Hi! I am wondering if anyone can help me out with the code and wiring for this to hook up to an ardunio uno! Im working on a project of my own with a little robotic vehicle, and would like to implement this! Thanks! You could email me at hcf5034 -- at- psu -dot- edu

  15. March 24th, 2017 at 23:21 | #15

    Getting the XV-11 to work with an Arduino Uno might be difficult due to the large amount of data which is transmitted at a 5 Hz rate.
    Therefore quite a lot of RAM is needed to store a full 360-degree scan (1980 bytes at least) whereafter you want to process all the data to extract the actual distance measurements. This is a time consuming task, though it is doable but due to the amount of RAM needed I would recommend to do it on at least an Arduino Mega.

  16. gyuhyong
    May 8th, 2017 at 06:38 | #16

    nice project. I am currently working on the similar project
    but the Lidar should sending out the distance data to the board
    but how do you know which data is going into which position?
    thank you

  17. May 8th, 2017 at 20:30 | #17

    You are correct that LiDAR data only corresponds to measured distances relative to the robot.
    To map the environment you will have to investigate what is known as Simultaneous Location and Mapping (SLAM)
    I can recommend you to take a look at this series of video lectures which in great detail go through the theoretical understanding and practical implementation of a LiDAR-based SLAM algorithm.

    Good luck with your project.

  1. No trackbacks yet.