Teensy 3.5 Autonomous Robot

Status
Not open for further replies.
Just a quick update. Awhile ago I read on the forum of the release of new library for multi threading. I just ran a test of the library to multi-thread the sensors since in my current implementation everything is sequential and reading the sensors take some time. Created three threads one for the 4 sonar sensors, a second for the IR sensors and a third for the BNO055. Initial testing and some suggestions by @ftrias and resulted in some confidence that it will work well for my rover. Plan on incorporating it into the code and probably expand it to read GPS data and the wheel encoders.
 
UPDATE: Based on the discussion in the multithread thread, for me its more like a tutorial, I made several modifications to the code for using Sonar, IR, and IMU in a threaded environment. I also finished incorporating the NeoGPS library for getting data from a Neo-M8N in a separate thread. Am debating now about incorporating waypoint navigation into the code base. Have to work on finishing the odometry function first which allows you to steer by direction. Am debating now about incorporating waypoint navigation into the code base. There is one caution, I will be sharing data across threads so that is another adventure. The one nice thing with the thread approach is that the data is more consistent and more responsive to changes.
 
Nice Rover!

As for RPI and Teensy Hat - and to not take over Jwatte's thread

Background: <probably ignore>
I keep meaning to get back to working with a Rover. I have an older Lynxmotion (now part of RobotShop) Rover from several years ago, plus one that Orion Robotics (no longer making) was doing for a short time. A couple of years ago I tried out using an RPI2 on my Lynxmotion Rover...

Again I would like to get back to it. One of my primary goals for it will be to hopefully get a better understanding of ROS. I have done some stuff with ROS on a hexapod, but I believe a Rover would be more suitable for it... Currently not sure if I will build up my own setup or start from a ROS turtlebot setup, which may either be the new TurtleBot 3 by Robotis or a 2i from Trossen.

But I may also instead or in addition I am thinking of building up a Rover. Not sure which one I will start from. The Orion one might be fun, but I was having issues of parts (like wheels) falling off...

<end background>

If I start building up Rover to use with ROS, electronically my current setup would be:

An RPI like host, might be RPI3, more likely Odroid (either Xu4 or C2) or UP board (small PC with RPI form factor)

To control the motors, I will use one or two RoboClaws (http://www.ionmc.com/) - Note ionmc is owned by the same people who did Orion Robotics and BasicMicro... These servo motor controllers handle the motors plus encoders.

Will probably have a Teensy HAT, as there are lots of things you might want to do, like control servos, read in sensors... The hat I showed in the other thread, has stuff on it like a Pololu DC/DC converter to handle converting the Lipo 3s (could be 4s) to 5v to power both the hat and the host. Also AX Servo control, Amplifier that works of DAC0, a couple switches, LEDS, one neopixel, several IO pins setup with three pin headers, plus a set of jumpers, to connect different parts of the RPI to the Teensy. Example the uart on the RPI connector connects up to one of the UARTS on the Teensy. I have it also setup for SPI pins (did a little testing), maybe I2C pins (have not tried them yet)...

Once I set this up on either Rover or Hexapod, will then figure out which processor to use to handle thins like IMU. Easiest would be for RPI(UP) handle it if there is already a ROS node to handle it, but may want to move to Teensy... Likewise for some other components.

There are a few threads up here as well as on Trossen that talk about some of these boards. Example: http://forums.trossenrobotics.com/s...-Yet-Another-Teensy-Board-D&p=75892#post75892

I should warn I am a software guy not a hardware guy, my last EE like class was probably over 35 years ago and I only do these boards for my own experimentation. So I would trust others like jwatte for a more complete design... But I usually keep versions of some of the different boards up at: https://github.com/KurtE/Teensy3.1-Breakout-Boards
For all of the boards there are DipTrace design files and for many of them there is a zip file that has all of the gerber files necessary to have built at places like oshpark...
 
Thanks Kurt. Wanted to do something with machine vision for a long time. I have looked ROS and its pretty complete on what it can do. Just would have to select what you want to pull together. There is also a Arduino-ROS bridge if you wanted to go that route :). I keep reusing and building off the software I put together for the other two rovers and tweaking the code. Finally got the odometry stuff working correctly and using BNO055 for the IMU - offloaded the calculation for the orientation that way. Also liked options so the different operating modes. With mega type breakout board I put together it really helped. I am at the point that I haven't touched a Arduino board in months.

I've seen the Roboclaws over at Robotshop and they are nice and would probably use them when I get around to driving large motors.

I've looked at the EZ-B as a controller but with that everything comes back to the PC. You might want to check that out as well. My goal was to have everything on-board. I have a Latte Panda as well, but power requirements and rover size. I stayed away from ROS, for one reason. Learn more if I do it myself, at least some of it ;).

I like the concept of HATs, but don't know enough about the Pi yet. @MichaelMeissner suggested using a Rasberry Pi Zero, and it looks like it has some possibilities for my design goals. It is big enough to run opencv so we will see.

Mike
 
Thanks Mike,

I actually have a few of the older Roboclaws sitting around, plus I do have one of their newer ones http://www.ionmc.com/Roboclaw-2x7A-Motor-Controller_p_13.html, but may want to pick up one of their newer 15s... Depending on which motors. Earlier with the rover I used just one where both left wheels were connected to one set of connections and both right wheels (motors) were connected to the other... But for larger may want to have separate control for each motor... I believe jwatte uses 2?

With lots of my Rover/Hexapod stuff on the Linux controllers. I ported my Arduino code base over to Linux (first one was RPI2). I created a set of library code, where I emulated some of the Arduino APIs, like pinMode, digitalWrite, Some Serial classes... To make it easier for me to port the code. I never took the Rover code as far as either of you two have done. I simply played around with it and used some form of joystick to allow me to control the rover, early on used Playstation 2 controllers, later converted to using XBee... With some code now I have converted over to using PS3 or PS4 controllers over bluetooth (linux)....

Hats: RPI has an external connector sort of like Arduino where the many of the pins can be used for GPIO, they also have one hardware UART, SPI, I2C... The hat is sort of like the Arduino Shield...

More later
 
Sounds like fun! I have one sitting on the desk next to me (along with UP board and Odroid XUW).

And today I received an RPI 0W, which I am in the process of setting up. Thought it might be interesting to take a look at...
One thing I notice when working on the RPIs is how much slower it is to work with SD cards versus the EMMC on the UP or Odroid boards.

Who knows maybe I will play around with Diptrace and see what I could fit on RPI hat that is very small... Probably not T3.5/6.

Maybe not DC/DC converter? Probably Dynamixel interface, but again probably not stuff to turn power buss off...

@jwatte - Do you have a small hat for your setup yet? What all did you put on it?
 
Not sure how much you need but essentially it compares readings from three sonars, left sensor is angled to the right and the right senor angled to the left with the center sensor point forward. It goes through a series of comparisons of the three sensors to see if there is a gap that the rover can go through. If no gaps are found with the three sensors it runs a modified VFH algorithm using a rotation sensor and looks for a gap if one is found then it turns the rover in that direction and another comparison is performed following the same methodology described. The code is posted on GITHUB, see post #26.
 
Thank for very for the link to your code. I have downloaded and will try to understand it. Do you think a Teensy 3.2 will be able to compile and run the code?

I have a RP Lidar that I am trying to figure out how to do obstacle avoidance with. Just looking for a simple algorithm that I can play around with.
 
take a look at what I did for the modified vfh algorithm. Its kind of based on a bubble algorithm which isn't too bad to implement. Also do a google search, it may turn up something you can use
 
Thanks for info. Also I see that you are using bno055. Did you have problems obtaining good yaw values? It seems mine sometimes jumps all over the place which I hear is due to the auto-calibration feature.
 
I haven't run into that problem with my BNO055 yet. Just make sure when you do the calibration all 3 show up.
 
Status
Not open for further replies.
Back
Top