Difference between revisions of "ECE497 Project Beaglebone Blue Robotics"

From eLinux.org
Jump to: navigation, search
(Highlights)
m
 
(8 intermediate revisions by 3 users not shown)
Line 1: Line 1:
[[Category:ECE497 |PB]]
+
[[Category:ECE497Fall2016 |PB]]
 
{{YoderHead}}
 
{{YoderHead}}
  
Line 5: Line 5:
  
 
== Grading Template ==
 
== Grading Template ==
Tuesday draft:
 
Looks like a good start.  Don't forget the video.  Instructions are sometimes hard to follow, but might be easier
 
with pictures.
 
 
 
I've added some <span style="color: red;">red</span> comments through out.  Feel free to delete them once you've made fixes.
 
 
 
I'm using the following template to grade.  Each slot is 10 points.
 
I'm using the following template to grade.  Each slot is 10 points.
 
0 = Missing, 5=OK, 10=Wow!
 
0 = Missing, 5=OK, 10=Wow!
  
 
<pre style="color:red">
 
<pre style="color:red">
00 Executive Summary
+
08 Executive Summary - Good, but seems dated
00 Installation Instructions  
+
07 Installation Instructions - Wiring instructions are needed.  A picture would be very helpful.
00 User Instructions
+
09 User Instructions - Clear instructions.  Nice screen shot
00 Highlights
+
08 Highlights - Nice videos.  I would have liked to see more of the web interface in action.
00 Theory of Operation
+
09 Theory of Operation
00 Work Breakdown
+
10 Work Breakdown
00 Future Work
+
10 Future Work
00 Conclusions
+
10 Conclusions
00 Demo
+
08 Demo - I wish more of the web interface was demo'ed.
00 Late
+
10 Not Late
Comments: I'm looking forward to seeing this.
+
Comments: You got a lot done on this project.
  
Score:  10/100
+
Score:  89/100
 
</pre>
 
</pre>
  
<span style="color:red">(Inline Comment)</span>
+
== Executive Summary ==
 +
<span style="color:red">(Seems out of date.)</span>
  
== Executive Summary ==
+
The stated goal for this project was to "do something interesting with robotics" using the BeagleBone-Blue as part of a final project for ECE497. We interfaced the Blue with an existing robotics platform and added a remote web interface. We developed a proportional-control object following program and had the robot follow a red ball within a small area using a pixy camera for image detection and continuous rotation servos for drive. We also developed a web interface for reading from the other sensors and driving the robot using a keyboard. A pixel grid on the web page shows the detected position of the object the robot is following in real-time*.
  
We will be interfacing the newly developed BeagleBone Blue with an existing Robotics Platform from Sparkfun. We will build an interface for interacting with existing libraries for motor control, servo control, and sensors.
 
The interface will include some amount of image processing and some a web interface that will let the user control the robot, in addition to a red ball following program.
 
  
We have the Blue working and mounted to the robot frame. We are able to connect to the school WiFi and host a web-server to interface with. We have rewired the robot to be connected to the BeagleBone blue board. We can read all the sensors on board and interface with the pixy camera.
+
We ran into a problem using the motors built into the platform. They were designed for a 5v controller and the built in controllers on the blue use 12v. For a number of reasons we decided to use continuous rotation servos instead. Some mechanical modifications were required to convert the existing wheels on the platform to work with continuous rotation servos. This saved us from having to order an external h-bridge or motors.
  
What we haven't been able to get working is motor control because the motors for the spark fun robot were rated for a much lower voltage than what the blue provides so we had to replace those with continuous rotation servos. We have had some issues with the board itself which may or may not be hardware issues, we summarize these in the conclusion section.
+
To conclude: We had a total of 3 weeks to work on the project and the first half of that time was spent dealing with the quirks of the Beagle Bone-Blue. Due to the nature of working with an alpha version of the board there were a number of hurdles and workarounds to figure out just to get the board to connected to the internet and successfully communicating with the sensors and motors. The basic project goals were met but there is yet more to be done.
  
In conclusion we had a number of difficulties getting the blue to work, and part of it would definitely be due to how experimental its current state is. After getting everything set up though we found it to be very easy to use.
+
&lowast;Not really, by "real-time" we mean about 5 times a second :)
  
 
== Packaging ==
 
== Packaging ==
Line 55: Line 47:
 
== Installation Instructions ==
 
== Installation Instructions ==
  
Give step by step instructions on how to install your project.<br>
+
Setup instructions
1. Follow the instructions for installing the pixycam libraries on the beaglebone. Link: http://cmucam.org/projects/cmucam5/wiki/Hooking_up_Pixy_to_a_Beaglebone_Black
 
 
 
TODO: write the install scripts
 
 
* [https://github.com/StrawsonDesign/Robotics_Cape_Installer Follow git hub instructions to configure blue for robotics]
 
* [https://github.com/StrawsonDesign/Robotics_Cape_Installer Follow git hub instructions to configure blue for robotics]
* TODO find instructions for pixy camera
+
* [http://cmucam.org/projects/cmucam5/wiki/Hooking_up_Pixy_to_a_Beaglebone_Black Follow the instructions for installing the pixycam libraries on the BeagleBone]
 
* [https://servicedesk.rose-hulman.edu/ics/support/TSList.asp?folderID=100&task=knowledge Follow the instructions for connecting with a Raspberry Pi over wifi, the process was the same for the Blue, your network may be different]
 
* [https://servicedesk.rose-hulman.edu/ics/support/TSList.asp?folderID=100&task=knowledge Follow the instructions for connecting with a Raspberry Pi over wifi, the process was the same for the Blue, your network may be different]
 
* It's a good idea to set passwords for the bone at this point but its not critical
 
* It's a good idea to set passwords for the bone at this point but its not critical
Line 66: Line 55:
 
* Running setup.sh should install any needed libraries
 
* Running setup.sh should install any needed libraries
 
* refer to our readme if any other problems are encountered
 
* refer to our readme if any other problems are encountered
 +
 +
<span style="color:red">(How do I wire it?)</span>
  
 
== User Instructions ==
 
== User Instructions ==
Line 83: Line 74:
 
./startServer.sh
 
./startServer.sh
  
Will start the server program in a tmux session called server, be default it should be on yourIP:8090
+
Will start the server program in a tmux session called server, be default it should be on port 8090
  
From there the robot can be controlled with the keyboard keys, and sensor reads can be done with the buttons.
+
With any browser go to yourIP:8090, from there the robot can be controlled with the keyboard keys, and sensor reads can be done with the buttons. The ball following program can be started and stopped from the web interface.
 
[[File:WebInterface_final.png|framed|none]]
 
[[File:WebInterface_final.png|framed|none]]
  
Line 94: Line 85:
 
Integrated a variety of devices and sensors into a web based application on the bone.<br><br>
 
Integrated a variety of devices and sensors into a web based application on the bone.<br><br>
  
Here is a youtube video of the object following mode in action: https://youtu.be/4T3ZzZQ5Q7M
+
Here is a youtube video of the object following mode in action: https://youtu.be/4T3ZzZQ5Q7M <br>
Here is a video of the web interface control in action: https://youtu.be/8AIXEpo2eGY
+
Here is a youtube video of the web interface control in action: https://youtu.be/8AIXEpo2eGY
  
 
== Theory of Operation ==
 
== Theory of Operation ==
Line 101: Line 92:
 
The theory for the project was rather simple, start a web server and have it make function calls based on input from the user of the website. Most of the work for the project was debugging libraries and hardware. The python server program makes calls to the socket.io library for communicating with the client computer and the robotics cape library for controlling hardware<br>
 
The theory for the project was rather simple, start a web server and have it make function calls based on input from the user of the website. Most of the work for the project was debugging libraries and hardware. The python server program makes calls to the socket.io library for communicating with the client computer and the robotics cape library for controlling hardware<br>
 
Routine for image tracking:<br>
 
Routine for image tracking:<br>
1. The object to be tracked is programmed into the pixy camera using the built in detection button on the camera(TODO: add reference to instructions for this)<br>
+
1. The object to be tracked is programmed into the pixy camera using the built in detection button on the camera<br>
 
2. There is a python script that interfaces with the libpixyusb library to communicate with the pixy camera over a USB interface.<br>
 
2. There is a python script that interfaces with the libpixyusb library to communicate with the pixy camera over a USB interface.<br>
 
3. The python script uses the object position to proportionally control 2 continuous rotation servos(via the robotics cape library).<br>
 
3. The python script uses the object position to proportionally control 2 continuous rotation servos(via the robotics cape library).<br>

Latest revision as of 06:28, 31 October 2017

thumb‎ Embedded Linux Class by Mark A. Yoder


Team members: Alvin Koontz, Samuel Lawrence

Grading Template

I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

08 Executive Summary - Good, but seems dated
07 Installation Instructions - Wiring instructions are needed.  A picture would be very helpful.
09 User Instructions - Clear instructions.  Nice screen shot
08 Highlights - Nice videos.  I would have liked to see more of the web interface in action.
09 Theory of Operation
10 Work Breakdown
10 Future Work
10 Conclusions
08 Demo - I wish more of the web interface was demo'ed.
10 Not Late
Comments: You got a lot done on this project.

Score:  89/100

Executive Summary

(Seems out of date.)

The stated goal for this project was to "do something interesting with robotics" using the BeagleBone-Blue as part of a final project for ECE497. We interfaced the Blue with an existing robotics platform and added a remote web interface. We developed a proportional-control object following program and had the robot follow a red ball within a small area using a pixy camera for image detection and continuous rotation servos for drive. We also developed a web interface for reading from the other sensors and driving the robot using a keyboard. A pixel grid on the web page shows the detected position of the object the robot is following in real-time*.


We ran into a problem using the motors built into the platform. They were designed for a 5v controller and the built in controllers on the blue use 12v. For a number of reasons we decided to use continuous rotation servos instead. Some mechanical modifications were required to convert the existing wheels on the platform to work with continuous rotation servos. This saved us from having to order an external h-bridge or motors.

To conclude: We had a total of 3 weeks to work on the project and the first half of that time was spent dealing with the quirks of the Beagle Bone-Blue. Due to the nature of working with an alpha version of the board there were a number of hurdles and workarounds to figure out just to get the board to connected to the internet and successfully communicating with the sensors and motors. The basic project goals were met but there is yet more to be done.

∗Not really, by "real-time" we mean about 5 times a second :)

Packaging

Installation Instructions

Setup instructions

(How do I wire it?)

User Instructions

Once everything is installed

cd into the root directory of the project

Run

./startWifi.sh

Will start the wifi program in a tmux session called wifi, and will output the ip address for the blue once it connects

Wifistart.png

Run

./startServer.sh

Will start the server program in a tmux session called server, be default it should be on port 8090

With any browser go to yourIP:8090, from there the robot can be controlled with the keyboard keys, and sensor reads can be done with the buttons. The ball following program can be started and stopped from the web interface.

WebInterface final.png

Highlights

The blue is able to read position data of objects from a pixy camera and display the coordinates of the object on a grid in a socket.io based web interface. The coordinate position data also controls the speed and direction of the robot to track a brightly colored object. Integrated a variety of devices and sensors into a web based application on the bone.

Here is a youtube video of the object following mode in action: https://youtu.be/4T3ZzZQ5Q7M
Here is a youtube video of the web interface control in action: https://youtu.be/8AIXEpo2eGY

Theory of Operation

The theory for the project was rather simple, start a web server and have it make function calls based on input from the user of the website. Most of the work for the project was debugging libraries and hardware. The python server program makes calls to the socket.io library for communicating with the client computer and the robotics cape library for controlling hardware
Routine for image tracking:
1. The object to be tracked is programmed into the pixy camera using the built in detection button on the camera
2. There is a python script that interfaces with the libpixyusb library to communicate with the pixy camera over a USB interface.
3. The python script uses the object position to proportionally control 2 continuous rotation servos(via the robotics cape library).
4. The object position is written to a JSON file every 50 frames (roughly one second)
5. The webpage has a 127x127 gird. The webpage polls the JSON file every second through the socket.io interface and colors the appropriate grid representing the position.

Work Breakdown

List the major tasks in your project and who did what.

Also list here what doesn't work yet and when you think it will be finished and who is finishing it.

  • Alvin's Tasks
  1. Build web interface
  2. Update Wiki page
  3. Interfaced connectors
  4. Integrate IR sensors, bump sensors, and gpio with the bone.
  5. Integrate servos
  6. Integrate IR sensors with the web interface.
  7. Created instructions and install files for the web server
  8. Made the server webpage look prettier
  • Sam's Tasks
  1. Find and order connectors
  2. Install pixycam libraries on the blue and use position data to control servos.
  3. send and visualize object position data on the web interface
  4. Mount the blue to the robot
  5. retrofit continuous rotation servos onto the wheels we were given with help from Gary and Jack
  6. integrate servos
  7. Created instructions and scripts for setting up the pixy cam
  8. Setup a socket.io function call from the webpage to start the pixycamera script automatically.
  9. Made the server webpage look prettier (it's ugly and jumbled at the moment)
  • incomplete tasks
  1. The motor drivers were not used because our motors operate on 5v rather than the supplied 12v. We decided that retrofitting continuous rotation servos was a better idea.

Future Work

  1. PID control for the object follow script (currently only proportional control is implemented) would make the robots movement less erratic.
  2. Add a screenshot button and object recognition button to the web interface so that new objects could be programmed in remotely without being forced to use the hardware button on the pixy.
  3. implement a web programming interface like blockly to make the Blue accessible to younger students.

Conclusions

The BeagleBone Blue has some very good hardware built into it, one of the flaws with this project was that we just used what was really available to us for interfacing with it. It would have been a lot more fun to pick out parts and build a platform for it to fully utilize all its functions. The board still being in an experimental state resulted in some interesting hardware issues, the ones we encountered where that some times the 12 Volt jack is bumped and shuts down the entire board. If the blue boots from just USB power, and then the 12 volt power or battery is plugged in then a few pwm signal pins will stay at 3.3 volts, making pwm impossible on those pins, rebooting the blue solves this. Finally whenever the voltage rail for the pwm is started, nothing can actually be attached to rail, or It will fail to reach 6 volts, the solution to this is to start with everything unplugged and then plug them back in. Aside from this the blue worked as expected.



thumb‎ Embedded Linux Class by Mark A. Yoder