Difference between revisions of "ECE497 Project Beaglebone Blue Robotics"

From eLinux.org
Jump to: navigation, search
Line 71: Line 71:
  
 
The theory for the project was rather simple, start a web server and have it make function calls based on input from the user of the website. Most of the work for the project was debugging libraries and hardware.
 
The theory for the project was rather simple, start a web server and have it make function calls based on input from the user of the website. Most of the work for the project was debugging libraries and hardware.
Routine for image tracking:
+
Routine for image tracking:<br>
 
1. The object to be tracked is programmed into the pixy camera using the built in detection button on the camera(TODO: add reference to instructions for this)<br>
 
1. The object to be tracked is programmed into the pixy camera using the built in detection button on the camera(TODO: add reference to instructions for this)<br>
2. There is a python script that interfaces with the libpixyusb library to communicate with the pixy camera over a USB interface.
+
2. There is a python script that interfaces with the libpixyusb library to communicate with the pixy camera over a USB interface.<br>
3. The python script uses the object position to proportionally control 2 continuous rotation servos(via the robotics cape library).
+
3. The python script uses the object position to proportionally control 2 continuous rotation servos(via the robotics cape library).<br>
4. The object position is written to a JSON file every 50 frames (roughly one second)
+
4. The object position is written to a JSON file every 50 frames (roughly one second)<br>
5. The webpage has a 127x127 gird. The webpage polls the JSON file every second through the socket.io interface and colors the appropriate grid representing the position.  
+
5. The webpage has a 127x127 gird. The webpage polls the JSON file every second through the socket.io interface and colors the appropriate grid representing the position.<br>
  
 
== Work Breakdown ==
 
== Work Breakdown ==

Revision as of 12:04, 8 November 2016

thumb‎ Embedded Linux Class by Mark A. Yoder


Team members: Alvin Koontz, Samuel Lawrence

Grading Template

I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

00 Executive Summary
00 Installation Instructions 
00 User Instructions
00 Highlights
00 Theory of Operation
00 Work Breakdown
00 Future Work
00 Conclusions
00 Demo
00 Late
Comments: I'm looking forward to seeing this.

Score:  10/100

(Inline Comment)

Executive Summary

We will be interfacing the newly developed Beagle-bone Blue with an existing Robotics Platform from spark fun. Build an interface for existing libraries for motor control, servo control, and sensors will be developed. Stretch goals include some amount of image processing.

We have the blue working and mounted to the robot frame. We are able to connect to the school WiFi and host a web-server to interface with. We have rewired the robot to be connected to the beagebone blue board. We can read all the sensors on board and interface with the pixy camera.

What we haven't been able to get working is motor control because the motors for the spark fun robot were rated for much lower than what the blue provides so we had to replace those with continuous rotation servos. We have had power issues with the board itself, these are related to the 12 volt jack and the 6 volt regulator, sometimes the jack disconnects or something causing the whole board to shut down, and sometimes the 6 volt regulator fails to work. We have yet to find a solid reason to why these things happen.

In conclusion we had a number of difficulties getting the blue to work, and part of it would definitely be due to how experimental its current state is. After getting everything set up though we found it to be very easy to use.

Packaging

Installation Instructions

Give step by step instructions on how to install your project.

User Instructions

Once everything is installed, run python server located in the git project, and then connect to the website and the robot should be controllable from the website with web buttons and keyboard keys.

Highlights

The blue is able to read position data of objects from a pixy camera and display the coordinates of the object on a grid in a socket.io based web interface. The coordinate position data also controls the speed and direction of the robot to track a brightly colored object. Integrated a variety of devices and sensors into a web based application on the bone. TODO: test and verify remote operation with a battery and wifi only. make sure connection is still open TODO: youtube video

Theory of Operation

The theory for the project was rather simple, start a web server and have it make function calls based on input from the user of the website. Most of the work for the project was debugging libraries and hardware. Routine for image tracking:
1. The object to be tracked is programmed into the pixy camera using the built in detection button on the camera(TODO: add reference to instructions for this)
2. There is a python script that interfaces with the libpixyusb library to communicate with the pixy camera over a USB interface.
3. The python script uses the object position to proportionally control 2 continuous rotation servos(via the robotics cape library).
4. The object position is written to a JSON file every 50 frames (roughly one second)
5. The webpage has a 127x127 gird. The webpage polls the JSON file every second through the socket.io interface and colors the appropriate grid representing the position.

Work Breakdown

List the major tasks in your project and who did what.

Also list here what doesn't work yet and when you think it will be finished and who is finishing it.

  • Alvin's Tasks
  1. Build web interface
  2. Update Wiki page
  3. Interfaced connectors
  • Sam's Tasks
  1. Find connectors
  2. Work with pixy camera
  3. Mounted the blue to the robot

Future Work

Suggest addition things that could be done with this project.

Conclusions

The beaglebone blue has some very good hardware built into it, one of the flaws with this project was that we just used what was really available to us for interfacing with it. It would have been a lot more fun to pick out parts and build a platform for it to fully utilize all its functions. The board still being in an experimental state resulted in some interesting hardware issues but was nothing that couldn't be worked around.



thumb‎ Embedded Linux Class by Mark A. Yoder