ECE497 Project Beaglebone Blue Robotics

From eLinux.org
Revision as of 14:53, 15 November 2016 by Yoder (talk | contribs) (Executive Summary: Final Grade)
Jump to: navigation, search

thumb‎ Embedded Linux Class by Mark A. Yoder


Team members: Alvin Koontz, Samuel Lawrence

Grading Template

Tuesday draft: Looks like a good start. Don't forget the video. Instructions are sometimes hard to follow, but might be easier with pictures.


I've added some red comments through out. Feel free to delete them once you've made fixes.

I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

00 Executive Summary
00 Installation Instructions 
00 User Instructions
00 Highlights
00 Theory of Operation
00 Work Breakdown
00 Future Work
00 Conclusions
00 Demo
00 Late
Comments: I'm looking forward to seeing this.

Score:  10/100

(Inline Comment)

Executive Summary

(Seems out of date.)

We will be interfacing the newly developed BeagleBone Blue with an existing Robotics Platform from Sparkfun. We will build an interface for interacting with existing libraries for motor control, servo control, and sensors. The interface will include some amount of image processing and some a web interface that will let the user control the robot, in addition to a red ball following program.

We have the Blue working and mounted to the robot frame. We are able to connect to the school WiFi and host a web-server to interface with. We have rewired the robot to be connected to the BeagleBone blue board. We can read all the sensors on board and interface with the pixy camera.

What we haven't been able to get working is motor control because the motors for the spark fun robot were rated for a much lower voltage than what the blue provides so we had to replace those with continuous rotation servos. We have had some issues with the board itself which may or may not be hardware issues, we summarize these in the conclusion section.

In conclusion we had a number of difficulties getting the blue to work, and part of it would definitely be due to how experimental its current state is. After getting everything set up though we found it to be very easy to use.

Packaging

Installation Instructions

Setup instructions

User Instructions

Once everything is installed

cd into the root directory of the project

Run

./startWifi.sh

Will start the wifi program in a tmux session called wifi, and will output the ip address for the blue once it connects

Wifistart.png

Run

./startServer.sh

Will start the server program in a tmux session called server, be default it should be on port 8090

With any browser go to yourIP:8090, from there the robot can be controlled with the keyboard keys, and sensor reads can be done with the buttons. The ball following program can be started and stopped from the web interface.

WebInterface final.png

Highlights

The blue is able to read position data of objects from a pixy camera and display the coordinates of the object on a grid in a socket.io based web interface. The coordinate position data also controls the speed and direction of the robot to track a brightly colored object. Integrated a variety of devices and sensors into a web based application on the bone.

Here is a youtube video of the object following mode in action: https://youtu.be/4T3ZzZQ5Q7M
Here is a youtube video of the web interface control in action: https://youtu.be/8AIXEpo2eGY

Theory of Operation

The theory for the project was rather simple, start a web server and have it make function calls based on input from the user of the website. Most of the work for the project was debugging libraries and hardware. The python server program makes calls to the socket.io library for communicating with the client computer and the robotics cape library for controlling hardware
Routine for image tracking:
1. The object to be tracked is programmed into the pixy camera using the built in detection button on the camera
2. There is a python script that interfaces with the libpixyusb library to communicate with the pixy camera over a USB interface.
3. The python script uses the object position to proportionally control 2 continuous rotation servos(via the robotics cape library).
4. The object position is written to a JSON file every 50 frames (roughly one second)
5. The webpage has a 127x127 gird. The webpage polls the JSON file every second through the socket.io interface and colors the appropriate grid representing the position.

Work Breakdown

List the major tasks in your project and who did what.

Also list here what doesn't work yet and when you think it will be finished and who is finishing it.

  • Alvin's Tasks
  1. Build web interface
  2. Update Wiki page
  3. Interfaced connectors
  4. Integrate IR sensors, bump sensors, and gpio with the bone.
  5. Integrate servos
  6. Integrate IR sensors with the web interface.
  7. Created instructions and install files for the web server
  8. Made the server webpage look prettier
  • Sam's Tasks
  1. Find and order connectors
  2. Install pixycam libraries on the blue and use position data to control servos.
  3. send and visualize object position data on the web interface
  4. Mount the blue to the robot
  5. retrofit continuous rotation servos onto the wheels we were given with help from Gary and Jack
  6. integrate servos
  7. Created instructions and scripts for setting up the pixy cam
  8. Setup a socket.io function call from the webpage to start the pixycamera script automatically.
  9. Made the server webpage look prettier (it's ugly and jumbled at the moment)
  • incomplete tasks
  1. The motor drivers were not used because our motors operate on 5v rather than the supplied 12v. We decided that retrofitting continuous rotation servos was a better idea.

Future Work

  1. PID control for the object follow script (currently only proportional control is implemented) would make the robots movement less erratic.
  2. Add a screenshot button and object recognition button to the web interface so that new objects could be programmed in remotely without being forced to use the hardware button on the pixy.
  3. implement a web programming interface like blockly to make the Blue accessible to younger students.

Conclusions

The BeagleBone Blue has some very good hardware built into it, one of the flaws with this project was that we just used what was really available to us for interfacing with it. It would have been a lot more fun to pick out parts and build a platform for it to fully utilize all its functions. The board still being in an experimental state resulted in some interesting hardware issues, the ones we encountered where that some times the 12 Volt jack is bumped and shuts down the entire board. If the blue boots from just USB power, and then the 12 volt power or battery is plugged in then a few pwm signal pins will stay at 3.3 volts, making pwm impossible on those pins, rebooting the blue solves this. Finally whenever the voltage rail for the pwm is started, nothing can actually be attached to rail, or It will fail to reach 6 volts, the solution to this is to start with everything unplugged and then plug them back in. Aside from this the blue worked as expected.



thumb‎ Embedded Linux Class by Mark A. Yoder