Difference between revisions of "ECE497 Project Beaglebone Blue Robotics"

From eLinux.org
Jump to: navigation, search
m
m
Line 77: Line 77:
  
 
Will start the wifi program in a tmux session called wifi, and will output the ip address for the blue once it connects
 
Will start the wifi program in a tmux session called wifi, and will output the ip address for the blue once it connects
 +
[[File:wifistart.png|framed|The normal run case]]
  
 
Run
 
Run

Revision as of 14:11, 9 November 2016

thumb‎ Embedded Linux Class by Mark A. Yoder


Team members: Alvin Koontz, Samuel Lawrence

Grading Template

Tuesday draft: Looks like a good start. Don't forget the video. Instructions are sometimes hard to follow, but might be easier with pictures.

I've added some red comments through out. Feel free to delete them once you've made fixes.

I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

00 Executive Summary
00 Installation Instructions 
00 User Instructions
00 Highlights
00 Theory of Operation
00 Work Breakdown
00 Future Work
00 Conclusions
00 Demo
00 Late
Comments: I'm looking forward to seeing this.

Score:  10/100

(Inline Comment)

Executive Summary

We will be interfacing the newly developed Beagle-boneBeagleBone Blue with an existing Robotics Platform from spark fun Sparkfun. Build an interface for existing libraries for motor control, servo control, and sensors will be developed. Sentence doesn't make sense. Stretch goals include some amount of image processing. Add something like: We are building a web interface that will let the user control the robot.

We have the Blue working and mounted to the robot frame. We are able to connect to the school WiFi and host a web-server to interface with. We have rewired the robot to be connected to the beagebone BeageBone blue board. We can read all the sensors on board and interface with the pixy camera.

What we haven't been able to get working is motor control because the motors for the spark fun robot were rated for much lower lower in what way? Voltage?than what the blue provides so we had to replace those with continuous rotation servos. We have had power issues with the board itself, these are related to the 12 volt jack and the 6 volt regulator, sometimes the jack disconnects or something causing the whole board to shut down, and sometimes the 6 volt regulator fails to work. We have yet to find a solid reason to why these things happen. Is it the power up sequence?

In conclusion we had a number of difficulties getting the blue to work, and part of it would definitely be due to how experimental its current state is. After getting everything set up though we found it to be very easy to use.

Packaging

Installation Instructions

Give step by step instructions on how to install your project. TODO: write the install scripts

User Instructions

Once everything is installed

cd into the root directory of the project

Run

./startWifi.sh

Will start the wifi program in a tmux session called wifi, and will output the ip address for the blue once it connects

The normal run case

Run

./startServer.sh

Will start the server program in a tmux session called server, be default it should be on yourIP:8090 Can you show a picture?

From there the robot can be controlled with the keyboard keys, and sensor reads can be done with the buttons.

Highlights

The blue is able to read position data of objects from a pixy camera and display the coordinates of the object on a grid in a socket.io based web interface. The coordinate position data also controls the speed and direction of the robot to track a brightly colored object. Integrated a variety of devices and sensors into a web based application on the bone. TODO: test and verify remote operation with a battery and wifi only. make sure connection is still open TODO: youtube video

Theory of Operation

The theory for the project was rather simple, start a web server and have it make function calls based on input from the user of the website. Most of the work for the project was debugging libraries and hardware. The python server program makes calls to the socket.io library for communicating with the client computer and the robotics cape library for controlling hardware
Routine for image tracking:
1. The object to be tracked is programmed into the pixy camera using the built in detection button on the camera(TODO: add reference to instructions for this)
2. There is a python script that interfaces with the libpixyusb library to communicate with the pixy camera over a USB interface.
3. The python script uses the object position to proportionally control 2 continuous rotation servos(via the robotics cape library).
4. The object position is written to a JSON file every 50 frames (roughly one second)
5. The webpage has a 127x127 gird. Picture?The webpage polls the JSON file every second through the socket.io interface and colors the appropriate grid representing the position.

Work Breakdown

List the major tasks in your project and who did what.

Also list here what doesn't work yet and when you think it will be finished and who is finishing it.

  • Alvin's Tasks
  1. Build web interface
  2. Update Wiki page
  3. Interfaced connectors
  4. integrate IR sensors, bump sensors, and gpio with the bone.
  5. integrate servos
  6. TODO: integrate IR sensors with the web interface.
  7. TODO: create instructions and install files for the web server
  8. TODO: make the server webpage look prettier
  • Sam's Tasks
  1. Find and order connectors
  2. Install pixycam libraries on the blue and use position data to control servos.
  3. send and visualize object position data on the web interface
  4. Mount the blue to the robot
  5. retrofit continuous rotation servos onto the wheels we were given with help from Gary and Jack
  6. integrate servos
  7. TODO: create instructions and scripts for setting up the pixy cam
  8. TODO: setup a socket.io function call from the webpage to start the pixycamera script automatically.
  9. TODO: make the server webpage look prettier (it's ugly and jumbled at the moment)
  • incomplete tasks
  1. The motor drivers were not used because our motors operate on 5v rather than the supplied 12v. We decided that retrofitting continuous rotation servos was a better idea.

Future Work

  1. PID control for the object follow script (currently only proportional control is implemented) would make the robots movement less erratic.
  2. Add a screenshot button and object recognition button to the web interface so that new objects could be programmed in remotely without being forced to use the hardware button on the pixy.
  3. implement a web programming interface like blockly to make the Blue accessible to younger students.

Conclusions

The BeagleBone Blue has some very good hardware built into it, one of the flaws with this project was that we just used what was really available to us for interfacing with it. It would have been a lot more fun to pick out parts and build a platform for it to fully utilize all its functions. The board still being in an experimental state resulted in some interesting hardware issues, the ones we encountered where that some times the 12 Volt jack is bumped and shuts down the entire board. If the blue boots from just USB power, and then the 12 volt power or battery is plugged in then a few pwm signal pins will stay at 3.3 volts, making pwm impossible on those pins, rebooting the blue solves this. Finally whenever the voltage rail for the pwm is started, nothing can actually be attached to rail, or It will fail to reach 6 volts, the solution to this is to start with everything unplugged and then plug them back in. Aside from this the blue worked as expected.



thumb‎ Embedded Linux Class by Mark A. Yoder