Difference between revisions of "ECE597 Project Robot Control"

From eLinux.org
Jump to: navigation, search
m (to Fall 2014)
 
(30 intermediate revisions by 3 users not shown)
Line 1: Line 1:
[[Category:ECE597 |Px]]
+
[[Category:ECE497Fall2014]]
 
{{YoderHead}}
 
{{YoderHead}}
  
Line 9: Line 9:
  
 
<pre style="color:red">
 
<pre style="color:red">
00 Executive Summary
+
10 Executive Summary - Clear and to the point.  Nice diagram and pictures
00 Installation Instructions  
+
10 Installation Instructions - Nice a detailed.  Nice readme.md on the repo
00 User Instructions
+
10 User Instructions
00 Highlights
+
09 Highlights
00 Theory of Operation
+
10 Theory of Operation - Good
00 Work Breakdown
+
10 Work Breakdown
00 Future Work
+
10 Future Work
00 Conclusions
+
09 Conclusions
00 Demo
+
10 Demo
00 Late
+
10 Not Late
Comments: I'm looking forward to seeing this.
+
Comments: Nice project.  Too bad you didn't have time to program it some more.
  
Score:  10/100
+
Score:  98/100
 
</pre>
 
</pre>
 
<span style="color:red">(Inline Comment)</span>
 
  
 
== Executive Summary ==
 
== Executive Summary ==
 
+
[[File:Robot.jpg|thumbnail|Our Robot]]
 
The objective of this project is to create an autonomous Robot based on Beaglebone Black. The robot's structure is consisted basically of two parallel back tractioned wheels, a guideball on the front, 3 Infrared rangefinders and two digital rotation encoders.
 
The objective of this project is to create an autonomous Robot based on Beaglebone Black. The robot's structure is consisted basically of two parallel back tractioned wheels, a guideball on the front, 3 Infrared rangefinders and two digital rotation encoders.
  
Line 34: Line 32:
 
The second type of sensor that can be read is the IR distance sensor and they can be a little tricky to deal due to their non-linear nature, but putting them 10 centimeters away from the edge and using an equation to correlate the voltage to a distance works.
 
The second type of sensor that can be read is the IR distance sensor and they can be a little tricky to deal due to their non-linear nature, but putting them 10 centimeters away from the edge and using an equation to correlate the voltage to a distance works.
  
In the end we want an autonomous robot that can find and hug walls using the IR sensors, and then use the rotary encoders to detect when the robot is in a stalled state. Along side this we made hardware in the form of a cape to hold all the components needed.  
+
In the end we want an autonomous robot that can find and hug walls using the IR sensors, and then use the rotary encoders to detect when the robot is in a stalled state. Along side this we made hardware in the form of a cape to hold all the components needed.
  
 +
To the right you can see the completed robot hardware.
  
 
== Packaging ==
 
== Packaging ==
 +
 +
'''Cape'''
 +
[[File:Cape.jpg|thumbnail|Closeup of the Cape]]
  
 
For the packaging of the solution our team made a cape for the BBB that included all of the voltage dividers needed to properly read the sensors alongside the H-Bridge to drive the motors. The point of this cape is so that the user just has to plug all the inputs and outputs on the cape and the robot should be ready to go.
 
For the packaging of the solution our team made a cape for the BBB that included all of the voltage dividers needed to properly read the sensors alongside the H-Bridge to drive the motors. The point of this cape is so that the user just has to plug all the inputs and outputs on the cape and the robot should be ready to go.
  
[Incert Circuit Diagram, schemaitc, picture]
+
<gallery>
 +
Pcb_robot.JPG|PCB Capture
 +
Schematic_robor.JPG|Schematic Capture
 +
</gallery>
 +
 
 +
{| class="wikitable"
 +
|-
 +
| Bill of Materials || Number of Parts
 +
|-
 +
| 22k Ohm Resistor || 7 
 +
|-
 +
| 10k Ohm Resistor || 7
 +
|-
 +
| 16 Pin IC Socket || 1
 +
|-
 +
| L293d H-Bridge|| 1
 +
|-
 +
| 23-Pin Two Row DIP Header|| 2
 +
|-
 +
| 7-Pin Single Row DIP Header|| 2
 +
|-
 +
| 1-Pin Single Row DIP Header|| 7
 +
|-
 +
| 4-Pin Single Row DIP Header|| 1
 +
|-
 +
| 9V Battery Header|| 1
 +
|-
 +
| Sharp 2Y0A21 IR Sensor|| 3
 +
|-
 +
| Magician Chassis|| 1
 +
|-
 +
| LifeCharge and Spliced USB to 5V Barrel Cable|| 1
 +
|}
 +
 
 +
'''Power'''
 +
 
 +
For power we have two parts. The 9 Volt battery to power the motors and the 5 V barrel connector to power the BBB and logic. For the 9V battery you just have to buy one and attach it to the 9V header. The barrel connector was slightly more complicated. We used a USB battery pack, but we needed a barrel 5 V input. So what we did was take a mini USB cable and spliced it. The connected the power and ground to a two wire to barrel connector. This allowed the robot to be portable and run on its own.
 +
 
 +
'''Breadboard Testing'''
 +
 
 +
Our initial testing of our circuit came through building it on a breadboard. Here is a schematic of our setup.
 +
 
 +
[[File:Sketch bb.jpg|720px|Breadboard Schematic]]
  
 
== Installation Instructions ==
 
== Installation Instructions ==
Line 80: Line 124:
 
The python scripts used on the project can be found in this git [https://github.com/vaddera/Robot.git repository].
 
The python scripts used on the project can be found in this git [https://github.com/vaddera/Robot.git repository].
  
 +
For installing the hardware, on a breadboard wire both the [https://www.sparkfun.com/products/242 Infrared Distance Sensors] and the L293D as seen in the schematic figure. For installing the cape, simply attach the cape on the top of the Beaglebone Black and wire each sensor properly and attach the L293D on the integrated circuit socket.
  
* Be sure your README.md is includes an up-to-date and clear description of your project so that someone who comes across you git repository can quickly learn what you did and how they can reproduce it.
+
Do not forget to power the Bone with a battery pack similar to the [http://www.mylifecharge.com/JuicyPACK-2600mah-Powerbank-p/ont-pwr-35553.htm LifeCHARGE JUICYPACK].
* Include any additional packages installed via '''opkg'''. // Probably won't be necessary.
 
* Include kernel mods. // Probably won't be necessary.
 
* If there is extra hardware needed, include links to where it can be obtained.
 
  
 
== User Instructions ==
 
== User Instructions ==
 
Once everything is installed, how do you use the program?  Give details here, so if you have a long user manual, link to it here.
 
 
Two possible ways this will be implemented:
 
 
1. Manually run the main python script to start the robot.
 
 
2. Change the init.d file to start running the python script on boot.
 
  
 
'''Programming Tools and Instructions'''
 
'''Programming Tools and Instructions'''
Line 134: Line 168:
 
     - This small program prints out the read values from the IR sensors. This is useful for finding the correct voltage multiplier and judge distances with the sensor. Essentially seeing what the robot is seeing
 
     - This small program prints out the read values from the IR sensors. This is useful for finding the correct voltage multiplier and judge distances with the sensor. Essentially seeing what the robot is seeing
 
'''AI_seq.py'''
 
'''AI_seq.py'''
 +
 
This file contains our basic AI. This program finds a wall and then hugs it. If it unhuggs a wall if it gets to far, the robot finds another wall.
 
This file contains our basic AI. This program finds a wall and then hugs it. If it unhuggs a wall if it gets to far, the robot finds another wall.
  
Line 148: Line 183:
 
         low_PWM  - When turning away from the wall the robot slows a motor down to this low_PWM to have a slight turn.
 
         low_PWM  - When turning away from the wall the robot slows a motor down to this low_PWM to have a slight turn.
 
         high_PWM - This is the max level the motors run at.
 
         high_PWM - This is the max level the motors run at.
 +
 +
        The following variable controls the turn delay to make a 90 degree turn
 +
        ninty - Sets delay when turning to create a full 90 degree turn
  
 
'''How to Run on the Robot'''
 
'''How to Run on the Robot'''
Line 162: Line 200:
  
 
== Highlights ==
 
== Highlights ==
 +
Our project was able to successfully avoid crashing, find, and then follow walls.
 +
Demo Avoiding Walls :  [https://www.youtube.com/watch?v=bTh0IXlKXHM YouTube]
  
Here is where you brag about what your project can do.
+
== Theory of Operation ==
  
Include a [http://www.youtube.com/ YouTube] demo.
+
The software basically is divided in three different distinct states. The first state is accessed when the robot has not detected any walls yet or when it moves too far from a wall. In this state, the robot moves forward until a wall is detected by the front sensor. When a wall is detected, the robot decides based on which sensor is reading the largest distance to the next obstacle, to turn 90 degrees towards that direction. When a direction is taken, it changes to one of the two sequential states: clings to the wall on the right or the left. These two follow essentially the same logic but for different directions. When the robot is between the lower limit or the higher limit of the distance from the robot to the wall, moves forward. If the robot gets too close to the wall, a distance that is smaller than the lower limit, the robot turns slightly to the opposite distance. If the robot gets too far from the wall, a distance that is bigger than the higher limit, the robot turns slightly towards the wall. If the robot moves away from the wall but in a distance that is much further than the distance defined by the higher limit, the logic changes the state back to the initial state.
  
== Theory of Operation ==
+
For the Infrared distance sensors, due to its not linear nature, the equation distance = 41.543(volt + 0.30221)^-1.5281 in which works for distances between 10cm and 80cm. In this case, it is advisable to put the sensors 10cm away from the edge of the robot, but nothing stops the user to define scales accordingly, not letting the robot reaching distances < 10cm.
 
 
Give a high level overview of the structure of your software.  Are you using GStreamer?  Show a diagram of the pipeline.  Are you running multiple tasks?  Show what they do and how they interact.
 
  
 
== Work Breakdown ==
 
== Work Breakdown ==
Line 183: Line 221:
 
| 2. Create codes to read the sensors and test the wheels: Alex || Completed || Completed
 
| 2. Create codes to read the sensors and test the wheels: Alex || Completed || Completed
 
|-
 
|-
| 3. Create main code to run the robot: Alex and Eric. || Work in Progress || By 11/17
+
| 3. Create main code to run the robot: Alex and Eric. || Completed|| Completed
 
|-
 
|-
 
| 4. Design the PCB for the cape: Eric. || Completed || Completed
 
| 4. Design the PCB for the cape: Eric. || Completed || Completed
Line 189: Line 227:
 
| 5. Solder the cape and test it: Eric. || Completed || Completed
 
| 5. Solder the cape and test it: Eric. || Completed || Completed
 
|-
 
|-
| 6. Final robot assembly: Alex and Eric. || Work in Progress || By 11/17
+
| 6. Final robot assembly: Alex and Eric. || Completed|| Completed
 
|}
 
|}
  
 
== Future Work ==
 
== Future Work ==
  
Suggest addition things that could be done with this project.
+
There are alot of things you could do with this project in terms of your own programs using the motorControl, and IRread functions.
 +
 
 +
In terms of this project you could add more calibration to the motors because one is noticably slower. We didnt take into account this differance when implimenting our algorithm.
 +
 
 +
The IR sensor values are noisy, one could find a way to get a smoother stream of inputs.
 +
 
 +
The rotary encoders could be used more to make more accurate turning and control.
  
 
== Conclusions ==
 
== Conclusions ==
  
Give some concluding thoughts about the project. Suggest some future additions that could make it even more interesting.
+
This project, although it seems simple, required a lot of debugging skills from either software and hardware. A good amount of hours were spent to detect an internal short in the H-bridge that caused two Beaglebone Blacks to be burned. Putting these problems aside, this project is a great opportunity to explore Adafruit's BBIO python APIs, specially the PWM applications and getting to know more about how the IR sensors used in the project work. The project could be upgraded to use an encoder in order that detects each of the wheels' speeds, balancing the PWM cycles making its motion more stable. The same encoders could be used to detect when the robot gets stuck on something, creating a timeout to attempt to solve the problem. A line sensor would also be beneficial in case the robot is put in a labyrinth to find its way out using the line as a center reference.
  
 
{{YoderFoot}}
 
{{YoderFoot}}

Latest revision as of 04:12, 10 August 2015

thumb‎ Embedded Linux Class by Mark A. Yoder


Team members: Alexandre van der Ven de Freitas, Eric Taylor

Grading Template

I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

10 Executive Summary - Clear and to the point.  Nice diagram and pictures
10 Installation Instructions - Nice a detailed.  Nice readme.md on the repo
10 User Instructions
09 Highlights
10 Theory of Operation - Good
10 Work Breakdown
10 Future Work
09 Conclusions
10 Demo
10 Not Late
Comments: Nice project.  Too bad you didn't have time to program it some more.

Score:  98/100

Executive Summary

Our Robot

The objective of this project is to create an autonomous Robot based on Beaglebone Black. The robot's structure is consisted basically of two parallel back tractioned wheels, a guideball on the front, 3 Infrared rangefinders and two digital rotation encoders.

The first type of sensor that is able to interact with the BBB is the Magician Robot encoder from Sparkfun which is successfully working and being read by a python script.

The second type of sensor that can be read is the IR distance sensor and they can be a little tricky to deal due to their non-linear nature, but putting them 10 centimeters away from the edge and using an equation to correlate the voltage to a distance works.

In the end we want an autonomous robot that can find and hug walls using the IR sensors, and then use the rotary encoders to detect when the robot is in a stalled state. Along side this we made hardware in the form of a cape to hold all the components needed.

To the right you can see the completed robot hardware.

Packaging

Cape

Closeup of the Cape

For the packaging of the solution our team made a cape for the BBB that included all of the voltage dividers needed to properly read the sensors alongside the H-Bridge to drive the motors. The point of this cape is so that the user just has to plug all the inputs and outputs on the cape and the robot should be ready to go.

Bill of Materials Number of Parts
22k Ohm Resistor 7
10k Ohm Resistor 7
16 Pin IC Socket 1
L293d H-Bridge 1
23-Pin Two Row DIP Header 2
7-Pin Single Row DIP Header 2
1-Pin Single Row DIP Header 7
4-Pin Single Row DIP Header 1
9V Battery Header 1
Sharp 2Y0A21 IR Sensor 3
Magician Chassis 1
LifeCharge and Spliced USB to 5V Barrel Cable 1

Power

For power we have two parts. The 9 Volt battery to power the motors and the 5 V barrel connector to power the BBB and logic. For the 9V battery you just have to buy one and attach it to the 9V header. The barrel connector was slightly more complicated. We used a USB battery pack, but we needed a barrel 5 V input. So what we did was take a mini USB cable and spliced it. The connected the power and ground to a two wire to barrel connector. This allowed the robot to be portable and run on its own.

Breadboard Testing

Our initial testing of our circuit came through building it on a breadboard. Here is a schematic of our setup.

Breadboard Schematic

Installation Instructions

This project was developed in python which has a simple API created by Adafruit called Beaglebone Black IO - BBIO.

To install Adafruit's BBIO follow the instructions below. These installation instructions are for Debian and Ubuntu distributions:

1 - Update your Operating System and install the dependencies:

bone$ sudo apt-get update
bone$ sudo apt-get install build-essential python-dev python-setuptools python-pip python-smbus -y

2 - Next install the BBIO:

bone$ sudo pip install Adafruit_BBIO

3 - Verify if the BBIO was installed properly:

bone$ sudo python -c "import Adafruit_BBIO.GPIO as GPIO; print GPIO"
# You should see the following message:
bone$ <module 'Adafruit_BBIO.GPIO' from '/usr/local/lib/python2.7/dist-packages/Adafruit_BBIO/GPIO.so'>

If the previous installation method fails, install it manually:

1 - Update and install the depencencies:

bone$ sudo apt-get update
bone$ sudo apt-get install build-essential python-dev python-pip python-smbus -y

2 - Clone the BBIO's git repository and change into the directory it was downloaded into:

bone$ git clone git://github.com/adafruit/adafruit-beaglebone-io-python.git
bone$ cd adafruit-beaglebone-io-python

3 - Install the API and remove the installation file:

bone$ sudo python setup.py install
bone$ cd ..
bone$ sudo rm -rf adafruit-beaglebone-io-python

The specifics on how to use this API in Python can be found on the following link provided by Adafruit.

The python scripts used on the project can be found in this git repository.

For installing the hardware, on a breadboard wire both the Infrared Distance Sensors and the L293D as seen in the schematic figure. For installing the cape, simply attach the cape on the top of the Beaglebone Black and wire each sensor properly and attach the L293D on the integrated circuit socket.

Do not forget to power the Bone with a battery pack similar to the LifeCHARGE JUICYPACK.

User Instructions

Programming Tools and Instructions

Our group used python as the base for our program and you can do multiple tings with the code base that is in Github

motorControl.py

   -In our motor control code we setup functions that the user can use to properly set the PWM to make the robots move in specific directions. The following is included in the file
   stop()
       -This command stops all PWM pins thus stopping the robot.
   forward()
       - This command starts moving the robot forward. It does not stop the robot.
   backward()
       - This command starts moving the robot backward. It does not stop the robot.
   left()
       - This command starts turning  the robot left using one motor.
   right()
       - This command starts turning the robot right using one motor.
   leftb()
       - This command starts turning the robot left using both motors.
   rightb()
       - This command starts turnning the robot right using both motors.
   right_delay(delay)
       - This command turns the robot right for a specific amount of time, delay, using only one motor.
   left_delay(delay)
       - This command turns the robot left for a specific amount of time, delay, using only one motor.
   rightb_delay(delay)
       - This command turns the robot right for a specific amount of time, delay, using both motors.
   leftb_delay(delay)
       - This command turns the robot left for a specific amount of time, delay, using both motors.

run

   - This small program is used to test out the motor control functions and to conferm the motor pins are connected correctly.

IRread.py

   - This program reads the sensor values for the IR sensors off the analog pins and returns thier value, and also calculates distance from that value. This file contains two useful functions
   Out1, Out2, Out3, Out4, Out5, Out6 = IRread()
   distance = distanceCalc(out)

read

   - This small program prints out the read values from the IR sensors. This is useful for finding the correct voltage multiplier and judge distances with the sensor. Essentially seeing what the robot is seeing

AI_seq.py

This file contains our basic AI. This program finds a wall and then hugs it. If it unhuggs a wall if it gets to far, the robot finds another wall.

       The following options change the settings in terms of follwing a wall
       frontLow - This is the lower bond on distance you want your front sensor to read. The higher the number, the larger the buffer you give the robot.
       rightLow - This is the lower bond on distance you want your right sensor to read. The higher the number, the larger the buffer you give the robot.
       leftLow  - This is the lower bond on distance you want your left sensor to read. The higher the number, the larger the buffer you give the robot.
       rightHigh - When following a wall. This is the max distance you want the robot to get away from the wall when the wall is on the right side of the robot.
       leftHigh - When following a wall. This is the max distance you want the robot to get away from the wall when the wall is on the left side of the robot.
       farfromright - This is the distance you want your robot to give up and start finding another wall on the right side
       farfromleft  - This is the distance you want your robot to give up and start finding another wall on the right side
       The following two commands deal with the motor control of the robot. The farther apart they are the faster it turns away/toward a wall. The higher the number, the faster the overall operation is.
       low_PWM  - When turning away from the wall the robot slows a motor down to this low_PWM to have a slight turn.
       high_PWM - This is the max level the motors run at.
       The following variable controls the turn delay to make a 90 degree turn
       ninty - Sets delay when turning to create a full 90 degree turn

How to Run on the Robot

We used a portable USB pack and constructed a cable to attach it to the barrel connector. This allows you to program the robot, unplug the USB, and the watch it run on the floor/test area.

   Steps
   1. Attach both USB and barrel connector from protable battery pack
   2. ssh into your bone
   3. Add a delay to the start of you program using the time library (time.sleep(delay)
   4. Run the python program using the 'python' command
   5. Unplug the USB and set your robot on the flooor


Highlights

Our project was able to successfully avoid crashing, find, and then follow walls. Demo Avoiding Walls : YouTube

Theory of Operation

The software basically is divided in three different distinct states. The first state is accessed when the robot has not detected any walls yet or when it moves too far from a wall. In this state, the robot moves forward until a wall is detected by the front sensor. When a wall is detected, the robot decides based on which sensor is reading the largest distance to the next obstacle, to turn 90 degrees towards that direction. When a direction is taken, it changes to one of the two sequential states: clings to the wall on the right or the left. These two follow essentially the same logic but for different directions. When the robot is between the lower limit or the higher limit of the distance from the robot to the wall, moves forward. If the robot gets too close to the wall, a distance that is smaller than the lower limit, the robot turns slightly to the opposite distance. If the robot gets too far from the wall, a distance that is bigger than the higher limit, the robot turns slightly towards the wall. If the robot moves away from the wall but in a distance that is much further than the distance defined by the higher limit, the logic changes the state back to the initial state.

For the Infrared distance sensors, due to its not linear nature, the equation distance = 41.543(volt + 0.30221)^-1.5281 in which works for distances between 10cm and 80cm. In this case, it is advisable to put the sensors 10cm away from the edge of the robot, but nothing stops the user to define scales accordingly, not letting the robot reaching distances < 10cm.

Work Breakdown

List the major tasks in your project and who did what.

Also list here what doesn't work yet and when you think it will be finished and who is finishing it.

1. Organize Breadboard circuit and test it: Alex. Completed Completed
2. Create codes to read the sensors and test the wheels: Alex Completed Completed
3. Create main code to run the robot: Alex and Eric. Completed Completed
4. Design the PCB for the cape: Eric. Completed Completed
5. Solder the cape and test it: Eric. Completed Completed
6. Final robot assembly: Alex and Eric. Completed Completed

Future Work

There are alot of things you could do with this project in terms of your own programs using the motorControl, and IRread functions.

In terms of this project you could add more calibration to the motors because one is noticably slower. We didnt take into account this differance when implimenting our algorithm.

The IR sensor values are noisy, one could find a way to get a smoother stream of inputs.

The rotary encoders could be used more to make more accurate turning and control.

Conclusions

This project, although it seems simple, required a lot of debugging skills from either software and hardware. A good amount of hours were spent to detect an internal short in the H-bridge that caused two Beaglebone Blacks to be burned. Putting these problems aside, this project is a great opportunity to explore Adafruit's BBIO python APIs, specially the PWM applications and getting to know more about how the IR sensors used in the project work. The project could be upgraded to use an encoder in order that detects each of the wheels' speeds, balancing the PWM cycles making its motion more stable. The same encoders could be used to detect when the robot gets stuck on something, creating a timeout to attempt to solve the problem. A line sensor would also be beneficial in case the robot is put in a labyrinth to find its way out using the line as a center reference.




thumb‎ Embedded Linux Class by Mark A. Yoder