Difference between revisions of "ECE597 Fall2014 LED Helmet"

From eLinux.org
Jump to: navigation, search
Line 112: Line 112:
  
 
https://www.youtube.com/watch?v=xqu2p-e1CKA
 
https://www.youtube.com/watch?v=xqu2p-e1CKA
 +
 +
https://www.youtube.com/watch?v=QGg0AZK6Wac

Revision as of 06:25, 18 November 2014

thumb‎ Embedded Linux Class by Mark A. Yoder


Team members: Asa Bromenschenkel, Caio Silva

Grading Template

I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

00 Executive Summary
00 Installation Instructions 
00 User Instructions
00 Highlights
00 Theory of Operation
00 Work Breakdown
00 Future Work
00 Conclusions
00 Demo
00 Late
Comments: I'm looking forward to seeing this.

Score:  10/100

(Inline Comment)

Executive Summary

This project uses LED arrays on a wearable headpiece mimicking those used by the group Daft Punk. Along with the ability to display various patterns, the arrays can also be modified based on an accompanying accelerometer and interactions with Twitter.

The project uses three different APIs to implement all of its features. We use the LEDscape API to start a server that takes data in when connected to and translates then sends that data to LED arrays. openpixelcontrol is used to create the patterns that are sent to the LEDscape. Finally the TwitterAPI is used to interact with the user's Tweets and display them on the arrays.

We currently have a working prototype which can be seen in the conclusion section.

Packaging

Two 10x10 LED arrays have been mounted onto the front of a baseball hat just above the brim. These arrays were constructed on the Rose-Hulman campus and include small holes in the corners which were used to sew the arrays to the hat.

A communications wire was created to allow the Helmet range away from the BeagleBone Black. On one end it has male header pins to connect to the BeagleBone and the other end has female header pins to allow the arrays and accelorometer to be connected and disconnected without the need to permanently attach.

Because we do not have a function WiFi adapter to allow the BeagleBone access to the internet when not attached to a host computer, we have not packaged it this in a portable manner. It remains attached to the prototyping breadboard used for class.

Installation Instructions

  1. Plug in Beaglebone into host computer via USB.
  2. Wire the first neopixel LED array data wire to P_22 on the Beagle. Wire the second LED array data wire to P_21 on the Beagle.
  3. Run the following line to install the TwitterAPI
  4. pip install TwitterAPI
  5. Also wire a Sparkfun’s ADXL335 3-axis accelerometer to the Beagle following the diagram
Bbb accelerometer image.png
  1. Follow User Instructions to run the code.

User Instructions

The use of this project requires the USB connection to a host computer and two terminals. From the first, navigate to the LEDscape directory (cloned from https://github.com/Yona-Appletree/LEDscape.git) and run the following commands to setup the server:

echo BB-BONE-PRU-01 >/sys/devices/bone_capemgr.9/slots
make
sudo systemctl enable /path/to/LEDscape/ledscape.service
sudo ./run-ledscape

Running the Twitter Connection:

To run the interaction between the Beagle and Twitter, in the other terminal navigate to the openpixelcontrol directory (cloned from https://github.com/zestyping/openpixelcontrol.git) and run the following line:

python_clients/<pattern_file>.py

where <pattern_file> is the name of the file the pattern is stored in.

Running the accelorometer:

If you wish to use the accelorometer, navigate to the accelerometer directory (cloned from https://github.com/caiocvsilva/ECE597-2.git) and run the following line:

node ac.js

Highlights

Besides all the concepts behind the usage of the LEDscape library, the neonpixel library, twitter API and the accelerometer data on the BeagleBone, one of the most important characteristics of this project is the interface between all this components, that allow us to create a LED Helmet, that interact with the user, through tweets or movements of it owns head, changing the color or what is shown on the helmet.

Theory of Operation

This project has four components interacting to give the full project. The central component is the LEDscape server. LEDscape takes data from sources that connect to it and translates the data to information for the neopixel LED arrays. Next is the openpixelcontrol Python clients that connect to the LEDscape server. These create the patterns which will be displayed on the arrays by assigning RGB values to each pixel. The Python clients also interact with Twitter using TwitterAPI to receive the latest Tweet from the user and display it using text scrolling. The final component is the Javascript code which accesses the accelorometer and changes the color of the LEDs based upon its rotation.

Work Breakdown

This project was separated in small blocks that could be easily executed and later on were combined to finalize the project.
The first block was to understand how the LEDscape and neonpixelcontrol worked with the LED grid, displaying different patters and colors trough the grid.
After understanding that two different blocks were studied, the interface of the twitter API with the LEDscape library and the use of an accelerometer to change the patterns showed by the grid.
With a lot of work spent on this two blocks was time to get it interfaced, so to work on the final part of the project, a helmet was chosen and the programs and libraries were adapted to work with two grids.
The helmet was prepared and connected to the BeagleBone ecosystem, and then the LED Helmet was finalized.

Future Work

An Idea for a future work can be aggregate a sensor to the ecosystem that allows following someone that get close to the helmet, and draw a pattern that responds to this data.
As an example, two big eyes could be draw following the person to all the sides, and even crossing if he/she is too close.

Conclusions

After all the set up and some hardworking, the most part of the work was successfully finished, allowing the team to create the interface to the LEDscape library, using OpenPixelControl, Accelerometer and the Twitter API.
Each one of the parts was individually developed and combine on the final stage of the project, bringing a one device with all the components.
The final product has two grids of LEDs and an accelerometer connected to a hat, and having a long connection with the beagle bone, long enough to allow the user to put the beagle bone inside a pocket and make it portable using also an extra battery.
The grids can be controlled using the LEDscape library and it can show different patters, that can be chosen trough the OpenPixelControl library, or can be controlled using Tweets to scroll the text over the grid. The patterns can also come from the accelerometer moviment, since a change on the values read from it can fire different patterns to the grid.

These are some photos of the result of the project:

Helmet eyes.jpg
Helmet rose.jpg


And some videos demonstrating:

https://www.youtube.com/watch?v=xqu2p-e1CKA

https://www.youtube.com/watch?v=QGg0AZK6Wac