ECE597 Fall2014 LED Helmet

From eLinux.org
Jump to: navigation, search

[[]] thumb‎ Embedded Linux Class by Mark A. Yoder


Team members: Asa Bromenschenkel, Caio Silva

Grading Template

I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

09 Executive Summary
10 Installation Instructions - Nice diagram
09 User Instructions
08 Highlights
08 Theory of Operation
10 Work Breakdown
10 Future Work - Good ideas
09 Conclusions
10 Demo - Nice demo
10 Not Late
Comments: Thanks for fixing it up.

Score:  93/100

Executive Summary

This project uses LED arrays on a wearable headpiece mimicking those used by the group Daft Punk. Along with the ability to display various patterns, the arrays can also be modified based on an accompanying accelerometer and interactions with Twitter.

The project uses three different APIs to implement all of its features. We use the LEDscape API to start a server that takes data in when connected to and translates then sends that data to LED arrays. openpixelcontrol is used to create the patterns that are sent to the LEDscape. Finally the TwitterAPI is used to interact with the user's Tweets and display them on the arrays.

We currently have a working prototype which can be seen in the conclusion section.

Packaging

Two 10x10 LED arrays, as showing bellow,have been mounted onto the front of a baseball hat just above the brim. These arrays were constructed on the Rose-Hulman campus and include small holes in the corners which were used to sew the arrays to the hat.

GridLED.jpg
GridLEDback.jpg

A communications wire was created to allow the Helmet range away from the BeagleBone Black. On one end it has male header pins to connect to the BeagleBone and the other end has female header pins to allow the arrays and accelorometer to be connected and disconnected without the need to permanently attach.

LEDcables.jpg

Because we do not have a functioning WiFi adapter to allow the BeagleBone access to the internet when not attached to a host computer, we have not packaged it this in a portable manner. It remains attached to the prototyping breadboard used for class.

Besides that, and accelerometer ADXL335 of Sparkfun was integrated to the project, allowing the Grids to be controlled with head movements, the accelerometer can be stored in any place of the hat, the following picture shows the accelerometer with it connections, also using the external cable from the Grids.

LEDaccelerometer.jpg

Installation Instructions

  • Plug in Beaglebone into host computer via USB.
  • Wire the first neopixel LED array data wire to P_22 on the Beagle. Wire the second LED array data wire to P_21 on the Beagle.
  • Run the following line to install the TwitterAPI
 pip install TwitterAPI
  • Also wire a Sparkfun’s ADXL335 3-axis accelerometer to the Beagle with the X, Y, and Z pins connected to P_36, P_38, and P_40 respectively.
  • The final wire diagram is shown below.
Nice diagram
LEDHelmet2014.png
  • Follow User Instructions to run the code.

User Instructions

The use of this project requires the USB connection to a host computer and two terminals. From the first, navigate to the LEDscape directory (cloned from https://github.com/Yona-Appletree/LEDscape.git) and run the following commands to setup the server:

 echo BB-BONE-PRU-01 >/sys/devices/bone_capemgr.*/slots
 make
 sudo systemctl enable /path/to/LEDscape/ledscape.service
 sudo ./run-ledscape

This code has been packaged into a shell script that can be found in the Project subdirectory of [1]. The file is project, which when copied to the root directory of the Beagle can be run by typing:

 sudo ./project

Running the Twitter Connection:

To run the interaction between the Beagle and Twitter, in the other terminal navigate to the openpixelcontrol directory (cloned from https://github.com/zestyping/openpixelcontrol.git) and run the following line:

 python_clients/<pattern_file>.py

where <pattern_file> is the name of the file the pattern is stored in.

The project python files can be found in the Project/python_clients subdirectory of [2]. This directory can replace the python_clients directory in openpixelcontrol. The main code can be then be run:

 python_clients/test.py -p <pattern> -t <optional_text>

where <pattern> is the pattern to be displayed and <optional_text> is an option to type your own text to be displayed with the textScroll pattern. All options can be seen by running:

 python_clients/test.py -h

Running the accelorometer:

If you wish to use the accelorometer, navigate to the accelerometer directory (cloned from https://github.com/caiocvsilva/ECE597-2.git) and run the following line:

 node ac.js

Highlights

English is a bit rough in this paragraph.

The highlight of this project is the intercommunication between all the separate components of the project. The LEDscape server, the openpixelcontrol library, TwitterAPI, and the accelerometer all interact to create the LED Helmet. This allows an end user experience of being able to display various patterns, show text scrolling Tweets from Twitter, and change the display colors by tilting the accelerometer.

Theory of Operation

Could you add more deatils here?  For example:
How do sources connect to LEDscape?
How are the patterns created that are displayed?
How does the JavaScript talk the display?

This project has four components interacting to give the full project. The central component is the LEDscape server. LEDscape takes data from sources that connect to it and translates the data to information for the neopixel LED arrays. Any source can connect to the server through the IP address and port. In this project, since we are not communicating over a network but instead communicating over the same machine, it uses the local IP address of 127.0.0.1 and the default port of 7890.

Next is the openpixelcontrol Python clients that connect to the LEDscape server. These create the patterns which will be displayed on the arrays by assigning RGB values to each pixel. This creates an array of tuples which each have a red, blue, and green value in that order with values form 0 to 255. This array is then translated by the openpixelcontrol library to the format required by the LEDscape and sent to the IP address.

The Python clients also interact with Twitter using TwitterAPI to receive the latest Tweet from the user and display it using scrolling text. The final component is the Javascript code which accesses the accelorometer and changes the color of the LEDs based upon its rotation.

Work Breakdown

Please tell me who did what.

The priority of the project was the research and study of the various possibilities with the LED interface with the Beaglebone. Both of the students worked hard to figure it out the best approach and to come up with the LED Helmet idea.

This project was separated in small blocks that could be easily executed by both students and later on were combined to finalize the project.

The first block was studied by both of the students, this block consisted in understand how the LEDscape and neonpixelcontrol worked with the LED grid, displaying different patters and colors trough the grid.

After understanding that two different blocks were studied, the interface of the TwitterAPI with the LEDscape server, mostly developed by the student Asa Bromenschenkel. The use of an accelerometer to change the patterns showed by the grid was developed by the student Caio Silva.

With a lot of work spent on this two blocks was time to get it interfaced, so to work on the final part of the project, a hat was chosen and the programs and libraries were adapted to work with two grids. The helmet was prepared and connected to the BeagleBone, and then the LED Helmet was finalized. This last phase of the project was coordinated and executed by both of the students.

Future Work

An Idea for a future work can be aggregate a sensor to the ecosystem that allows following someone that get close to the helmet, and draw a pattern that responds to this data. As an example, two big eyes could be draw following the person to all the sides, and even crossing if he/she is too close. The Pattern of the eyes was already created and works with a sequence of commands, making the eyes to look to the sides and blink. A big approach would be also make it more portable, using a WiFi adapter and an extra battery.

Conclusions

After all the set up and some hardworking, the most part of the work was successfully finished, allowing the team to create the interface to the LEDscape library, using OpenPixelControl, Accelerometer and the Twitter API. Each one of the parts was individually developed and combine on the final stage of the project, bringing a one device with all the components. The final product has two grids of LEDs and an accelerometer connected to a hat, and having a long connection with the beagle bone, long enough to allow the user to put the beagle bone inside a pocket and make it portable using also an extra battery. The grids can be controlled using the LEDscape library and it can show different patters, that can be chosen trough the OpenPixelControl library, or can be controlled using Tweets to scroll the text over the grid. The patterns can also come from the accelerometer moviment, since a change on the values read from it can fire different patterns to the grid.

These are some photos of the result of the project:

Nice Videos
Helmet eyes.jpg
Helmet rose.jpg


And some videos demonstrating:

https://www.youtube.com/watch?v=xqu2p-e1CKA

https://www.youtube.com/watch?v=QGg0AZK6Wac