Archive for the ‘Progress’ Category

Welcome to the Jungle, Baby!

We finally got all five “fingers” working and can now play songs on Hard and Expert difficulty. We thought it was appropriate for Slashbot’s first full performance to be “Welcome to the Jungle”. We are happy with the results: 80% accuracy. Once we have the “strum” actuator mounted we expect our scores to improve. More videos to come!


Update with Pics & Video of Testing

Only two weeks remain before Demo Day and we’ve been very busy getting Slashbot functioning properly. The actual robotic portion of the project is complete and we are in the final testing stages. The software (all LabVIEW!) has gone through many revisions over the past month and, like the hardware, it is being tested for final release.

There has been one major modification to the software within the last week. Reluctantly, we removed the functionality that would allow the system to detect the tempo of the song and adjust delays appropriately. The original motivation for this function stems from the delay that occurs between note detection and when the note should be played. As it was, our system was unreliable. In the interest of creating a robust design we implemented a manual delay adjustment control. It is slightly less elegant, but makes Slashbot much more reliable.

Below are some pictures and a video of our current setup.

The controller apparatus – we definitely plan to clean up the presentation

A closeup of the solenoid actuators used to press the controller buttons.

A very short demo of the first test of our “working” system. We still need to make plenty of tweaks. This is a video of Slashbot playing on Medium difficulty. Our circuit was not completely working at the time this video was shot so we could not play with more than four buttons (which would allow Hard and Expert play modes).

We plan to have more complete and hopefully more impressive videos up soon!

It’s Alive!

Up to this point, Slashbot has been divided into two major components: the “brain” and the “body”. The brain consists of our PXI system and LabVIEW code which analyzes the video signal. The body is a discrete circuit that amplifies a digital signal to fire a solenoid actuator used to press buttons on the Guitar Hero game controller.

For the first time, we connected the body to the brain. Currently the robotic circuit only utilizes a single actuator. Now that we have verified the circuit performs correctly, it will be duplicated five times (once for each of the remaining controller buttons and once for the strum bar). We were very excited to see both parts of our system working in tandem. The robotic circuit was connected to the detection line of a single colored note (green). Every time a green note passed through our detection area on the screen the actuator would press the green button on the game controller.

The next step is to duplicate the circuit for all buttons and the strum bar. We also need to tweak the timing of the system so the buttons are pressed at the correct time in the game and not when they are immediately detected.

NI Week Demo Submission

We have submitted our project to be considered in the NI Week Demo contest. NI Week will be held August 5-7, 2008 in Austin, Texas. The convention brings together the community of scientist, engineers, and educators in order to discuss developments in their technical fields and to exhibit how NI products can be used to further such developments.

Users who have created innovative applications and solutions with NI products are asked to submit demos. If chosen, we will have the opportunity to exhibit SlashBot! Here’s the video we submitted which shows our current progress and where we are headed.

Progress – Detection VI Complete

We have just completed our note detection virtual instrument in LabVIEW. TheVI divides the input video signal into small detection waveforms (one for each colored note) and sets level triggers. If the voltages rises above a trigger point for that note, the indicator turns on.

The screenshot below illustrates the VI in action. Both the red and blue notes are detected so the indicators are turned on (note: even though the blue appears a bit dark, it is in fact on). The spikes seen on the edges of the waveform are outside of our detection area so they are not being monitored. The pixel information contained in that portion corresponds to the white outlines of the fret board.

Detection VI Front Panel