Slashbot made a nice splash on the internet last year. Here are some notable links:
Slashbot is now officially owned by the Texas A&M University Electrical Engineering Department. One original intent of the project was to inspire young students and promote engineering. The robot will continue to do that as Texas A&M will demonstrate it at recruiting events in the future.
Please continue to share your comments about Slashbot. We still love hearing what people think; the compliments and the criticisms.
Ever since Slashbot hit the blogosphere and Youtube a lot of people have been asking questions about the project. We have especially loved reading the comments on Youtube, very entertaining. Below are responses to some general questions we have received. If you have more questions, we’d love to hear them. Just leave a comment!
How does Slashbot work?
There is a post in the blog that describes our approach but the concept is as follows. The “notes” to be played are the brightest images on the screen. This means we can look at a black and white (actually grayscale) version of the video and observe “notes” very easily. The brightness of a pixel is defined by a voltage between 0V(black) and 1V(white). By using NI LabVIEW to observe the waveform of the video signal, we monitor specific pixels on an individual line of video. When a “note” crosses that line, we see a spike in the signal’s voltage. Depending on where the spike occurred in the waveform, we can determine which note needs to be played by the robot.
How long did it take to make Slashbot
This was a semester long project that began in January 2008. Our team would work on Slashbot two days a week for about 4-5 hours each day. We had a working design after a couple months. We then spent a month tweaking and improving the overall design and performance. So in all, it took about three months from design proposal to fully functioning.
Why use the robot to press the buttons? Wouldn’t it be better to wire a signal directly to the controller’s electronics? You could probably achieve 100% that way.
This is a valid question. From a purely performance standpoint, wiring control signals directly to the game would likely have improved the system’s accuracy. However, there are already existing solutions that implement this approach. It was very important for us to create a system without hacking the game controller. We wanted our robot to have a human element to it. By physically seeing the buttons being pressed, someone watching could more easily relate to Slashbot. We wanted to blur the lines between man and machine. Plus, it is REALLY cool to see the actuators firing at break-neck (or “break-guitar”) speed!
Can you make it play “Through the Fire and Flames” on Expert?
This is by far the most common request. For those unfamiliar with Guitar Hero 3, “Through the Fire and Flames” by DragonForce is notoriously the game’s most challenging song. It is extremely fast and includes intricate solos lasting several minutes.
To answer the question, YES! We can make Slashbot play TTFAF and have done so. At our project demonstration day at Texas A&M a worthy challenger played against Slashbot. The robot finished with 68%.
So what’s the problem? Many have speculated that the actuators would not be fast enough to react to the incredible speed of the song, but this is not the case. The problem lies in the “note” detection software. Our algorithm currently has difficult differentiating between long strings of the same colored note and the long bar that appears for held notes. For the long strings of notes, Slashbot will strum on just the first note because the video signal is very similar to that of a held note. We are working to improve this.
Can Slashbot use Star Power or the whammy bar?
Slashbot does use Star Power. The game allows two ways to activate Star Power, by lifting the neck of the guitar or by pressing the “Select” button. We have mounted an actuator above the “Select” button. Its fires roughly every 10 seconds. There is no effect from pressing “Select” unless Star Power is available. As for whammy, we have the software implemented to handle it, but have yet to attach a motor to move the bar. We may eventually add this feature.
What does the future hold for Slashbot?
The main purpose of Slashbot was to complete our requirements for our Electrical and Computer Engineering degrees. That has been accomplished. We have been asked to demonstrate Slashbot at some public appearances, details TBD. We may make some improvements and will post any news here on the blog.
Slashbot can shred with the best of them! Here’s a video of our robot playing “Cliffs of Dover” on Expert difficulty. Final score: 96%. Not too bad for a machine.
Cool feature alert! Today we tested Slashbot in EXPERT multi-player mode. It was relatively simple to implement multi-player. We just changed the range of pixels that we wanted to monitor relative to the shifted positions of notes in two player mode. It is difficult to say whether man or machine prevailed. In our tests, Slashbot always achieved a higher accuracy percentage, but Vinny could keep multiplier streaks for longer, resulting in higher scores.
My Name is Jonas – EXPERT
Slashbot received a makeover this past weekend. We trimmed down the base and gave it a sweet coat of black paint. We mounted the strum actuator as well as added one above the Select button that will be used to activate Star Power. Pictures below…
We finally got all five “fingers” working and can now play songs on Hard and Expert difficulty. We thought it was appropriate for Slashbot’s first full performance to be “Welcome to the Jungle”. We are happy with the results: 80% accuracy. Once we have the “strum” actuator mounted we expect our scores to improve. More videos to come!
Only two weeks remain before Demo Day and we’ve been very busy getting Slashbot functioning properly. The actual robotic portion of the project is complete and we are in the final testing stages. The software (all LabVIEW!) has gone through many revisions over the past month and, like the hardware, it is being tested for final release.
There has been one major modification to the software within the last week. Reluctantly, we removed the functionality that would allow the system to detect the tempo of the song and adjust delays appropriately. The original motivation for this function stems from the delay that occurs between note detection and when the note should be played. As it was, our system was unreliable. In the interest of creating a robust design we implemented a manual delay adjustment control. It is slightly less elegant, but makes Slashbot much more reliable.
Below are some pictures and a video of our current setup.
A very short demo of the first test of our “working” system. We still need to make plenty of tweaks. This is a video of Slashbot playing on Medium difficulty. Our circuit was not completely working at the time this video was shot so we could not play with more than four buttons (which would allow Hard and Expert play modes).
We plan to have more complete and hopefully more impressive videos up soon!
Up to this point, Slashbot has been divided into two major components: the “brain” and the “body”. The brain consists of our PXI system and LabVIEW code which analyzes the video signal. The body is a discrete circuit that amplifies a digital signal to fire a solenoid actuator used to press buttons on the Guitar Hero game controller.
For the first time, we connected the body to the brain. Currently the robotic circuit only utilizes a single actuator. Now that we have verified the circuit performs correctly, it will be duplicated five times (once for each of the remaining controller buttons and once for the strum bar). We were very excited to see both parts of our system working in tandem. The robotic circuit was connected to the detection line of a single colored note (green). Every time a green note passed through our detection area on the screen the actuator would press the green button on the game controller.
The next step is to duplicate the circuit for all buttons and the strum bar. We also need to tweak the timing of the system so the buttons are pressed at the correct time in the game and not when they are immediately detected.
We have submitted our project to be considered in the NI Week Demo contest. NI Week will be held August 5-7, 2008 in Austin, Texas. The convention brings together the community of scientist, engineers, and educators in order to discuss developments in their technical fields and to exhibit how NI products can be used to further such developments.
Users who have created innovative applications and solutions with NI products are asked to submit demos. If chosen, we will have the opportunity to exhibit SlashBot! Here’s the video we submitted which shows our current progress and where we are headed.
One question that a few people have brought up regards the origin of our robot’s name, SlashBot. Basically, it is a reference to the former lead guitarist of Guns ‘n Roses, Slash. In Guitar Hero III, he is featured as a playable character and appears on the game’s cover. SlashBot also pays tribute to the popular technology-related blog, Slashdot.org
We have just completed our note detection virtual instrument in LabVIEW. TheVI divides the input video signal into small detection waveforms (one for each colored note) and sets level triggers. If the voltages rises above a trigger point for that note, the indicator turns on.
The screenshot below illustrates the VI in action. Both the red and blue notes are detected so the indicators are turned on (note: even though the blue appears a bit dark, it is in fact on). The spikes seen on the edges of the waveform are outside of our detection area so they are not being monitored. The pixel information contained in that portion corresponds to the white outlines of the fret board.
The basic approach we are taking is as follows:
1) Input video signal into NI PXI system
2) Analyze the active video portion of the signal
3) Detect which “notes” appear
4) When “notes” are detected, send appropriate control signal to robot.
We will be analyzing a standard NTSC 480i composite video signal from the PlayStation2. Each line of video contains 640 pixels. The brightness of a pixel is determined by the signal’s voltage amplitude. By parsing the signal waveform of a single line into subsets the width of our detection area, we can monitor when a “note” passes through those pixels. Since the “notes” are the brightest images on the screen we can detect them by setting trigger points. If the signal voltage of our waveform subset rises above a trigger value, we know a “note” was detected. Once a detection is made, we enqueue a control signal and the send it to the robot after a specific delay (determined by the tempo of the song).
The image below describes the anatomy of a single line of video. We focus on the active video region. This waveform is divided into subsets that correspond to locations on the screen.
The SlashBot system will consist of three different modules: a real-world interface, a sensing & computing unit, and an actuation component.
This component is responsible for producing the video signal and provides an interface to interact with the game (i.e. accepting input from the controller). It also has a monitor to display the video signal on a screen.
PlayStation2 gaming system
Guitar Hero game disc
Guitar Hero game controller
Sensing & Computing
Here we designed a system that will digitize the input video signal, use triggers to analyze the signal, and send appropriate control commands to the analog robotic circuit. We are employing a National Instrument PXI system for this processing component. After speaking with National Instruments (NI) representatives, we configured such a system to meet our needs.
NI PXI-1031 System Chassis (1)
NI PXI-8106 Controller (2)
NI PXI-7833R Reconfigurable I/O FPGA (3)
NI PXI-5114 High Speed Digitizer (4)
NI SCB-68 I/O Connector Block (not pictured)
This is the actual “robotic” portion of the design. Six solenoid actuators will be mounted above the Guitar Hero controller. One for each of the colored input buttons and one for the “strum” bar.
Analog amplifier circuit
Guitar Hero is a series of music video games that was first published in 2005. It has since been released on many platforms: PlayStation2, PlayStation3, XBox 360, Wii, PC, Mac, and even mobile phones. Now in its fifth installment, the franchise has become a cultural phenomenon and has sold over 14 million copies worldwide.
Guitar Hero is known for its unique style of gameplay. Players are able to control the game using a guitar-shaped peripheral, shown to the right. The goal of the game is to simulate playing songs by pressing color-coded buttons in sync with icons that scroll across the screen.
Our main motivation behind SlashBot is to complete the requirements for our Electrical & Computer Engineering B.S. degrees at Texas A&M University in College Station, Texas. The capstone course is ECEN 405: Electrical Engineering Design Laboratory.
The objectives of the class are:
Conceive and bring to fruition a nontrivial engineering project
Learn about and apply a disciplined engineering methodology covering: project specification, design, verification, documentation, and integration
Develop one’s professionalism
All four of us (Dave Buckner, Mitchell Jefferis, Vinny LaPenna, and Michael Voth) love music and video games. The Guitar Hero series encapsulates both of these passions. We felt that proposing an idea we would enjoy executing would help us produce a higher quality project.
To put it simply, we are designing a robot that is capable of autonomously playing a video game, the wildly popular Guitar Hero series. In the game a player attempts to simulate playing songs as color-coded buttons corresponding to notes scroll on the screen. A sensing and computation system will analyze the NTSC video signal as it is output from a PlayStation2 gaming system. The buttons a player is asked to press will be detected and an appropriate control signal will be sent to the robot. The robot will consist of six solenoid actuators, one for each colored button and one for the “strum” bar.
This blog is dedicated to the ECEN 405: Senior Electrical Design project of Texas A&M students: Dave Buckner, Mitchell Jefferis, Vinny LaPenna, and Michael Voth. We hope this blog will serve as documentation for the overall design process and project execution.
We plan to create a robotic system capable of autonomously playing the video game Guitar Hero. We hope to realize our solution using video signal analysis, the LabVIEW programming environment, and analog circuits.