Archive for February, 2008|Monthly archive page
We have just completed our note detection virtual instrument in LabVIEW. TheVI divides the input video signal into small detection waveforms (one for each colored note) and sets level triggers. If the voltages rises above a trigger point for that note, the indicator turns on.
The screenshot below illustrates the VI in action. Both the red and blue notes are detected so the indicators are turned on (note: even though the blue appears a bit dark, it is in fact on). The spikes seen on the edges of the waveform are outside of our detection area so they are not being monitored. The pixel information contained in that portion corresponds to the white outlines of the fret board.
The basic approach we are taking is as follows:
1) Input video signal into NI PXI system
2) Analyze the active video portion of the signal
3) Detect which “notes” appear
4) When “notes” are detected, send appropriate control signal to robot.
We will be analyzing a standard NTSC 480i composite video signal from the PlayStation2. Each line of video contains 640 pixels. The brightness of a pixel is determined by the signal’s voltage amplitude. By parsing the signal waveform of a single line into subsets the width of our detection area, we can monitor when a “note” passes through those pixels. Since the “notes” are the brightest images on the screen we can detect them by setting trigger points. If the signal voltage of our waveform subset rises above a trigger value, we know a “note” was detected. Once a detection is made, we enqueue a control signal and the send it to the robot after a specific delay (determined by the tempo of the song).
The image below describes the anatomy of a single line of video. We focus on the active video region. This waveform is divided into subsets that correspond to locations on the screen.
The SlashBot system will consist of three different modules: a real-world interface, a sensing & computing unit, and an actuation component.
This component is responsible for producing the video signal and provides an interface to interact with the game (i.e. accepting input from the controller). It also has a monitor to display the video signal on a screen.
PlayStation2 gaming system
Guitar Hero game disc
Guitar Hero game controller
Sensing & Computing
Here we designed a system that will digitize the input video signal, use triggers to analyze the signal, and send appropriate control commands to the analog robotic circuit. We are employing a National Instrument PXI system for this processing component. After speaking with National Instruments (NI) representatives, we configured such a system to meet our needs.
NI PXI-1031 System Chassis (1)
NI PXI-8106 Controller (2)
NI PXI-7833R Reconfigurable I/O FPGA (3)
NI PXI-5114 High Speed Digitizer (4)
NI SCB-68 I/O Connector Block (not pictured)
This is the actual “robotic” portion of the design. Six solenoid actuators will be mounted above the Guitar Hero controller. One for each of the colored input buttons and one for the “strum” bar.
Analog amplifier circuit
Guitar Hero is a series of music video games that was first published in 2005. It has since been released on many platforms: PlayStation2, PlayStation3, XBox 360, Wii, PC, Mac, and even mobile phones. Now in its fifth installment, the franchise has become a cultural phenomenon and has sold over 14 million copies worldwide.
Guitar Hero is known for its unique style of gameplay. Players are able to control the game using a guitar-shaped peripheral, shown to the right. The goal of the game is to simulate playing songs by pressing color-coded buttons in sync with icons that scroll across the screen.
Our main motivation behind SlashBot is to complete the requirements for our Electrical & Computer Engineering B.S. degrees at Texas A&M University in College Station, Texas. The capstone course is ECEN 405: Electrical Engineering Design Laboratory.
The objectives of the class are:
Conceive and bring to fruition a nontrivial engineering project
Learn about and apply a disciplined engineering methodology covering: project specification, design, verification, documentation, and integration
Develop one’s professionalism
All four of us (Dave Buckner, Mitchell Jefferis, Vinny LaPenna, and Michael Voth) love music and video games. The Guitar Hero series encapsulates both of these passions. We felt that proposing an idea we would enjoy executing would help us produce a higher quality project.
To put it simply, we are designing a robot that is capable of autonomously playing a video game, the wildly popular Guitar Hero series. In the game a player attempts to simulate playing songs as color-coded buttons corresponding to notes scroll on the screen. A sensing and computation system will analyze the NTSC video signal as it is output from a PlayStation2 gaming system. The buttons a player is asked to press will be detected and an appropriate control signal will be sent to the robot. The robot will consist of six solenoid actuators, one for each colored button and one for the “strum” bar.
This blog is dedicated to the ECEN 405: Senior Electrical Design project of Texas A&M students: Dave Buckner, Mitchell Jefferis, Vinny LaPenna, and Michael Voth. We hope this blog will serve as documentation for the overall design process and project execution.
We plan to create a robotic system capable of autonomously playing the video game Guitar Hero. We hope to realize our solution using video signal analysis, the LabVIEW programming environment, and analog circuits.