PEW - A Brain Controlled Robot-Shooting Game


VR Game, 03/11/24

“Have you ever wanted to stare at something and make it explode? Try our game PEW!”

Team project made for Columbia’s 2024 Neureality Hackathon. I worked on the VR development. We won second place.

For hardware, we used the G.tec Unicorn Hybrid Black (an 8 channel wearable EEG headset) to receive SSVEP signals, the OpenBCI Cyton to record EMG signals from arm movement, and the Meta Quest 2 VR headset to display the VR game.

The objectives for our hacking track at the hackathon was to incorporate EEG signals and EMG signals into a virtual reality game.

The basic premise of our game, PEW, is a first person shooter in virtual reality where the objective is to shoot as many robots as possible before the count down timer stops. Players stand stationary at a crossroads while different robots approach the player from three directions. Some robots approach from the ground level while others float in the air. Players have 3 different basic hand-held weapons to choose from (pistol, machine gun, katana). 

Aside from the basic weapon, there are advanced attack types where players can use their brain waves to make robots spontaneously combust. This is done based on the fact - in a nutshell- that when a person concentrates on a blinking object, their brain waves will peak at the same frequency that the object is blinking at. Using an EEG headset to capture the player’s SSVEP waves (Steady-state visually evoked potential, a brain signal that appears in an electroencephalogram (EEG) when a person looks at a flickering light), we can run signal proccessing code in real time to detect if the player is focused on a certain flashing robot within their field of view. Only one robot can be flashing on screen at a given time so as to make the signal detection and denoising easier. If so, a signal is sent from the signal processing script (written in Python, running as a UDP client) to the VR headset (running as a UDP server) to terminate the robot and make it combust in the game.

Another type of brain wave we wanted to include in the game was alpha waves, a type of brain wave that occurs when someone is relaxed but alert. We decided to have this control the AOE (Area of Effect) attack, an attack that affects multiple targets within a specific area. Visually, in the game, this effect rained down lightning bolts and zapped robots across the map. Because it was such a powerful attack, we wanted there to be a equivalent trade off. To achieve alpha waves, the player would have to close their eyes and attain a state of focus and relaxation similar to meditating, which is not easy to do in such a fast paced game. Again, implementation wise, if alpha waves were detected by the EEG headset, a signal would be sent over the UDP connection to trigger AOE in the game.

Finally, we mapped the EMG (electromyography) signal (a biomedical signal that measures the electrical activity of a muscle when it contracts) of a closed-fist wrist flick to the function to change weapons. We strapped electrodes connected to the OpenBCI Cyton onto the player’s arm to record whether their wrist and muscle movement matched the trigger motion. If so, the in-game weapon was changed. Consequent triggers would cycle through the three handheld weapons.

PEW connects to the Meta Quest 2 to provide an immersive game experience. The main game is displayed in the VR headset, while user gestures and input from BCI devices are tracked on an external computer running Python for signal processing. The Python code acts as a UDP client that sends signals to the VR headset with the game running as the UDP server. For future iterations, inspired by the use of the UDP server, we considered developing local multiplayer where two people can play in the same room using the same network.

Check out our GitHub repo here.
See our presentation slides here.



©2022—’25