Open House: Shattuck Lab, Multiuser Virtual Reality Environment for Visualizing Neuroimaging Data – NRB225

Dr David Shattuck will be demonstrating a virtual reality environment that’s been developed for interactive visualization of neuroimaging data. His system uses the HTC Vive to present a room-sized virtual experience for data exploration and can currently accommodate four simultaneous users. Though still an early work, it offers a new way of exploring structural and diffusion MRI of the brain. In particular, his framework provides interactive visualization of volumetric MRI, surface models, tensor glyphs, and streamline tractography. The system can also be used to explore imaging data of other anatomical structures. More information on this system can be found in his recent paper, “Multiuser Virtual Reality Environment for Visualizing Neuroimaging Data”(http://dx.doi.org/10.1049/htl.2018.5077).

The lab is on the 2nd floor of NRB at the top of the stairs from the lobby. They are behind a set of double doors. If you are facing the doors, there is an intercom on the right. Press the red button to be let into the lab. The VR demo will be in the theater, which is just to the right of the entrance.

Open House: Suthana Laboratory, Semel Building room 48-136

The Suthana laboratory provides a novel platform that allows for wireless and programmable recording and stimulation of deep brain activity in freely moving human participants. The lab has integrated the platform with external biometrics (e.g., heart rate, skin conductance, respiration, eye tracking, and scalp EEG) and state of the art virtual (and augmented) reality (VR/AR) technology. This first-of-its kind research platform for programmable deep brain recording and stimulation, external biometrics and VR/AR allows for a naturalistic and ecologically valid environment for elucidating the neural mechanisms underlying freely moving human behaviors and developing and testing viable therapies for patients with neurologic and psychiatric disorders.

Take H-elevators to 4th floor. Turn right and take the first hallway to the right. The lab is on the right side, room # 48-136. Please call (310) 343-9628 if you need more directions.

Further details

The Suthana laboratory has a state-of-the-art motion capture space for mobile brain imaging housed in a 400 square foot room in suite 48-136 of the Jane and Terry Semel Institute for Neuroscience and Human Behavior. This space is equipped with 24 wall mounted high-resolution cameras (Optitrack, Natural Point, Inc.) for sub-millimieter motion capture and body positioning. This includes 18 Prime 13W cameras each with a 3.5 mm stock lens (82° horizontal FOV and 70° vertical FOV, 850nm band-pass filter, 800 nM (infrared) or 700 nm (Visible) filter switcher), 1280 x 1024 resolution with a frame rate of 30-240 FPS, 4.2 ms latency, and shutter speed of up to 3.9 msec. For facial emotion capture there are an additional 6 Prime 13 cameras each with a 5.5 mm stock lens (56° horizontal FOV and 46° vertical FOV, 850nm band-pass filter, 800 nM (infrared) or 700 nm (Visible) filter switcher), 1280 x 1024 resolution with a frame rate of 30-240 FPS, 4.2 ms latency, and shutter speed of up to 3.9 msec. Motion capture suits, reflective markers are used in conjunction with specialized software, Motive and Camera SDK (Natural Point, Inc.), that allows for kinematic labeling through development of subjects’ skeletal structure using sub-millimiter marker labels, real time output, full real-time control of cameras, and syncing capability with behavioral paradigm and neural data.

The VR/AR environments are programmed using the Unity game engine and the C# language to implement customized immersive environments with controlled stimuli and functionalities. Using motion capture cameras, the participants’ locations are mirrored real-time in the VR/AR application. VR/AR headsets are all equipped with eye tracking and include the SMI Samsung gear, HTC Vive, HoloLens and Magic Leap.

Heart rate, skin conductance, and respiration are recorded using the wireless BioNomadix Smart Center (Biopac). Data is streamed in real-time to the computer that is running the MoCap system for synchronization of multiple data streams.

Mobile Scalp EEG is recorded with the EEGOSPORTS WaveGuard 64-channel EEG and eego mylab system (ANT Neuro) for real-time recording and compatibility with DBS. The amplifier (weight =~500 g) and data acquisition tablet (VAIO™ Ultrabook®, Sony) are battery powered and are carried by the participant in a small backpack allowing for scalp EEG in real-time through access via WIFI. The eego mylab system has a sampling rate of up to 16 KHz capable of handling large DBS artifacts with a less than 10 msec recovery rate. Electrode positions can be digitized and registered to individual participant MRIs using Xensor software (ANT Neuro).

VR Showcase: UCLA Game Lab, Broad Art Center Room 3252

Is VR revolutionary or is it hype? As with any form of emerging technology, high expectations tend to detract from the experience itself; and VR experiences that ARE mainstream enough, often disappoint users. Let’s put aside “the future is now” argument and focus on the content that actually shows the bold spirit of innovation among independent artists that haven’t had a chance to share their work. This mini showcase is an exclusive selection of unique under the radar VR experiences. Dive into a day of narrative worlds, social media cultures, immersive designs, game theory, and transmedia activism!