Categories
Uncategorized

The Laboratory of Neuromodulation & Neuroimaging: Your Brain’s Never Looked so Good

At some point I’m going to sound like a broken record when I keep saying how amazing the experiences of each of these visits are. There are only so many superlatives that can be thrown around until they start to lose their meaning. Before that happens, however, we should reserve some for Dr Nanthia Suthana’s lab, because it was an amazing tour with a remarkably generous host and obliging postdocs, each ready to engage with visitors to discuss their work and their equipment. (To say that they were incredibly knowledgeable about everything would be to state the obvious.)

One of Dr Suthana’s research labs.

Visitors are immediately struck by the large, open space surrounded by 24 infrared base stations placed throughout the room. These sensors are integrated with all of the VR equipment in the space, and are used to not only set virtual boundaries, but are also meant to track markers. They are capable of sub-millimeter motion tracking, so don’t move. Or do move, depending on what they need.

The light grey balls on the top of the headset are the kinds of markers that the infrared base stations are tracking. These markers bounce back the infrared signals that are then picked up by the base stations to keep track of the subject’s movement. Entire motion tracking suits are in the space to allow for even greater motion capturing.

The entire space boasts an impressive amount of hardware. Some of it includes:

  • 24 OptiTrack base stations
  • A Magic Leap AR headset
  • Microsoft Hololens
  • HTV Vive complete with a Tobii eye tracking attachment
  • Samsung VR Headset with SMI eye tracking
  • Motion capture suits of various sizes
  • Eva Artec Scanner and automatic rotating platform
  • BioPac – state of the art biometric measurements
  • ANT Neuro eego sport 64 channel mobile EEG system 
They also have this eye tracking headset from Pupil Labs, made to look even goofier because the goof in the picture doesn’t know why WordPress is flipping his photo and can’t seem to figure out how to rotate the damn thing. (For the record, I’ve since figured it out but I’m leaving this as is because otherwise I wouldn’t be able to call myself a goof.)

So what do they do with all this stuff? Well, a whole hell of a lot as it turns out. For starters, the Vive headset station is a great example of cross-campus collaboration. The environments created for the Vive are for educational purposes. Yea, you guessed it, they’re teaching folks about the brain.

This particular program was design by Dr Rebecca Allen, professor of 3D media art whose work centers around creative expression in virtual and augmented reality. In this screenshot you can see that I’ve already taken apart different pieces of the brain, each part named as though scientists picked random words from a Latin dictionary.

Here I have detached yet another chunk from the brain. The model was quite large, and I was able to virtually place my head in the middle of the brain, using the controllers to highlight different sections of the brain. They claimed that by this time the person would have died from all the missing parts of their brain, but I’m not too sure.

The Magic Leap was probably my favorite experience. I’d never really done AR before, and this was probably one of the best environments in which to do it. I mean, I don’t know about you, but I don’t have 24 OptiTrack base stations at home. The 18 that I have just don’t seem to cut it.

Here I am, wearing the Magic Leap, deftly and bravely defending everyone from attacking robots while they just chit-chat in blissful ignorance of the imminent danger all around them. Let me just say that if you’re an alien robot climbing out of the walls you better watch your ass – I’m a crack shot.


In addition to various AR environments that I got to experience with the Magic Leap, the most impressive thing about the lab is the research that they are conducting.

Patients with epilepsy are sometimes given an implant that resides within the skull, on the brain, with wires that go into the brain itself. Now, I don’t know if you know this, but I’m not a neuroscientist – I don’t even know if I spelled that correctly. But from what I could gather, these wires are meant to stimulate the brain in such a way as to alleviate epileptic episodes.

That’s all well and good, but these implants were traditionally just one-way streets. Nowadays, many implants are of a new, recently-FDA-approved, variety. These implants perform the same functions as the aforementioned ones (that is, if I got that right in the first place), but are a two-way street. They can be connected to from the outside, and they can send data about what’s going on inside the brain. Again, not a neuroscientist, but basically these wires send electric signals into the brain but also can read and send out information about what’s going on deep inside the brain.

The idea is that by reading these epilepsy implants while subjects are engaging in various VR environments, Dr Suthana and her researchers can monitor brain activity when a person is up and about, engaged instead of lying in a coffin getting an MRI scan. Here’s Cory Inman, Post-Doc in the lab, wearing the AR headset to shoot robots or pet penguins.

Some of the environments developed for this implant research revolve around memory. I, of course, passed the test with flying colors but I don’t remember what my score was.

Subjects would wear special caps that could read the information being sent by the implant’s wires that were deep in the brain. The wires were in the hippocampus or hippopotamus or something to study memory as subjects engaged in an AR environment designed to test their memory.

Special thanks to Cory Inman, PostDoc researcher, Diane Villaroman the resident programmer analyst, and Sonja Hiller, research assistant and lab manager.

Definitely some interesting research going on here, and if you can think of some kind of partnership or research project, don’t email me, email Dr Suthana.

Categories
Events Location

Game Lab Showcase: A Visceral VR Venue

The Game Lab is located in the Broad Art Center, Room 3252. Inside are several computer stations with VR headsets: three Oculus headsets and two Vive headsets. Each station had a collection of two to three VR experiences, designed to inspire students by showing them the breadth of experience that is possible with VR. And the breadth was stunning.

Image courtesy: https://www.zerodaysvr.com/

Part of the trouble with a showcase like this is that the experiences themselves are so immersive, so gripping, that you spend half your time being enthralled by just one. Which is exactly what happened to me. Zero Days VR was an artistic immersive documentary experience. The environment was designed and built around an existing feature-length documentary, unsurprisingly named Zero Days. The creators layered the audio on top of an engaging, futuristic technolandscape where the viewer was gently gliding through walls of circuitry that pulsed and reacted to the words being spoken. Occasionally large panels would appear with clips from the documentary itself, but mostly the viewer is spending their time absorbing the landscape, watching it react to the narration and audio of the documentary. Truly engaging, and an exciting future development for documentary film making.

Image courtesy: http://notesonblindness.arte.tv/en/vr

Notes on Blindness is a British documentary from 2016 based on the audio tapes of John Hull, a writer and theologian that gradually went blind and wrote a book about blindness.

Image courtesy: https://www.youtube.com/watch?v=W2eTgbyiY_0

This was the second experience that I had the privilege of being immersed in. John Hull narrates the expanding aural atmosphere, beginning on a park bench and detailing, one by one, the various sounds that bring his world to life. Children laughing, people walking by, birds taking off in flight, joggers passing by, all of these things appear as ephemeral blue outlines as Hull’s voice, with a warm and crackling analog texture, illuminates the experience. It was simultaneously calming and exhilarating, and definitely unlike anything I’d experienced before. And certainly something that could only be appreciated in a VR environment.

Unfortunately (or, rather, fortunately), these two experiences captivated me throughout my time at the Showcase. Although I regret not being able to experience any of the other showcases, the two that I was able to engage with make confident in saying that the list below must be equally as exciting to experience.

Zeynep Abes, curator of this Showcase, has an incredible understanding of the multitudinous possibilities when it comes to VR experiences. Below is a full list of experiences that were on display (in no particular order):

Spheres
Dear Angelica
Gymnasia
Land Grab
A Short History of the Gaze
Museum of Symmetry
Melody of Dust
Chocolate
Davina

Categories
Art Events Resource

Steve Anderson’s Media Arts Lab

Steve Anderson’s brand new Media Arts Lab is located in Melnitz Hall, room 1470. The room is spacious, and comes equipped with seven powerful workstations, each equipped with a VR headset. Both Oculus and Vive equipment can be found.

Oculus donated several headsets to the lab; Vive equipment is also included on some workstations. Here Zizi Li, a PhD student in Cinema and Media Studies, readies the workstations for the Media Lab’s recurring demo series on MWF of this quarter.

The lab is just 2-weeks old, and is still awaiting the installation of a state-of-the-art 7.1 surround sound system. The most prominent feature of the space is a large green screen wall, complete with powerful green lights that help create a shadowless, monochrome background to make it much easier for software to key out the background.

Here is the greenscreen wall without the green lights on. The shadows here could be problematic when trying to key out the background.
With the aid of these powerful green lights…

…the greenscreen wall becomes much more evenly lit. White lights would then be used on the subject being filmed/captured. Motion capture made easy!

This greenscreen area will be used for in part for motion capture. Additionally, on either side of the green screen wall is where the Vive base stations are set up.

The room provides ample space for a VR environment.

In addition to the VR and greenscreen equipment, Steve also has several 360 cameras on hand for students to use. Including this monstrosity, the Google Odyssey, which is an array of 16 GoPros to capture beautiful, hi-def stereoscopic video.

As you can imagine, the footage can get quite large. So large, in fact, that Steve is currently trying to work with Google to allow him to download a 20-minute shot that a student took in Alaska using this rig. The file size is over 1TB, making it impossible for Steve to download the footage after it’s been processed on Google’s servers.
The battery for the camera is, as you can imagine, quite large. It’s about the size of two car batteries put together, and weighs just as much.
The TSA may ask you to open up the case for them to inspect…

Steve and Zizi explain a bit about how the Odyssey works.

By and large the space is shaping up to be an incredibly flexible and useful lab for motion capture, projection mapping, and VR work. Steve is eager to explore as many use-cases as possible, and is happy to speak to other faculty about bringing their students/classes in to work on some VR projects.

Here students from Maja Manojlovic’s class take turns experiencing some VR environments. Professor Manojlovic teaches a course called EngComp133: Writing in Multimedia Environments: Videogame Rhetoric and Design.

The Media Arts Lab is open to visitors on Mondays, Wednesdays, and Fridays for the rest of the quarter from 11a-1p. Various VR films and environments will be on display for visitors to experience.

css.php