Categories
Health Location

The Bionics Lab: It’s as Awesome as it Sounds

The view upon entering. I did not touch the sign, I promise.

Did you know that there exists on campus at least one mad scientist’s lair? That is to say, what appears to be a mad scientist’s lair (don’t want to blow anyone’s cover). There are probably many more besides, but Dr Jacob Rosen’s Bionics Lab looks like a set for an upcoming Ridley Scott film called The Scientist’s Lair or something. And I mean that in the best way possible.

Strewn about the lab is a smorgasbord of stuff. Bits, pieces, bits of pieces, pieces of bits, and, of course, full and partial sets of exoskeletons are strewn about the lab in a manner befitting the minds that combine medicine, physiology, neuroscience, mechanical engineering, electrical engineering, bioengineering, and whatever else I’m missing (which, admittedly, is a lot). (Incidentally, those seem to be the recurring dual-themes of these lab visits: brilliant minds with broad specializations and me missing stuff.)

Check it out! I didn’t get a chance to ask what this thing was, but it sure does look neat.

When you first enter the space, it’s not graduate students wearing exoskeletons that jump out at you, but rather a big, giant plaster-like vertical dome thingy. That was the technical term they used, I believe. Further back in the lab you can see metal scaffolding in much the same shape; these both serve a similar purpose but you could say that one is more…stucco in place than the other. More on that later.

There’s the dome next to the double-doors that lead in and out of the space. Careful readers take note: directly in between the camera and the dome are a set of about nine blue handles. These will be relevant shortly.

I was greeted next to the dome by Dr Ji Ma who was busy making minor adjustments to the four projectors that work together to produce an image up to 4k in resolution. However, Dr Ma was more keen to show me his nifty invention close by: a set of wearable inertia measurement unit, or IMU, sensors. As with the other equipment I was introduced to, these devices were designed to help in the physical rehabilitation of stroke patients.

Here’s Yang Shen, soon-to-be-Ph.D., wearing three of the sensors and demonstrating the quick, real-time virtual response elicited by his movements. The response was very responsive, in other words.

The wearables were wireless and prototypical, and each contain accelerometers and gyroscopes in order to record the speed and range of movement of the subject.

The guts of one of the wearables. Dr Ma told me that he plans to shrink each of these down to smaller than a coin. I was surprised at how much articulation was transmitted to the avatar: shoulder, elbow, and wrist movement all reacted to Yang’s movements in basically real-time.

Like the exoskeleton I’m about to talk about, the idea behind these devices is to aid in the rehabilitation of upper-body movement of patients who have suffered brain damage from a stroke. But given how small these sensors will end up being, and how quick and accurate the response-time was, these devices can easily find their way into other applications in the XR world.

Just to whet your appetite a bit… a bionic hand!!

Astute readers have noticed that in the image of Yang demonstrating the IMU sensors there sits in the background what appears to be a brilliant blue exoskeleton. And so it is!

Look on this thing, ye Mighty, and despair! It is a Mirror Image Bilateral Training Exoskeleton. I haven’t heard back from Dr Rosen yet on my idea to rename it the Blue Bonecrusher. What about the Blue Beast, or just the Beast for short? I’m sure when this thing gains sentience it won’t mind being so informal.

Before I get into what is going to be the grossest of oversimplifications, I want to give a special thanks to Yang Shen, a Ph.D. candidate who’s been with the lab for several years now, for being so patient with me and going over this incredible device for me. So, as I said, the device is used for Mirror Image Bilateral Training. Stroke patients often suffer damage to one or more hemispheres in the brain. When it’s one or the other, movement can become impaired. Based on the theory of neuroplasticity (that means you can make new neurons and new neural pathways), this device is meant to strengthen the damaged hemisphere by allowing the healthy side to provide the proper and/or full range of movement.

With both patient arms strapped in, the operator would select the correct mode, allowing the patient’s healthy side to take the wheel, so to speak, with the damaged side following alongside with precise mirrored movements.

Impaired movement as result of a stroke has to do solely with brain damage, not damage to the muscles (eg muscular dystrophy) or nerves (eg spinal injury). Because of this, it is believed that thinking command thoughts to move your arm coupled with the arm actually moving in the way it’s meant to help rehabilitate the damage hemisphere.

The device also allows for pre-programmed movements (as by a physical trainer, for example), is height- and width-adjustable, consists of three harmonic drives and four Maxon D/C motors for a full range of seven degrees of movement, but only comes in one color. Priorities, Dr Rosen, priorities. Where’s my candy red model?

Here’s Yang, demonstrating how a patient’s arm would strap in to the machine. He told me to tell you all that if you all don’t think this is super-duper cool then he WILL crush you into bonedust. Just kidding. He can’t do that because the device isn’t exactly mobile. Yet…

Remember those nine blue handles I mentioned earlier? No? Well you’re not a very close reader, are you? (Hint: caption, third image)

Pfft, you believe that guy? Anyway, those nine blue handles sit directly across from The Blue Bonecrusher. They’re part of the studies that the lab has been conducting with the exoskeleton, and will also factor in to how this all relates to VR, because as awesome as all of this is, it ain’t exactly the point of this, now is it? As you might have already guessed, the idea here is for patients to reach out, grab the handles, and turn. A healthy person will use up to six degrees of freedom to perform this task, even though a human arm comes with seven. This seventh degree is known as redundancy, not unlike my job in a few years’ time.

Stroke patients often do not lose all seven degrees of freedom. As such, although they may be capable of reaching out and grabbing the handle or otherwise capable of limited movement, the danger lies in reinforcing bad habits. Enter, once again, The Blue Bonecrusher. Instead of allowing patients to unduly rely (and thus reinforce), say, five degrees of movement, the exoskeleton guides and assists patients through a healthy range of motion.

Yang’s also published a study using The Blue Bonecrusher.

Yang’s study on goes something like this (and brings us tantalizingly closer to the whole connection to XR). Let’s assume that a healthy range of movement involves an arm moving from point A to point D, passing points B and C along the way. Now let’s assume that a stroke patient has difficulty with moving their arm from point A to B and from C to D, but has no difficulty moving from point B to C. Yang’s programmed the Beast to assist only during those portions of the range from A to D that the patient struggles with. Yang called it Asymmetric Bilateral Movement or “Assist as Needed” movement.

Now, that’s impressive enough as it is. But the folks in the Bionics Lair (I’m just renaming everything now) like to be precise, and like so many scientists, they like to measure stuff. A lot.

Which brings us, finally, to XR!! Near the ceiling in this image you’ll see part of a set of 10 infrared motion capture cameras. Also note the dome scaffolding, that’s coming up in a bit, too.

In order to more accurately program the machine, the Bionics Lair uses these motion capture cameras to record instances of a healthy human reaching for the handles and turning them (or instances of a stroke patient using their healthy hemisphere to make the move). Then, frame by frame (which translates to something like millisecond by millisecond), the researchers can measure each joint’s position relative to the other joints, ultimately plugging all of that information into the Beast.

But that’s not all! As part of Yang’s Assist as Needed study, he incorporated visual stimulation in a virtual reality environment. After all, there’s more to life than just reaching out, grabbing a random handle, and turning it. Yang provided patients with a variety of virtual objects to reach towards and interact with, tying a virtual ‘string’ between the patients palm and the object, pulling on the patient’s arm with the string to assist with the movement as, you guessed it, as needed. Yang’s not bad at naming stuff, I’ll admit, but I think I’m better.

Enter, once again, The Dome. The story has come full circle, we are back to the beginning! Or is it the end?

If you’re anything like me, you’ve sometimes gotten a bit queasy when trying on a VR headset. Now imagine you’re not just a recovering stroke patient, but more than likely a geriatric one who is about as familiar and comfortable around cutting-edge technology as I am with my mother-in-law. Having said that, I bet none of you will be very surprised when I say that many of these stroke patients were getting nauseous while using VR headsets.

In an effort to keep himself, his lab, his equipment, and ostensibly his students vomit-free, Dr Rosen designed this dome to serve as a replacement for VR headsets. The dome is placed directly in front of patients, thereby giving them near-total immersion while still allowing them to see the ground underneath them. Seeing the ground beneath them helps the patients to stay oriented and to keep their lunches from making a reappearance. The white dome is a fixed prototype; the scaffolding I mentioned and pictured earlier, is meant to be more portable.

I call dibs on whatever Dr Rosen’s Bionics Lair cooks up next because it’s sure to be some kind of exoskeleton. And who doesn’t want an exoskeleton?

Internal skeletons are so passé. By the way, check out Yang’s website for more info about the work he’s doing: https://yangshen.blog/

I tried walking out with it but couldn’t do it. This will have to suffice.

For further reading that is far above my ability to comprehend, Yang has shared with me some of the published work that I mentioned above:

https://www.sciencedirect.com/science/article/pii/B978012811810800004X

Y. Shen, P. W. Ferguson, J. Ma and J. Rosen, “Chapter 4 – Upper Limb Wearable Exoskeleton Systems for Rehabilitation: State of the Art Review and a Case Study of the EXO-UL8—Dual-Arm Exoskeleton System,” Wearable Technology in Medicine and Health Care (R. K.-Y. Tong, ed.), Academic Press, 2018, pp. 71-90.

https://ieeexplore.ieee.org/abstract/document/8512665

Y. Shen, J. Ma, B. Dobkin and J. Rosen, “Asymmetric Dual Arm Approach For Post Stroke Recovery Of Motor Functions Utilizing The EXO-UL8 Exoskeleton System: A Pilot Study,” 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, 2018, pp. 1701-1707.
doi: 10.1109/EMBC.2018.8512665

https://ieeexplore.ieee.org/abstract/document/8246894

Y. Shen, B. P. Hsiao, J. Ma and J. Rosen, “Upper limb redundancy resolution under gravitational loading conditions: Arm postural stability index based on dynamic manipulability analysis,” 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids), Birmingham, 2017, pp. 332-338.
doi: 10.1109/HUMANOIDS.2017.8246894

https://link.springer.com/chapter/10.1007/978-3-030-01887-0_53

Ferguson P.W., Dimapasoc B., Shen Y., Rosen J. (2019) Design of a Hand Exoskeleton for Use with Upper Limb Exoskeletons. In: Carrozza M., Micera S., Pons J. (eds) Wearable Robotics: Challenges and Trends. WeRob 2018. Biosystems & Biorobotics, vol 22. Springer, Cham.
doi: 10.1007/978-3-030-01887-0_53

Categories
Uncategorized

The Laboratory of Neuromodulation & Neuroimaging: Your Brain’s Never Looked so Good

At some point I’m going to sound like a broken record when I keep saying how amazing the experiences of each of these visits are. There are only so many superlatives that can be thrown around until they start to lose their meaning. Before that happens, however, we should reserve some for Dr Nanthia Suthana’s lab, because it was an amazing tour with a remarkably generous host and obliging postdocs, each ready to engage with visitors to discuss their work and their equipment. (To say that they were incredibly knowledgeable about everything would be to state the obvious.)

One of Dr Suthana’s research labs.

Visitors are immediately struck by the large, open space surrounded by 24 infrared base stations placed throughout the room. These sensors are integrated with all of the VR equipment in the space, and are used to not only set virtual boundaries, but are also meant to track markers. They are capable of sub-millimeter motion tracking, so don’t move. Or do move, depending on what they need.

The light grey balls on the top of the headset are the kinds of markers that the infrared base stations are tracking. These markers bounce back the infrared signals that are then picked up by the base stations to keep track of the subject’s movement. Entire motion tracking suits are in the space to allow for even greater motion capturing.

The entire space boasts an impressive amount of hardware. Some of it includes:

  • 24 OptiTrack base stations
  • A Magic Leap AR headset
  • Microsoft Hololens
  • HTV Vive complete with a Tobii eye tracking attachment
  • Samsung VR Headset with SMI eye tracking
  • Motion capture suits of various sizes
  • Eva Artec Scanner and automatic rotating platform
  • BioPac – state of the art biometric measurements
  • ANT Neuro eego sport 64 channel mobile EEG system 
They also have this eye tracking headset from Pupil Labs, made to look even goofier because the goof in the picture doesn’t know why WordPress is flipping his photo and can’t seem to figure out how to rotate the damn thing. (For the record, I’ve since figured it out but I’m leaving this as is because otherwise I wouldn’t be able to call myself a goof.)

So what do they do with all this stuff? Well, a whole hell of a lot as it turns out. For starters, the Vive headset station is a great example of cross-campus collaboration. The environments created for the Vive are for educational purposes. Yea, you guessed it, they’re teaching folks about the brain.

This particular program was design by Dr Rebecca Allen, professor of 3D media art whose work centers around creative expression in virtual and augmented reality. In this screenshot you can see that I’ve already taken apart different pieces of the brain, each part named as though scientists picked random words from a Latin dictionary.

Here I have detached yet another chunk from the brain. The model was quite large, and I was able to virtually place my head in the middle of the brain, using the controllers to highlight different sections of the brain. They claimed that by this time the person would have died from all the missing parts of their brain, but I’m not too sure.

The Magic Leap was probably my favorite experience. I’d never really done AR before, and this was probably one of the best environments in which to do it. I mean, I don’t know about you, but I don’t have 24 OptiTrack base stations at home. The 18 that I have just don’t seem to cut it.

Here I am, wearing the Magic Leap, deftly and bravely defending everyone from attacking robots while they just chit-chat in blissful ignorance of the imminent danger all around them. Let me just say that if you’re an alien robot climbing out of the walls you better watch your ass – I’m a crack shot.


In addition to various AR environments that I got to experience with the Magic Leap, the most impressive thing about the lab is the research that they are conducting.

Patients with epilepsy are sometimes given an implant that resides within the skull, on the brain, with wires that go into the brain itself. Now, I don’t know if you know this, but I’m not a neuroscientist – I don’t even know if I spelled that correctly. But from what I could gather, these wires are meant to stimulate the brain in such a way as to alleviate epileptic episodes.

That’s all well and good, but these implants were traditionally just one-way streets. Nowadays, many implants are of a new, recently-FDA-approved, variety. These implants perform the same functions as the aforementioned ones (that is, if I got that right in the first place), but are a two-way street. They can be connected to from the outside, and they can send data about what’s going on inside the brain. Again, not a neuroscientist, but basically these wires send electric signals into the brain but also can read and send out information about what’s going on deep inside the brain.

The idea is that by reading these epilepsy implants while subjects are engaging in various VR environments, Dr Suthana and her researchers can monitor brain activity when a person is up and about, engaged instead of lying in a coffin getting an MRI scan. Here’s Cory Inman, Post-Doc in the lab, wearing the AR headset to shoot robots or pet penguins.

Some of the environments developed for this implant research revolve around memory. I, of course, passed the test with flying colors but I don’t remember what my score was.

Subjects would wear special caps that could read the information being sent by the implant’s wires that were deep in the brain. The wires were in the hippocampus or hippopotamus or something to study memory as subjects engaged in an AR environment designed to test their memory.

Special thanks to Cory Inman, PostDoc researcher, Diane Villaroman the resident programmer analyst, and Sonja Hiller, research assistant and lab manager.

Definitely some interesting research going on here, and if you can think of some kind of partnership or research project, don’t email me, email Dr Suthana.

Categories
Events Location

Game Lab Showcase: A Visceral VR Venue

The Game Lab is located in the Broad Art Center, Room 3252. Inside are several computer stations with VR headsets: three Oculus headsets and two Vive headsets. Each station had a collection of two to three VR experiences, designed to inspire students by showing them the breadth of experience that is possible with VR. And the breadth was stunning.

Image courtesy: https://www.zerodaysvr.com/

Part of the trouble with a showcase like this is that the experiences themselves are so immersive, so gripping, that you spend half your time being enthralled by just one. Which is exactly what happened to me. Zero Days VR was an artistic immersive documentary experience. The environment was designed and built around an existing feature-length documentary, unsurprisingly named Zero Days. The creators layered the audio on top of an engaging, futuristic technolandscape where the viewer was gently gliding through walls of circuitry that pulsed and reacted to the words being spoken. Occasionally large panels would appear with clips from the documentary itself, but mostly the viewer is spending their time absorbing the landscape, watching it react to the narration and audio of the documentary. Truly engaging, and an exciting future development for documentary film making.

Image courtesy: http://notesonblindness.arte.tv/en/vr

Notes on Blindness is a British documentary from 2016 based on the audio tapes of John Hull, a writer and theologian that gradually went blind and wrote a book about blindness.

Image courtesy: https://www.youtube.com/watch?v=W2eTgbyiY_0

This was the second experience that I had the privilege of being immersed in. John Hull narrates the expanding aural atmosphere, beginning on a park bench and detailing, one by one, the various sounds that bring his world to life. Children laughing, people walking by, birds taking off in flight, joggers passing by, all of these things appear as ephemeral blue outlines as Hull’s voice, with a warm and crackling analog texture, illuminates the experience. It was simultaneously calming and exhilarating, and definitely unlike anything I’d experienced before. And certainly something that could only be appreciated in a VR environment.

Unfortunately (or, rather, fortunately), these two experiences captivated me throughout my time at the Showcase. Although I regret not being able to experience any of the other showcases, the two that I was able to engage with make confident in saying that the list below must be equally as exciting to experience.

Zeynep Abes, curator of this Showcase, has an incredible understanding of the multitudinous possibilities when it comes to VR experiences. Below is a full list of experiences that were on display (in no particular order):

Spheres
Dear Angelica
Gymnasia
Land Grab
A Short History of the Gaze
Museum of Symmetry
Melody of Dust
Chocolate
Davina

css.php