New Interfaces in Radiology

I’ve said before that I consider myself a generalist. I don’t limit my design ideas to games. So, today I would like to talk about Radiology.

During my last semester at Carnegie Mellon I was fortunate enough to get my hands on a zSpace. I ended up attending zCon 2013 and seeing all the cool things people are doing with the zSpace technology. One company, echopixel, is using it to create an interface for viewing 3D representations of Radiological scans like CTs. This may surprise you, but this is an area I happen to know a great deal about.

Now don’t get me wrong, I like zSpace, and if you are reading this David (or anyone else from zSpace) please don’t take offense, but I’m not convinced it is the way to go with this technology. I’ve read the articles, like: Virtual Holography, The Next Step in Radiology Imaging?, and A tipping point for visualization-driven knowledge. I’m just not convinced this interface is going to be the right fit or provide the usability that doctors need for the industry. And neither are doctors I’ve talked to, it will take a lot of peer reviewed papers before this tech will get adopted. The papers aside, there are a few issues I see right off the bat:

  1. As far as I know zSpace is not an FDA approved monitor. FDA approved monitors cost a lot of money (like $10,000 a pop) and I don’t know how they will convert it over to that.
  2. Using that wand all the time is going to be a problem since doctors need to have that hand free to a certain extent to dictate cases while they are reading them.
  3. This is the big one. You can only wear those glasses for so long. I found that after about two hours of use I was done for the day. Your eyes get tired and it can cause headaches.

I propose another, perhaps cheaper and simpler way to use similar software technology the echopixel has created (which I do think is awesome), but with different less invasive and labor-intensive hardware. Couple a FDA approved monitor with a Soft Kinetic.

The soft kinetic is a really awesome machine like the Kinect and has the ability to do gesture recognition as well, if not better than, the Leap Motion. One could track the position of the head to create the 3D perception, and then use gestural control to manipulate the position of the 3D imaging being observed.

Don’t believe that good perceptual 3D can be achieved without stereo glasses or a 3D screen? Well, I know it can. But don’t take my word for it, you can start by looking at the work of Johnny Lee if you like. But I worked on a project with Brad Buchanan at Carnegie Mellon where we made this very technology work with a Kinect. That was before Kinect had near mode, and the Soft Kinetic has better technology on-board than the Kinect.

This is obviously just a concept, not a perfect solution yet, there are problems to be solved. A big one would be, how to make sure the head tracking stayed with the primary physician if someone else dropped by. But, it seems like a more user friendly solution to me. I envision a future where doctors lean back in front of their screens casually waving their hands at 3D representations of breast MRs, no glasses required.