It seems product research is much like any research; the deeper you go, instead of finding a clear-cut answer, the more questions you come up with. So far, we have tabletop systems, modular systems that go from tabletop to working with fMRI systems (with their gigantic magnets that might humble other camera setups), or head-mounted systems that can accommodate viewers looking close up as well as far away, but no single system that does all three. The fMRI-compatible systems seem to be the most expensive, and that functionality is definitely our least-required.

Eye-tracking on the sphere or with natural exhibit use seem to be the biggest stumbling blocks for the technology so far. The tabletop systems are designed to allow left-to-right head movement, and maybe a bit of front-to-back, but not variable depth, such as when one might be looking at the far wall of the 6-foot-wide wave tank vs. the wall one is leaning on. Plus, they don’t follow the user as she might move around the sphere. The glasses-mounted systems can go with the user to a point, but not all can do so without external tracking points to pre-define an area of interest. One promising (for our purposes) head-mounted system does video the scene the user is looking at as he moves. I haven’t figured out yet whether it would work well on a close-up screen, where we could track a user’s kiosk navigation, for example. Another big open question is just how long will visitors agree to wear these contraptions??

The other questions I am really trying to press on are the warranty and tech support. I have been stuck with fancy-schmancy shiny new technology that has a) relatively little thought behind the actual user experience at least for our applications, b) tech support that is so new that they can hardly figure out what you’re trying to do with the product, or (worst) c) both problems. The good news with the eye tracking systems is that most seem to be provided at this point by companies that have been around for a while and might even have gone through several rounds of increasingly better products. The bad news is, since we may be going out on a limb with some of our intended uses, I might end up where I was before, with people that don’t understand what I’m trying to do. This is the curse of trying to do something different and new with the equipment, rather than just applying it for the same use in another subject, I suppose. However, I have seen updated versions of these products down the road, so I guess my struggles have not been for naught, at least from the users down the line from me. Maybe I’m just destined to be an alpha-tester.

One big help has come, as usual, from the museum community, fellow innovators that you all are. A post to the ASTC list has yielded a couple of leads on how these products are being used, which products were chosen and why, and most importantly, customer satisfaction with the products and the tech support. I’ve myself been called upon to advise on the use of the particular technologies I talked about above, and I definitely gave it to people straight.

One of the first things to do with a new grant is to buy equipment, yea! That means a bit of research on the available products, even if you’re looking for something as seemingly specialized as eye trackers. So this is the story of that process as we try to decide what to buy.

I got a head start when I was provided with a whole list of files compiled in Evernote. That meant I had to get up to speed on how to use Evernote, but that’s part of this process – learning to use new tools for collaboration as we go. Speaking of, before we got too far into the process I made sure to set up a Dropbox folder for online folder storage and sharing, and a Google Docs spreadsheet to track the information I got from each manufacturer.

The spreadsheet is pretty bare to start, just company, cost, and an “other features” category, but here again, I got a bit of direction for things to take off. We made a connection with a professor at the University of Oregon who’s been studying these systems and even designing some cool uses with them – creating drawings and computerized music simply with the eyes. I digress, but Dr. Hornof has done some background work compiling documentation on a couple of the commercial systems. He gave us a couple of clues as to specs for commercial systems: they’re often limited by the size of the virtual “head box” and that the software with the systems might be limited in capability – so two more categories for the spreadsheet! Dr. Hornof has also invited us down to his lab at the U of O, so we’ll head down in a couple of weeks and check that out.