It is well-known that cats' eyes are optimized for vertical as opposed to horizontal movement. Many species, not only cats, have eyes optimized for movement, and tend to neglect the stationary objects in their view.
This should not come as a surprise: it's vaguely equivalent to people residing near a slaughterhouse, who within days "learn" to ignore the awful smell, or people who live on a streetcar line, who "learn" to ignore the awful sounds of steel scraping upon steel; or let us go further, the people who inhabit a neighborhood populated by hookers, pimps, addicts and so on: we learn how to ignore this noise in favor of the signal, which in these various cases would mean "something unusual". I won't bother to venture what might be deemed unusual in these various avenues.
Back to cats' eyes: optimized for the vertical, able to detect the slightest movement in the horizontal plane, e.g. a mouse slinking down a baseboard. Cats are optimized for this behavior. Dogs on the other hand are optimized for different behaviors, as are whales, minnows, wolves, bears, etc.
The whole idea of, I'm not sure what to call it, but the idea is this: transplanting human intelligence if only for an hour or so, into the mind of another species, is a profound and intriguing question. The mind of a dog or cat, for example, is optimized first and foremost by the sense of smell, and secondarily by eyes.
I have been owned by about a dozen cats and a few dogs in my life, and have stories to tell about their behavior. Dogs are easier to explain; cats are far more difficult and complex, and dare I say it, profound. They have behaviors that are not at all easy to explain.
What I do know, strictly from my own experiments, supplemented by various wikiPedia entries, is that cats' eyes work very differently from humans' eyes. The former are vertically optimized, and work in serious conjunction with their sense of smell. By and large, humans do not work this way: we trust first and foremost our eyes, except when our olfactory senses override what is visible (e.g. I'm walking past a grow-op and I know that smell even though the door is closed). But 99% of the time it's the visual inputs that give it away. Second is the scent. Third is the sound. And finally, is the touch. That's the hierarchical order of the senses. I can't prove it with 10K subjects, but I know that it's true.
If we're ever going to implement tele-AI, all this needs to be considered. To become a virtual cat means to minimize the optical intake and maximize the olfactular intake, so that our perceptions of the world around us are dictated by scent, not by vision.
This is the problem! How to impersonate a shark, a whale, a skunk, a beaver, a bear.... How to realize their optics and the ratio of that input into the overall ratio of sensory inputs.
That's going to be difficult, but I am seriously into the desire to explore these avenues.