We investigate a measure of ``dominant perceived orientation'' that has recently been developed to match the output of a human study involving 40 subjects. The results of this measure are compared with humans analyzing seven ``teaser'' images to test its effectiveness for finding perceptually dominant orientations. The use of low-level orientation is then applied to a ``quick search'' problem important in image database applications. Since both pigeons and humans are able to perform coarse classification of certain kinds of scenes, e.g., city from country, without taking time or brain-power to solve the image understanding problem, we conjecture that the collective behavior of low-level textural features such as orientation may be doing most of the work. We demonstrate a simple test of global multiscale orientation for quickly searching a database of vacation photos for likely ``city/suburb'' shots. The orientation features achieve agreement with human classification in 91 out of 98 of the scenes.