An algorithm for detecting orientation in textures is developed and compared to results of humans detecting orientation in the same textures. The algorithm is based on the steerable filters of Freeman and Adelson (1991), orientation-selective filters derived from derivatives of Gaussians. The filters are applied over multiple scales and their outputs nonlinearly contrast normalized. The human data was collected from forty subjects who were asked to identify "the minimum number of dominant orientations" they perceived, and the "strength" with which they perceived each orientation. Test data consisted of 111 graylevel images of natural textures taken from the Brodatz album, a standard collection used in computer vision and image processing. Results on comparing the computer and humans' test data indicate they each chose at least one of the same dominant orientations on 95 of the natural textures. Of these texture, 74 were also in 100% agreement on the location of all the dominant orientations chosen by both human and computer. Individual cases which disagree are analyzed, and possible causes are discussed. Some apparent limitations in the current filter shapes and sizes are illustrated, as well as some effects possibly due to semantic influences and gestalt grouping.