Following legislation making provision for vision screening for regular DSE users, research compared six types of vision screening systems to each other and to a full Optometric eye examination. Of these, the first two were computer based screeners, the remainder were stand alone units. The aim was to compare measures of visual function using each of the screeners to each other and to Optometric data with particular reference to visual abilities relevant to work with display screen equipment. Regular users of such equipment were tested on the screeners, a sample of subjects were retested on the systems later to measure reliability. For the software systems, data were also obtained at the user's workstations. The research found: (1) No statistical difference between the overall pass rates for the six vision screeners; (2) reliability analysis found that two of the screeners produced statistically similar results when comparing the test/retest data; (3) the workplace/laboratory comparison for the two software systems produced a significant association between their overall results obtained in the laboratory and those in a user's normal workplace; and (4) the overall pass results obtained by the Optometrist were significantly correlated with the overall pass rates for three of the six systems. The sensitivity and specificity scores of the different screeners varied widely. Perhaps the most important result, however, was the high percentage of users that failed the Optometric test but passed according to some of the vision screeners - the implications of this high 'false positive' rate that some vision screeners produce are discussed.
|Number of pages||7|
|Publication status||Published - 1 Apr 1997|
- DSE users
- Vision screening