How to Read and Interpret a Visual Field Test

A visual field test measures the entire area seen by the eye, encompassing both central and peripheral vision. This examination determines the sensitivity of your vision across a broad field, mapping out your complete scope of sight. The primary purpose of this test is to detect and monitor areas of vision loss that may be invisible to the patient in early stages. Such defects are often associated with progressive conditions like glaucoma or underlying neurological issues that affect the visual pathway.

Overview of the Visual Field Test Report

The visual field printout, often produced by a device like the Humphrey Field Analyzer, is a standardized, multi-part document. The top section includes patient data and test parameters, such as the strategy used and the size of the light stimulus. This is followed by reliability metrics, which confirm the patient’s cooperation during the examination.

The central part of the report features graphical displays that visually represent the measured sensitivity across the field. The bottom portion contains a statistical summary, presenting numerical indices that quantify the severity and pattern of vision loss. Sensitivity is measured in decibels (dB); a higher decibel value signifies greater visual sensitivity.

Evaluating the Reliability of the Test

Before interpreting any visual field data, check the reliability metrics, as an unreliable test report cannot be trusted for diagnosis or monitoring. These metrics gauge the patient’s focus and consistency throughout the examination. The report typically lists three main reliability indices, each expressed as a percentage of errors.

Fixation Losses track how steadily the eye maintained its gaze on the central target. The machine periodically presents a light stimulus directly into the physiological blind spot; if the patient sees it, the eye moved away from the target. Fixation losses exceeding 20% may compromise the test results.

False Positives occur when the patient presses the response button even when no light stimulus was presented. A high false positive rate suggests a “trigger-happy” patient who is anticipating the light, which falsely elevates the measured visual sensitivity. A percentage above 33% is considered a sign of poor reliability, indicating the patient is likely over-reporting their ability to see.

False Negatives are recorded when the patient fails to respond to a light that is bright enough to have been seen based on earlier testing. While a high false negative rate can indicate a lack of attention or fatigue, it can also increase in eyes with advanced vision loss due to the inherent variability of a severely damaged visual field. For mild disease, a false negative rate exceeding 33% is generally unacceptable, suggesting the patient was not fully engaged during the test.

Interpreting the Graphical Maps

The visual field report provides several graphical maps that translate the raw decibel values into visual representations of your vision. Understanding the role of each map is fundamental to distinguishing between a generalized vision problem and a localized disease process.

The Grayscale Map offers the most intuitive overview of the results, using different shades of gray to represent sensitivity. Darker areas correspond to locations where the measured sensitivity is lowest, suggesting greater vision loss. Lighter or white areas signify normal visual sensitivity, making this map useful for patient education. However, this map is not used for detailed clinical interpretation because the shading can be misleading regarding the statistical significance of the defect.

The Total Deviation Plot compares the patient’s measured sensitivity at every tested point against the expected sensitivity of a healthy individual of the same age. Each point is represented by a number indicating the difference in decibels from the age-matched norm, with negative numbers signifying vision loss. This plot shows generalized vision loss, which might be caused by a media opacity that diffusely dims vision, such as a cataract.

The Pattern Deviation Plot is used for identifying specific disease patterns. This plot is generated by statistically adjusting the Total Deviation data to remove any generalized depression across the entire visual field. By filtering out non-specific factors like cataracts, the Pattern Deviation Plot isolates and highlights localized, focal defects. If a defect appears on the Total Deviation map but disappears on the Pattern Deviation map, the loss is likely generalized; if it persists, it suggests a localized disease process, such as glaucoma.

Decoding the Statistical Indices

Beyond the visual maps, the lower section of the report provides numerical summaries, known as global indices, which condense the entire test into a few quantifiable metrics. These indices are used to stage the disease severity and track changes over time.

The Mean Deviation (MD) is the most common index, representing the average deviation of the patient’s field sensitivity from the age-matched normal population. It is expressed in decibels, and a negative MD value indicates an overall loss of visual sensitivity. A large negative number suggests significant, generalized vision loss, which may be caused by diffuse damage or media opacity.

The Pattern Standard Deviation (PSD) measures the degree of irregularity or localized variation within the visual field. This index helps identify focal damage. A high PSD value suggests that some areas of the visual field are significantly worse than others, indicating a patchy or localized defect, a common hallmark of glaucoma. Conversely, a low PSD suggests a relatively uniform field, whether uniformly normal or uniformly depressed by a generalized condition like a cataract.

The report also includes Probability Values, often displayed as shaded boxes or symbols (like X or XX) next to the deviation plots and indices. These symbols denote the statistical significance of the deviation compared to the normal population. For example, a dark symbol indicating a probability of less than 0.5% suggests that the result is so rare that it would be seen in fewer than one in 200 normal eyes. These statistical markers help confirm that the measured defect is a genuine abnormality, rather than a mere fluctuation in test performance.

Common Patterns of Vision Field Loss

The final step in interpreting the visual field report is linking the graphical and statistical data to recognized patterns of vision loss, which helps narrow down the potential cause. Different diseases affect the visual pathway in distinct ways, producing characteristic shapes of visual field defects. These patterns serve as a roadmap for the underlying pathology.

A pattern associated with Glaucoma is the arcuate scotoma, a bow-shaped defect that originates from the blind spot and arches into the nasal visual field. Glaucoma-related defects also frequently manifest as a nasal step, a sharp, horizontal difference in sensitivity that respects the horizontal midline in the nasal field. These patterns are specific because they correspond to the arrangement of the nerve fibers damaged by the disease.

In contrast, Neurological Issues, such as a stroke or tumor affecting the visual pathways in the brain, typically cause a loss that respects the vertical midline. This results in hemianopsia, where the entire left or right half of the visual field is affected in both eyes. The appearance of a defect sharply divided down the center suggests a problem posterior to the optic chiasm, requiring neurological investigation rather than a focus on the eye itself.