This paper describes the MAGI (microscope-assisted guided interventions) augmented-reality system, which allows surgeons to view virtual features segmented from preoperative radiological images accurately overlaid in stereo in the optical path of a surgical microscope. The aim of the system is to enable the surgeon to see in the correct 3-D position the structures that are beneath the physical surface. The technical challenges involved are calibration, segmentation, registration, tracking, and visualization. This paper details our solutions to these problems. As it is difficult to make reliable quantitative assessments of the accuracy of augmented-reality systems, results are presented from a numerical simulation, and these show that the system has a theoretical overlay accuracy of better than 1 mm at the focal plane of the microscope. Implementations of the system have been tested on volunteers, phantoms, and seven patients in the operating room. Observations are consistent with this accuracy prediction.