A Systematic Review of Simulation-Based Training in Vascular Surgery

Introduction Recent advancements in surgical technology, reduced working hours, and training opportunities exacerbated by the COVID-19 pandemic have led to an increase in simulation-based training. Furthermore, a rise in endovascular procedures has led to a requirement for high-fidelity simulators that offer comprehensive feedback. This review aims to identify vascular surgery simulation models and assess their validity and levels of effectiveness (LoE) for each model in order to successfully implement them into current training curricula. Methods PubMed and EMBASE were searched on January 1, 2021, for full-text English studies on vascular surgery simulators. Eligible articles were given validity ratings based on Messick’s modern concept of validity alongside an LoE score according to McGaghie’s translational outcomes. Results Overall 76 eligible articles validated 34 vascular surgery simulators and training courses for open and endovascular procedures. High validity ratings were achieved across studies for: content (35), response processes (12), the internal structure (5), relations to other variables (57), and consequences (2). Only seven studies achieved an LoE greater than 3/5. Overall, ANGIO Mentor was the most highly validated and effective simulator and was the only simulator to achieve an LoE of 5/5. Conclusions Simulation-based training in vascular surgery is a continuously developing field with exciting future prospects, demonstrated by the vast number of models and training courses. To effectively integrate simulation models into current vascular surgery curricula and assessments, there is a need for studies to look at trainee skill retention over a longer period of time. A more detailed discussion on cost-effectiveness is also needed.


Introduction
Influenced by the aviation and military sectors, simulation has become more apparent in medical education and has recently grown to be incorporated in the training curricula of various specialties. 1 Vascular surgical training has historically followed the Halstedian model of 'learning by doing'. This model is often criticized because current surgical trainees are limited by a lack of opportunity of 'what comes through the door'. 2 Working time restrictions have reduced surgical training in the United Kingdom by a third. Factors, including the shortening of residency time and reduced training opportunities, worsened by the COVID-19 pandemic, are having a large impact on vascular surgical training. [2][3][4] Additionally, vascular surgery has evolved into a more independent and highly complex specialty. This is due to increasingly more endovascular procedures being conducted, with over half of aortic aneurysms in Europe now being treated by endovascular repair. 5 These relatively new treatment methods require additional skills to be acquired. Furthermore, some studies have shown that vascular trainees experience fewer teaching opportunities for endovascular procedures than for open surgery, leading to a requirement for alternative teaching methods. 6,7 It is still of vital importance to ensure that vascular trainees have proficiency in both open and minimally invasive procedures. 8,9 In recent years, there has been a greater emphasis on simulation as a surgical training tool to achieve the same level of competency and maintain patient safety. 10 Due to the rise in endovascular procedures, which are known to offer improved patient outcomes and reduced hospital stay compared to open vascular procedures, 5,11 there will be a subsequent rise in high-fidelity simulators utilized for training. Despite this, there is possibly an even greater demand for open simulators due to the requirement for junior trainees to safely practice common open procedures such as arterial anastomosis and additionally for senior trainees to undertake crisis management in a simulated operating theater. 12 These may actually be more expensive and face greater resistance than their endovascular counterparts, due to the challenge in replicating a three-dimensional field as opposed to a two-dimensional computer system featured in many endovascular simulators. 12 In order to ensure that vascular trainees can perform to a high standard, regulatory bodies and a European wide consensus have recognized the importance of simulation as part of the curriculum and as a method of assessment, which has been shown in other specialties to be an effective way of ensuring proficiency in surgical skills. 2,10,13,14 Based on what is currently known, the rationale for this review is to identify the current simulation modalities, assess their validity, and determine how effective they are for training vascular surgeons.

Methods
This review followed the PRISMA guidelines ( Fig.). 15

Information sources and search
A preliminary search included the terms 'vascular surgery AND simulation' in order to identify MeSH terms. A systematic search of PubMed and Embase was carried out on January 1, 2021, with no limits using the search terms: 'Simulat* AND vascular surg* AND (EVAR OR aneurysm repair OR endovascular OR bypass OR endarterectomy OR angioplasty OR embolectomy) AND (education OR performance OR curriculum OR competence)'. Reference lists from eligible articles were manually searched for potential studies.

Study eligibility criteria
Original articles in full English text were included, with duplicates and conference abstracts excluded. Randomized control trials, cohort, case-control, and cross-sectional studies were included. Population inclusion criteria featured vascular surgeons or trainees at any stage, medical students, or doctors on a vascular placement. Models identified were separated into categories outlining the modality: virtual reality, bench, cadaver, animal, e-learning, computer, or application software.

Data extraction
Articles were screened and organized using the COVIDENCE software. Initially, this was done by title and abstract screening. Once duplicates and noneligible articles were excluded, the remaining articles were read in full. An Excel spreadsheet featured the name of the simulator, manufacturer, fidelity, availability, type of model, reference, number of participants, training level, procedure, validity, and effectiveness criteria.

Data analysis
Validity studies were assessed using Messick's modern concept of validity framework, which has been widely utilized in the literature. [16][17][18][19][20][21][22][23][24][25][26] This evaluated the strength of each source of validity evidence using five parameters (Table 1), quantified by Beckman et al. 's rating scale, which indicated the strength of validity of each parameter using a scale from 0 to 2 ( Table 2). 27 Potential author bias and selective outcome reporting was assessed on a study level.
An adaption of McGaghie's translational outcomes was applied, and a level of effectiveness score (LoE) from 1 to 5 was given for each simulation model, 28 representing the translational outcome gained from using a simulator (Table 3).
Data analysis was discussed and practiced in detail by authors AH and AA prior to the formal analysis of validity studies. Data analysis was carried out by one author, AH, who applied ratings and scores to individual papers alongside a description of why individual scores and ratings were given. An additional author, AA, was brought in when there was a discrepancy regarding validity ratings or translational outcoming scoring.

Open vascular surgery
Abdominal aortic aneurysm (AAA) repair Two simulators for AAA repair had validation criteria or showed an LoE in two articles. The Inanimate OAR bench simulator achieved excellent ratings across the board, rating 2/2 for content, response processes, internal structure, and consequences. 29 An LoE of 2/5 was awarded due to a decrease in the simulated procedure time and supervisor interference.
The Open Abdominal Aneurysm Simulation Model earned a rating of 2/2 for content, due to evaluation and refinement by experts and for relations to other variables. Similarly, an LoE of 2/5 was awarded due to an improvement in postsimulator performance. 30 Fig. e Study selection process according to the PRISMA statement. 15 Figure 1 shows that 525 articles were found after the initial search and exclusion of duplicates. After abstract and title screening, 178 articles were assessed for eligibility via fulltext screening and 102 articles were excluded for various reasons. Overall, 76 eligible articles were found for 34 different simulators (See Supplementary  h a i s e r e t a l s i m u l a t i o n -b a s e d t r a i n i n g i n v a s c u l a r s u r g e r y

Carotid endarterectomy (CEA)
Three validation studies were identified for three CEA simulators (See Supplementary Table II). The Carotid Bench model showed content validity with a rating of 2. 31 An LoE of 2/5 was achieved as there was a significant improvement in surgical skills.
The low-cost Pulsatile Carotid Endarterectomy model by Fletcher et al. 32 was tested upon 17 participants, achieving a content rating of 2/2. An increase in confidence and skills scored an LoE ¼ 2/5.

Vascular anastomosis (VA)
VA was the most common type of simulation for open vascular surgery. Eight validation studies for eight simulator models were identified (Supplementary Table II), and all demonstrated high validity ratings alongside an LoE of 2/5 and above.
The Virtual Surgery simulator 33 achieved a rating of 2/2 for content and relations to other variables by comparing anastomosis performance between vascular surgeons and medical students. An improvement in anastomosis skills led to an LoE of 2/5.
A Frozen Porcine Aorta Anastomosis model discriminated between various levels of experience of residents within the group, thereby achieving a rating ¼ 2/2 for relations to other variables. 34 Improvement in the post-test performance led to an LoE of 2/5. The Vascular Anastomosis model scored highly for content and relations to other variables (rating ¼ 2/2). 35 Performance in models was predictive of operative competency on real patients and was associated with less leakage and shorter operating time (LoE ¼ 3/5).

Angioplasty and stenting
These were the most commonly validated vascular procedures, with thirty-four validation studies found for five simulators.
ANGIO Mentor, a high-fidelity VR simulator, was validated by twelve studies as seen in Supplementary Table III. [36][37][38][39][40][41][42][43][44][45][46][47] Willaert et al. scored 2/2 for content by implementing an expert validated assessment scale on 20 endovascular experts and outlined a clear method of test design and standardization, scoring highly for response process (rating ¼ 2/2). 40 Maertens et al. demonstrated in two studies that ANGIO Mentor could differentiate between endovascular performance in 32 surgical trainees and additionally 20 vascular surgeons and 29 medical students. 36,37 All studies received favorable feedback and an improvement in technical metrics, such as decreased procedural and fluoroscopy time (LoE ¼ 1-2/5). Wooster et al. scored an LoE ¼ 3/5, as performance on patients post ANGIO Mentor use was associated with lower doses of radiation, contrast, and shorter procedure time. 46 Maertens et al. was awarded an LoE of 5/5 as proficiency levels were retained up to 3 mo, alongside an improvement in patient outcomes and better operative performance. 37 The Vascular Intervention System Trainer (VIST) was validated by 15 studies [48][49][50][51][52][53][54][55][56][57][58][59][60][61][62] (See Supplementary Table III). Bech et al. used a previously validated assessment method by experts, and Boyle created expert reviewed scoring sheets (content rating ¼ 2/2). 52,54,58 Response process ratings were mixed (N-2/2), with some studies featuring an explicit thought process of test design and error reduction methods such as randomization and rater blinding. Rolls et al. demonstrated   64 Differences between trainee and expert performance were clearly noted, scoring relations to other variables ¼ 2/2. Another group used VIST to assess EVAR on 23 endovascular experts and established a pass/fail standard using the Angoff method (consequences ¼ 2/2). 65 ANGIO Mentor has been validated by five studies for EVAR simulation. [66][67][68][69][70] Saratzis et al. used expert reviewed assessment methods scoring content ¼ 2/2. 70 Vento et al. used two expert vascular surgeons to perform simulated cases on ANGIO Mentor, obtaining mean values and comparing this with metrics obtained from residents to differentiate between varying levels of competence (relations to other variables ¼ 2/2). 68 There was minimal discussion surrounding response processes; however, Descender et al. scored 2/2 for this criterion due to a clear thought process regarding assessment design. 69 The same group achieved an LoE of 4/5 after EVAR rehearsal, as there was a reduction in major and minor errors post simulator use, having a direct effect on patient outcomes by reducing inhospital mortality by 2%. The rest of the studies achieved an LoE of 2/5, due to a reduction in simulated fluoroscopy and procedural time, alongside fewer endoleaks and postsimulator use.

General endovascular procedures
Seven studies validated six simulators for general endovascular procedures (see Supplementary Table III). A Pulsatile Fresh Frozen Cadaver model scored an LoE ¼ 1/5 as participants rated the simulation useful. 71 Relations to other variables scored 2/2 as a difference in score was shown between endovascular experts and trainees. This was re-assessed in another study where an improvement in performance awarded an LoE of 2/5. 72 Content was scored 2/2 as there was reference to a prior assessment method that demonstrated excellent face validity.
The 3D-Printed Endovascular Simulation model tested basic endovascular skills, using a low fidelity bench simulator on 96 endovascular physicians. 73 Favorable postfeedback responses scored an LoE ¼ 1/5.

Nontechnical skills training (NTS)
For NTS, four studies were found for three simulators (see Supplementary Table IV).
The Resusci Anne Simulator simulated a rAAA. 74 A content score of 1/2 was awarded as simulation supervisors continuously evaluated methods of assessment. This process was poorly defined. There was a direct change to patient outcomes after the simulation scenario was introduced, including reduced door to occlusion time, door to needle time, and reduced 30-d mortality (LoE ¼ 4/5).
ORCAMP was used in two studies assessing trainee sympathetic tone and teamwork performance. 75,76 Content ratings were limited (0-1/2) due to reference of a prior validated assessment criterion but limited data. Relations to other variables scored highly (rating ¼ 2/2) for both. Ramjeeawom et al. 75 scored 2/2 for response processes due to a clear thought process regarding quality control methods and test h a i s e r e t a l s i m u l a t i o n -b a s e d t r a i n i n g i n v a s c u l a r s u r g e r y standardization. An improvement in knowledge and teamwork skills scored an LoE ¼ 2/5.  77 The course was able to differentiate between residents and vascular fellows (relations to other variables ¼ 2/2). Post-test procedural knowledge and self-rated procedural competence increased (LoE ¼ 2/5).

As in Supplementary
Strøm et al. held a 1-d intensive course for EVAR sizing and stent-graft selection, using a CT Angiography computer program. 78 The test score comprised a subscore for anatomy measurements and modular stent-graft selection, which was devised by experts (content ¼ 2/2). An improvement in knowledge and skill was recorded, scoring an LoE ¼ 2/5.
The EduCas course was a 2-d simulation teaching for carotid stenting using ANGIO Mentor. 79 A previously validated rating scale by experts scored content ¼ 2/2. An improvement in performance was shown by decreased procedural, fluoroscopy, and delivery-retrieval time of the embolic protection device (LoE ¼ 2/5).

Discussion
This systematic review has identified a whole scope of simulators and courses that have been evaluated for their validity and translational outcomes. VR simulators, in particular ANGIO Mentor, were widely validated and translated to realworld performance for a variety of endovascular procedures. There is a need for novel, high fidelity simulators that offer greater sensory and haptic feedback for basic endovascular skills to address this paradigm shift from open to endovascular repair. 80 Despite this, it is still vital to ensure trainees are proficient in open vascular skills as these are procedures more likely to be performed. Therefore, it is important to remember that there is still a need for highly validated and effective open vascular simulators. Furthermore, the continuing development of surgical technology offers technical endovascular skill feedback such as fluoroscopy time, contrast volume, and even video-motion analysis to track hand-eye coordination. 51 This has been demonstrated in this review to improve performance, translate skills to the operating theater, and improve patient outcomes, seen across multiple specialties where a transition to endovascular procedures has become more common. This overlap between specialties was highlighted in this review. For example, a considerable number of angioplasty and catheterization simulators were seen in the literature to test the skill of interventional radiology and cardiology trainees, often overlapping with vascular surgeons. 81,82 Cadaver models are known to replicate procedures based on tissue feel and anatomy, permitting high fidelity training and hence were ranked most highly on a trainee review of simulator popularity, improving the confidence and competence for trainees with limited operative experience. 83 However, only four cadaver models were seen in this review, not offering a well-balanced perspective on the validity and educational impact compared to other models. There is a need for greater implementation of cadaver models alongside or in combination with other simulation modalities by training institutions and designers.
VA models were the most popular simulators, mainly due to a low fidelity and simple design, alongside being a fundamental vascular procedure, as highlighted on a vascular surgery curriculum needs assessment. 8 This assessment did identify other common open procedures, such as peripheral bypass, open embolectomy, and amputation, but these lacked available simulators in the literature. Open vascular simulators have met more resistance than endovascular counterparts, which assess technical skills via VR or computer software, which is often difficult to replicate for open procedures. 12 This explains the greater proportion of commercially available, high fidelity endovascular simulators in the literature compared to low-fidelity, open vascular models.
The rise in endovascular procedures has seen a subsequent rise in complexity and error rate, in particular when open and endovascular surgery are combined, often involving a diverse range of health professionals. 84 NTS or human factors such as teamwork, leadership, situational awareness, and communication, play an important role in patient safety, accounting for up to 85% of major errors in some studies. 85 However, only three simulators tested NTS; hence, it was neglected in this review. 86 Rudarakanchana et al. used a fully immersive Angiosuite combined with a high-fidelity VR simulator to simulate a rAAA scenario. 64 This was rated as highly realistic and tested the technical and nontechnical skills of a multidisciplinary team under realistic conditions. There needs to be a greater implementation of NTS testing within vascular surgery, given the variety of simulation modalities and courses available, and this combination of disciplines is highly recommended and rated by experts, despite the high costs of implementation and maintenance. 87 The use of modern definitions that identify validity as a unitary construct is vital in the in-depth assessment of simulators and their suitability for examining and training vascular surgeons. 88 This report featured Messick's validity criteria, which has been widely utilized as an acceptable and appropriate method for evaluating simulation tools within the literature, not only within vascular surgery but also within many other surgical specialties. Additionally, it was recommended as a standard by the American Educational Research Association. [18][19][20][21][22][23][24][25][26] Nevertheless, this was clearly limited as only one study (1.3%) followed Messick's modern concept of validity and featured all five validity criteria. 65 This corresponds to a surgical review that found only 6.6% of studies used the modern concept of validity. 89 None of the studies in this review showed complete validity for all domains, although Lawaetz et al. 29 achieved highly by scoring 2/2 for four of the validity criteria. A significant proportion of studies scored highly (rating ¼ 2/2) for content (46%) and relations to other variables (76%). This was often due to an expert review of assessment methods, reference to a previously validated study, and comparing performance between novice and expert participants. Internal structure reporting was weak as many studies discussed measures of reliability, but only a small proportion (6.6%) scored 2/2 by using multiple measures to calculate interitem and test-retest reliability. Only two studies (2.6%) 29,65 scored high validity ratings for consequences criteria, the most neglected validity criteria, by setting a pass-fail score. This criterion was vital to set a certain trainee competence level before practice on real patients. 25 There was also a large population of novices featuring as study participants and rated simulators on their realism and effectiveness, meaning that the ratings of some simulators may not be entirely reflective of vascular experts.
Overall, 25% of studies used conventional definitions such as the face, content, and construct validity. 90 These definitions are outdated, possibly due to many studies being performed by clinicians without guidance from a medical education specialist or being published before the wide implementation of Messick's framework. 25,90 Furthermore, commonly used definitions such as face and content validity are more subjective measures of validity, alongside there being no clear consensus on the exact definition of these measures, offering little educational relevance. 90,91 This corresponds to similar reviews focused on urology, open vascular surgery, and additionally endovascular surgery. 25,91,92 Another issue was that a significant proportion of studies (74%) lacked a formal validation process and were purely descriptive. This was evident in other specialties where reviews of ophthalmology and orthopedics showed a 45% and 38% prevalence of descriptive studies, respectively. 93,94 This demonstrates that the validity evidence for simulation in vascular surgery is poor, and there needs to be greater use of the modern validity framework (Table 1). 89 Only seven simulators demonstrated improvement in the operating theater, patient outcomes, and skill retention (LoE 3þ/5), with the majority of studies scoring an LoE ¼ 1-2/5. The current literature demonstrates that simulation-based training has the potential to largely improve patient outcomes, but the current reporting is limited in this regard, and further research is needed to investigate the real-world transfer of skills. 25,92 Surgical simulation has significantly developed over recent years and offers exciting prospects for the future, such as augmented reality and 3D printed patient-specific models. These have been seen in this review to improve patient outcomes and are set to become more integrated into the vascular surgery curriculum and improve procedural training. 95 Simulation-based training has been shown to reduce the initial phase of the learning curve, but many trials in this review had a small sample size and did not include an adequate follow-up time. Future trials must validate simulators in large, well-designed randomized control trials with adequate follow-up time. This has been seen in the Simulation in Urological Training and Education (SIMULATE) trial, 96,97 which assessed whether simulation reduced the number of procedures to reach technical skill proficiency and positively impacted patient outcomes. This offers high relevancy to the future of vascular surgery training. 85 Limitations Despite a specific and methodical search criterion, the initial search result included irrelevant articles which focused solely on other endovascular specialties. Instead, the search terms could feature negative Boolean operators to exclude such articles. The search also focused on more technical aspects of simulation-based training and could perhaps be restructured to allow a balance of nontechnical skills, which is known to complement the technical performance. Many articles featured outdated validation definitions, which were purely descriptive, lacked translational outcomes to real-world performance and lacked specialized expert opinions, which had a significant impact on reaching a conclusion. Finally, the heterogeneity of study methods and outcomes did not allow for a quantitative meta-analysis.

Conclusions
Simulation-based training in vascular surgery is a continuously developing field and of growing importance due to advances in technology and reduced training opportunities, exacerbated by the COVID-19 pandemic. This review applied the modern concept of validity criteria and found a vast number of studies validating simulation models and training courses. ANGIO Mentor was the most validated simulator, collectively demonstrating an improvement in real-world performance, knowledge, patient outcomes, and skill retention. In summary, the future of vascular surgery simulation has great potential. However, too few studies utilized the modern concept of validity framework, and prospective studies require greater use of this, alongside a longer followup time of trainee skill retention and discussion of costeffectiveness. This is needed in order to effectively integrate simulation models into the current vascular surgery curriculum and methods of assessment.

Supplementary Materials
The following references [98e122] are cited in Supplementary Tables.

Author Contributions
AH and AA were involved in the initial planning and conceptualization of the review. AH conducted the literature search, alongside data collection and writing of the manuscript. AA and BK both provided guidance and revisions to the manuscript. KA and PD oversaw the project and provided guidance and access to necessary resources. All authors read and approved the final manuscript.

Disclosure
None declared.

Funding
None. h a i s e r e t a l s i m u l a t i o n -b a s e d t r a i n i n g i n v a s c u l a r s u r g e r y