Locate a Member
Enter the first part of a member's last name to search
New Member Registration
Search this Site
12635 E. Montview Blvd., Suite 270
Aurora, CO 80045
P: (720) 859-4149
F: (720) 859-4158
Microsurgery Instruction in an Orthopaedic Residency
Investigator: Julie Balch Samora, MD (AΩA, West Virginia University, 2009)
Mentor: Ryan D. Klinefelter, MD
Background: It is difficult to impartially judge surgical aptitude and proficiency, such that clinical observation tends to be subjective with poor interrater reliability. In fact, objective assessment of technical skill has been considered the weakest measurement in surgical training (Darzi and Mackay 2001). Currently, case volume is the only widely available standard to determine surgical competency, as utilized by the Accreditation Council for Graduate Medical Education (ACGME), the American Board of Surgery, and U.S. hospital credentialing committees (Korndorffer, Clayton et al. 2005). Logbooks are similarly the current method by which the Royal Colleges in the United Kingdom and other European countries assess completion of surgical training (Chan, Matteucci et al. 2007). Case numbers do not necessarily correlate with proficiency. We must have validated assessment tools to determine surgical ability, and these should be utilized in residency. Traditionally, there has not been a significant emphasis in orthopaedic residencies to provide training in microsurgical techniques. However, microsurgical skill has become more important in the era of advancing technology (Chan, Matteucci et al. 2007), and hand and spine specialists regularly work under a microscope. Hand-eye coordination, manual dexterity, spatial orientation, and good clinical judgment are absolute requirements for a good microsurgeon, and these skills would readily translate to other orthopaedic specialties.
Although there are many teaching methods and tools available for microsurgical instruction, there is no consensus as to the best method to teach microsurgical skills. Microsurgery courses are very common, and generally utilize a combination of teaching techniques including videos, simulators, and hands-on surgical practice utilizing live animal vessels in a laboratory setting over the course of several days. Live animals, nonliving models (including stored vessels, fresh leaf, practice cardboards), prosthetic systems (colored beads, silicone hollow tubes, practice rat), and virtual reality methods are all options (Peled, Kaplan et al. 1983; Kaufman, Hurwitz et al. 1984;Yenidunya, Tsukagoshi et al. 1998; Southern and Ramakrishnan 1998; Chaudhry, Sutton et al. 1999; Smith, Torkington et al. 1999; Guler and Rao 1990; Fanua, Kim et al. 2001; Remie 2001). Regardless of teaching method, is imperative to have a reliable and validated assessment tool for objective evaluations of technical ability. Reznick has defined various categories of validity as pertaining to an assessment tool: face validity (learned tasks reflect those performed in real-life surgical situation), content validity (assessment actually measures skill), construct validity (discriminates between expert and trainee), concurrent validity (assessment provides similar results as other similar validated tools), and predictive validity (skill learned during lab setting will be transferable to real life setting) (Reznick 1993).
Many orthopaedic residents graduate without having had exposure to a microsurgical education. Our objective was to create and implement a proficiency-based training curriculum in microvascular surgery for orthopaedic residents that is cost-effective, maintainable, reliable and valid. The ultimate goals of the curriculum were to enhance the surgical education of orthopaedic residents, and to improve their overall performance in the operating room.
Methods and Materials: Participants were OSU orthopaedic surgery residents rotating on the hand service. At any given time throughout the academic year, there is one junior (PGY3) and one senior (PGY5) resident on the rotation. There were four phases of the microsurgical training. At the start of the two month rotation, each resident was provided a questionnaire regarding age, handedness, gender, and operative experience. They were provided a brief orientation to the microsurgery laboratory and asked to perform an initial skills exam passing three simple interrupted square knots utilizing 9-0 nylon. Residents were scored on completion time of the task in seconds using a digital stopwatch as well as with an objective assessment of their knots (square, spacing, etc), as well as with a predefined checklist, and a six domain GRS. The task-specific checklist was rated on a three point scale: A score of 0, 1, or 2 was given when a task was either not performed, performed in mediocre fashion, or performed well, respectively. The GRS has six domains, each rated on a behaviorally anchored 5-point Likert scale (Grober, Hamstra et al. 2003). Residents were provided the Acland Practice Manual for Microvascular Surgery (copies located in the laboratory) and had access to the associated videos, which they could watch on their own time. The residents were provided with one training session, to include basic microsurgical technique including instrument handling and microscope operation, as well as anastomotic technique (Hino, 2003). They had unlimited access to the laboratory and equipment, and were free to practice on their own time, logging their hours in the laboratory log book.
The residents performed specified tasks at the completion of their two month rotation and the microsurgery instruction. The final observation/testing sessions were videorecorded and timed. Three assessment tools were used to evaluate the performance of each subject: the time on task, the task-specific checklist, and a six domain GRS. The first task was a repeat of their baseline testing of placing three interrupted square knots. They then utilized a practice model to perform specific pre-determined tasks. Checklists as used by Harden et al in the Objective Structured Clinical Examination (OSCE) (Harden, Stevenson et al. 1975) eliminate subjectivity (Regehr, MacRae et al. 1998). The GRS with a Likert component, as used by Grober et al to assess microsurgical skill (Grober, Hamstra et al. 2003), has the advantage of higher interrater reliability (Grober, Hamstra et al. 2003). The GRS and checklists have previously been validated to objectively assess competence in basic microsurgical skills (Kalu, Atkins et al. 2005; Grober, Hamstra et al. 2003). The examiners were trained in the use of the checklist and GRS for the grading of microsurgical procedures.
Videotaping was completed in a blinded manner without audio recording to ensure proper masking of the identity and level of training of the subject. Blinding was achieved by videotaping only the subjects’ hands throughout the testing. At the end of the data collection period, one fellowship-trained hand surgeon evaluated each performance using a task-specific checklist (see Appendix B & C).
Statistical analysis was performed with SPSS 13.0 (SPSS Inc, Chicago, IL, USA). Significant differences were analyzed using a Tukey’s post hoc test. Significance was defined as p<0.05.
Results: We have had a total of eight residents participate in the training to date and not all of the data are yet available for analysis. In general, residents appeared to view the intervention as valuable to their education. Before the intervention, residents had either no comfort performing microsurgery or only some comfort with basic skills. Most residents changed their response to either comfort with basic concepts or comfort with advanced concepts on the post-intervention questionnaire. A majority of residents believed that the course was beneficial (85.7%). All of the residents (100%) would recommend this training to their colleagues. Furthermore, every resident (100%) believed that they were not only better microsurgeons, but better overall surgeons after the training.
At the start of the rotation, the average score on the task-specific checklist was 0.97 for each task. This improved to 1.58 by the end of the rotation, with notable improvements in how instruments were held, in the ability to hold the needle appropriately, and the ability to produce square knots. Not enough data have been collected to determine if there is construct validity.
Discussion: There has been a heightened public concern for patient safety (Leape 2009), such that medical providers are under increased scrutiny to demonstrate adequate knowledge, diagnostic capabilities, and expertise in decision-making ability. In surgical fields, it is critically important to demonstrate competence, aptitude, and dexterity, as operative proficiency has been shown to influence outcomes (Darzi, Smith et al. 1999). Therefore, objective assessment of technical skill utilizing valid and reliable tools is of paramount importance.
Traditionally, there has not been a significant emphasis in orthopaedic residencies to provide training in microsurgical techniques. However, microsurgical skill has become more important in the era of advancing technology, and hand and spine specialists regularly work under a microscope. Hand-eye coordination, manual dexterity, spatial orientation, and good clinical judgment are absolute requirements for a good microsurgeon, and these skills would readily translate to other orthopaedic specialties.
It is imperative to have a reliable and validated assessment tool for objective evaluations of technical ability. We are in the preliminary phases of this educational intervention, and there are several limitations. So far, we have a very small sample size, as there are only two rotating residents on the hand service every two months. Not enough data have been collected to determine if there is construct validity. We had technology limitations, in that the quality of the tapes was not ideal to assess skills of residents. Although assessment methods have been utilized and validated previously, they have not yet been validated with this intervention.
We have many ‘to-do’ items. We plan to have all the hand faculty assess resident skill using the checklist and GRS. We hope to establish the concurrent validity of our test criteria with the previously validated task-specific checklist and GRS. We will ascertain if there is a relationship between all three outcome measures (time, task-specific checklist, and GRS). Intraclass correlation coefficients will be calculated to assess the inter-rater reliability of assessments provided by the examiners. And finally, we hope to include the rotating plastic surgery residents into the curriculum.
Conclusion: Our aim was to create a proficiency-based training curriculum in microvascular surgery for orthopaedic residents that is cost-effective, maintainable, reliable and valid that will ultimately enhance the overall surgical education and improve performance in the operating room. We are in the preliminary stages of this teaching program, but it does appear that a microsurgical laboratory course is an effective method to educate residents and improve their overall surgical skill, and is perceived by the house staff as a valuable aspect of training.
Chan, W. Y., P. Matteucci, et al. (2007). "Validation of microsurgical models in microsurgery training and competence: a review." Microsurgery 27(5): 494-9.
Darzi, A., S. Smith, et al. (1999). "Assessing operative skill. Needs to become more objective." Bmj 318(7188): 887-8.
Kalu, P. U., J. Atkins, et al. (2005). "How do we assess microsurgical skill?" Microsurgery 25(1): 25-9.
Martin, J. A., G. Regehr, et al. (1997). "Objective structured assessment of technical skill (OSATS) for surgical residents." Br J Surg 84(2): 273-8.
Reznick, R. K. (1993). "Teaching and testing technical skills." Am J Surg 165(3): 358-61.
Last updated: 10/2/2014
Updated on December 23, 2014.