Purpose: The validated Objective Structured Assessment of Technical Skills (OSATS) score is used for evaluating laparoscopic surgical performance. It consists of two subscores, a Global Rating Scale (GRS) and a Specific Technical Skills (STS) scale. The OSATS has accepted construct validity for direct observation ratings by experts to discriminate between trainees' levels of experience. Expert time is scarce. Endoscopic video recordings would facilitate assessment with the OSATS. We aimed to compare video OSATS with direct OSATS. Methods: We included 79 participants with different levels of experience [58 medical students, 15 junior residents (novices), and 6 experts]. Performance of a cadaveric porcine laparoscopic cholecystectomy (LC) was evaluated with OSATS by blinded expert raters by direct observation and then as an endoscopic video recording. Operative time was recorded. Results: Direct OSATS rating and video OSATS rating correlated significantly (ρ = 0.33, p = 0.005). Significant construct validity was found for direct OSATS in distinguishing between students or novices and experts. Students and novices were not different in direct OSATS or video OSATS. Mean operative times varied for students (73.4 ± 9.0 min), novices (65.2 ± 22.3 min), and experts (46.8 ± 19.9 min). Internal consistency was high between the GRS and STS subscores for both direct and video OSATS with Cronbach's α of 0.76 and 0.86, respectively. Video OSATS and operative time in combination was a better predictor of direct OSATS than each single parameter. Conclusion: Direct OSATS rating was better than endoscopic video rating for differentiating between students or novices and experts for LC and should remain the standard approach for the discrimination of experience levels. However, in the absence of experts for direct rating, video OSATS supplemented with operative time should be used instead of single parameters for predicting direct OSATS scores.

1.
Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M: Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997;84:273-278.
2.
Arora S, Aggarwal R, Sirimanna P, Moran A, Grantcharov T, Kneebone R, Sevdalis N, Darzi A: Mental practice enhances surgical technical skills: a randomized controlled study. Ann Surg 2011;253:265-270.
3.
Arora S, Miskovic D, Hull L, Moorthy K, Aggarwal R, Johannsson H, Gautama S, Kneebone R, Sevdalis N: Self vs expert assessment of technical and non-technical skills in high fidelity simulation. Am J Surg 2011;202:500-506.
4.
Pape-Koehler C, Immenroth M, Sauerland S, Lefering R, Lindlohr C, Toaspern J, Heiss M: Multimedia-based training on Internet platforms improves surgical performance: a randomized controlled trial. Surg Endosc 2013;27:1737-1747.
5.
Dath D, Regehr G, Birch D, Schlachta C, Poulin E, Mamazza J, Reznick R, MacRae HM: Toward reliable operative assessment: the reliability and feasibility of videotaped assessment of laparoscopic technical skills. Surg Endosc 2004;18:1800-1804.
6.
Chipman JG, Schmitz CC: Using objective structured assessment of technical skills to evaluate a basic skills simulation curriculum for first-year surgical residents. J Am Coll Surg 2009;209:364-370.e362.
7.
Sarker SK, Chang A, Vincent C, Darzi SA: Development of assessing generic and specific technical skills in laparoscopic surgery. Am J Surg 2006;191:238-244.
8.
Nickel F, Jede F, Minassian A, Gondan M, Hendrie JD, Gehrig T, Linke GR, Kadmon M, Fischer L, Müller-Stich BP: One or two trainees per workplace in a structured multimodality training curriculum for laparoscopic surgery? Study protocol for a randomized controlled trial - DRKS00004675. Trials 2014;15:137.
9.
Ahmed K, Miskovic D, Darzi A, Athanasiou T, Hanna GB: Observational tools for assessment of procedural skills: a systematic review. Am J Surg 2011;202:469-480.e466.
10.
R Core Team: R: a language and environment for statistical computing. R Foundation for Statistical Computing. Vienna, Austria, 2014. http://www.R-project.org/.
11.
Van Sickle KR, Baghai M, Huang IP, Goldenberg A, Smith CD, Ritter EM: Construct validity of an objective assessment method for laparoscopic intracorporeal suturing and knot tying. Am J Surg 2008;196:74-80.
12.
Pellen M, Horgan L, Roger Barton J, Attwood S: Laparoscopic surgical skills assessment: can simulators replace experts? World J Surg 2009;33:440-447.
13.
Vivekananda-Schmidt P, Lewis M, Coady D, Morley C, Kay L, Walker D, Hassell AB: Exploring the use of videotaped objective structured clinical examination in the assessment of joint examination skills of medical students. Arthritis Rheum 2007;57:869-876.
14.
Calatayud D, Arora S, Aggarwal R, Kruglikova I, Schulze S, Funch-Jensen P, Grantcharov T: Warm-up in a virtual reality environment improves performance in the operating room. Ann Surg 2010;251:1181-1185.
15.
Fried MP, Satava R, Weghorst S, Gallagher A, Sasaki C, Ross D, Sinanan M, Cuellar H, Uribe JI, Zeltsan M, Arora H: The Use of Surgical Simulators to Reduce Errors. Advances in Patient Safety: From Research to Implementation. Rockville, Agency for Healthcare Research and Quality, 2005, vol 4: Programs, Tools, and Products.
16.
Hogle NJ, Chang L, Strong VE, Welcome AO, Sinaan M, Bailey R, Fowler DL: Validation of laparoscopic surgical skills training outside the operating room: a long road. Surg Endosc 2009;23:1476-1482.
17.
Alam M, Nodzenski M, Yoo S, Poon E, Bolotin D: Objective structured assessment of technical skills in elliptical excision repair of senior dermatology residents: a multirater, blinded study of operating room video recordings. JAMA Dermatol 2014;150:608-612.
18.
House JB, Dooley-Hash S, Kowalenko T, Sikavitsas A, Seeyave DM, Younger JG, Hamstra SJ, Nypaver MM: Prospective comparison of live evaluation and video review in the evaluation of operator performance in a pediatric emergency airway simulation. J Grad Med Educ 2012;4:312-316.
19.
Selvander M, Asman P: Ready for OR or not? Human reader supplements Eyesi scoring in cataract surgical skills assessment. Clin Ophthalmol 2013;7:1973-1977.
20.
Hu Y, Tiemann D, Michael Brunt L: Video self-assessment of basic suturing and knot tying skills by novice trainees. J Surg Educ 2013;70:279-283.
21.
Kramp KH, van Det MJ, Hoff C, Lamme B, Veeger NJ, Pierie JP: Validity and reliability of global operative assessment of laparoscopic skills (GOALS) in novice trainees performing a laparoscopic cholecystectomy. J Surg Educ 2015;72:351-358.
22.
Hanley JA, McNeil BJ: The meaning and use of the area under a receiver operating characteristic (ROC) curve. Radiology 1982;143:29-36.
23.
Zou KH, O'Malley AJ, Mauri L: Receiver-operating characteristic analysis for evaluating diagnostic tests and predictive models. Circulation 2007;115:654-657.
24.
Bedrick EJ, Tsai C-L: Model selection for multivariate regression in small samples. Biometrics 1994;50:226-231.
25.
Kiehl C, Simmenroth-Nayda A, Goerlich Y, Entwistle A, Schiekirka S, Ghadimi BM, Raupach T, Koenig S: Standardized and quality-assured video-recorded examination in undergraduate education: informed consent prior to surgery. J Surg Res 2014;191:64-73.
26.
Fleiss J: The Design and Analysis of Clinical Experiments, New York, Wiley, 1986.
27.
Fleiss JL: Measuring agreement between two judges on the presence or absence of a trait. Biometrics 1975;31:651-659.
28.
Landis JR, Koch GG: The measurement of observer agreement for categorical data. Biometrics 1977;33:159-174.
29.
Wheeler P, Haertel, G, Scriven, M: Teacher Evaluation Glossary. Kalamazoo, CREATE Project, The Evaluation Center, Western Michigan University, 1992.
30.
Park YS: Rater drift in constructed response scoring via latent class signal detection theory and item response theory; thesis, Columbia University, 2011.
Copyright / Drug Dosage / Disclaimer
Copyright: All rights reserved. No part of this publication may be translated into other languages, reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying, recording, microcopying, or by any information storage and retrieval system, without permission in writing from the publisher.
Drug Dosage: The authors and the publisher have exerted every effort to ensure that drug selection and dosage set forth in this text are in accord with current recommendations and practice at the time of publication. However, in view of ongoing research, changes in government regulations, and the constant flow of information relating to drug therapy and drug reactions, the reader is urged to check the package insert for each drug for any changes in indications and dosage and for added warnings and precautions. This is particularly important when the recommended agent is a new and/or infrequently employed drug.
Disclaimer: The statements, opinions and data contained in this publication are solely those of the individual authors and contributors and not of the publishers and the editor(s). The appearance of advertisements or/and product references in the publication is not a warranty, endorsement, or approval of the products or services advertised or of their effectiveness, quality or safety. The publisher and the editor(s) disclaim responsibility for any injury to persons or property resulting from any ideas, methods, instructions or products referred to in the content or advertisements.
You do not currently have access to this content.