Search Results
You are looking at 1 - 7 of 7 items for
- Author or Editor: Graham P. Shaw x
- Refine by access: All Content x
Faculty in the present-day academic medicine environment are expected to perform multiple functions, notably, the provision of high-quality teaching to the medical professionals of tomorrow. However, evaluating the effectiveness of this teaching is particularly difficult. Student evaluations of teaching, despite their many flaws, are widely used as a convenient tool to measure teaching effectiveness. Administrators continue to routinely use student evaluation of teaching surveys in faculty retention/promotion and merit pay decisions. This practice should be reevaluated since it may have unintended consequences, such as grade inflation and content debasement, and may contribute to faculty leaving the institution and even the profession. A more valid, reliable, and formative protocol for the evaluation of genuine teaching effectiveness needs to be developed as a matter of some urgency. In this review, alternatives to the student evaluation of teaching are explored to better measure true teaching effectiveness. (J Am Podiatr Med Assoc 103(1): 94–96, 2013)
Attrition from medical school remains a serious cause of concern for the medical education community. Thus, there is a need to improve our ability to select only those candidates who will succeed at medical school from many highly qualified and motivated applicants. This can be achieved, in part, by reducing the reliance on cognitive factors and increasing the use of noncognitive character traits in high-stakes admissions decisions. Herein we describe an analytic rubric that combines research-derived predictors of medical school success to generate a composite score for use in admissions decisions. The analytic rubric as described herein represents a significant step toward evidenced-based admissions that will facilitate a more consistent and transparent qualitative evaluation of medical school applicants beyond their grades and Medical College Admissions Test scores and contribute to a redesigned and improved admissions process.
Background:
This study examined the predictive ability of educational background and demographic variables, available at the admission stage, to identify applicants who will graduate in 4 years from podiatric medical school.
Methods:
A logistic regression model was used to identify two predictors of 4-year graduation: age at matriculation and total Medical College Admission Test score. The model was cross-validated using a second independent sample from the same population. Cross-validation gives greater confidence that the results could be more generally applied.
Results:
Total Medical College Admission Test score was the strongest predictor of 4-year graduation, with age at matriculation being a statistically significant but weaker predictor.
Conclusions:
Despite the model’s capacity to predict 4-year graduation better than random assignment, a sufficient amount of error in prediction remained, suggesting that important predictors are missing from the model. Furthermore, the high rate of false-positives makes it inappropriate to use age and Medical College Admission Test score as admission screens in an attempt to eliminate attrition by not accepting at-risk students. (J Am Podiatr Med Assoc 102(6): 463–470, 2012)
Background:
Most medical school admission committees use cognitive and noncognitive measures to inform their final admission decisions. We evaluated using admission data to predict academic success for podiatric medical students using first-semester grade point average (GPA) and cumulative GPA at graduation as outcome measures.
Methods:
In this study, we used linear multiple regression to examine the predictive power of an admission screen. A cross-validation technique was used to assess how the results of the regression model would generalize to an independent data set.
Results:
Undergraduate GPA and Medical College Admission Test score accounted for only 22% of the variance in cumulative GPA at graduation. Undergraduate GPA, Medical College Admission Test score, and a time trend variable accounted for only 24% of the variance in first-semester GPA.
Conclusions:
Seventy-five percent of the individual variation in cumulative GPA at graduation and first-semester GPA remains unaccounted for by admission screens that rely on only cognitive measures, such as undergraduate GPA and Medical College Admission Test score. A reevaluation of admission screens is warranted, and medical educators should consider broadening the criteria used to select the podiatric physicians of the future. (J Am Podiatr Med Assoc 102(6): 499–504, 2012)
In recent years, there has been a rapid increase in World Wide Web–based teaching and learning materials; however, present-day systems for recording student-patient interactions have trailed behind other academic areas in the appropriate use of technology. This article reviews the implementation of an innovative Web-based computerized student-patient log. This system represents considerable improvement in terms of efficiency and accuracy over traditional paper-based reporting systems. It facilitates faculty tracking of students’ clinical experiences at geographically disparate locations and allows gaps in student knowledge to be more easily identified. Moreover, the Web-based system has the added advantage of making students responsible for their own learning, providing them with a sense of ownership of the data collected. (J Am Podiatr Med Assoc 93(2): 150-156, 2003)
This article reviews the extent of health-care students’ computer literacy and presents the results of a survey of podiatric medical students’ computer literacy. The results of this survey indicate that podiatric medical students are more likely than other health-care students to rate their computer literacy as good or very good. There was no gender difference in this self-reported computer knowledge. The implications for designing and using Web-based instructional materials and technology for podiatric medical students are discussed. (J Am Podiatr Med Assoc 94(4): 375–381, 2004)
The computerized student-patient encounter log system represents a considerable improvement in terms of efficiency and accuracy over traditional paper-based student-patient encounter reporting systems. The computerized log not only facilitates faculty monitoring of students’ assessment and management of health problems at geographically disparate locations but also provides a rich resource of data for enhancing clinical teaching and learning experiences. However, little is known about podiatric medical students’ experiences with Web-based computerized student-patient encounter log systems. The findings reported in this article suggest that the computerized student-patient encounter log was considered to be useful and effective by most of the podiatric medical students surveyed and represents an improvement over traditional paper-based recording systems. (J Am Podiatr Med Assoc 95(6): 556–563, 2005)