Volume 17, Issue 55 (2024)                   JMED 2024, 17(55): 1-9 | Back to browse issues page


XML Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Sil A, Das S, Das P, Jayswal D, Das N K. Designing, introducing,and implementing objective structured practical examinations as a formative assessment tool in undergraduate medical pharmacology. JMED 2024; 17 (55) :1-9
URL: http://edujournal.zums.ac.ir/article-1-2047-en.html
1- Department of Pharmacology, Rampurhat Government Medical College, Hospital More, Birbhum (731224), West Bengal, India. , drsilamrita@gmail.com
2- Department of Pharmacology, Rampurhat Government Medical College, Hospital More, Birbhum (731224), West Bengal, India.
3- Department of Pharmacology, Jagannath Gupta Institute of Medical Sciences, Kolkata 700137, West Bengal, India.
4- Department of Dermatology, College of Medicine and Sagore Dutta Medical College, Kamarhati (700058), West Bengal, India.
Full-Text [PDF 425 kb]   (379 Downloads)     |   Abstract (HTML)  (891 Views)
Full-Text:   (149 Views)
Abstract
Background & Objective:
Objective Structured Practical Examination (OSPE) has gained popularity as an objective assessment tool. Traditional assessment methods such as video and semester practical examinations are better suited to assessing the cognitive domain in pharmacology. The competency-based medical education curriculum has shifted to the psychomotor and attitude communication domains (hands-on demonstration on manikins, criticism of prescription and medical literature, hands-on demonstration on manikins, computer-assisted learning), and assessing these domains calls for more objective methods of assessment, such as the OSPE. This study aimed to design and implement the OSPE as an assessment tool for practical pharmacology for Phase II MBBS students. We also evaluated the perception, acceptability, and usefulness of OSPE for the students and the faculty.
Material & Methods: The faculty was sensitized. Group discussions with the head of the department and faculty were held regarding the content of the OSPE stations and the design, planning, implementation, and feasibility checks. The OSPE was scheduled to be held at the upcoming formative examination with a set of 8 OSPE stations and 2 rest stations. The OSPE stations were set up in the department and were initially piloted by faculty. The OSPE was carried out in the formative examination of Phase II students. Feedback questionnaires for both students and faculty members were prepared and validated prior to administration.
Results: Of the ninety-eight students in the batch, 96 participated. The average OSPE score obtained by the students was 22.23 ± 5.74 (the total OSPE score was 35). Ninety-six percent of the students enjoyed the OSPE, 99% of whom were satisfied (Likert scale 3-5). All the faculty agreed that the OSPE was unbiased and structured, although it required more effort, and manpower and preparation were time consuming.
Conclusion: The key to a successful OSPE is careful planning. A well-designed OSPE can drive learning and have a positive impact on
education.

Introduction
Objective Structured Practical Examinations (OSPEs) can assess practical competence and communication skills. (1) The OSPE has gained popularity as an objective assessment tool for medical students, residents, and trainees. OSPE was described in 1975 and in greater detail in 1979 by Harden and his group from Dundee (2, 3). In accordance with the National Medical Council (NMC) Competency-Based Medical Education (CBME) guidelines for the evaluation of medical students in India, setting up OSPE stations and applying them has become necessary (4). The disadvantages faced in conventional practical examinations, especially in terms of their outcome, are numerous (5, 6). Experiment variability and examiner variability significantly affect scoring and are not based on student variability. Traditional assessment methods are better suited to assessing the cognitive domain in pharmacology (5, 6). The conventional examination centers on the reporting of findings, ignoring the ‘doing’ part. However, the new CBME curriculum has shifted to the psychomotor and attitude communication domains (hands-on demonstration on manikins, criticism of prescription and medical literature, computer-assisted learning, etc.) in pharmacology, and assessing these domains calls for more objective methods of assessment, such as OSPE. The OSPE assesses the psychomotor and communication domains, along with objective assessments of the knowledge domain. In CBME, assessment drives learning, thus making the assessment objective necessary (7).
The OSPE is an assessment tool that evaluates student competence at various stations, for example, (a) identifying the equipment and accessories of an experiment, the procedure of the experiment, and the handling of instruments; (b) making observations and interpretations of results and conclusions; (c) simple procedures; (d) interpreting laboratory results; and (d) addressing patient management problems, communication, and attitudes. For this purpose, an agreed-upon checklist and response questions are used regarding the aspects mentioned above for the evaluation
of students’ competencies in both general and clinical experiments. The teacher or observer evaluates the student silently at some of the stations and evaluates them according to the checklist provided. (8) The OSPE provides an environment for “observing” students during assessment.
With this background, this study aimed to design and implement the OSPE as an assessment tool for practical pharmacology for Phase II Bachelor of Medicine and Bachelor of Surgery (MBBS) students. We also evaluated the perception, acceptability, and usefulness of OSPE for the students and the faculty.

Material & Methods
Design and setting(s)
The study was carried out in the Department of Pharmacology at a tertiary care medical college from May 2021 to April 2022. Approval from the Institutional Ethics Committee and the Scientific and Technical Advisory Committee was obtained prior to the commencement of the study.
The flow of the project interventions is shown in Figure 1.

Figure 1.Flow chart of the study
Initially, the sensitization and training of faculty and Senior Residents (SRs) in the department was performed through a one-day sensitization program. The concept of OSPE was introduced, and the need to advocate OSPE for formative examinations was deliberated upon in the sensitization program.
Next, a group discussion with the head of the department and faculty of the department was held regarding the content of the OSPE stations to be included in the formative examination. The role of the various faculty and staff was discussed on that platform. Details of the design, planning, implementation, and feasibility checks were provided in the discussion. The OSPE was scheduled to be held at the upcoming formative examination, and a set of 8 OSPE stations and 2 rest stations was considered. A core team consisting of three members was formed to arrange and implement OSPE with the department faculty.
A blueprint of the OSPE stations was created with the core team under the following headings:
i. Total number of OSPE stations,
ii. Place for conducting OSPE in the department
i. Content of OSPE stations
ii. Sequence of OSPE stations and remaining stations with student flow (Figure 2)
iii.The following logistics are required for each station:
iv. Faculty required to be present in which stations (Manned stations),
v. Time keeping, time allotted for each station, total time required
vi. OSPE checklist for skill stations
vii. Marks allotted for each station
The OSPE questions and checklist were prepared by the faculty of the department. The OSPE stations were set up in the department and were initially piloted with faculty; questions were checked for ambiguity, and checklists were scrutinized for completeness. Feedback questionnaires for both students and faculty members were prepared and validated prior to administration.

 
Figure 2.The order of the OSPE stations. The OSPE stations were numbered from 1 to 10. The arrows show the movement of students from OSPE station 1 to OSPE station 10. Of the 10 OSPE stations, 6 were unmanned response stations, 2 were skill stations manned by examiners, and 2 were rest stations. The topics assessed at each OSPE station are shown in the schematic figure.
Of the 10 stations, 6 were response stations [spotter with question on dosage form, spotter with question on drug delivery device, calculations in pharmacology, prescription writing, reporting an adverse drug reaction on the Central Drugs Standard Control Organization (CDSCO) Adverse Drug Reactions (ADR) reporting form, pharmacokinetic/pharmacodynamics chart evaluation]; 2 were skill stations (demonstration of various routes of drug administration [intramuscular/intravenous/subcutaneous/intradermal on manikins, demonstration of the effects of drugs on BP using computer-assisted learning software]); and 2 were rest stations. The skill stations were manned stations.
Participants and sampling
Phase II included MBBS students who attended pharmacology classes from 2021–22 and consented to participate in the study. Since the OSPE was conducted as a formative examination, all the students participated, and a purposive sampling technique was used.
Tools/Instruments
The OSPE was used for the assessment in the following steps:
Implementation of OSPE
All 10 OSPE stations were set up in the computer laboratory of the Department of Pharmacology. Each OSPE station was numbered, and questions were attached alongside the station. The direction of movement of the students is marked by arrows. Each station was separated by opaque curtains. A bell was marked every 6 minutes of the station time, and 1 minute was allowed for movement to the next station.
OSPE stations were implemented in the formative examination held in December 2021. All 98 students consented to participate in the OSPE examination. The students were divided into 5 batches of 20 each, and the assessment was held over 5 days, using different sets of questions for each batch. Care was taken to have questions of the same difficulty level. The students were sensitized in a separate demonstration room where the concept of OSPE, the various stations, their contents, their position, the time allowed, and their movement were discussed. The students were allowed to clear up their confusion and queries.
Data collection methods
The data were collected on standardized feedback questionnaire forms. At the end of the OSPE, feedback was obtained from both faculty and students in standardized feedback forms regarding their satisfaction, performance of the OSPE, time management, competency coverage, relevance of the questions asked, any other station that might have been included, and acceptance and scope of improvement. The attitudes regarding OSPE among the students and facilitators were documented on a 5-point Likert scale ranging from 1 to 5 (where 1 stands for strongly disagree, 2 for disagree, 3 for neutral, 4 for agree, and 5 for strongly agree).
Development of the Feedback Questionnaire
The experts identified the key areas, item formats, and item domains from which the preliminary questionnaire was developed. The questions were then tested for validity before the final questionnaire was developed. Pilot testing was conducted prior to the final data collection, and minor revisions to the questionnaire were made.
The validity of the parameters of face validity and content validity was tested as follows:
Face validity: Eight experts were asked to comment on the item domain and item pool of the questionnaire. They were asked:
1. Do the questions reflect the attitudes, acceptance, and satisfaction of students regarding OSPE?
2. Are the questions simple and unambiguous?
3. Are the questions easily understandable?
Content validity: Eight experts were asked to comment on each item as “essential” or "nonessential.”.
The content validity ratio (CVR) for each item was calculated by the following formula (9):
CVR = ne - (N / 2) / (N / 2).
where ne is the number of panelists indicating essential, and N is the total number of panelists.
Those items which minimum CVR was 0.75 (according to Lawshe’s critical value for 8 experts) were included in the final questionnaire (9).
Data analysis
Data collection and analysis were performed simultaneously. The quantitative data are expressed as the mean ± standard deviation. Categorical data are expressed as percentages. For comparisons between quantitative data, the student’s t test was used, and for qualitative data, the chi-square test or Fisher’s test was used. A p value of < 0.05 was considered to indicate statistical significance.

Results
Of the ninety-eight students in the batch, 96 participated in the OSPE examination. The demographic profiles of the study participants are shown in Table 1.
Table 1. Demographic profile of participants
Parameter Number of students (n = 96)
Age (years)
Range
Mean ± SD
Median (IQR)

19–24
20.80 ± 1.29
21 (20,22)
Gender
Male: Female
66: 30
Residence
Day scholar: Boarding
87: 9
Abbreviations: SD, standard deviation; IQR, interquartile range
Two students were absent on the day of the examination. The total score of the OSPE was 35. The average OSPE score obtained by the students was 22.23 ± 5.74. The highest score was 31, and the lowest score was 3. The interquartile range was 19–26.5. As shown in Table 1, nearly 14.58% of the students scored less than 50%. Approximately 27.08% of the students scored above 75% in their assessments (Table 2).
Table 2. Performance of students on the OSPE examination
Range of marks (Total = 35) No of students (n = 96)
1 – < 17.5 14 (14.58%)
17.5 – < 26.25 56 (58.33%)
> 26.25 26 (27.08%)
Feedback from students
Of the 96 students who participated in the OSPE, 95 completed the feedback questionnaire (1 student did not provide the feedback form). Nearly 66.32% of the students were moderately to severely worried about the OSPE examination, and 91.57% of the students found the OPSE orientation session helpful and adequate. Ninety-two (96.84%) students confirmed that the content of the OSPE covered the topics of the curriculum being taught. The Teaching and Learning Methods (TLMs) used in class adequately prepared 92.63% of the students for the OSPE (Table 3).
Table 3. OSPE feedback from students
Feedback questions for students (n = 95) 1 2 3 4 5
Worried about OSPE
(1 = not worried at all, 5 = severely worried)
14
(14.73%)
18 (18.95%) 29 (30.53%) 24
(25.26%)
10 (10.53%)
Orientation prior to OSPE
(1 = inadequate, 5 = adequate)
2
(2.11.%)
6
(6.32%)
12
(12.63%)
17 (17.89%) 58
(61.05%)
Content of OSPE covered topics taught in class
(1 = least coverage, 5 = most coverage)
0 3
(3.16%)
7
(7.37%)
25
(26.32%)
60
(63.15%)
TLM used in class prepared for OSPE
(1 = not at all, 5 = absolutely yes)
3
(3.16%)
4
(4.21%)
20
(21.05%)
24
(25.26%)
44
(46.32%)
OSPE organization and conduct
(1 = disorganized, 5 = absolutely organized)
0 1
(1.05%)
4
(4.21%)
13
(13.69%)
77
(81.05%)
Sufficient time allotted for each OPSE station
(1 = very insufficient, 5 = absolutely sufficient)
3
(3.16%)
4
(4.21%)
9
(9.47%)
25
(26.32%)
54
(56.84%)
Understanding of questions given at OSPE stations
(1 = not at all understood, 5 = completely understood)
1
(1.05%)
2
(2.11%)
11
(11.58%)
23
(24.21%)
58
(61.05%)
Environment at OSPE comfortable
(1 = not at all comfortable, 5 = absolutely yes)
1
(1.05%)
1
(1.05%)
4
(4.21%)
20
(21.05%)
69
(72.64%)
Enjoyed OSPE
(1 = not at all, 5 = enjoyed absolutely)
4
(4.21%)
2
(2.11%)
16
(16.84%)
33
(34.74%)
40
(42.10%)
OSPE fair and unbiased
(1 = not at all fair, 5 = absolutely fair)
2
(2.11%)
1
(1.05%)
7
(7.37%)
19
(20%)
66
(69.47%)
OSPE is better scoring than traditional assessment methods
(1 = not at all, 5 = absolutely better)
0 2
(2.11%)
13
(13.68%)
26
(27.37%)
54
(56.84%)
Satisfaction with OSPE
(1 = not at all satisfied, 5 = fully satisfied)
0 1
(1.05%)
12
(12.63%)
43
(45.26%)
39
(41.06%)
Want next practical examination by OSPE
(1 = not at all, 5 = absolutely yes)
1
(1.05%)
0 11
(11.58%)
20
(21.05%)
63
(66.32%)
Confidence before OSPE examination
(1 = not at all confident, 5 = absolutely confident)
9
(9.47%)
22
(23.16%)
35
(36.84%)
21
(22.11%)
8
(8.42%)
Confidence after OSPE examination
(1=not at all confident, 5 = absolutely confident)
2
(2.10%)
3
(3.16%)
23
(24.21%)
40
(42.11%)
27
(28.42%)
Recommendation of OSPE method of examination to future students
(1 = not at all, 5 = absolutely yes)
3
(3.16%)
2
(2.10%)
9
(9.47%)
25
(26.32%)
56
(58.95%)
Identification of areas of weakness by OSPE stations
(1 = not at all, 5 = absolutely yes)
1
(1.05%)
2
(2.10%)
3
(3.16%)
33
(34.74%)
56
(58.95%)

Notes:The score of each question ranges from 1 to 5. For each question, 1 represents the lowest value and 5 represents the highest value. The number and percentage represent the number and percentage of students who opted for that particular score, respectively.
Abbreviations: OSPE, objective structured practical examination; TLM, teaching learning method 
Nearly all (98.95%) students found that the OSPE examination organization and conduct were adequate and that the environment was comfortable (97.89%). Few students (7.37%) reported that the time allotted to each station was less than sufficient for the completion of tasks, and 3.16% of them had difficulty understanding the questions. Most students (93.68%) enjoyed OSPE, 98.95% of whom were satisfied (Likert scale 3-5). A total of 96.84% of the participants found OSPE to be a nearly absolutely fair and unbiased method of assessment. The confidence among the students increased from 30.53% prior to the OSPE examination to 70.53% after the examination (p < 0.001) (according to the student’s t test) (Figure 3). A total of 85.26% of the students would like to recommend the OSPE method of examination to future students. Nearly 96.84% of them could appreciate moderate-to-absolute areas of weakness after the examination (Table 3).

Figure 3. Column graph showing the increase in confidence before and after OSPE examination. The numbers on top of each bar represent the number of students who opined for that response. The scale of confidence ranged from 1 = not confident at all to 5 = absolutely confident.
Feedback from Faculty
The faculty of the department comprised one professor, one associate professor (lead author), one assistant professor, one demonstrator, and one senior resident, and all of them agreed that the faculty sensitization program held in the Department of OSPE was effective and useful. All of them agreed that the learning objectives during the teaching-learning sessions, the TLMs used, and the curriculum topics taught were satisfactory and aligned for conducting OSPE examinations. All the faculty agreed that the OSPE was unbiased and structured, although it required more effort, more manpower, and more preparation than traditional assessments. One hundred percent of the faculty members found that the OSPE was better than traditional assessment. Seventy-five percent of them believed that further OSPE for practical examinations may be conducted in the department and that OSPE was feasible both during formative and summative examinations. However, one faculty member disagreed with the feasibility of conducting OSPE for summative examinations (Table 4).
Table 4. OSPE feedback from faculty
Feedback questions for Faculty Strongly agree Agree Neutral Disagree Strongly disagree
The one-day sensitization programme on OSPE held in the department on 07.10.2021 was useful during designing and implementation of OSPE. 1 3 0 0 0
The learning objectives during the teaching-learning session was conducted in tune with the OSPE examination. 1 3 0 0 0
The teaching learning methods and tools were satisfactory for conduct of OSPE examination. 0 4 0 0 0
The content of the OSPE stations was satisfactory with respect to curriculum topics taught. 1 3 0 0 0
The OSPE conducted was unbiased and structured. 2 2 0 0 0
The preparation of OSPE stations required more effort on your part than traditional practical examinations. 3 1 0 0 0
For conduct of OSPE, more manpower is needed compared to traditional practical examinations. 1 2 0 1 0
The preparation of OSPE stations, making good quality comprehensive question bank and check lists for conducting OSPE stations is time consuming. 2 2 0 0 0
OSPE method of examination was better compared to the traditional practical examinations that has been previously conducted in Pharmacology. 4 0 0 0 0
Further practical examinations in the department should be conducted by the OSPE method. 2 1 1 0 0
OSPE is feasible to be used both during formative and summative assessment of students in Pharmacology. 1 2 0 1 0
OSPE is repetitive/boring for the observers. 0 0 2 1 1
Discrimination between average and brilliant student may be difficult in OSPE. 0 1 0 2 1
OSPE can test depth of knowledge of a student. 1 1 2 0 0
OSPE method of examination could be recommended to other departments in the institute. 2 2 0 0 0

Fifty percent of the faculty disagreed that OSPE is boring and repetitive, while 50% were neutral about it. Seventy-five percent of the faculty disagreed that OSPE cannot discriminate between average and brilliant students. Fifty percent agreed that the OSPE can test the depth of knowledge of a student. All the faculty members wanted to recommend the OSPE form of assessment to other departments (Table 4).

Discussion
The acquisition of practical skills is one of the most important attributes of medical students’ training. Objective assessments of practical (psychomotor, communication, and attitude) skills pose a formidable challenge to examiners. The assessment of practical skills in pharmacology needs to improve from subjective to objective methods. OSPE is one such method. The CBME curriculum incorporates all three domains—cognitive, psychomotor, and affective—in the practical syllabus for pharmacology. Skill assessment by traditional methods is subject to variability on the part of the examiner, patient, or student, which significantly affects the score. Traditional methods tend to evaluate the global performance of the student but not the individual competencies. Most of the time, the final outcome is tested, but the process of arriving at the conclusion is not. According to CBME, individual competencies and their development require special focus. Additionally, feedback during the assessment process can be provided to the students, which allows for improvements in their skills. The OSPE provides for an objective assessment of competencies so that variability is decreased. It tests not only skills and knowledge but also attitudes. The OSPE tests students’ ability to integrate knowledge, clinical skills, and communication with patients and can be used by a large number of students at the same time. Keeping in mind the wider applicability of the OSPE, we designed and implemented the OSPE in our Department of Pharmacology for formative assessment.
In the OSPE examination, only 14.58% of the students scored < 50%, while 27.08% scored > 75%. Thus, the OSPE score remains relatively high. Most students were worried prior to OSPE, but after the orientation session, their apprehension was mitigated. The students had not attempted any previous OSPE in pharmacology, and the importance of the OSPE orientation session is thus stressed. The students could understand and follow instructions properly without any confusion due to the orientation session. In 2016, Vishwakarma K et al. also conducted a study on OSPE in which they sensitized and oriented students beforehand regarding the pattern of the OSPE examination and the discussion of sample questions (10).
It was more students in our study (97%) who said that the topics on the test were related to what they had learned in class than in Vishwakarma et al.'s study (74% of students agreed that there was a link between the topics taught and the topics tested). Ninety-three percent of our students opined that the TLMs used in class prepared them for the OSPE examination. In CBME, alignment of the assessment with learning objectives and the TLM is ensured to achieve a specified educational outcome, and if any one corner of the “golden triangle” is changed, the others should follow suit (11). At our institution, all three vertices of the golden triangle were aligned.
During the OSPE, only 3% of the students had difficulty understanding the questions, which was comparable to the 6% of the students with similar understanding difficulties in the study by Vishwakarma et al. The time allotted at our stations was 6 minutes each, compared to 5 minutes reported in other studies (10). Seven percent of our students found the time inadequate, compared to 5% in the Vishwakarma et al. study (10). The environment at the time of the examination was comfortable, as reported by 98% of the students, and they appreciated the organization and conduct of the exam. Nearly all reported a satisfaction rate of ≥ 3 on the Likert scale, substantiating our efforts to implement OSPE for the first time in our department. The areas of weakness and scope of improvement could be appreciated by nearly 97% of the students, making OSPE a healthy assessment method. Additionally, the students were ready to recommend OSPE to future students and themselves. This shows that the students are ready to accept OSPE as their assessment method for future examinations. This finding is in agreement with a study by Chandelkar et al., where it was found that all the students accepted OSPE because it helped them improve not only their practical skills but also their application in pharmacology (12). The OSPE was perceived as fair, unbiased, and scoring, akin to a study by Malhotra SD et al., where 66.4% of students found that the OSPE format was fair and more objective than conventional examinations (6). The acceptability of OSPE is thus increased as each student has to perform the same tasks.
The faculty of the department were appreciative of the faculty sensitization program, as it helped them decide what tasks to assess beforehand, set deadlines, and organize the OSPE in the department. The faculty also found OSPE to be an unbiased way of assessing students, as preprepared checklists and answers with marked distributions were available. Despite the effort and time invested in preparing OSPE questions, checklists, and answers, 100% of the faculty members found that OSPE was better than traditional practical exams. This feedback was similar to that shared by Saurabh MK et al., where all the faculty members favored the OSPE method of assessment (5). The feasibility of OSPE for summative examination was not recommended by one faculty member because of a lack of manpower in the department. However, the facilitators could assess the depth of knowledge of the student via the OSPE, even though the evaluation was silent. Thus, the OSPE inculcated a culture of observation during assessment and did not ask questions. They felt that OSPE should also be recommended in other departments of the institute.
Our study was limited by the fact that communication skills could not be incorporated into the OSPE because the topic has yet to be covered in class.

Conclusion
The key to a successful OSPE is careful planning. A well-designed OSPE can drive learning and have a positive educational impact. The OSPE, which was carried out for formative examinations in pharmacology, showed that it is well accepted by both students and teachers. Careful specification of the content of the OSPE increases its validity, and a structured marking schedule increases its objectivity. This paves the way for the incorporation of OSPE into further formative and summative examinations, keeping in mind the modifications suggested by the students and facilitators and thereby improving upon the stations. The ability of the OSPE to be structured and objective is limited by examiner bias, and the OSPE may be considered for medical education assessments. The various universities may formulate model OSPE stations, questionnaires, and checklists for the medical schools under them for uniformity of examinations. Further research needs to be done to make OSPE more viable in resource-poor (manpower and logistics) settings.

Ethical considerations
The study was conducted ethically as per the 2017 National Ethical Guidelines for Biomedical and Health Research Involving Human Participants, the Indian Council of Medical Research, and current Good Clinical Practice (GCP) guidelines.
Artificial intelligence utilization for article writing
The authors claim that no artificial intelligence was utilized for article writing.
Acknowledgments
The authors acknowledge the contribution and enthusiasm of the entire Department of Pharmacology, the faculty, and staff for their wholehearted support of the project. We are indebted to Prof. Dr. Karabi Baral, Principal, Rampurhat Government Medical College, for being persistently encouraging and to our MEU Coordinator, Prof. Dr. Simit Kumar, for always motivating the team. I would like to extend my heartfelt gratitude to the Phase II MBBS students of the institute for being open to the newer methodologies of teaching and assessment and assisting us in every way possible. We appreciate the mentorship of Prof. Dr. Dinesh Badyal, Christian Medical College, Ludhiana, who was a cornerstone for this ACME project.
Conflict of interest statement
The authors certify that the study reported has not received any financial support from any pharmaceutical company or other commercial source. None of the authors or any first-degree relative of the authors has any financial interest in the subject matter discussed. The study was purely academic in nature.
Author contributions
All the authors contributed to the study conception, study design, data collection, analysis, and report preparation.
Supporting resources
The source of support was primarily institutional. Rampurhat Government Medical College and Hospital, Birbhum, West Bengal, India, contributed to the infrastructure, participants, and logistics for conducting the study. The project was carried out as an academic exercise, and no grants were received.
Data availability statement
The raw data are available from the corresponding author.


 
Article Type : Orginal Research | Subject: Medical Education
Received: 2023/09/12 | Accepted: 2024/02/27 | Published: 2024/09/10

References
1. Ananthakrishnan N. Objective structured clinical/practical examination (OSCE/OSPE). Journal of Postgraduate Medicine. 1993;39(2):82-4. [PubMed]
2. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. British Medical Journal. 1975;1(5955):447-51. [DOI]
3. Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education. 1979;13(1):41-54. [PubMed]
4. Medical Council of India. Competency Based Undergraduate Curriculum for the Indian Medical Graduate. 2018. [Online PDF document] Available from: [Article]
5. Saurabh MK, Patel T, Khatun S, Chaudhri J, Patel P. Implementation of objective structured practical examination in formative assessment for undergraduate practical pharmacology. Maedica (Bucur). 2021;16(1):64-74. [DOI]
6. Malhotra SD, Shah KN, Patel VJ. Objective structured practical examination as a tool for the formative assessment of practical skills of undergraduate students in pharmacology. Journal of Education and Health promotion. 2013;2(1):53. [DOI]
7. Medical Council of India. Competency Based Assessment Module for Undergraduate Medical Education Training program, 2019: pp 1-30. [Accessed on 27.3.24]. Available from: [Article] [Google Scholar]
8. Azeem MA. A brief overview regarding various aspects of Objective Structured Practical Examination (OSPE): modifications as per local needs. Pakistan Journal of Physiology. 2007;3(2):1-3.
9. Lawshe CH. A quantitative approach to content validity. Personnel Psychology. 1975;28(4):563-75. [DOI]
10. Vishwakarma K, Sharma M, Matreja PS, Giri VP. Introducing objective structured practical examination as a method of learning and evaluation for undergraduate pharmacology. Indian Journal of Pharmacology. 2016;48(Suppl 1):S47-S51. [DOI]
11. Chatterjee D, Corral J. How to write well-defined learning objectives. Journal of Education in Perioperative Medicine. 2017;19(4):E610. [PubMed]
12. Chandelkar UK, Rataboli PV, Samuel LJ, Kamat AS, Bandodkar LV. Objective structured practical examination: our experience in pharmacology at Goa Medical College, Bambolim Goa, India. International Journal of Scientific Reports 2015;1(2):113-7. [DOI]

Add your comments about this article : Your username or Email:
CAPTCHA

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.