One of the most important aspects of dental training is ‘hands-on’ application of learned subject matters. Lectures and problem solving which applies lecture content are useful ways to apply knowledge. Problem solving allows adult learners direct application of knowledge and helps maintain motivation to learn. However, dentistry demands a high degree of psychomotor application of knowledge. Students must be able to prepare a tooth that has been damaged due to various reasons. Simulation training is used in this area.
Simulation training using mannequins placed in a dental chair surrounded by dental instruments as the operator will find in their own practice follow most lectures. Students are provided simulated cavities and asked to prepare an acceptable cavity preparation for a specific restorative material. This has been the most common mode of facilitating learning in adult dental students. It is effective as it allows learner to apply lecture teachings to a direct problem – something they will be doing for the rest of their career. Through practice and critical evaluation, students learn to master their skills.
Through the recent learning from this course, I have learned to understand why our simulations have been so effective. We seem to be following many of the principles and recommendations discussed in the reference source. I would like to share our steps and utilize the reference source to further improve our current protocol. We start with a lecture which introduces the learning problem with clear outlines in writing. At times, we run an I-clicker quiz to ensure that all learners possess the required knowledge base. Quiz results are confidential and only recorded on the learner’s learning site. Results of the classroom response are displayed after each quiz and discussed with the class. This has been found to be a good problem solving and discussion strategy. I am always amazed at how learners justify their points of view and help me realize that their ‘answer’ could be considered correct.
From the lecture setting, we move into the simulation lab. Each learner receives an exercise sheet at least two days before simulation clinic. They are provided learning outcomes, steps for simulation as well as video support and evaluation criteria. Learners setup their plastic teeth and infection control barriers and prepare and carry out the ‘exercise of the day’. Upon completion of their exercise, learners are asked to self-evaluate their work. I really encourage this and find a tremendous learning outcome from self-reflection. However, they tend to skip self-evaluation and seek instructor feedback on either an informal or formal basis. To address this, I would like to add a graded self-reflection component to our exercises. I think this will result in more participation in the activity and hopefully assist in forming a habit of self-critiquing one’s work.
After self-reflection, students are asked to make any adjustment and corrections and re-evaluate their own work again. If the preparation is not acceptable and/or cannot be corrected, then a new preparation is encouraged. When the student feels that their preparation meets clinical criteria, they are asked to have a peer member evaluate their work. They can choose any peer. This has helped students learn from one another. However, I find this area is not currently setup in a fair and kind manner. Before any peer evaluation, facilitators should set certain ‘ground rules’ for respectful, kind, and professional feedback. This tone needs to established at the beginning of the course and currently it is not. Peer evaluation and feedback can be extremely helpful when offered in a constructive and respectful manner. Peer members can share ideas and assist once another to further develop each other’s skills and form a collegial professional bond.
Students are then asked to modify their preparation in consideration of the feedback they have received from their peer. Once completed, the student should seek facilitator feedback. Of course, the learner can seek assistance at any time during the simulation clinic, but facilitators are asked specifically to guide students and not simply provide ‘end process’ answers. Learning through critical thinking is our goal and it can not occur when answers are easily provided. Currently, we are struggling with facilitator ‘buy-in’ with respect to our teaching. I find the information in the resource article so supportive that I plan to share it with my group. I am hoping they will read the article and understand the effectiveness of experiential learning. I should point out that it is a lot easier for facilitator to simply provide answers than guide learners. This method of teaching requires a lot of time and energy from facilitators in order to be effective.
Finally, learners are encouraged to repeat the exercise from start to finish on a new plastic tooth. I usually find that they run out of time or are ready to start another, more difficult exercise. This can be changed by adding a grading component for several attempts of the same exercise. Alternatively, we can choose the highest mark from all attempts to be posted on the learner grading report. Another approach is to have the students choose the best preparation for grading. I like this method as I believe it will encourage deep self-evaluation of preparation in order to determine the ‘best one’. I do find that those who repeat exercises have a better understanding of the preparations and tend to perform well in most future exercises. With limited funding, clinic time is not readily available. As a result, students who have the intention of practicing may not have adequate time to do so. This limitation needs to be further assessed within our program.