Dear All A couple of people have asked about the validity of using scenarios in validated tests that are not relevant for their learners. Does anyone have any ideas for Irina?
Irina Ibraghimova says: Dear Janet and collegues,
hope this discussion will produce useful results -as this is one of the "burning" questions in our practice.
In our organization (American International Health Alliance) we provide training on EBP to health care professionals in the former Soviet Union countries in Russian - so we try to translate important EBP-related materials into Russian. Two years ago we translated Fresno test and presented it at one of our workshops as an example of evaluation tool - but it was rejected by our audience as the scenarios used there seemed absolutely unrealistic to them (breast-feefing mother coming for contraception advice). So all we are using for now - is before and after workshop questionnaire (asking about participants confidence in defferent aspects of EBP).
thanks, Irina Ibraghimova. Coordinator, Medical Information Resources AIHA ibra@zadar.net
I'm adding a related post from Julie Tilson here (originally e mailed to the EBHC mailbase):
I teach evidence based practice to Doctor of Physical Therapy graduate students at the University of Southern California in the U.S. The program has used an internally developed “EBP Self-Assessment” to measure change in students’ self efficacy in EBP. It asks students to rate their confidence on a 1-10 point scale from “not confident at all” to “very confident” in 15 EBP skills.
I used Shaneyfelt et al to guide my search for a more objective assessment tool to use in evaluation of our curriculum’s outcomes. The Fresno test reported by Ramos et al came closest to meeting our needs but was not a perfect fit for several reasons:
1. The clinical scenarios (and corresponding grading rubrics) are specific to medical practice and are not appropriate for testing the skills of physical therapist students and clinicians.
2. We wanted to include specific content to measure skills of integrating appraised research with patients’ unique biology, values, and circumstances.
3. We also wanted to include specific content to measure skills of integrating appraised research with clinical expertise.
Our faculty are currently pilot testing an EBP Skills Aptitude Test that can be easily adapted for use across all major healthcare fields. We have developed a physical therapy version with discipline-specific clinical scenarios, appraisal questions, and grading rubrics. We have also included open ended questions to test skills for integrating patient values/circumstances and clinical expertise.
I would be happy to share the measure in Sicily and I would love to hear from anyone who would be interested in joining our collaboration on this project. We are still in the phase of establishing content validity and I would be most appreciative of tapping into the vast expertise from this group!
Julie, I'd be really interested in learning more about how you are assessing the integration of research with patient circumstances and clinical expertise. Also, we have several groups here who are looking at assessment for physiotherapists, occupational therapists, and nurses. What sort of collaboration are you looking for for the pilot? Best wishes Janet
Hi Joe It's nice to hear from you, never too late to add to the discussion! I have a couple of questions (--inserted below)
Hello Janet,
I'm sorry this is delayed and I hope you can still use it if you find it interesting.
The assessment tools I have used with general dental residents include: 1.) A quantitative tool assessing knowledge and skills which I've named the UNM EBD test. This test is based on the Fresno test and the Berlin questionnaire. It uses elements of the objective structured clinical examination, short answer questions, matching, and multiple choice questions. The test requires simple statistical calculations and questions related to the structure of evidence, searching, and other EBHC competencies.
--Joe, how did you modify the tools to make them relevant for dentistry?
2.) A quantitative tool assessing residents' attitudes and residents' self-assessment of their EBHC competency. This is a Likert scaled questionnaire.
--So is this a pre-post tool ssessing change in attitudes and perceived self confidence?
3.) A qualitative arm investigating residents' attitudes using semi-structured interviews and field notes of residents' activities in problem solving.
--What sort of domains are you noting for problem solving?
Unfortunately, I have found no reliable way to measure impact on patient outcomes or durability of EBHC in practice after residents leave the program.
--Let's see if anyone else is tackling this issue, on the blog.
4 comments:
Dear All
A couple of people have asked about the validity of using scenarios in validated tests that are not relevant for their learners. Does anyone have any ideas for Irina?
Irina Ibraghimova says:
Dear Janet and collegues,
hope this discussion will produce useful results -as this is one of the "burning" questions in our practice.
In our organization (American International Health Alliance) we provide training on EBP to health care professionals in the former Soviet Union countries in Russian - so we try to translate important EBP-related materials into Russian. Two years ago we translated Fresno test and presented it at one of our workshops as an example of evaluation tool - but it was rejected by our audience as the scenarios used there seemed absolutely unrealistic to them (breast-feefing mother coming for contraception advice). So all we are using for now - is before and after workshop questionnaire (asking about participants confidence in defferent aspects of EBP).
thanks,
Irina Ibraghimova.
Coordinator, Medical Information Resources
AIHA
ibra@zadar.net
www.eurasiahealth.org
lrc.aiha.com
I'm adding a related post from Julie Tilson here (originally e mailed to the EBHC mailbase):
I teach evidence based practice to Doctor of Physical Therapy graduate students at the University of Southern California in the U.S. The program has used an internally developed “EBP Self-Assessment” to measure change in students’ self efficacy in EBP. It asks students to rate their confidence on a 1-10 point scale from “not confident at all” to “very confident” in 15 EBP skills.
I used Shaneyfelt et al to guide my search for a more objective assessment tool to use in evaluation of our curriculum’s outcomes. The Fresno test reported by Ramos et al came closest to meeting our needs but was not a perfect fit for several reasons:
1. The clinical scenarios (and corresponding grading rubrics) are specific to medical practice and are not appropriate for testing the skills of physical therapist students and clinicians.
2. We wanted to include specific content to measure skills of integrating appraised research with patients’ unique biology, values, and circumstances.
3. We also wanted to include specific content to measure skills of integrating appraised research with clinical expertise.
Our faculty are currently pilot testing an EBP Skills Aptitude Test that can be easily adapted for use across all major healthcare fields. We have developed a physical therapy version with discipline-specific clinical scenarios, appraisal questions, and grading rubrics. We have also included open ended questions to test skills for integrating patient values/circumstances and clinical expertise.
I would be happy to share the measure in Sicily and I would love to hear from anyone who would be interested in joining our collaboration on this project. We are still in the phase of establishing content validity and I would be most appreciative of tapping into the vast expertise from this group!
Sincerely,
Julie Tilson, DPT NCS
Julie, I'd be really interested in learning more about how you are assessing the integration of research with patient circumstances and clinical expertise. Also, we have several groups here who are looking at assessment for physiotherapists, occupational therapists, and nurses. What sort of collaboration are you looking for for the pilot?
Best wishes
Janet
Hi Joe
It's nice to hear from you, never too late to add to the discussion! I have a couple of questions
(--inserted below)
Hello Janet,
I'm sorry this is delayed and I hope you can still use it if you find it interesting.
The assessment tools I have used with general dental residents include:
1.) A quantitative tool assessing knowledge and skills which I've named
the UNM EBD test. This test is based on the Fresno test and the Berlin
questionnaire. It uses elements of the objective structured clinical
examination, short answer questions, matching, and multiple choice
questions. The test requires simple statistical calculations and
questions related to the structure of evidence, searching, and other
EBHC competencies.
--Joe, how did you modify the tools to make them relevant for dentistry?
2.) A quantitative tool assessing residents' attitudes and residents'
self-assessment of their EBHC competency. This is a Likert scaled questionnaire.
--So is this a pre-post tool ssessing change in attitudes and perceived self confidence?
3.) A qualitative arm investigating residents' attitudes using semi-structured interviews and field notes of residents' activities in problem solving.
--What sort of domains are you noting for problem solving?
Unfortunately, I have found no reliable way to measure impact on patient outcomes or durability of EBHC in practice after residents leave the program.
--Let's see if anyone else is tackling this issue, on the blog.
Best wishes,
Joe Matthews
Post a Comment