ebook img

CSAP Communications Technical Assistance Bulletin: Evaluating the Results of Communication Programs PDF

16 Pages·1998·0.94 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview CSAP Communications Technical Assistance Bulletin: Evaluating the Results of Communication Programs

MWS1o3^-) 4£ SubstanceAbuseandMentalHealthServicesAdministration CenterforSubstanceAbusePrevention 1l 5 Technical Assistance Bulletin CD £ ® <D <N .t- in "eOn m.c^m C^D Evaluating the Results of Communication Programs Preventionprogramplanners sometimesfind evaluation difficult to do, and out- O come evaluation may seem to be the most difficult. This bulletin describes how to tr> plan and conduct outcome evaluation and explains the benefits ofconducting outcome evaluation to document the results ofcommunication efforts. Evaluation is a continuous process that The Health begins with identifying the prevention problem or issue to be addressed and Communications Process who is affected by the problem. Where does evaluationfit in? Then, as shown in the Health Communi- cations Process, evaluation continues throughout the development and imple- mentation ofthe communication pro- gram. Types ofevaluation include formative, process, outcome, impact, and efficiency. Each ofthese serves a different purpose and requiresvarying amounts oftechni- cal expertise and resources. In this bulletin, these types ofevaluation are defined as follows: Formative evaluation (or formative research) is conducted as a part of program development. Collecting evalua- tion data at this early point can help Process evaluation addresses how the "% make decisions about target audience prevention communication program is selection and the types ofcommunication implemented: messages, channels, and activities to be used. Pretesting messages and materials Is the program involving target audi- ence members? How many? to assure their effectiveness prior to final production is one kind offormative Are the planned activities being con- evaluation. ducted? Bywhom? Bywhen? ™ US DEPARTMENT OF HEALTH AND HUMAN SERVICES M^ Substance Abuse andMental Health Services Administration August 1998 Center for Substance Abuse Prevention ^»M3al " dresses a more complex situation and re- quires sustained attention and multiple "The challenge most threatening to program strategies over time, impact evalua- thegoal offindingsolutions to the tion is usuallythe most complex and costly type ofevaluation. problem ofdrug abuse is the cur- Efficiency evaluation addresses whether rent lack ofevaluation evidence to the program is making the best use ofre- demonstrate thesuccess ofindi- sources within the context ofthe stated vidualprograms. Thisfailure to program objectives and goals. document results represents agreat loss to this developingfield, where reliable evidence ofsuccess could Outcome Evaluation guideso many efforts. Step By Step Planning outcome evaluation ofa communi- U. S. GeneralAccounting Office, 1992 cation program requires several steps. Begin With Clear andAction-Oriented I. Communication Program Goals and Process evaluation data identify whether the Objectives program is progressing as expected so that any program components that are notwork- Appropriate goals describe the overall change ing as planned can be adjusted. Ifthe pro- that planners expect from the program. These gram is not being implemented as planned, goals, which address a broad substance abuse then the chances are that the program out- problem and often describe an improvement comes will not be met. This information is in the substance abuse situation (such as important both to improve the operation ofa making a school drug-free or postponing the program and to replicate it. age offirst use among youth), areprevention goals. Reaching prevention goals often means Outcome evaluation measures the effects of using a combination ofstrategies-such as a program on the target population in the strengthening drug-free policies, generating short term and provides evidence ofwhat has community support for activities, and com- changed as a result ofthe program. At its municatingabout the efforts. Communication best, outcome evaluation is a comparison of strategies should be interwoven with other "before" and "after." For example: prevention strategies to reach the prevention How much did the youth know about goals. tobacco as an addictive drug before the Achievable objectives for a communication program ("baseline knowledge")? What program specifically relate to what communi- were their intentions to use tobacco? cations can reasonably be expected to con- Whatwas their knowledge after the tribute to resolving the problem. That is. program?Was there a change? Did their objectives for communication efforts should intentions to use tobacco change? contribute to: Impact evaluation measures the effects in Raising awareness the long term, such as the prevention Increasing knowledge program's effect on a community problem 6, 12, and 18 months after cessation ofthe Influencing attitudes and norms program. Because impact evaluation ad- Showing benefit ofbehavior change What Outcome Evaluation Can Do Evaluation offers program planners the tools Gain credibility. Without more for seeing where they are going and knowing formal evaluation, program planners when they get there. It can help them to: have to rely on anecdotal evidence to describe what did or did not happen. Guide administrative decisions. Evaluation provides objective data or Evaluation involves making decisions documented facts. Anecdotes are about value, worth, and merit. For ex- important, too; but credibility is ample, will teaching styles affect the earned by being able to explain how impact ofthe program on students? The each activity resulted in change. answer to this question can help deter- mine the types ofteachers to hire or Support replication. Outcome whether itwill be worthwhile to invest in evaluation explains that the strategy or teacher training. programworks. When coupled with process evaluation (which shows how Refine the program. Outcome evalu- the programwas implemented), the ation findings can clarifywhat has hap- steps needed to replicate the program pened, andwhat can realistically be and to reproduce the results can be accomplished. The program can then be described. reframed and/or its objectives revised based on documented progress made. Advance the field. It takes concerted efforts among many prevention See concrete results. Evaluation find- specialists to be successful in reaching ings help provide reasons for involve- ment to community groups and others prevention goals. So it is important to identify and share "whatworks and participating in the program, help with what doesn't" to increase the use of recruitment, and bolster staffmorale. communication strategies that prove Prove to funders the value oftheir successful, and modify or replace investment. This is important for pro- those that are not. gram continuation! Reinforcing positive knowledge, attitudes, Attainable - to increase the probability and behavior ofsuccess Demonstrating skills Prioritized- to direct the allocation of resources Suggesting/prompting an action Quantified- to specify how much Increasing demand for services change is desired Refuting myths and misconceptions. Measurable - to assess progress First, identifywhat communications will Targeted- to identify who should contribute to the broader prevention pro- change gram. Then, write communication objectives that are: Time specific - to tell when the objec- Specific - written to point the way to tive is to be reached action evaluation is writing outcome objectives, such Six Steps for Outcome as the following: Evaluation: By 1997, increase the knowledge of 50 percent ofhealth clinic counseling staff 1. Beginwith clear and action on five key effects ofusing crack. oriented communication program By the end ofthe program, increase by 75 goals and objectives percent the number of 13 to 18-year-old 2. Turn the communication program girls who know the five immediate effects outcome objectives into outcome oftobacco smoking on their bodies. evaluation questions At the end ofthe presentation, 90 percent 3. Design the outcome evaluation ofthe runaway youth will be able to list an agency where a friend with an alcohol 4. Gather data or other drug problem can get help. 5. Analyze data andwrite the evalua- tion report 2. Turn the Communication Program 6. Use the findings. Outcome Objectives Into Outcome Evaluation Questions The questions an outcome evaluation is Useful- to contribute to accomplishing designed to answer should be based on the program's outcome objectives. For example: the program goals. Outcome objective Establishing realistic, measurable objectives is important. Information on recent trends can By the end ofthe program, increase by 75 provide some clues to what is possible. For percent the number of 13 to 18-year-old example, ifnegative attitudes toward cocaine girls who can articulate the five immediate use have increased by 3 percent among youth effects oftobacco smoking on their in the pastyear, program planners may bodies. decide to set a goal in the same range for the Evaluation questions nextyear.Just make sure that the change aimed for is reasonable - and remember that What does the target audience currently expecting 100 percent ofany group ofpeople know about the immediate effects of to change is not reasonable. smoking tobacco? (This is the "baseline.") The goals and objectives for the Nation set What do those who participate know after forth in the HealthyPeople2000 report can the program? provide some guidance, (as can those devel- During this same time, what do nonpar- oped by many States and communities). Some ticipants within the target audience know? national substance abuse prevention objec- tives relate to communications, others to Ifmore than one target audience (such as broader prevention goals to which a commu- youth leaders and otheryouth) is involved, nication program can contribute. measure changes with each group. And if others such as teachers are involved, add Communication program objectives may evaluation questions to address their role describe whatis expected to happen or (such as identifying changes in teaching change (outcome objectives), and others may methods). Before moving ahead to evaluation relate to howit will happen (process objec- design, make sure that the evaluation ques- tives). The first step in planning outcome tions focus on the most important issues for the program. Case Study 1 stress or anxiety, self-perception, fre- quency ofsubstance abuse, reactions to Across Ages: An Intergenerational persuasion to use substances, knowledge Mentoring Approach to Drug Preven- about older persons, and personal sense tion (Philadelphia, PA) matches elder ofwell-being. mentors to help sixth grade students in Newly designed evaluation instruments high-risk environments. Across Ages also looked at attitudes toward school, elders, includes community service for students, and the future; knowledge about sub- workshops for parents, and a classroom stance abuse; and problem-solving skills. curriculum. Program evaluation includes Also measured was the extent to which both process and outcome evaluation students actively participated in the components. activities (to determine how much each The outcome evaluation design includes studentwas exposed to the program). two randomized intervention groups and a Ofthe 13 measures, evaluators found control group using pretest (before the very good results in the pretest for four in intervention) and posttest (after the inter- both the experimental and control vention) measures. Within each school, groups. For example, students already three sixth grade classes (from among had the desired attitudes toward drug those with teachers who agreed to partici- use. Therefore, less change could be pate) were assigned to the control group expected in these areas. For the other (no program intervention), the moderate nine measures, however, there was intervention group (all program compo- significantly more improvement among nents except mentoring), or the full inter- students in the experimental groups than vention group (all program components students in the control group, with most including mentoring). The groups were improvement among students who demographically comparable. received the full intervention. The use of Previously tested evaluation instruments a control group permitted comparisons were used to measure attitudes toward and demonstrated the positive effects of alcohol and cigarette use, reactions to both classroom interventions and situations involving drug use, reactions to mentoring. 3. Design the Outcome Evaluation experimental or intervention group), and the otherwill not (the control group). There are various research designs that can Ask the same set ofevaluation questions to be used for outcome evaluation. Basic text- both groups before the program begins. books on evaluation provide detailed expla- wnaotrikonwsiotfhtahneseex.pHeorwieevnceerd,eitvamlauyatborewusheofuclatno Aknnsowwelresdgteo tohreastetiqtuuedsetsitohnesp(er.ogg.,raomnwtahents to affect) serve as the baseline, or starting point. suggestwhich research design is best applied underwhich conditions. Then, ask both groups the same questions The best outcome evaluation uses a random- after the program is finished. Presumably, any significant differences in the responses from ized "experimental" design. This means that the intervention group, as compared with the individuals within the target audience are control group, can be attributed to the pro- assigned into two equivalent groups: one gram. Randomized experimental designs group will be a part ofthe program (the Outcome Evaluation Options To Match Resources Available resources, experience, and Modest Resources. Pre- and expertise all help guide program planners postassessments ofdesired changes in to the most practical evaluation design for program participants (self-reported or their program. The rule ofthumb forwell- observed, questionnaires filled out by planned intervention programs is to participants, telephone or in-person allocate at least 15 percent ofthe program interviews, third-party observation/ budget to evaluation. assessment); for media programs, a media content analysis by monitoring Some type ofoutcome evaluation, how- and analysis ofcoverage (content and ever, is possible for almost any budget: quantity) appearing in the media Minimal resources. Activity assess- before/during/after the program. ments (numbers ofpeople who Substantial resources. Use ofexperi- participate and their responses to mental design with intervention and questions aboutwhat they learned, control groups, pre- and thought, and/or did as a result oftheir postassessment to measure change after involvement). the program exposure. provide the best evidence ofoutcomes. It is works best for short programs, such as a the kind ofoutcome evaluation that is most training session, where program planners acceptable to people looking for "hard can be reasonably sure that the program evidence" ofsuccess. Using an experimental caused or contributed to the change.) design will also: Looking for sources ofexisting baseline isolate outcomes to determine ifcertain information and conducting postinter- changes, when controlled for, affect the vention data collection only (for example, result using existing community or State surveys ofknowledge and/or attitudes toward minimize or prevent conditions that substance use/abuse). obscure clear interpretation ofresults Sorting other information already gath- address potential problems and biases ered from participants and nonpartici- that could become barriers to understand- pants (such as classroom attendance, ing whether the program is responsible school grades). for the outcomes. Collecting information only after the However, for some community communica- intervention to see ifparticipants are tion programs, this type ofevaluation be- where program planners want them to be comes practical only ifevaluation expertise (for example, ready to be peer counselors can be accessed. to others). This does not tell planners if Other types ofdesigns include: participants obtained that status as a result ofthe program. Collecting information pre- and post- intervention (the activity or program) only Most outcome evaluation methods are based from participants to see what happened to on collecting data about participants through them as a result ofthe program. (This a questionnaire, interview, or observation. Case Study 2 assess advertisements critically, perception Right Turns Only! (Prince Georges offamily, conflict resolution, self-efficacy County, MD) is a video-based drug in peer relationships, and behavioral education series produced by the Prince intentions related to substance use/abuse Georges County School System with prevention. Changes were measured using CSAP funding. The effect ofthis video a questionnaire completed by students series (including collateral print material) before and after the interventions, and the on student knowledge, attitudes, and questionnaires were analyzed to identify behavioral intentions was tested at 12 any differences based on gender, race, schools among approximately 1,000 grades (self-reported), and teacher. seventh grade students assigned to four As hypothesized, groups receiving drug groups. One group received only the education scored higher on all measures video-based education; a second group except self-esteem in the posttest than did received the program in addition to a the control group. On two ofthe seven traditional drug education curriculum; a measures, the group receiving the video third group received only the traditional series and traditional curriculum scored curriculum; and a fourth group - the significantly higher than other groups. control group - received no drug abuse Thus evaluators were able to demonstrate prevention education during the time of that instructional television (in particular the study. All interventions were com- when used in conjunction with print pletedwithin a 3- week period. materials and teacher guidance) can be an Outcomes measured included knowledge effective tool for delivering drug education ofsubstance abuse terminology, ability to in the classroom. The type ofdata collection method selected is tion consultation and hands-on assistance based on how to best answer the evaluation include university faculty, graduate students questions, access to the target audience, and (for data collection and analysis), local mar- resources. (See the case studies included in keters and other businesses (staff, computer this bulletin for ideas.) time), State and local health and social service agencies, and organizations with experience Some ofthe decisions made in designing an in evaluation. outcome evaluation include what data will be What Data collected, fromwhom, how, andwhen. These data help answer the program's evaluation The data collected and the measures used to questions. determine whether a change has occurred Remember that the more complex the evalua- should directly relate to the evaluation ques- tion design, the more expert assistance may tions. Specific questionnaires or other data be needed to design and conduct the evalua- collection instruments should be compared to tion. the questions they intend to ask with the evaluation questions (Step 2) and outcome Ifthere is no evaluator on staff, seek help to objectives (Step 1). For example, ifthe pro- decide what type ofoutcome evaluationwill gram is intended to change knowledge, all of best serve your program. Sources ofevalua- the measures should be knowledge related. " Ensure the validity ofthe data collection instruments used. Valid instruments should "Positive resultsfrom an outcome evalu- — measure what they are supposed to measure. ation showing morefavorable results Sources for help in developing a question- fromparticipants than similarnonpar- — naire include examples used by other pro- ticipants offerhard, objective evidence grams, evaluation reports from NCADI [see thata socialprogram truly makes a Resources], evaluation design manuals, and experienced stafffrom other prevention difference and thus is apositive invest- programs oruniversity graduate programs. mentofhuman capital. Such sources can provide guidance on how to U. S. GeneralAccounting Office, 1992. develop data collection measures and instru- ments that are reliable andvalid. Ifcollecting baseline data is beyond existing capability, check for available data. Has someone already collected this information? personal information to interviewers? The Ifit was not for the program's geographic answers mayvary according to age, culture, area, is it reasonable to use it as an approxi- levels ofliteracy, and setting, so pilot testing mation ofthe baseline measures? data collection instruments is vital. From Whom When: The Outcome Evaluation Plan Think about from whom information should The decisions made about what data to be gathered. Different data collection instru- collect, from whom, and how are all con- ments and methods may be needed for each nected. Completing the Worksheet: Outcome group. For example, ifone or more interven- Evaluation Plan (see page 13) will help tion groups and a control group are being program planners think through their evalua- followed, each may need to be asked the tion plan. Developing a timeline and task same questions in a different way in order to schedule is crucial. The collection ofprepro- get at a common construct or concept. Or, if gram-intervention data must be completed more than one culture is involved, different before the prevention communication pro- methods to capture adequately the experi- gram starts. The more complex the evaluation ences ofeach culture may be needed. design, the more important the timeline and task schedule become. How When program planners complete the Data can be collected in many ways; for worksheet, it provides a written guide to the example, through qualitative methodssuch as outcome evaluation plans. The plan also can focus groups, open ended questionnaires, help identify where there is a need for evalu- personal interviews, and observation or ation help. through quanitative methodssuch as struc- tured surveys and questionnaires. The design ofprocess evaluation is not the focus in this bulletin, but process measures Deciding how data will be collected is an- are important for any program conducting other resource question. Does the program outcome evaluation. Information from pro- have access to skilled interviewers? cess measures will help identify parts ofthe Consider how comfortable the participants program that work, parts that did not work, will be with the planned means ofcollecting and the reasons why the outcome was orwas data. Will they be willing and able to fill out not positive. the forms? Will they be willing to provide 4. Gather Data 6. Use the Findings Think through the logistical issues involved to An evaluation report that resides in the office make sure that all ofthe information (data) bookcase is not supporting the program. The needed is actually gathered from the partici- distribution plan might include: pants. It is important to ensure that data Project staff(for discussion about implica- gatherers are well-trained so that they do their tions for the program) job thoroughly. Uniform data collection strategies should be employed throughout the Managers and other influential people in process. the agency or organization Funding sources (outcome data may be 5. Analyze Data and Write the crucial to further funding) Evaluation Report Other citizens groups, agencies, and Once information is gathered, it should be businesses thatwere involved in the analyzed according to a carefully thought out program approach (analysis plan). Others that program planners might want "Analysis" means looking methodically at the to work with in the future information and determiningwhat answers it Local or State legislators provides to the evaluation questions. It is also important to write down any reasons why the Prevention specialists and similar pro- program did or did not work as expected, and grams in other locations what should be changed or evaluated differ- NCADI (see Resources). ently in the future. Ifthis process is new to program planners, they may need the help ofan evaluation specialist to guide them through data collec- Cultural Competence in tions and analysis. Evaluation Once analysis is complete, think about how Planning for prevention communication the evaluation report will be used before programs requires forming a set ofassump- writing it. And as with all communication tions aboutwhat should happen to contribute materials, it is important to considerwho the most effectively to substance use prevention, readers (target audiences) will be. A report who should be involved or affected, andwhat prepared forjournal publication may look results may be expected. These assumptions very different from one shared with the must take into consideration the norms, community. values, and expectations ofthe community Consider: and its cultural/ethnic groups. Among these different groups, for example, norms con- Preparing different versions for different cerning who is in the best position to influ- readers and purposes ence youth and to lead the community may Using visuals to highlight important vary. findings Planning for evaluation must also take cul- Incorporating information from other tural considerations into account, particularly studies, programs, or sources to add in selecting methods for gathering informa- weight to findings (such as national data tion and in interpreting the results. Consider, that are comparable, other studies that for example, that: came to similar conclusions). . Case Study 3 week after seeing the video. For students who understood enough spoken English to understand the video but did not have La Esperanza del Valle (Yakima Valley, sufficient skills to read the questionnaire WA) is a multimedia prevention soap in English, the questions were read in opera campaign targeted to Latino adoles- English and repeated in Spanish. cents and families at risk thatwas devel- oped by Novela Health Education and the The evaluation identified some positive University ofWashington Health Educa- changes in youth attitudes regarding tion andTraining Center. The video alcohol from pre- to post-viewing, includ- componentwas shown to students in the ing significant changes in intentions to classroom. To evaluate the efficacy ofthe drink. The majority (74%) ofresponding video in changing attitudes and behaviors students indicated that ifthey used related to alcohol use, students com- alcohol they would probably or definitely pleted a pretest 1 to 2 weeks before the change their drinking behavior based on viewing and a posttest form within 1 whatwas shown in the video. In some cultures speaking out in a group cluded school attendance and progress, (such as a focus group) or revealing change in family/individual behavior, and negative feelings about an activity is not nonverbal, self-reported measures. appropriate. Thus observing nonverbal The cultural outlook ofthe evaluator may cues may be more revealing than oral inadvertently affect the objectivity of communications evaluation reports on program activities. Some groups distrust or fear providing Thus finding a member ofthe community information to a person from a different to conduct the evaluation or making sure culture. Thus the ethnicity ofthe data to involve a culturally competent evalua- gatherer may inadvertently influence the tor may be important. information provided. Programs that cut across cultures and Some groups may lack familiaritywith adapt their evaluation methods to fit how to fill out a printed questionnaire, or different groups may find it difficult to they may face difficulties due to limited compare results across groups. Thus these English skills. The Asian Drug and Alco- types ofevaluations are more complicated hol Prevention Team (Utah) found it and usually require more evaluation difficult to use pre- and posttest instru- expertise. ments with an Asian student population Ifevaluation materials are translated into living in a high-risk environment: "In another language, have them translated most cases, the problem was limited back to the original prior to use. This will English; in other cases, participants simply ensure that the original meaning has not were not familiarwith the testing proce- dures or the concepts... we were attempt- been lost. ing to gauge." The team decided instead to measure outcomes in ways that in- • 10

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.