Presentation of the American Evaluation Association Internal Scan Findings The American Evaluation Association undertook an internal scan to learn more.

1 Presentation of the American Evaluation Association Int...
Author: Caitlin Oliver
0 downloads 1 Views

1 Presentation of the American Evaluation Association Internal Scan FindingsThe American Evaluation Association undertook an internal scan to learn more about its membership during the period September 2007 to January Methods included an online survey of the full membership, follow-up interviews, and online Question and Answer (Q&A) groups. Currently available reports from the scan include: ·         American Evaluation Association Internal Scan Report to the Membership ·         Index of Quantitative Analysis of the 2007 AEA Member Survey ·         Index of Qualitative Analysis of the 2007 AEA Member Survey ·         Index of Qualitative Analysis of Interviews from the AEA Internal Scan ·         Index of Qualitative Analysis of Online Q&A Groups from the AEA Internal Scan ·         Presentation of the American Evaluation Association Internal Scan Findings Currently available instruments from the scan include: ·         AEA Member Survey 2007 ·         AEA Member Interviewer Guidelines and Interview Protocol ·         AEA Online Question and Answer (Q&A) Groups Protocol All reports and instruments listed above may be found online on the AEA website at This document contains the PowerPoint Presentation of the American Evaluation Association Internal Scan Findings.

2 Presentation of the American Evaluation Association Internal Scan FindingsPresented to The AEA Board By Goodman Research Group, Inc. February 29/March

3 Three member inputs Online survey Interviews Online discussion groupsOnline survey – over 2,600 responses, 49% response rate; conducted nonrespondent bias survey Interviews – 56 in person and phone with evaluators from different professional worlds Online discussion groups – three bulletin board style groups with members with varying experience in the field

4 Response by Professional IdentityRespondents Nonrespondents Evaluator 49% 44% Faculty 15% 19% Researcher 14% 10% Student 7% 6% Conducted an online nonrespondent bias survey with 200 nonrespondents; 53 responded, for a response rate of 27% Following slides show how respondents compare to nonrespondents

5 Response by Highest DegreeRespondents Nonrespondents Doctorate 52% Master’s 42% 39% Bachelor’s 7% 8%

6 Response by ExperienceRespondents Nonrespondents Less than 5 33% 38% 6-10 23% 20% 11-15 16% 12% 16+ 27% 30%

7 Nonrespondent Satisfaction with AEA Products, Services, Benefits10% extremely satisfied 58% very satisfied 28% somewhat satisfied 4% only a little satisfied

8 Response by AffiliationRespondents Nonrespondents Affiliate most strongly with AEA 45% 35% Affiliate most strongly with other 30% 48% No strong affiliation 25% 17%

9 Voices from the Field Each Board member read a member response to the first survey question: “Imagine you are out to dinner with other AEA members, and each member is taking a minute or two to describe his or her evaluation-related work and/or study.  It’s your turn; what would you say?” Currently, in my semi-retirement, I am serving as an adviser for educational administration graduate students who are pursuing their doctoral degrees in a distance-education program. I also teach courses in evaluation and statistics in this program. My central interest and concern in teaching and in conducting evaluations is the need to achieve public and policy understanding of experimental and quasi-experimental design as the defensible basis for establishing cause and effect relationships and, therefore, the proper basis for determining the effectiveness of educational and social action programs. I am a currently a doctoral student in Social Work. However, I was forced into program evaluation when I had a position as a program director prior to my return to school. We needed to evaluate and defend our program to our government funders, and it was a daunting task. I am working on sharpening those skills so that I can be a better program advocate when I return to the non-profit or government research world. I have been doing evaluation studies since I was never trained in evaluation as a student, but once I got the opportunity I was hooked. As an entering professor, I quickly had doubts about the relevance of my work to the real world political and social problems I was concerned about, and evaluation was the answer. I have been an active evaluator since, including more than 100 state or local program evaluations, and nearly 20 national multi-site evaluation research projects. I love the challenge of adapting rigorous research methods and tools to real world problems and solutions. I have been an internal evaluator for a small federal agency for over 15 years. Since we've had no budget and only a few staff dedicated to evaluation, I perform all sorts of evaluations and have served as a consultant for partners ranging from others in our agency to state health departments and non-profit organizations. Since evaluation is an ad hoc activity I am typically invited in to work with a program or project. Most of my time is devoted to evaluating organizational change efforts, from simple interventions such as targeted training to larger cultural transformations. I place great emphasis on mixed methods approaches, as I feel that the integration of numbers and stories create the most compelling arguments and forms of evidence that speak to the great diversity of stakeholders, and, equally importantly, to those effected by such changes. I love this stuff; always have. My early evaluation work was with Ed.gov Technology Challenge Innovation Grants. After receiving NSF funded training at the Evaluator Center at Western Michigan University and through The Evaluator’s Institute, I became involved in some private contracts evaluating distance education and early use of classroom management systems in K-12 classrooms. My benefits are provided from a half-time position as a web content manager at a university. If private evaluation keeps growing, I could go completely as an independent contractor. In 2005 I started my own research and evaluation firm. I primarily assist government agencies and statewide non-profit agencies build their evaluation capacity to evaluate their services and prevention efforts. I also do original research. In an effort to keep my rates "affordable" I have chosen a home-based business model. I have no employees, low overhead, and use subcontractors for additional support. I work on four different grants and it is my job to produce information to be provided to the grantors. I do a lot of pre and post surveys, use Excel and SPSS, and collect data from many different state websites. It is frustrating, as I've been doing this for less than a year and feel inept a lot of the time. I get frustrated when grant participants don't like the numbers. I spend a lot of time explaining the process. In general there is a huge learning curve, but on the upside, I am never bored! I work in a country where evaluation is still not a common practice. At the moment, my work is related to teaching evaluation in order to contribute to making evaluation a part of organizational practice, mainly of health organizations. And, also, I am coordinating a part of a national program evaluation project. I have been doing program evaluation since 1992 without knowing it is what I am doing. By the time I attended the International Program for Development Evaluation Training in 2003, my practice had improved significantly. Since then, I have provided consulting services in results-based program evaluation, including participatory evaluation. My vision now is to promote program evaluation in Africa, so that more impacts from development efforts are achieved to the benefit of all the citizens in the region, and at satisfaction of our partners. Most of the programs I evaluate these days have been and are those within higher education institutions. Several are funded infrastructures or centers that support health research, community outreach, clinics and small seed grants. I provide expertise on how program administrators can effectively but inexpensively gather data that will show whether or not their center programs are working. I also teach a graduate course in health program evaluation. I am based at a university and work in two fairly distinct areas: disability services and watershed management. In addition to some fairly nuts-and-bolts types of evaluations, I also work with groups trying to bridge different agency or discipline knowledge bases to determine how to make more systems-level changes. Underlying all of this is my interest in how evaluation could better facilitate the necessary two-way interaction between academic research and the applied world. I'm new to the formal field of evaluation, but have been a research scientist and biology professor, so I am familiar with the process of objectively analyzing data. Now I'm working as Program Evaluator for an NIH-funded project, which supports laboratories involved in training minority students in biomedical fields. In addition to program evaluations, I have just completed an extensive report compiling quantitative data obtained from survey administered to researchers on our campus. Gosh, I keep on trying to focus my work, but I keep on inheriting these wonderful projects that I can't resist. I'm currently working on several evaluations that have professional development components and a couple that work directly with students. Across the board, I am looking for opportunities to infuse what seem to be polar opposites, but I'm using both more experimental studies and more participatory approaches. I am the sole person responsible for evaluation in the not for profit organization I work for. My responsibilities are mainly for client services, but as part of the president's office team I offer evaluation support to other departments. My role in client services is to develop templates and tools for the programs we offer which colleagues can tailor to their needs, to develop education and information resources on evaluation, and to provide assistance to colleagues involved in evaluation projects.

10 Who are our members? 67% Female, 33% Male 86% U.S., 14% International73% White 52% Doctorate, 42% Master’s, 7% Bachelor’s 21% New, 44% Short-term, 36% Long-term members

11 Six Stories from the Scan

12 It’s Not Easy Bein’ GreenNew Members in AEA

13 Profile of New Members Higher percentages of females, members of color, international members Biggest percentage young and relatively inexperienced, but not all new members are new evaluators More Master’s level See report for details

14 Academic Background by Age and Length of MembershipShows newer members less likely to have Doctorates than longer-term members of the same age

15 New Member Use of Current AEA ResourcesFewer have used resources; however: Majority use AJE and NDE, website, Guiding Principles Two in ten participate in TIGs New and longer-term members find resources equally useful New members find website more useful Not surprising, new members haven’t accessed AEA resources at the same rates as longer-term members, but when they have they find them just as useful and they find the website even more useful. The rates of TIG participation indicate it takes more than a year for this kind of involvement to kick in. Current AEA resources asked about on the survey include: Guiding Principles for Evaluators Website resources AEA annual meeting PD workshops at the annual meeting AEA/CDC Summer Institute AEA listserv, EVALTALK Topical Interest Groups (TIGs) AEA electronic newsletter Local or regional affiliate American Journal of Evaluation New Directions for Evaluation Evaluation Review Evaluation and the Health Professions

16 New Member Desires New members want same resources (but are more interested in them) Highly desirable resources: Online archive of evaluation materials (69% highly desirable) New training opportunities live in region (62%) New members also desire: Journal targeted to practitioners (48%) Updates on public policy issues (39%) DVD-CD ROM of training materials (39%) Web-based training (pre-recorded) (38%) Professional mentoring (35%) Hardcopy self-study texts (35%) Potential new AEA resources asked about on the survey include: Online archive of evaluation materials New training opportunities offered live in your region Journal targeted to practitioners Updates on relevant public policy issues that affect the evaluation field DVD/CD-ROM of training materials Training via web-based delivery that is pre-recorded Professional mentoring Expanded training opportunities offered live at the Annual Meeting Hardcopy self-study texts Training via web-based delivery offered in real time Expanded training opportunities offered live at the Summer Institute Videotaped lectures/speeches Evaluation blog Training via teleconferences

17 Voices of New Members Forming professional identity in evaluationGuidance upon entering evaluation Mentoring along the way Opportunities to get involved in AEA Forming professional identity: New members’ professional identities in evaluation are still in formation. Quotes from members: “I am a relatively new member of AEA, and have only recently thought of myself as a member of the “evaluation” profession.” “I wonder when I’ll look in the mirror and call myself an evaluator?” Guidance upon entering: We heard from new members that they would like AEA to provide some sort of guidance for new entrants to the field, for, as one member put it, “how to ‘break into’ evaluation.” Another member desired “guided customized pathways toward competency.” In addition, those who worked with new members wanted, as one member summarized “guidance in the design of the best possible program of study for new evaluators.” Mentoring: New members would like to be able to learn from someone more experienced. Some work in small firms with little or no mentoring opportunities. Getting involved: New members also want and look forward to getting involved or more involved in AEA. One member put it this way: “One thing I would like to see is more opportunity for new/lurking members to get more involved. I would benefit from an to me directly indicating what groups exist in my area or what opportunities there are available.”

18 Members of Color in AEA

19 Members of Color Race/ethnicity data available from 89% (n=2,361) of respondents 71% (n=1,874) White 18% (n=487) Members of color (Black, Asian, Latino, American Indian, Pacific Islander, Biracial, Multiracial) 11% (n=296) missing race/ethnicity data 5% (n=135) identified as International 3% (n=84) actively chose not to respond 2% (n=40) identified as other 1% (n=37) skipped question

20 Members of Color in U.S. Among U.S. residents for whom race/ethnicity data are available: 80% (n=1,707) White 20% (n=428) Members of color

21 Demographic Profile of U.S. Members of ColorYounger Less experienced Newer members No differences in gender, highest degree

22 Age by Race Members of color White Total 20s/30s 44% 32% 34% 40s/50s45% 53% 51% 60s+ 11% 15% 14%

23 Experience in Evaluation by RaceMembers of color White Total < 5 yrs 45% 31% 34% 6-10 yrs 26% 24% 11-15 yrs 11% 16% 15% 16+ yrs 18% 29% 27%

24 Length of Membership by RaceMembers of color White Total < 1 yr 23% 18% 19% 1-2 yrs 29% 22% 24% 3-4 yrs 21% 5+ yrs 26% 39% 36%

25 Evaluation Profile of U.S. Members of ColorMore likely than White members to be a student in evaluation Less likely than White members to be conducting evaluations, teaching evaluation, training others in evaluation One content area in which members of color more likely to be working than White members: Indigenous peoples

26 AEA Profile of U.S. Members of ColorMembers of color find AEA services/products more useful than do White members. Members of color with more than 2 yrs experience in evaluation find publications, newsletter, website, TIGs more useful than do White members at the same experience level. Higher percentage of members of color (with more than 2 yrs exp) have: attended AEA/CDC Summer Institute (24%, compared to 17% White members). accessed electronic newsletter (51% vs. 41%).

27 AEA Profile of U.S. Members of Color (cont.)Members of color express greater need for evaluation resources than do White members, Members of color with more than 2 yrs experience in evaluation find all potential new products and services more desirable than do White members at the same experience level. Professional mentoring ranks higher on the list of desirable resources for members of color (with more than 2 yrs exp). No difference in strength of affiliation, number of other professional associations

28 Bridging Academic and Practitioner ConcernsFaculty and Evaluators in AEA

29 Four Key Member Identities49% evaluator 15% faculty 14% researcher 7% student Each represents a unique consciousness and affiliation with regard to evaluation

30 Faculty Evaluation ProfileMost faculty conduct evaluations 92% conduct evaluations; 90% had conducted evaluations in the last year But devote less time to evaluation-related work Also, their evaluation-related work differs: 74% teach evaluation 39% work on evaluation methods 23% work on evaluation theory

31 Demographics by Professional IdentityFaculty Evaluators Researchers Male 46% 29% 33% Doctorate 90% 49% 60% 16+ yrs experience 41% 21% Faculty: More gender balanced More doctorates Older More experienced Longer-term members

32 Differences in Professional Development NeedsFaculty Evaluators Researchers Students Archive 60% 66% 64% 74% Regional training 39% 55% 44% 65% Journal 45% 49% 38% 52% Mentoring 23% 27% 24% Students most interested in online archive Evaluators, students most interested in regional training; faculty, researchers less interested Students and evaluators more interested than researchers in practitioner journal Students want professional mentoring

33 Differences in Strength of AffiliationEvaluators most affiliated, researchers least affiliated 54% evaluators affiliate most strongly with AEA 44% students 37% faculty 29% researchers

34 Differences in Use of Products, ServicesMore of the faculty use journals, Guiding Principles (GP), newsletter, TIGs 86% of brand new faculty had read AJE, compared to 64% of new evaluators 60% of new faculty had read NDE, compared to 48% of new evaluators Note: Only 48 new faculty among survey respondents

35 Differences in Usefulness of Products, ServicesDifferences in AJE, NDE, GP, annual meeting, TIGs, EVALTALK Faculty give find GP, AJE, NDE, TIGs highest usefulness ratings Faculty and students rate annual meeting more useful Students find EVALTALK most useful, faculty least useful

36 Employment by Professional Identity in Evaluation30% of members employed in universities Of these: 57% primary professional identity in evaluation is faculty 31% primary professional identity is evaluator

37 Employment and Professional Identity by Length of MembershipSmaller percentage of new members employed in universities Among more experienced members: 41% of new members identify as faculty 43% of shorter-term members 52% of longer-term members

38 Faculty and Evaluator Responses to Existing and New AEA ResourcesFaculty and evaluators find many current resources equally useful Faculty find journals more useful than do evaluators Faculty and evaluators have equal interest in a journal for practitioners

39 Some Practitioner PerceptionsTheory Nuances Treatises Ivory tower Some of the emotion expressed by practitioners regarding current AEA resources/experiences: “Case studies are more valuable to me than theory, which seems too bogged down in jargon.” “The annual conference is far too academically oriented to be useful for me to justify my organization paying my way and too expensive for me to justify on my own dollar. Nit picking evaluation nuances gets old very fast when some of us are looking at some huge constraints on resources and the practicality of conducting real time evaluations.” “Provide basic resources for non-academic practitioners. We are busy and don't have time (however interesting ) to digest differing treatises arguing for theories and practices to evaluate the number of angels dancing on pinheads.” “[We need] journals that offer articles from people other than a small group of evaluation insiders that have a myopic perspective on evaluation beyond the ivory tower.”

40 What Some Practitioners Want to See and Discusssolutions constraints dilemmas issues common current practical real Some of the words members use to describe what they would like from/in AEA: “AEA needs to focus less on the academic community and theoretical models and genres and be more responsive to those who really conduct evaluations.” “From my perspective, there seems to be a shift toward development of an academic establishment specializing in methodological issues in evaluation. Some of what comes from this is interesting and valuable but some seems to me publishing for the sake of publishing. I would be excited if the shift went in a different direction--toward strategies for making evaluation research/work still more useful as a tool for effective management and policy development.” “The AEA journals are great, but they are so academic. I would like to see an AEA magazine to keep evaluation practitioners abreast of current issues and provide discussion of common dilemmas in an informative but non-academic way.”

41 One Hat, Two Hat … Three, Four, Five Hats!

42 Members’ Many Hats 84% do some non-evaluation work92% do more than conduct evaluations 84% work in more than one content area 23% U.S. members do some international work 31% international members do U.S. work

43 Opening the Evaluation Policy Window

44 A Simple Framework Members AEA Public Policy

45 AEA  Members 42% would find updates on public policy issues that affect evaluation highly desirable 53% members of color vs. 38% White members Evaluators in government, public policy find updates more desirable than evaluators in other settings

46 AEA  Public Policy Give input Structure better conversationsWork with legislators Take a stand Advocate Some of the members envisioned AEA’s role in public policy: “More needs to be said about the root issue - why evaluate? What do we want evaluation to do?” “AEA could be touting their wares more.” “Have better conversations about what we’re doing.” “Go further than being responsive to members - advance discipline, work with legislators.” “… more of a willingness to take a professional stand against ridiculous federal initiatives such as No Child Left Behind (e.g., note how it clearly violates many of our profession's evaluation standards).” “Advocacy role around what is high-quality evaluation.”

47 Bringing AEA into the EverydayGaining Greater Access to the Benefits of AEA

48 Highly Desired ResourcesOnline archive of evaluation material (65% highly desirable) Live regional trainings (52%) The only potential new resource (among those we asked about) that a clear majority of members find highly desirable is an online archive of evaluation materials. We don’t have a lot of information from the scan as to why members value this so highly or exactly what members have in mind when they think of an online archive. The internal scan served to establish members’ interest in these potential new resources, but development of any specific products would require additional input.

49 Challenges Related to TimeLack of time (and money) to travel to conference, institute Return on investment has to be worth it When we asked members what AEA could do for them, two of the most frequently used words in response were TIME and ACCESS, which help explain the energy behind the desire for the online archive and the regional trainings. “Don't have the time to commit to a summer institute.” “As funding within my org becomes more limited and as my time for training becomes more limited, any opportunities to learn via a distance on my own time would be a dream come true for me!” “With a full time job, how do you insert these various training activities. We can plan for such things but it has to be good and deemed relevant by management with time allotted for it.” “I have thought about undertaking the certificate course from the evaluators institute and have in fact taken some courses over the years, but I find that it is discouraging the number of courses required for the certificate and the overall costs of such. I think I could follow a degree course by the time I made such an investment in time and money for the certificate. I think the requirements are too heavy and perhaps you could offer some kind of intermediate level certificate that would be more attainable by a person who is already working full time on evaluation and can't invest all of the time requirements for the current certificate.”

50 Challenges Related to AccessLack comprehensive set of references Cannot access publications, literature Want conference materials online Overseas members need alternative to in-person

51 Desire for Local NetworkingLocal or regional cooperatives Local/regional events by TIG Partner with universities Working groups Opportunities to meet others in area

52 Using and Disseminating the Internal Scan Findings“I’m certainly glad the association took on this survey. Let’s see what ‘use’ it gets.” “Hopefully, the results will indicate new areas to explore for the benefit of all members.” “It would be good to know if/when AEA members might have access to aggregate data from this survey.” “I look forward to seeing the results and your response/reflection on the results.” Quotes from members represent their interest in learning from the internal scan

53 Thank you! Time for Q & A