Engagement in Political Process

This tool represents “respondent’s reports of the frequency with which he/she participates in activities intrinsic to the political process, including talking with others about politics” (p. 12, http://www.nationalservice.gov/pdf/national_evaluation_youthcorps_technicalappendices.pdf).

The tool measures reported behaviors in the political process, not community action. 

This tool has been adapted from the National Evaluation of Youth Corps: Findings at Follow-up.  This is a national study of Youth Corps programs receiving federal funding from the Corporation for National and Community Service.  The study was conducted as a random control trial.  The items in this tool were part of the survey used in that study which collected information from applicants to the program (2006-07) before random assignment, and 18 months after random assignment.  The Youth Corps members on which this tool was tested were both male and female, were primarily 18-25 years old, typically had a high school diploma or less, had generally not served in the military, were mostly single, typically had worked for pay in the last 12 months, and tended to have lived in their community five or more years.  The study tested for differences among subpopulations, but found none. The National Evaluation did not find any statistically significant impacts on education, employment, or civic engagement, but the survey constructs created to measure changes are considered valid.  http://www.nationalservice.gov/pdf/nat_eval_youthcorps_impactreport.pdf


Number of Questions
Creator(s) of Tool
Price, C., Williams, J., Simpson, L., Jastrzab, J., and Markovitz, C. (2011). National Evaluation of Youth Corps: Findings at Follow-up. Prepared for the Corporation for National and Community Service. Cambridge, MA: Abt Associates Inc.
Scoring / Benchmarking
These questions would be administered as a pre-test (before the person participates in the program) and as a post-test (follow up after participating in the program).

Higher scores indicate higher levels of engagement in the political process. Score each item as indicated below:

Question 1a and 1b:
Strongly Agree = 5
Agree = 4
Neither Agree nor Disagree = 3
Disagree = 2
Strongly Disagree = 1

Question 2:
A great deal = 3
Somewhat = 2
Not at all = 1

Question 3:
Does not receive a score. You would expect number of days to increase.

Question 4a and 4b:
Always = 5
Often = 4
Sometimes = 3
Rarely = 2
Never = 1

Question 5:
Very Likely = 5
Somewhat Likely = 4
Not Sure How Likely = 3
Not Too Likely = 2
Not Likely at All = 1

Program administrators would score each item (except Question 3), and calculate a total score. There are two ways that the program administrators can use the scores. First, they could look at the overall score to see how that improves from the first time the individual filled out the assessment to the most recent assessment. Second, they could look at improvements in the scores of particular items between the first time the individual filled out the assessment to the most recent assessment.

Improving Service Delivery Tip!
The second way provides three avenues for improving service delivery. First, it has the advantage of being able to see where the program seems to be making the most difference for individuals. Second, it will also allow program administrators to see if participants ever indicate becoming “less likely” to engage in a particular activity after participating in the program. Third, examining the scores on the pre-test can provide an indication if the targeted children/youth are enrolling in the program. Children/youth must score at a pre-test level that provides room for improvement. If the children/youth score high at pre-test, it indicates that they might not benefit from the program activities. This could trigger a review of recruitment and enrollment strategies, or a review of what the program hopes to accomplish.
Background / Quality
Cronbach’s Alpha 0.77
Is there a cost associated with this tool?
Rate this listing
0 vote