Teaching American History Heartland Consortium

The teaching of American History, and especially as a separate academic subject, has faced many challenges and debates for well over a century. The American History Association, with the inception of the Committee of Seven in 1899 has been working to

increase the understanding of the subject for over 100 years. Recently, a Washington Post article discussed in addition to difficulties surrounding American History as a separate core subject additional barriers exist, “In many schools across the country, teachers say social studies has taken a back seat under the federal No Child Left Behind law, which stresses math and reading. Squeezing history into the curriculum can be difficult, educators say [.] ”

The logic model provided as Appendix I to this evaluation aligns professional development activities with the expected outcomes. The visual representation is a framework depicting how the program will use grant resources to help increase the recognized importance of traditional American History and how well-managed programs influence student achievement.

Purpose

The purpose of this evaluation is to develop a deeper understanding and appreciation of American history as a separate core subject, and in doing so, develop improved instruction of the subject, and raise student achievement in the subject.

Team Composition and Participation

A consortium was formed to administer the U.S. Department of Education’s Teaching of American History Grant Program (CFDA: 84.215X). The consortium consists of local educational agencies (LEAs), historic sites and museums, organizations with expertise in history education, and institutions of higher education.

In accordance with 34 CRFR 75.105(b)(2)(iv), this evaluation recognizes the absolute priority of partnerships with other agencies or institutions. The evaluation meets the statutory requirement through comprising research in conjunction with the Iowa Department of Education, the University of Iowa – Department of History and College of Education, Heartland AEA 11, and the Herbert Hoover Presidential Library and Museum. Verbal commitments have been obtain from the above research partners. Detailed letters of commitment will be submitted in an appendix to the RFP application narrative.

The purpose of developing a strong evaluation plan is to determine the programs audience and their interests – using this to drive the approach and interventions used in the evaluation. The team is responsible for defining the scope of the project, collecting and analyzing data, and reporting on the findings. We intend to use measurements and analysis to reduce, to the greatest extent possible, various threats to validity. However, we feel even more important than our interventions and methodology is the seeing long term program objectives accomplished. In order for successful long-term implementation in the program, key stakeholders will be highly utilized and involved in all important aspects of the evaluation.

Representing both internal and external evaluators, the evaluation team is comprised of stakeholders of the professional development program (teachers, administrators, and school board members), master teachers from the University of Iowa, and the retained services of The Gilder Lehrman Institute (also serving as Principle Investigator). LEA superintendents will provide team participants – in addition to school principles, each school will be represented by another key administrator (i.e. a curriculum developer), and two teachers. Each participating school will have four direct representatives on the evaluation team. The superintendent from the lead LEA (Des Moines Public Schools) shall function as the evaluation headquarters and facilitate overall management of all evaluation activities – to include assignment of detailed responsibilities, and supervision and direction of retained services. The superintendent of the lead LEA, serving as the project director, will specify the overall evaluation task, inform the team of available resources, and establish a deadline for completion and reporting format.

Evaluation responsibilities are assigned in accordance with team member’s qualifications and skills. The team will develop a variety of quantitative (measurement driven) and qualitative (narrative based) measures to evaluate the professional development program. Prior to any instructional changes, the first data collection will establish a baseline to be used in measuring the program results.

The team will meet in very early stages to develop and assess the needs of the teachers. Program interventions will be coordinated based on projected needs, and on established guidelines for evaluating professional development programs. Qualitative data will be collected and analyzed through interviews with teachers, administrators, and school board members to examine the current trends and attitudes regarding professional development. Research will be conducted on existing data outlining previous professional development programs in an academic environment and, to the extent which it is available, data relating to existing Teaching American History research will be analyzed thoroughly. The intended outcomes of the professional development program will be detailed in measurable terms and are discussed in detail below.

The team will consider strategies for collecting formative and summative data during and after the professional development program. Formative evaluations will be utilized during various intervals of the professional development program. Based on feedback and comments, these evaluations will be used to modify and improve the program in mid-course and allow continuous fine-tuning to insure quality improvement of the program. Summative evaluations will be utilized to determine the overall effectiveness of the professional development program. A summative evaluation will be conducted at the conclusion of the program to measure shifts in pedagogy, change in school and community attitude, and the effect of the professional development program on student achievement.

In an effort to ensure evaluation results are put to good use, the team will make recommendations for the implementation of future professional development programs.

Evaluation

The evaluation will measure the success of goals through quantitative and qualitative analysis of the various interventions and their impact on teachers and students.

Goal One: To improve teachers’ knowledge of traditional American History; consists of two primary objectives: (1) Improve Heartland AEA 11 teachers’ in specific American History areas such as Industrialism, Gilded Age, Progressive Era, Great Depression, and roles of women in the emergence of a modern nation. (2) Improved appreciation and involvement in traditional American History.

The interventions to accomplish this goal are ground on two programs. The first is a three day staff development workshop offered by the Gilder Lehrman Institute. The workshop offers teachers access to award-winning historians providing in-depth coverage of topics selected by the project director and team. The second program is a week long Summer Seminar sponsored by the Gilder Lehrman Institute. The Summer Seminars will be attended by teachers at Midwestern universities. Teachers will receive a stipend and graduate credits for participation.

Goal Two: To improve the quality of instruction in traditional American History in LEAs; focusing on the primary objective to improve the teachers’ use of more effective content pedagogy in the instruction of American History.

The interventions will include the Gilder Lehrman Institute providing teachers with educational resources in three genres: visual, print, and digital. These comprehensive resources are designed not only to increase teacher knowledge of core American history content, but also to offer exciting new ways of bringing American history into the classroom. Each teacher receives a personal copy of the materials, which are also supported by online supplemental material on the Gilder Lehrman website. Materials include traveling exhibitions, calendars, posters, books, and multimedia resources . There will be nine lectures during the school year offered by history scholars from the Gilder Lehrman Institute. The history scholars will assist the LEAs to transform specific content into lesson plans specific to their classrooms. This partnership will help engage both the teachers’ and the students’ interests and increase the overall quality of instruction.

Goal Three: To improve student achievement in traditional American History in LEAs; focusing on the primary objective of improving the student’s ability to use critical thinking.

The interventions to accomplish these objectives are outlined under goals to improve teachers’ knowledge of traditional American History and improve the quality of instruction in traditional American History in LEAs.

Collection and Analysis

To measure the increase in teacher knowledge and understanding of traditional American History surveys, interviews, and pre-posttest using nationally validated tests of American history assessments will be utilized to establish a direct link to participation in the Teaching American History Grant Program. Pretest and posttest results will be used in the summative evaluation of the professional development program. Surveys and interviews will be used as a formative measure. Surveys and interview questions will be designed by the project director, master teachers and the Gilman Institute. Surveys (using a Likert scale) will be conducted prior to the summer institute, at the mid point of instruction, at the conclusion of the instruction. Survey results will be translated into a matrix to provide quantitative data. Interviews will be conducted at the end of the instruction and results will provide qualitative data.

The Customer Satisfaction Survey provided by the Iowa Department of Education (IDE) will be used to measure teacher satisfaction of the program. Information will be collected on a forced Likert scale with 1=strongly disagree, 2=disagree, 3=agree, and 4=strongly agree. The teachers will be asked to respond to the following question: “The Gilder Lehrman Institute provided beneficial and resourceful professional development training,” and additional questions to provide responsive and informative feedback.

To measure a shift in the quality of instruction of traditional American History the evaluation will use surveys, observations and focus groups. Teachers and school board members will be asked to complete an evaluation of the program and describe in a narrative format their thoughts regarding the quality of instruction. Teachers and school board members will again be asked for evaluation six months after their participation. The six month evaluation will be collected in a survey form – similar to the format used to evaluate increased knowledge and understanding of American History. Teacher logs will be complied by observations conducted by master teachers and school evaluation team members. Topics of discussion, deviations, and unusual events will be recorded and qualitatively analyzed by the assigned team members (internal and external). Themes will be compared to research of similar projects and used as formative evaluation to insure continual improvement in the ongoing program. Teachers and administrators will be selected by the Principle Investigator and staff to form a focus group – providing qualitative data.

To measure the effectiveness of above interventions on student achievement the evaluation requires the use of pre-post student surveys and pre-post standardized statewide U.S. History assessment scores; additional methodology may be utilized as requested by key stakeholders and as beneficial to the Gilder Lehrman Institute.

The Customer Satisfaction Survey provided by the Iowa Department of Education (IDE) will be used to measure attainment of student needs. Information will be collected on a forced Likert scale with 1=strongly disagree, 2=disagree, 3=agree, and 4=strongly agree. The students will be asked to respond to the following question: “The teaching of American History was exciting and informative,” and additional questions to provide responsive and informative feedback.

A Regression-Discontinuity (RD) design (quasi-experimental, closely approximating an experimental design) is expected to be the primary measurement in terms of the students success. The RD design allows assignment of the treatment group to those who most need or deserve it. The RD design is ethical – it does not deny the treatment to students who need as the case may be in a randomized experiment. One pitfall of the RD design is described by Langbein and Felbinger (2006) is the “considerable random measurement error” associated to those students close to the cut-score.

In October 2008 a survey will be sent out from Heartland AEA 11 to all school districts advising of the grant opportunity and as a solicitation for participation in the program. Based a ratio of 1 rural school district, 2 urban school districts, and 3 Des Moines City schools, the AEA staff, using qualitative measures and based on any possible schools being identified as needing improvement, will develop a list of participants. The schools above the cut-score, as determined by the Principle Investigator and AEA 11, surrounding measures above, will be selected as treatment groups. The schools not selected as treatment groups in year one, but at the time of assignment would line-up for treatments groups during the following year, will be selected as control groups. To point in which program funds are exhausted.

For use in student achievement a cut-off score will be established and a pre-test drafted by the project director and the evaluation team. The assessment will be administered by classroom teachers to all students of all participating grade-levels in all schools designated to participate in the program – either in a treatment group or control group function.

In May of 2008 and 2009 all pre-tests will be administered to all participating students. The results of these tests will be the basis for cut-off scores during the following year’s interventions. Students above the cut-off score will be assigned to the treatment groups and students below will participate in the control group. The upper 50 percentile will be selected as the treatment group and be registered in a new course titled American history and lower 50 percentile will be registered in the current social study course.

During the 2009 school year there will be two groups of subjects – based on initial year of professional development training. The treatment groups will be the schools which have teachers actively participating in the professional development program and the control group will be those schools attending professional development training the following summer. The 2010 school year will provide for one large treatment group. Each year the evaluation team will continue to relying on RD with cut-off within each grade of participating schools. Stratification will decrease to some degree on the school unit-level as the measurement on the student unit-level increases.

Under the cut-score the evaluators consider the control group as equivalent to the treatment group at the start of the program. The gain from the pre-posttest should account for any variables outside of the program. The gain in the treatment group will provide evidence for the effect of the program. This process allows solid internal validity by showing a causal relationship between the program and the outcome and other factors are ruled out.

A discontinuity in regression lines indicates a program effect in the RD design. The evaluation team will describe the treatment and control groups and explain how to interpret the direction of scale values on the outcome measures. William Trochim describes the program effect and use of the RD lines as, “a program effect is suggested when we observe a “jump” or discontinuity in the regression lines at the cutoff point. (Trochim 2006) ”

The evaluation team may consider between group modeling (ANOVA) in cases of drawling comparisons between treatment groups of various demographics (rural, urban, and suburban schools). Additional independent variables related to these demographic conditions and data collected may also be analyzed using ANOVA/MANOVA designs.

There is likely to be a third summer session of professional development in 2011. This evaluation is structured to meet contingencies for a continuation award of 24 months. In the case funds are unavailable for program extensions, this program is structured in such a way to continue implementation by recognizing teachers completing the program as qualified instructors of the program. The qualified teachers in each LEA will be fully capable and certified to provide professional development instruction to their colleagues. In addition, funds from the initial grant are being reserved to continue the Gilder Lehrman partnership through the year 2013 (on a per hour basis).

An overall summative evaluation to be developed and published in the summer of 2011 will include program successes and pitfalls, lessons learned, and recommendations for continuous improvement and subsequent professional development activities. Regular intervals of reporting will be recognized and a network for dissemination (website design) will also be available for researching up-to-date program information.

Budget

The approach of this program is not entirely predetermined as adjustments based on formative evaluations and other basis will be made. The budget provided as Appendix II to this plan is an estimate based on several broad research stages and estimated time for each. Reallocation may be required in certain categories as developing themes are explored and priorities are adjusted. The outline for this budget was drafted using a sample budget format provided by the Alameda County Office of Education.