CAEP Annual Indicators
CAEP has four revised annual reporting measures which are used to provide data to the public on program impact, program outcomes, and consumer information.
Measure One: Completer Impact and Effectiveness (Initial Programs)
Under Measure One, this item evaluates by observation or P-12 student and administration surveys how program completers apply the professional knowledge, skills, and dispositions acquired in their preparation programs effectively in their P-12 classrooms. Generally the EPP sends all completers an annual survey, grounded in the InTASC standards. The survey prompts a self-assessment regarding their effectiveness in the P-12 classroom. The EPP also surveys principals regarding completer performance and their P-12 learners' growth. Further, some EPP programs engage in observations of completers in the P-12 classroom. Unfortunately, the EPP was not permitted in-person access to teachers' classrooms and guests were not allowed in virtual classrooms, thereby limiting data collection for the 2020-21 academic year. Therefore, based upon feedback from district partners regarding teacher capacity constraints due to the pandemic, the EPP implemented a revised strategy in Spring 2021.
Teachers instead completed a self reflection table on progress on their goals for effectiveness. While some teachers reported not being able to meet their goals for effective teaching, others responded that they exceeded their goals in 2020-21. Moving forward, the EPP this new tool will continue to be implemented, as well as an expansion of existing observations and surveys. the EPP found this new tool useful and in moving forward will explore reliability and validity of this as a measure to be used in concert with additional indicators of completer effectiveness.
Also under Measure One, this item evaluates how program completers contribute to student-learning growth. The Commonwealth of Pennsylvania utilizes a Student Learning Objectives (SLO) process to assess and document educator effectiveness based on student achievement of content standards. SLO is a part of Pennsylvania's multiple-measure, comprehensive, system of Educator Effectiveness authorized by Act 82 (HB 1901). SLO serves as one source of evidence of completer's impact on student learning and development. The College of Education contacted completers who are classroom teachers to ask what evidence they have collected that speaks to student learning in their P-12 classrooms. This PDF includes samples of SLO provided by program completers who are classroom teachers at various P-12 levels.
However, because neither the Pennsylvania Department of Education nor the Commonwealth of Pennsylvania currently share the SLO data as explained above, the EPP created a Penn State approved project: “Do College of Education Teacher Preparation Program Completers Contribute to P-12 Student Learning and Development through Effective Teaching Practices” as a continuation of the data collection efforts surrounding Standard 4.1. Emails were sent to 243 completers who had been identified as having less than five years of classroom experience, in order to control for district effect overtaking EPP effect. To date, only four data sets have been submitted for the SLO evidence. The EPP has reviewed the collected data and finds that there is alignment looking at the SLOs with other measures previously collected that program completers do contribute to an expected level of student-learning growth.
However, the small number of participants, even after offering incentives, was disappointing and had significantly limited reliability, validity, and actionability. Planning has begun on the logistics of setting up direct observations of classroom teachers that have been identified by the state in order to supplement and validate the data previously collected.
A data collection tool that continues to provide robust evidence that our program completers who are classroom teachers are systematically measuring student-learning growth is our annual master completer survey. This tool surveys all College of Education completers from all programs, and asks questions relevant to program taken and employment status. For the Summer 2018 survey, 186 classroom teachers from Pennsylvania plus 21 other states and 5 non-US locations provided responses to "What indicators do you use that provide evidence to guide your practice that your students are making academic progress?" Over 95% responded that they use Progress Monitoring (e.g. classwork, homework, student artifacts) as one indicator used, plus 60% indicating state standards, 30% indicating standards of my discipline, and individuals noted daily observations, Vermont Teaching Strategies Gold, constant assessment - formal and informal, PA SAS, project-based learning projects, agricultural task sheets, pass rates on required state exams, and online ESGI software.
However, with falling survey responses in AY 2019-20, and because of direct feedback from superintendents that teachers were at capacity delivering instruction during the pandemic, the EPP repeated a personal email technique successfully used in Spring 2020 to capture responses in Spring 2021. The email prompted teachers to complete a table asking for the teacher’s personal perception of what percentage of students exceeded, met, and did not meet district growth goals for 2019, 2020, and 2021, plus any additional thoughts, stories, frustrations or triumphs they wanted to share. Besides the percentage data, the teachers provided rich content of their experiences with their P-12 students during the pandemic. They provided a glimpse into their classrooms, actual and virtual, shared a wide variety of frustrations and a few successes as well. Fourteen teachers responded, with most reporting lower student growth in 2020-21 than in the previous two years. This response was well in line with what the EPP had been hearing from the field.
A recently crowd sourced DB of completers will serve as a communication tool to contact completers. The EPP again plans to survey district administration and pilot case studies with districts with large numbers of completer as districts once again open. Due to the COVID 19 pandemic the EPP recognizes increased need to provide induction programming. Such programming will increase connections between the EPP and the completers and provide additional insight into candidates' practices and the resulting P-12 student growth. EPP faculty plan to conduct additional research within this context.
Finally, the College requested and received the latest copy of the PVAAS report from the Pennsylvania Department of Education. The report shows limited value-added data for Pennsylvania educator preparation programs. The more recent report is the PVAAS report for 2018-2019.
Measure Two: Employers and Stakeholders (Initial and Advanced Programs)
Regarding employer perception, until recently data regarding where completers took positions were not available. The EPP employed numerous strategies, such as leveraging data from the alumni association and searching social media to reach out to completers. PDE recently started providing current PA placement data for completers. These data will be of some use for following individuals into the field, however many of our completers do not remain in PA. The data provided by PDE also does not include contact information. Nonetheless, it will be a place to start.
In preparation for the ability to track completers into positions in Pennsylvania, the EPP identified concentrations of recent completers (since 2012) who took positions in varied districts. We intentionally picked schools for this pilot that included variance regarding size of district, suburban, rural, and urban districts, and grade level of teachers taught. Principal emails were found through district websites. Principals (n=17), all with more than one EPP completer in their school, were selected. Each was sent an email with a 10-item survey grounded in InTASC standards and parallel with EPP developed student surveys to provide mechanisms to support validity, with rating scale: not at all, limited, adequate, and proficient. Principals (n=2) responded in reference to P-4; 2 Reading Specialists; 2 Spanish (P-12); 1 Biology (7-12); 1 Cooperative Education (7-12); 2 Social Studies (7-12); 1 Health and Physical Education (P-12); 2 English (7-12); 1 Special Education (P-8); 1 Special Education (P-12). These two principals rated completers as 'proficient' for each standard, except for one. The elementary principal rated the PK-4 completers as 'Adequate' in Penn State teachers use of multiple methods of assessment for progress monitoring and decision making. These data have significant limitations and provide limited evidence as a small pilot with a very low response rate. The EPP, however, plans to conduct a stronger principal survey in the latter half of 2019, focusing on principals who have completing certification training at the EPP.
Faculty colleagues at the EPP have recently received data from PDE on the certification and placement of completer educators working in Pennsylvania. Thus, in the future, we will be able to track the placement and retention of all Penn State educators by certificate area (e.g., elementary, middle school mathematics, high school English Language Arts, etc.) and school level (e.g., elementary-, middle-, and high- school) within the Commonwealth, in accord with component 4.3. Further, PDE will be releasing all principal and superintendent emails, which will greatly facilitate effective sampling. Unfortunately, there was too little data from the pilot to address psychometric properties of this scale. Therefore, prior contact with principals we will survey to make them aware of the intent of the survey will be initiated. This should result in increased response rate and adequate sampling. The alignment with InTASC standards and the ability to correlate with completer surveys hold promise the data from the survey will meet standards for reliability and validity.
Advanced Programs: Although the EPP has some of the same tracking completers into the field issues with the advanced preparation programs, there are some data for students completing Principal Certification programs. This program has begun surveying program completers to provide data for its SPA reporting and by looking at the question, "Did you change jobs after receiving your principal certification from Penn State?" the EPP can make judgments on the satisfaction of employers by the percentage of completers who were able to change positions. Of the 75 respondents who completed in 2016-2019, 17 of the 18 who changed jobs, did so into an administrative or supervisory role. Of the 58 completers who did not change positions, 15 were already in administrative or supervisory positions.
Because of the limited nature of the collected data, we have developed a plan to collect additional data to demonstrate Employer Satisfaction. We modeled our Employer Satisfaction Survey from the Completer Survey administered and validated by institutions in the State of Mississippi. The instrument demonstrates sound psychometric properties as administered to completers of their advanced program. Aligned to CAEP key competencies and our Advanced Completer Survey, data from administration of the Employer Satisfaction Survey will inform advanced programs' continued improvement. As part of the 4.1 plan we demonstrate how we will assure the instrument meets standards for CAEP EPP-Created Assessments that provide data about our key competencies for Advanced Completers from all programs and allow for desegregated data and assessment of program-specific objectives. Additional Employer Satisfaction pilot data will be collected into Spring 2022 for completers from each advanced program.
The 4.1 Plan includes a human subjects approved research study to further refine and test a measure that captures Employers' Satisfaction. Annual administrations will occur every fall beginning in the Fall 2022 semester and rounds of desegregated data will be tracked and reported. A test pilot of the survey was sent to two professional stakeholders, one for principal completers and one for superintendent completers. These initial data show 100% Strongly Satisfied or Satisfied responses out of a rating scale of Strongly Satisfied, Satisfied, Dissatisfied, Strongly Dissatisfied. Feedback for areas of improvement included the importance of developing, implementing, and measuring long-term strategic goals and measures, leading system-wide change processes including curriculum development for principals. Feedback for superintendent development included high knowledge of personnel management, finance, contract negotiations, community relations, and personnel evaluations. The EPP will review course curriculum for alignment with these feedback areas and discuss ways of strengthening instruction and assessment of these items.
EPP stakeholders play critical roles in program evaluation and continuous improvement. The EPP has implemented several mechanisms to engage various stakeholders. The QAS conceptual framework demonstrates, in part, how data are generated to share with stakeholders and how stakeholder feedback is solicited to support and inform program revision. EPP stakeholders examine data within and across programs. The AAC often facilitates how data are accessed, shared, and examined among internal and external stakeholders. Often data from various stakeholders coalesce to identify themes of areas ripe for data-based evaluation. External stakeholders, such as personnel in our partner districts, provide valuable data for program evaluation. One mechanism for these stakeholders to share data is through Employer satisfaction surveys and mentor surveys. Data from these surveys can be used to co-construct program requirements and field experiences.
While many institutions target a day for data and reflection to meet CAEP Standards, the EPP employs the use of the Data Roundtable series. Each session in the series targets a specific topic or issue. The EPP seeks feedback from district faculty and staff, EPP faculty, staff, and students, alumni completers, and alumni who serve in district leadership roles. These valuable sessions provide opportunity to share data with stakeholders, interrogate the data together, and listen to stakeholder feedback and suggestions for programs based upon these data. These rich discussions, as well as the feedback received from other, internal stakeholders, including PCCC, lead the EPP to further examine clinical partnerships.
PCCC serves as an advisory board that helps to guide Educational Preparation Programs across campuses and to make recommendations to College and University Leadership regarding Preparation programs. PCCC also provides space for representatives from all programs to examine and discuss data together. These stakeholder discussions often result in opportunities to construct additional synergies among programs. Recent discussion, for example, privileged the nature of partnerships within districts and how these evolve. Over the last year, given the COVID pandemic, PCCC provided valued opportunities for programs to solve challenges that impacted our candidates and our district partners. Questions about how to navigate and modify placements while providing rich experiences grounded in standards, how to coordinate testing within state, university, and district requirements, how to request and honor exceptions to in-person internships, how to effectively communicate with stakeholders how placements would be executed and how program requirements were being met, given the changing landscape of state and district expectations. One benefit of the pandemic has been increased recognition of similarities among programs and the collegiality of the EPP faculty and staff as they faced unique unexpected challenges.
Another example of how data collected from stakeholders results in continuous program improvement is the feedback provided by superintendents and alumni through the Data Roundtable series, data from recruitment and retention, Penn State admissions data, candidates' communications with advising, alumni, QAS benchmarking data, and institutional data. For example, in response to NC SARA, it was known that many EPP candidates are out of state students and many intend to return to their home states or other states outside of Pennsylvania. To address this challenge, all candidates admitted to Penn State advanced programs receive a letter informing them of their home state requirements for certification and if the EPP's candidate preparation will be accepted in their home state for licensure. Additionally, the LionPath shares this information with the candidate as well and lead faculty are aware of how to access and share this information from the Penn State website of interactive maps.
An additional example of intense data-examination for program review and potential revision was the need for all programs to undergo state review in 2021- 2022. Responding to PDE as a primary EPP stakeholder initiated with program faculty and PCCC and was informed by SPA data and non-SPA program data, data derived through the QAS systematic use of data, and data from other stakeholders, including mentors, principals, and superintendents. Discussion of the PDE major review data was held at PCCC and two retreats were held with program stakeholders review and discuss essential data for their program reviews. The Penn State Office of Planning and Institutional Research (OPAIR) also served as an important additional stakeholder in major program review.
Through numerous data sources, including data identified through the QAS systematic use of data and the AAC data requests, the EPP programs collect, examine, interrogate, and discuss data. The EPP relies on a variety of stakeholders and partners to use these data to inform program review and continuous improvement.
Measure Three: Candidate Competency at Program Completion (Initial and Advanced Programs)
All students completing a teacher preparation program at Penn State must meet program course requirements by passing with a C grade or higher, maintain a 3.0 GPA overall in the program, and pass a PDE-required content exam. The following table of data presents three years of completers with Praxis or Pearson testing pass rates and the average pass score. For programs that have less than 5 students per year, the completer data have been aggregated into one or more blocks to preserve student privacy.
Program Completers with GPA and Certification Exam Pass Rates
AY 2016-17 through 2020-21
|Soc St 4-8||2016-17||2017-18||2018-19||2019-20||2020-21|
|Earth & Space 7-12||2016-2019||2019-2021|
|PECT Mod 1||231||249||243||242||241|
|PECT Mod 2||243||253||248||245||234|
|PECT Mod 1||246||232||241||249||239|
|PECT Mod 2||245||245||252||251||246|
|Praxis 811 or NOCTI 5961|
Measure Four: Ability of Completers to be Hired (Initial and Advanced Programs)
One of the measures on program outcome that the College of Education is required to list in its annual report to CAEP is the ability of our program completers to be hired into positions for which they have been prepared. The Pennsylvania Department of Education (PDE) provides a list of Penn State students who have been hired in Pennsylvania and the school districts that have hired them. However, because a significant number of completers choose to seek teaching positions in other states, the College has been working on ways to maintain connections with those students using emails and surveys, so as to include their data in our self assessments.
The following tables reports PDE data on the number of Penn State completers hired into Pennsylvania school districts for five calendar years at the initial and at the advanced program levels. Please note that 2022 initial completers were new to the job search process, and any recent hires would not yet be loaded into the PDE system.
|Certification Year||Total Certified Initial||Teaching in PA|
|Certification Year||Total Certified Advanced||Working in PA|
An example of less formal data was collected in 2017, a year after the first cohort of the new dual degree CEAED PK- Grade 4 bachelor's program with a one year Special Education PK- Grade 8 Master's degree graduated. Seven completers were emailed asking for feedback on the following questions: Are you working in an elementary school, are you in a general or special education classroom, and do you believe that your master's degree helped you obtain your position or made you better qualified.
Six students (85.1%) responded, and all six reported that they were working in an elementary school. Three were working in general education and three were working in special education. All six reported that their second degree helped them obtain their position over the other candidates. Additional comments described satisfaction with their program, such as "I feel much more comfortable as a teacher" and "I thinking staying and getting my Master's was the best decision I could have made" and "I'm thankful to have had such a great education to prepare me!"
In 2021 additional completers were contacted for their feedback, especially trying to obtain a position and teach during the first year of COVID. Eight responses (30.8%) indicated that having the special education Master's "made me WAY more prepared and WAY more marketable," and "In that short year, I learned so many strategies and techniques that I use in my classroom today. " Finally, "The pandemic did add extra stress to my job search but I feel as though having my master's was what put me above the rest of the candidates."