CAEP Annual Indicators
CAEP has eight annual reporting measures which are used to provide data to the public on program impact, program outcomes, and consumer information.
Measures of Program Impact (CAEP Standard 4)
This measure evaluates how program completers contribute to student-learning growth. The Commonwealth of Pennsylvania utilizes a Student Learning Objectives (SLO) process to assess and document educator effectiveness based on student achievement of content standards. SLO is a part of Pennsylvania's multiple-measure, comprehensive, system of Educator Effectiveness authorized by Act 82 (HB 1901). SLO serves as one source of evidence of completer's impact on student learning and development. The College of Education contacted completers who are classroom teachers to ask what evidence they have collected that speaks to student learning in their P-12 classrooms. This PDF includes samples of SLO provided by program completers who are classroom teachers at various P-12 levels.
However, because neither the Pennsylvania Department of Education nor the Commonwealth of Pennsylvania currently share the SLO data as explained above, the EPP created a Penn State approved project: “Do College of Education Teacher Preparation Program Completers Contribute to P-12 Student Learning and Development through Effective Teaching Practices” as a continuation of the data collection efforts surrounding Standard 4.1. Emails were sent to 243 completers who had been identified as having less than five years of classroom experience, in order to control for district effect overtaking EPP effect. To date, only four data sets have been submitted for the SLO evidence. The EPP has reviewed the collected data and finds that there is alignment looking at the SLOs with other measures previously collected that program completers do contribute to an expected level of student-learning growth.
However, the small number of participants, even after offering incentives, was disappointing and had significantly limited reliability, validity, and actionability. Planning has begun on the logistics of setting up direct observations of classroom teachers that have been identified by the state in order to supplement and validate the data previously collected.
A data collection tool that continues to provide robust evidence that our program completers who are classroom teachers are systematically measuring student-learning growth is our annual master completer survey. This tool surveys all College of Education completers from all programs, and asks questions relevant to program taken and employment status. For the Summer 2018 survey, 186 classroom teachers from Pennsylvania plus 21 other states and 5 non-US locations provided responses to "What indicators do you use that provide evidence to guide your practice that your students are making academic progress?" Over 95% responded that they use Progress Monitoring (e.g. classwork, homework, student artifacts) as one indicator used, plus 60% indicating state standards, 30% indicating standards of my discipline, and individuals noted daily observations, Vermont Teaching Strategies Gold, constant assessment - formal and informal, PA SAS, project-based learning projects, agricultural task sheets, pass rates on required state exams, and online ESGI software.
However, with falling survey responses in AY 2019-20, and because of direct feedback from superintendents that teachers were at capacity delivering instruction during the pandemic, the EPP repeated a personal email technique successfully used in Spring 2020 to capture responses in Spring 2021. The email prompted teachers to complete a table asking for the teacher’s personal perception of what percentage of students exceeded, met, and did not meet district growth goals for 2019, 2020, and 2021, plus any additional thoughts, stories, frustrations or triumphs they wanted to share. Besides the percentage data, the teachers provided rich content of their experiences with their P-12 students during the pandemic. They provided a glimpse into their classrooms, actual and virtual, shared a wide variety of frustrations and a few successes as well. Fourteen teachers responded, with most reporting lower student growth in 2020-21 than in the previous two years. This response was well in line with what the EPP had been hearing from the field.
A recently crowd sourced DB of completers will serve as a communication tool to contact completers. The EPP again plans to survey district administration and pilot case studies with districts with large numbers of completer as districts once again open. Due to the COVID 19 pandemic the EPP recognizes increased need to provide induction programming. Such programming will increase connections between the EPP and the completers and provide additional insight into candidates' practices and the resulting P-12 student growth. EPP faculty plan to conduct additional research within this context.
This measure evaluates by observation or P-12 student and administration surveys how program completers apply the professional knowledge, skills, and dispositions acquired in their preparation programs effectively in their P-12 classrooms. Generally the EPP sends all completers an annual survey, grounded in the InTASC standards. The survey prompts a self-assessment regarding their effectiveness in the P-12 classroom. The EPP also surveys principals regarding completer performance and their P-12 learners' growth. Further, some EPP programs engage in observations of completers in the P-12 classroom. Unfortunately, the EPP was not permitted in-person access to teachers' classrooms and guests were not allowed in virtual classrooms, thereby limiting data collection for the 2020-21 academic year. Therefore, based upon feedback from district partners regarding teacher capacity constraints due to the pandemic, the EPP implemented a revised strategy in Spring 2021.
As noted in Section 4 Annual Indicators, teachers instead completed a self reflection table on progress on their goals for effectiveness. While some teachers reported not being able to meet their goals for effective teaching, others responded that they exceeded their goals in 2020-21. Moving forward, the EPP this new tool will continue to be implemented, as well as an expansion of existing observations and surveys. the EPP found this new tool useful and in moving forward will explore reliability and validity of this as a measure to be used in concert with additional indicators of completer effectiveness.
Regarding employer perception, until recently data regarding where completers took positions were not available. The EPP employed numerous strategies, such as leveraging data from the alumni association and searching social media to reach out to completers. PDE recently started providing current PA placement data for completers. These data will be of some use for following individuals into the field, however many of our completers do not remain in PA. The data provided by PDE also does not include contact information. Nonetheless, it will be a place to start.
In preparation for the ability to track completers into positions in Pennsylvania, the EPP identified concentrations of recent completers (since 2012) who took positions in varied districts. We intentionally picked schools for this pilot that included variance regarding size of district, suburban, rural, and urban districts, and grade level of teachers taught. Principal emails were found through district websites. Principals (n=17), all with more than one EPP completer in their school, were selected. Each was sent an email with a 10-item survey grounded in InTASC standards and parallel with EPP developed student surveys to provide mechanisms to support validity, with rating scale: not at all, limited, adequate, and proficient. Principals (n=2) responded in reference to P-4; 2 Reading Specialists; 2 Spanish (P-12); 1 Biology (7-12); 1 Cooperative Education (7-12); 2 Social Studies (7-12); 1 Health and Physical Education (P-12); 2 English (7-12); 1 Special Education (P-8); 1 Special Education (P-12). These two principals rated completers as 'proficient' for each standard, except for one. The elementary principal rated the PK-4 completers as 'Adequate' in Penn State teachers use of multiple methods of assessment for progress monitoring and decision making. These data have significant limitations and provide limited evidence as a small pilot with a very low response rate. The EPP, however, plans to conduct a stronger principal survey in the latter half of 2019, focusing on principals who have completing certification training at the EPP.
Faculty colleagues at the EPP have recently received data from PDE on the certification and placement of completer educators working in Pennsylvania. Thus, in the future, we will be able to track the placement and retention of all Penn State educators by certificate area (e.g., elementary, middle school mathematics, high school English Language Arts, etc.) and school level (e.g., elementary-, middle-, and high- school) within the Commonwealth, in accord with component 4.3. Further, PDE will be releasing all principal and superintendent emails, which will greatly facilitate effective sampling. Unfortunately, there was too little data from the pilot to address psychometric properties of this scale. Therefore, prior contact with principals we will survey to make them aware of the intent of the survey will be initiated. This should result in increased response rate and adequate sampling. The alignment with InTASC standards and the ability to correlate with completer surveys hold promise the data from the survey will meet standards for reliability and validity.
Advanced Programs: Although the EPP has some of the same tracking completers into the field issues with the advanced preparation programs, there are some data for students completing Principal Certification programs. This program has begun surveying program completers to provide data for its SPA reporting and by looking at the question, "Did you change jobs after receiving your principal certification from Penn State?" the EPP can make judgments on the satisfaction of employers by the percentage of completers who were able to change positions. Of the 75 respondents who completed in 2016-2019, 17 of the 18 who changed jobs, did so into an administrative or supervisory role. Of the 58 completers who did not change positions, 15 were already in administrative or supervisory positions.
Because of the limited nature of the collected data, we have developed a plan to collect additional data to demonstrate Employer Satisfaction. We modeled our Employer Satisfaction Survey from the Completer Survey administered and validated by institutions in the State of Mississippi. The instrument demonstrates sound psychometric properties as administered to completers of their advanced program. Aligned to CAEP key competencies and our Advanced Completer Survey, data from administration of the Employer Satisfaction Survey will inform advanced programs' continued improvement. As part of the 4.1 plan we demonstrate how we will assure the instrument meets standards for CAEP EPP-Created Assessments that provide data about our key competencies for Advanced Completers from all programs and allow for desegregated data and assessment of program-specific objectives. Additional Employer Satisfaction pilot data will be collected into Spring 2022 for completers from each advanced program.
The 4.1 Plan includes a human subjects approved research study to further refine and test a measure that captures Employers' Satisfaction. Annual administrations will occur every fall beginning in the Fall 2022 semester and rounds of desegregated data will be tracked and reported. A test pilot of the survey was sent to two professional stakeholders, one for principal completers and one for superintendent completers. These initial data show 100% Strongly Satisfied or Satisfied responses out of a rating scale of Strongly Satisfied, Satisfied, Dissatisfied, Strongly Dissatisfied. Feedback for areas of improvement included the importance of developing, implementing, and measuring long-term strategic goals and measures, leading system-wide change processes including curriculum development for principals. Feedback for superintendent development included high knowledge of personnel management, finance, contract negotiations, community relations, and personnel evaluations. The EPP will review course curriculum for alignment with these feedback areas and discuss ways of strengthening instruction and assessment of these items.
The College of Education surveys students after program completion. These various surveys and results can be found on the Completer Survey Results page. Beginning in 2017, PDE released Program Satisfaction data that it collected from initial level completers from within their certification application, with additional years released in 2019-20. The following table reflects these two years of data:
|Survey Questions||2016-17 Agree or Strongly Agree %||2019-20 Agree or Strongly Agree %|
|My program provider clearly communicated program and certification requirements before program enrollment||87.57%||91.98%|
|I feel adequately prepared to design and implement instruction and assessments aligned with state standards||94.67%||97.71%|
|I have sufficient knowledge to teach and work with all students effectively||94.67%||99.14%|
|My field experiences provided opportunities to work with diverse learners||88.76%||95.42%|
|I received effective feedback, coaching and mentoring that built my confidence, skills, and knowledge||89.94%||97.42%|
Responses were also collected from candidates completing an advanced level program. Their response rates are as follows:
|Survey Questions||2016-17 Agree or Strongly Agree %||2019-20 Agree or Strongly Agree %||2020-21 Agree or Strongly Agree %|
|My program provider clearly communicated program and certification requirements before program enrollment||73.53%||98.36%||97.83%|
|I feel adequately prepared to design and implement instruction and assessments aligned with state standards||91.18%||96.72%||95.65%|
|I have sufficient knowledge to teach and work with all students effectively||91.18%||96.72%||100.00%|
|My field experiences provided opportunities to work with diverse learners||88.24%||98.36%||97.83%|
|I received effective feedback, coaching and mentoring that built my confidence, skills, and knowledge||94.12%||96.72%||100.00%|
The advanced level PDE data did provide some insights into advanced candidate satisfactory with their program; however, it was not sufficient to guide program improvement. In fall 2021, the EPP constructed a new survey modeled from the Completer Survey administered and validated by institutions in the State of Mississippi. The instrument demonstrates sound psychometric properties as administered to completers of their advanced program. Aligned to six CAEP key competencies and our Employer Satisfaction Survey, we recently administered the survey and are able to report pilot data findings.
Because the survey was administered in December 2021, there were limited responses in the pilot across programs; however, completers reported their preparation was effective. For example, 88% reported they agreed or strongly agreed with the overall effectiveness of the program in accord with CAEP Standard 4. 88% of those completers surveyed in the pilot also reported that their coursework was relevant to the responsibilities confronted on the job; and 88% also reported that the coursework was relevant to the responsibilities they confronted in their career.
Measures of Program Outcomes
The College of Education provides eight years of EPP data such as completer final GPA's and the percentage of program completers eligible for state certification upon graduation. All years show eligibility rates well over 95% at the initial level. In almost every year, 100% of advanced program completers were eligible for certification.
Graduating Certification Student Statistics
Cumulative GPA-Initial Programs (by academic year)
- 2012-13 3.63 GPA
- 2013-14 3.67 GPA
- 2014-15 3.69 GPA
- 2015-16 3.66 GPA
- 2016-17 3.65 GPA
- 2017-18 3.65 GPA
- 2018-19 3.64 GPA
- 2019-20 3.68 GPA
- 2020-21 3.65 GPA
Cumulative GPA-Advanced Programs (by academic year)
- 2012-13 3.88 GPA
- 2013-14 3.87 GPA
- 2014-15 3.87 GPA
- 2015-16 3.90 GPA
- 2016-17 3.82 GPA
- 2017-18 3.85 GPA
- 2018-19 3.94 GPA
- 2019-20 3.94 GPA
- 2020-21 3.93 GPA
Initial Programs Teacher Education Students Eligible for Certification Upon Graduation
- 2012-13 97.46%
- 2013-14 98.60%
- 2014-15 98.25%
- 2015-16 96.37%
- 2016-17 98.33%
- 2017-18 98.48%
- 2018-19 97.55%
Advanced Programs Teacher Education Students Eligible for Certification Upon Graduation
- 2012-13 100.00%
- 2013-14 100.00%
- 2014-15 100.00%
- 2015-16 100.00%
- 2016-17 98.00%
- 2017-18 100.00%
- 2018-19 100.00%
- 2020-21 97.83%
All students completing a teacher preparation program at Penn State must meet program course requirements by passing with a C grade or higher, maintain a 3.0 GPA overall in the program, and pass a PDE-required content exam. The following table of data presents three years of completers with Praxis or Pearson testing pass rates and the average pass score. For programs that have less than 5 students per year, the completer data have been aggregated into one or more blocks to preserve student privacy.
Program Completers with GPA and Certification Exam Pass Rates
AY 2016-17, 2017-18, 2018-19, 2019-20
|Soc St 4-8||2016-17||2017-18||2018-19||2019-20||2020-21|
|Earth & Space 7-12||2017-18||2018-19||2019-21|
|PECT Mod 1||231||249||243||242|
|PECT Mod 2||243||253||248||245|
|PECT Mod 1||246||232||241||249|
|PECT Mod 2||245||245||252||251|
One of the measures on program outcome that the College of Education is required to list in its annual report to CAEP is the ability of our program completers to be hired into positions for which they have been prepared. The Pennsylvania Department of Education (PDE) provides a list of Penn State students who have been hired in Pennsylvania and the school districts that have hired them. However, because a significant number of completers choose to seek teaching positions in other states, the College has been working on ways to maintain connections with those students using emails and surveys, so as to include their data in our self assessments.
The following tables reports PDE data on the number of Penn State completers hired into Pennsylvania school districts for five calendar years at the initial and at the advanced program levels. Please note that 2020 initial completers were new to the job search process, and any very recent hires would not yet be loaded into the PDE system.
|Certification Year||Total Certified Initial||Teaching in PA|
|Certification Year||Total Certified Advanced||Working in PA|
An example of less formal data was collected a year after the first cohort of the new dual degree CEAED PK- Grade 4 bachelor's program with a one year Special Education PK- Grade 8 Master's degree graduated. Seven completers were emailed asking for feedback on the following questions: Are you working in an elementary school, are you in a general or special education classroom, and do you believe that your master's degree helped you obtain your position or made you better qualified.
Six students (85.1%) responded, and all six reported that they were working in an elementary school. Three were working in general education and three were working in special education. All six reported that their second degree helped them obtain their position over the other candidates. Additional comments described satisfaction with their program, such as "I feel much more comfortable as a teacher" and "I thinking staying and getting my Master's was the best decision I could have made" and "I'm thankful to have had such a great education to prepare me!"
Student Loan Default Rates and Other Consumer Information
The data below are from the U.S. Department of Education. At this time, college student loan default rates as provided by the federal government are not able to be separated out from the university's aggregate data.
Default Rate as of 2018
School Type Percentage Rate
Penn State University 4.0%
All Pennsylvania Schools pending
Public Four Year Schools 5.4%
National Average All Schools 7.3%