Annual Indicators

Eight annual reporting measures required by CAEP.

CAEP has eight annual reporting measures which are used to provide data to the public on program impact, program outcomes, and consumer information.

Measures of Program Impact (CAEP Standard 4)

Impact on P-12 Learning and Development (Component 4.1)

This measure evaluates how program completers contribute to student-learning growth. The Commonwealth of Pennsylvania utilizes a Student Learning Objectives (SLO) process to assess and document educator effectiveness based on student achievement of content standards. SLO is a part of Pennsylvania's multiple-measure, comprehensive, system of Educator Effectiveness authorized by Act 82 (HB 1901). SLO serves as one source of evidence of completer's impact on student learning and development. The College of Education contacted completers who are classroom teachers to ask what evidence they have collected that speaks to student learning in their P-12 classrooms.  This PDF includes samples of SLO provided by program completers who are classroom teachers at various P-12 levels.   

However, because neither the Pennsylvania Department of Education nor the Commonwealth of Pennsylvania currently share the SLO data as explained above, the EPP created a Penn State approved project: “Do College of Education Teacher Preparation Program Completers Contribute to P-12 Student Learning and Development through Effective Teaching Practices” as a continuation of the data collection efforts surrounding Standard 4.1.  Emails were sent to 243 completers who had been identified as having less than five years of classroom experience, in order to control for district effect overtaking EPP effect. To date, only four data sets have been submitted for the SLO evidence.  The EPP has reviewed the collected data and finds that there is alignment looking at the SLOs with other measures previously collected that program completers do contribute to an expected level of student-learning growth.

However, the small number of participants, even after offering incentives, continues to be disappointing and has significantly limited reliability, validity, and actionability.  Planning has begun on the logistics of setting up direct observations of classroom teachers that have been identified by the state in order to supplement and validate the data previously collected. 

A data collection tool that continues to provide robust evidence that our program completers who are classroom teachers are systematically measuring student-learning growth is our annual master completer survey.  This tool surveys all College of Education completers from all programs, and asks questions relevant to program taken and employment status. For the Summer 2018 survey, 186 classroom teachers from Pennsylvania plus 21 other states and 5 non-US locations provided responses to "What indicators do you use that provide evidence to guide your practice that your students are making academic progress?"  Over 95% responded that they use Progress Monitoring (e.g. classwork, homework, student artifacts) as one indicator used, plus 60% indicating state standards, 30% indicating standards of my discipline, and individuals noted daily observations, Vermont Teaching Strategies Gold, constant assessment - formal and informal, PA SAS, project-based learning projects, agricultural task sheets, pass rates on required state exams, and online ESGI software.

Indicators of Teaching Effectiveness (Component 4.2)

This measure evaluates by observation or P-12 student surveys how program completers apply the professional knowledge, skills, and dispositions acquired in their preparation programs effectively in their P-12 classrooms.  As mentioned in the Penn State project above, the College of Education contacted completers who are classroom teachers to ask for evidence of their teaching effectiveness that is collected through teacher evaluation tools administered by the school districts, with the same disappointing small responses.  The PDF referenced above also includes these teacher observation forms.

Satisfaction of Employers and Employment Milestones (Component 4.3 / A.4.1)

Regarding employer perception, until recently data regarding where completers took positions were not available. The EPP employed numerous strategies, such as leveraging data from the alumni association and searching social media to reach out to completers. PDE recently started providing current PA placement data for completers. These data will be of some use for following individuals into the field, however many of our completers do not remain in PA. The data provided by PDE also does not include contact information. Nonetheless, it will be a place to start.

In preparation for the ability to track completers into positions in Pennsylvania, the EPP identified concentrations of recent completers (since 2012) who took positions in varied districts. We intentionally picked schools for this pilot that included variance regarding size of district, suburban, rural, and urban districts, and grade level of teachers taught. Principal emails were found through district websites. Principals (n=17), all with more than one EPP completer in their school, were selected. Each was sent an email with a 10-item survey grounded in InTASC standards and parallel with EPP developed student surveys to provide mechanisms to support validity, with rating scale: not at all, limited, adequate, and proficient. Principals (n=2) responded in reference to P-4; 2 Reading Specialists; 2 Spanish (P-12); 1 Biology (7-12); 1 Cooperative Education (7-12); 2 Social Studies (7-12); 1 Health and Physical Education (P-12); 2 English (7-12); 1 Special Education (P-8); 1 Special Education (P-12). These two principals rated completers as 'proficient' for each standard, except for one. The elementary principal rated the PK-4 completers as 'Adequate' in Penn State teachers use of multiple methods of assessment for progress monitoring and decision making. These data have significant limitations and provide limited evidence as a small pilot with a very low response rate. The EPP, however, plans to conduct a stronger principal survey in the latter half of 2019, focusing on principals who have completing certification training at the EPP.

Faculty colleagues at the EPP have recently received data from PDE on the certification and placement of all educators. Thus, in the future, we will be able to track the placement and retention of all Penn State educators by certificate area (e.g., elementary, middle school mathematics, high school English Language Arts, etc.) and school level (e.g., elementary-, middle-, and high- school) within the Commonwealth, in accord with component 4.3. Further, PDE will be releasing all principal and superintendent emails, which will greatly facilitate effective sampling. Unfortunately, there was too little data from the pilot to address psychometric properties of this scale. Therefore, prior contact with principals we will survey to make them aware of the intent of the survey will be initiated. This should result in increased response rate and adequate sampling. The alignment with InTASC standards and the ability to correlate with completer surveys hold promise the data from the survey will meet standards for reliability and validity. 

Advanced Programs:  Although the EPP has some of the same tracking completers into the field issues with the advanced preparation programs, there is some data for students completing Principal Certification programs.  This program has begun surveying program completers to provide data for its SPA reporting and by looking at the question, "Did you change jobs after receiving your principal certification from Penn State?" the EPP can make judgments on the satisfaction of employers by the percentage of completers who were able to change positions.  Of the 75 respondents who completed in 2016-2019, 17 of the 18 who changed jobs, did so into an administrative or supervisory role.  Of the 58 completers who did not change positions, 15 were already in administrative or supervisory positions.

Satisfaction of Completers (Component 4.4 / A.4.2)

The College of Education surveys students after program completion.  These various surveys and results can be found on the Completer Survey Results page.  Also, beginning in 2017, PDE released Program Satisfaction data that it collected from initial level completers from within their certification application, with additional years to be released when available.  The following table reflects this first year of data:

PDE Program Satisfaction Data
Survey Questions Spring and Summer 2017 CompletersPercentage of Agree or Strongly Agree Responses
My program provider clearly communicated program and certification requirements before program enrollment. 85.27%
I feel adequately prepared to design and implement instruction and assessments aligned with state standards. 94.64%
I have sufficient knowledge to teach and work with all students effectively. 94.20%
My field experiences provided opportunities to work with diverse learners. 88.84%
I received effective feedback, coaching and mentoring that built my confidence, skills, and knowledge. 90.63%

All Standard Four evidence tagged at the component level provided to CAEP in the College's Self-Study report can be found on the Evidence Table for Standard Four webpage.

Advanced Programs:  Within the principal preparation program survey mentioned above are scaled questions providing rating of various aspects of the program, as well as a final open ended question "Are you satisfied with this program's ability to prepare you for your responsibilities working with PK-12 students?" 80.0% of the completers responded in the affirmative, 6.7% were negative and 13.3% did not respond to this question.  Finally 64% of respondents did provide ideas for "What changes could be made in the program to improve its quality?"

Measures of Program Outcomes

Graduation Data Summaries (Initial and Advanced Programs)

The College of Education provides five years of EPP data such as completer final GPA's and the percentage of program completers eligible for state certification upon graduation.  All years show eligibility rates well over 95% at the initial level.  In almost every year, 100% of advanced program completers were eligible for certification.

Ability of Completers to Meet Certification Requirements

The College of Education provides three years of EPP data showing results of state licensure exams, including pass rates and median scores by program year and program area.

Ability of Completers to be Hired in Education Positions

The College of Education compiles data received from PDE on an annual basis on program completers who have accepted a position in a Pennsylvania school. For example, the calendar year 2016 shows 191 Penn State completers employed by a Pennsylvania school.  Please see Completers Hired for a break down of additional years and levels of completers.

Student Loan Default Rates and Other Consumer Information

(please note that any links that ask for a Penn State WebAccess log in are for registered Penn State students only)

The data below are from the U.S. Department of Education.  At this time, college student loan default rates as provided by the federal government are not able to be separated out from the university's aggregate data.

Default Rate as of 2015
School TypePercentage Rate
Penn State University 5.1%
All Pennsylvania Schools 9.6%
Public Four Year Schools 7.1%
National Average All Schools 10.8%