Program Review Content
The key components of the Comprehensive Program Reviews (CPRS) and the Annual Program Reviews (APRs) are listed below, along with specific instructions on how they may be produced. In addition, please review the Program Review Policies.
A key component of the program review is a clear statement of your program's vision. To support this section, identify a least two best practices in use in other programs in the discipline and analyze the current trends in your discipline. This portion of the review includes the current vision and mission for the program, notes any changes that have been made to the mission and vision.
This section also includes
- descriptions of two or more best practices in your discipline that show potential for use at GGC along with a summary of theoretical and/or applied data supporting the selection of the specified practices, and
- descriptions of current and upcoming trends in the discipline.
This section specifies the rationale for selecting each best practice, with a focus on the both the theoretical foundations for the specific practice and examples of its implementation. Examples of implementation should include data on the characteristics of the program implementing the practice and data on the student learning outcomes of the program. Where possible, results of comparative studies demonstrating that the identified practice contributes to improved student learning outcomes should be included.
This section of the review addresses the global and national character of the discipline and identifies growth areas, major shifts, and general movement in any of several areas, including but not limited to: pedagogy, theoretical frameworks, and career and/or graduate school opportunities. It is possible that the information in this section will remain fairly stable from year to year and require only updating on an annual basis.
Current Data Snapshot
Current State of Your Program
To address this thoroughly, please complete the 5-part data snapshot:
- Student Demographics (generated by IR);
- Program Productivity (generated by IR);
- Student Learning Outcome Results;
- Faculty Info;
- Analysis of the Resources of the Program, including the leadership and the organizational structure.
This compiles existing and available data about the program so that the faculty and leadership can consider the status and performance of the program in the context of each of the perspectives listed above. There are five components of the current data snapshot: student demographics; program productivity; student learning information; faculty information; and an analysis of the resources, leadership, and organizational structure of the program and the school within which it resides. Data and information are gathered from several sources over the course of spring semester. The table below provides a list of recommended data to be included. Programs may wish to include other indicators and, in some years or programs, some data listed below may be unavailable.
The data snapshot concludes with a narrative description, prepared by the school dean or director, of the resources and the leadership and organizational structure of the program and the school within which it resides. This description is focused on articulating how the School and/or program allocates and uses its resources – money, space, time, personnel (and their specific skills) – to support the mission and the goals of the program.
Data Collection Process
February – March
- Student Learning data is collected by program faculty and the Office of Institutional Research.
This data includes program outcomes (summary of results); IEE outcomes (summary and analysis of results from upper-level courses); IEE (or GE) outcomes (summary and analysis of results from Area F courses); grade distributions for lower-level courses (1000 to 2000-level courses) and upper level courses (3000 to 4000-level courses).
- Student Demographics (gender, age, ethnicity) are collected via the Institutional Research Standard template (to be constructed)
- Program Productivity data is collected by Institutional Research (standard template to be constructed)
This data includes number of majors, graduates and faculty; the average semester load (students), the average student credit hours per faculty by semester; the ratios of faculty to students, of program mentor per mentee (all dependent on identifying faculty by program).
April and Early May
- Additional Student Learning data focused on the job and grad school placement of alumni is collected by the Career Services staff and program faculty.
- Faculty Data is compiled by the program faculty and directors as well as the school deans.
This data includes summary data on course evaluations, faculty scholarship and service (grants, publications, etc.); faculty awards; the number of faculty or staff who applied for and received promotions; and recruitment data, including the number of applicants per position and the average or median ratings of the applicants.
Program Strategic Plan Update
What is your program's role in supporting the College's Strategic Plan?
To answer this question, please review the results of your unit assessment report, and provide evidence that your program supports the College’s Strategic Plan.
This section of the review provides a narrative analysis of the program’s role in supporting the College strategic plan, showing how the program level strategic goals serve to advance the College’s goals and plan. The program level strategic plan is reviewed with specific attention paid to identified or potential areas of growth and enhancement. The status of individual action steps within the strategic plan should be updated; completed action steps should be flagged and action steps that are behind schedule or that are due to be completed in the upcoming year should be highlighted.
The review and update of the program level strategic plan should be conducted by the faculty within the discipline.
Comprehensive Review of Data
What are the strengths, weaknesses, opportunities and threats facing your program?
To answer this question, please review the information compiled to answer questions # 1, 2, and 3. The comprehensive review of the data that has been collected will help in completing this analysis.
This section of the annual review provides the overall analysis of all data and information from the previous sections. In essence, this is a SWOT analysis of the current status of the program. What are its strengths and weaknesses – in relation to the four perspectives identified earlier? What opportunities for growth or enhancement can be identified? What conditions – on or off campus – present challenges to continued effectiveness and efficiency, to growth, or to enhancement of the program.
While this section will clearly include a narrative summary of the program’ student learning outcomes assessments for the previous calendar year, it must go beyond student learning and consider operational needs and factors. A detailed analysis referencing best practices and disciplinary trends is particularly useful.
What gaps have you identified between the current status of your program and your vision? What is needed to implement best practices at GGC? What is your action plan for addressing those gaps and needs?
The final section of the review articulates the specific responses that would be appropriate responses to the comprehensive data review. This includes
- the previously articulated action plans that emerge from student learning outcomes assessment, but will also need to address actions in response to the strategic plan analysis and ideas arising from the review of best practices and disciplinary trends and
- an analysis of the resources required (people; funds; space; time; etc.) to mitigate the gap between the vision and the current state.
This section is reserved for discussion and supporting data for (a) any additional factors or information you consider important to the current state of the program or (b) the specified focus area for the year.