STANDARD 2. ASSESSMENT SYSTEM AND UNIT EVALUATION
STANDARD 2. ASSESSMENT SYSTEM AND UNIT EVALUATION
The unit has an assessment system that collects and analyzes data on the applicant qualifications, the candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs.
2a. Assessment System
2a.1-2 Evaluation and Refinement
At the time of our last NCATE visit in October 2001, the BOE’s report issued the following weakness in the unit assessment system:
The assessment system does not include a plan or data system from which the unit will be able to engage in program evaluation and improvement.
Rationale: Although the Unit has given some thought to developing an evaluation system, there is no conceptual design for relating the various categories of data currently being collected. The Unit should begin to develop an information system architecture that will be flexible enough to meet internal as well as external reporting needs (p. 18).
In consideration of this evaluation, the Unit began a major effort to make changes to the Assessment System in the Fall of 2002. Responsibility for oversight of the unit assessment system now rests with the Dean’s office, specifically with the Associate Dean for Teacher Education who is also the NCATE Coordinator. While each department is responsible for the collection, analysis and evaluation, and use of data for improvement of its individual programs, the Office of the Associate Dean tracks progress and where needed coordinates efforts to ensure that each element of Standard 2 is met.
Initial Programs. Beginning efforts attended to the undergraduate initial licensure programs and involved examining ways to “link the component parts (of the unit assessment system) in a meaningful way”. A committee of faculty was formed to align professional and state standards by creating a “crosswalk” between the standards and the portfolio-based system of assessing teacher candidates. In the fall of 2003, changes to the portfolio system were discussed, e.g., have students create electronic portfolios and assess the student work using rubrics. As the planning evolved, it became evident that faculty were loyal to the portfolio system but felt the changes under discussion might undermine the philosophical purposes of the portfolio (i.e., to have students articulate their professional development over time).
Faculty agreed to move forward and create a new plan to assess candidates, recognizing that it was unclear how the previous portfolio review system would be incorporated. In the fall of 2003, a team attended AACTE/NCATE and upon return, proposed to undergraduate program faculty in the Unit that a new assessment system be developed based on a common rubric approach. The rubrics would assess candidate performance on critical tasks, would be aligned with the Unit’s conceptual framework, professional, state, and program standards, and would draw upon a web-based electronic management system, LiveText, for data collection and analysis. The extant portfolio system would remain in use, and a task force was convened by the Department of Teaching and Learning’s Chair and charged to study ways the system might be revised and integrated into the new assessment plan.
The common rubric system with critical tasks became somewhat operational in fall 2004, underwent revision in fall 2005, became functional in spring 2006, and fully functional in Spring 2007 (Related departmental meeting minutes E-exhibit 2a.1-2.1, T&L Assessment Committee minutes E-exhibit 2a.1-2.2, Complete plan E-exhibit 2a.1-2.3). Also, by the fall of 2007, the Portfolio Task Force had completed its work (Related minutes E-exhibit 2a.1-2.4). The revised portfolio system was presented to and accepted by the faculty and the implementation of the new system set into motion (E-exhibits 2a.1-2.5). It was decided that the portfolio and its presentation would become the final critical task in the candidates’ program and the first cohort of candidates to present portfolios under the new system would be those student teaching in the spring of 2008.
Special Program Accreditation (SPA). The Special Education Program Area in the Department of Teaching and Learning prepares candidates for licensure in the field of special education at the graduate level. The program was received on October 18, 2007. Two standards were met with conditions, the remaining were not met. The primary issue of concern related to the need for the program to revise assessments in order to provide evidence that standards were met for each disability area (rather than the program as a whole). The program faculty are in the process of preparing a follow-up report. For detailed assessment information on that program, please refer to the SPA report (E-exhibits 2a.1-2.6).
Advanced Programs For Teachers. Assessment plans for the advanced programs for teachers have also undergone complete revision since the previous NCATE visit. In fall 2004, graduate directors were convened to collaborate on outlining a new assessment plan. The candidates would be assessed on advanced critical tasks and other performances using rubrics aligned with the Unit’s Conceptual Framework and National Board for Professional Teaching Standards (NBPTS). While initial implementation was directed towards admission and the final project, in the spring and summer of 2007 additional assessments were piloted. In the fall of 2007, the assessment plan was presented to and accepted by graduate directors. It will become fully operational in spring, 2008 (E -exhibits 2a.1-2.7 and 2a.1-2.8). Because the plan is new, data for Standard 1 of this report is drawn from the previous assessment plan. Candidates are assessed at entrance, at mid-point and at the end of their program.
Programs for Other School Professionals. The Departments of Educational Leadership and Counseling Psychology and Community Services have assessment plans (E –exhibits 2a.1-2.9 and 2a.1-2.10). In addition, information related to each program’s assessment system is outlined in each program re-approval report (E -exhibits 2a.1-2.11 and 2a.1-2.12). The Instructional Design and Development Program within the Department of Teaching and Learning has developed an assessment system in response to AECT (Association for Educational Communications and Technology) Standards (E -exhibits 2a.1-2.13). The graduate program in the Department of Communication Sciences and Disorders is accredited by ASHA and information related to their assessment plan may be viewed in their accreditation materials (Hard Copy Exhibit 2a.1-2.1).
A great deal of work has been done since the previous visit to evaluate and refine the overall assessment system, so the data collection and analysis on the applicant qualifications, candidate and graduate performance, and unit operations leads to improvement of the unit and its programs. For a graphic representation of the unit assessment system please see 2a.1.14 in the E-exhibit room.
2a.3-4 Key Assessments and Transition Points
The unit’s assessment system includes evaluation measures to monitor candidate performance across the programs as indicated in Table 2a3-4.1.
2a.5 Fairness, Accuracy, Consistency, and Non-bias in the Assessment System
The following list addresses the procedures used to establish fairness, accuracy and consistency and to eliminate bias in programs and the assessment system.
2b. Data collection, analysis, and evaluation
2b.1-2: Unit’s Process and Timeline to Collect Summarize, and Analyze Data
The Unit Assessment System includes the collection and analysis of data, and the Unit uses this information for candidate, program and unit evaluation for initial and advanced programs. Internal (candidate and university) and external sources of data are used at various transition points to assess candidate knowledge, skills and dispositions which are aligned with the Unit Conceptual Framework, professional and state standards and program goals. The data are also used to improve programs and the unit.
Multiple forms of assessment data are collected routinely from the candidates, faculty, University supervisors, cooperating teachers, University staff and institutional offices (e.g., Registrar) and accrediting agencies as detailed in Table 2b.1-2.1. Table 2b.1-2.2 describes the data bases, where they are housed and how the data is used. Data are analyzed routinely to monitor student progress and evaluate programs and unit operations.
Initial Programs: Candidate. A profile of each candidate is compiled in a cumulative folder and kept in the offices of the Associate Dean. At Transition Point 4, cumulative folders include, but are not limited to the following performance information for completers of an initial licensure program:
The cumulative files for both initial licensure and advanced program candidates are reviewed by the candidate’s academic advisor, who uses the information to monitor candidate preparedness. Additionally, performance data is monitored by staff in the OAD (or Graduate School staff in the case of advanced programs for teachers). Files and data that reveal insufficient performance by an initial licensure candidate usually involves a review of the performance with advisor or if more appropriate, the Student Review Committee (E-exhibit 2b.1-2.1). Insufficient performance can result in retaking a course or field experience, developing an improvement plan, seeking a new major, a referral to various student services, dismissal from the program or other appropriate interventions.
Initial Programs: Program and Unit Operations. Excel/Access spread sheets, statistical software packages, the University PeopleSoft System, and LiveText reports are used to compile and analyze data. Key components for program review for initial programs include:
The Undergraduate Assessment Committee in the Department of Teaching and Learning conducts an initial analysis of all data in preparation for the annual faculty assessment retreat. The committee summarizes the data related to critical tasks and prepares charts and tables for faculty review as well as the agenda for the sessions. The first meeting was held in May of 2006 (E-exhibit 2b.1-2.2) and the second in April of 2007 (E-exhibit 2b.1-2.3). After that session, the Committee decided to move the retreat to January to allow faculty the full spring semester to complete the necessary work for program change such that new plans could be implemented the following semester.
Completer and administrator survey data reveal descriptive statistical information (e.g., percentages) related to graduates’ and administrators’ perspectives about the units teacher preparation programs. The Office of the Associate Dean collects the data annually in collaboration with BESAR (Bureau of Educational Services and Applied Research). Once the reports are submitted to the Associate Dean they are routed to program area coordinators who review the data with faculty and prepare a summary report (E-exhibit 2b.1-2.4). Complete data related to the survey reports of 2006 and 2007 are available in the hard copy exhibit room under 2b.1-2.2.
The Advising Satisfaction Survey (E-exhibit 2b.1-2.5) data is compiled using Excel (as of spring 2008) and provides descriptive information related to students’ perceptions of advising in the unit. Results are shared with advisors in the Office of Advising and Admissions and the appropriate program area faculty who interpret the data and plan data-driven program improvements. Full survey reports for spring 2007 are available in the hard copy exhibit room under 2b.1-2.3. For a summary report see E-exhibit 2b.1-2.6.
Advanced Programs: Candidate. A profile of each candidate in advanced programs is compiled in a cumulative folder and kept in the office of the Department of Teaching and Learning. For completers of an advanced program, at Transition Point 4 a cumulative folder includes the following to produce a performance profile of each candidate/completer:
The cumulative files for advanced program candidates are reviewed by the candidate’s academic advisor, who uses the information to monitor candidate preparedness. Additionally, performance data is monitored by the Graduate School staff.
Advanced Programs: Program and Unit Operations. Excel/Access spread sheets and the University PeopleSoft System are used to compile and analyze data. Advanced program candidate’s performance on Advanced Critical Tasks, knowledge of research, and internships/practica are used for program evaluation. Data from common rubrics and assessment tools which are aligned with the Unit Conceptual Framework and national and state standards are tabulated and summarized to reveal program strengths and areas that need improvement.
The Graduate Program Directors in the Department of Teaching and Learning are responsible for the collection and analysis of all critical task data. They also prepare the agenda for the annual faculty assessment retreat related to advanced programs. The first retreat of the advanced programs faculty is scheduled in January of 2008.
Faculty advisors and advisors in the Office of the Associate Dean are committed to helping candidates resolve issues early-on. The Unit monitors and addresses complaints in two ways. When general complaints come directly to the Office of the Associate Dean for Teacher Education, written documentation is retained in a file along with notes indicating what has been done to resolve the difficulty (Hard Copy exhibit 2b.1-2.4). Generally, the Associate Dean reviews the letter or e-mail, contacts the appropriate person(s) who may assist the candidate, and follows-up to assure that difficulty has been resolved. In the case of academic complaints related to grades or discrimination, candidates are referred to the college grievance policies outlined on the colleges’ web-page (E-exhibit 2b.3.2).
2c. Use of Data for Program Improvement
2c.1. Assessment Data Indications about Candidate Performance
Data are collected in a variety of ways across and within programs. Overall findings indicate that candidates in our initial programs know their content and how to teach it and have the dispositions to help all students learn. Candidates are satisfied with their programs for the most part as are those employing them. As faculty reviewed data concerns related to assessment of students and working with diverse learners arose and action plans have been developed to address these.
Candidates in advanced programs are able to study and explore content and pedagogy through a variety of courses. They have successfully demonstrated their skills, knowledge and dispositions in course related assignments, action research projects and the culminating research project.
Graduate Directors have begun discussions that center around the assessment of dispositions and the establishment of a data collection process that will offer more direct measures of candidates knowledge and abilities while still providing maximum choice in their program of study; a hallmark of the graduate degree program.
2c.2.-3 Data Use to Improve Performance
The Unit has an assessment system that promotes on-going improvement in candidate, faculty, program and unit performance. Candidates have access to the results of nearly all performance information (with the exception of the raw data from their admissions application). Candidates have opportunities to improve their performances through revising assignments, additional or extended (highly supervised) field experiences, and retaking courses. In addition, the Director of Field Placement, Cooperating Teachers and University Supervisors work closely with student teachers to help them understand student teaching performance criteria and to apply, throughout the semester, the continuous the assessment feedback on knowledge, skills and dispositions.
Faculty analyze course evaluation (University Student Assessment of Teaching [USAT]) data and make changes to course assignments, pedagogy, course materials, or other aspects of course design. The annual faculty evaluation processes are designed to be supportive of improving teaching. Based on teaching data (USATs, formative assessments, or other information), faculty are encouraged to set teaching goals and to develop course changes that align courses with the adopted national standards, state expectations and program goals.
The faculty peer evaluation process is designed to support faculty in achieving departmental goals for tenure and promotion. If the evaluation process reveals ineffective teaching, suggestions for improvement by the peer committee are to be taken under consideration by the department chair. The department chair works with the faculty member to plan expectations for improvement. If progress towards expectations is not sufficient, the chair works with the College dean to determine the next steps.
When the assessment data drives program or course changes (to content, methods, or field courses), program area faculty discuss and initiate changes, completing the appropriate curriculum forms. The changes are brought to elected committees at the department, college and university levels (and State Board of Higher Education level, if the change process requires).
2c.4 Data-Driven Changes
Extensive effort has gone into developing and organizing the unit assessment system. Changes within the assessment system the last three years include but are not limited to:
Initial Programs
Advanced Programs
Other Program and Unit Improvements
Data-driven program decisions are now being made on a routine basis. A sampling of those changes is offered below:
Initial Programs
A technology-supported connection between the Education Building on campus and an elementary classroom in the Grand Forks public schools was developed. The purpose of the connection is to study how skilled elementary teachers manage their classrooms through a live interactive video feed. A classroom on campus is connected to a school classroom in which teacher candidates can see and interact with the teacher and students involved in a range of management decisions including classroom routines, student groupings, transitions, and engaged learning for all students. In October 2007, at an elementary education program area meeting, faculty will report back on this initiative.
The field experience that is co-requisite with the methods block is active and sustained. Candidates are in the field for three consecutive weeks and candidates are assessed on teaching four lessons in the field (at least one enhanced with technology). One lesson, chosen by the candidate, is assessed as a Critical Task.
2c.5. Sharing Assessment Data
Assessment information is regularly shared with stakeholders throughout the year. Candidates are informed about assessment and receive on-going feedback about performance level through grades and personal LiveText messages reporting performance on Critical Tasks. In addition, candidates are participative in dispositions and student teaching evaluation assessments.
Faculty have timely access to course evaluation (USAT) data. Also, faculty peer evaluation committees review the work of the faculty regularly and department chairs are required to review faculty performance annually.
Assessment committees in collaboration with the Office of Associate Dean for Teacher Education share candidate performance data with faculty at annual retreats. The Teacher Education Committee regularly reviews unit level data to support and recommend changes to improve the unit. The extended faculty (faculty in the College of Arts and Sciences and the College of Business and Public Administration) receive Praxis II scores from the Office of the Associate and are invited to the annual assessment retreat. In addition, they review the Lesson Plan, a critical task embedded in methods courses, and can use the tools in LiveText to analyze and interpret candidates’ scores. The Director of Field Placement and Student Teaching hosts collaborative meetings with supervisors and cooperating teachers and regularly shares data related to the assessment of candidates.
Finally, for the last two years, the Dean of the College of Education and Human Development has hosted an annual assessment program during the final all college meeting in the spring semester and each department presents assessment related work accomplished during the year. To view an example of the Department of Educational Leaderships’ reports see 2c.5.1 in E-exhibits).
Electronic Exhibits in Support of Standard 2
2a.1-2.1: Department of Teaching & Learning Minutes Related to the Assessment System
2a.1-2.2: T&L Assessment Committee Minutes
2a.1-2.3: T&L Undergraduate Assessment Handbook
2a.1-2.4: Portfolio Task Force Meeting Minutes
2a.1-2.5: Revised Portfolio System Documents
2a.1-2.6: Link to Special Education Program SPA Report
2a.1-2.7: MS Programs Assessment Plan: Early Childhood, Elementary Education and
General Studies
2a.1-2.8: MS Programs Assessment Plan: Reading Masters
2a.1-2.9: Link to Educational Leadership Assessment Plan:
https://www.und.edu/dept/datacol/assessment/unsecure/0405/EDL_masters.pdf
https://www.und.edu/dept/datacol/assessment/unsecure/0405/EDL_doc.pdf
2a.1-2.10: Link to MS in Counseling Assessment Plan:
https://www.und.edu/dept/datacol/assessment/unsecure/0405/Coun_MA.pdf
2a.1-2.11: Program Report for the Preparation of Educational Leader (Advanced)
2a.1-2.12: Program Report for the Preparation of Counselors for Schools (Advanced)
2a.1-2.13: Overview Instructional Design & Technology Assessment Plan
2a.1-2.14: Graphic of Unit Assessment System
2a.5.1: Reliability Document Related to Admissions Letter Review
2b.1-2.1: Student Review Committee Policy and Procedures
2b.1-2.2: 2005-2006 Undergraduate Assessment Committee Report and Related Documents
2b.1-2.3: 2000-2007 Undergraduate Assessment Committee Report and Related Documents
2b.1-2.4: 2006 Program Area Responses to Completer and Administrator Surveys
2b.1-2.5: Advisement Satisfaction Survey
2b.1-2.6: Advisement Satisfaction Survey Results
2b.3.2: Link to Colleges’ Grievance Policy: https://www.und.edu/dept/ehd/policy.htm
2c.5.1: Educational Leadership 2006 and 2007 Assessment Day Reports.
Hard Copy Exhibits in Support of Standard 2
2a.1-2.1HC: Communication Sciences and Disorders ASHA Accreditation Materials
2b.1-2.2HC: 2006, 2007 Completer and Administrator Survey Results
2b.1-2.3HC: Advisement Satisfaction Survey Report for Spring 2007
2b.1-2.4HC: Documents File Related to Processing of Student Complaints
.