In Spring 2012, a faculty task force was formed and charged with investigating alternatives to the Endeavor course evaluation instrument. In Fall 2012, the task force issued a final report (which included a full rationale and notes from meetings with five different vendors) with a recommendation to move forward with IDEA as a potential replacement for Endeavor.
During the Spring 2013, Fall 2013, and Spring 2014 semesters, the IDEA instrument was pilot tested, with a slightly different emphasis each time. In all, IDEA was pilot tested in 274 classes across all four colleges, all three North American campuses, at some off-site locations, in both face-to-face and online/blended classes, and in courses that ran on both a traditional and non-traditional schedule. A total of approximately 4,800 student responses were gathered through the pilot tests. Surveys of pilot test faculty indicated overwhelming support for implementing IDEA university-wide.
In Spring 2014, IDEA was approved as the replacement for Endeavor by the Academic Policy and Research Committee (APRC), the faculty senate, and the full faculty.
The plan at that point was simply to change the evaluation instrument and move from paper forms to online. The timing and frequency of evaluation remained the same as it had always been at FDU.
In Fall 2014, IDEA was rolled out university-wide. All courses with an enrollment of five or more were evaluated using IDEA’s full Diagnostic Feedback instrument, which provides both summative and formative feedback about student progress on relevant course objectives, instructor teaching methods, and overall impressions of the instructor and course (in keeping with past practice, there were some courses that were not evaluated, such as independent studies courses or co-ops/internships). At the time, the full Diagnostic Feedback instrument contained more than 40 questions, and an additional 12 Endeavor questions were added to the IDEA instrument for tenure-track faculty who started at FDU prior to Fall 2014.
As indicated in the “Fall 2014 IDEA Task Force Activities Report,” the rollout was very successful but revealed some areas where changes could be made to increase response rates and strengthen the effectiveness and efficiency of the evaluation process. While some procedural and administrative changes were made during the Spring and Fall 2015 evaluation process, the frequency of evaluation and the evaluation instrument remained the same (aside from a few changes made to the Diagnostic Feedback instrument by IDEA).
After gathering feedback from faculty, staff, and students and analyzing data from two years of implementation at FDU, as well as examining research focused on how other similar universities use student surveys of teaching, it became apparent that we were collecting more frequent and more detailed evaluation data than is typical. Most similar universities gather student feedback on six to ten courses total for faculty members during their pre-tenure years, and only periodically assess courses taught by tenured faculty.
In Spring 2016, Campus Labs introduced IDEA2. This new version of IDEA included changes to the Diagnostic Feedback Instrument. At the same time, FDU also introduced two short evaluation surveys provided by IDEA (Teaching Essentials and Learning Essentials). IDEA Technical Report 19 discusses the changes to the Diagnostic Feedback Instrument and the rationale for those changes. Campus Labs stopped supporting the previous version of IDEA, so there was no choice for FDU at that point but to transition to IDEA2.
FDU piloted use of the two shorter evaluation instruments in Spring 2016 with the goals of reducing evaluation “fatigue” on the part of students, increasing response rates, reducing the administrative burden (for both administrative staff tasked with deploying IDEA and faculty personnel committees tasked with reviewing IDEA data), and aligning FDU evaluation procedures with best practices. Using the shorter evaluation instruments for some faculty provided a means to achieve many of these goals.
The Dean’s Council and the APRC recommended a plan for the Spring 2016 semester that would provide experience with the Learning Essentials instrument (through a tenured-faculty pilot) and the Teaching Essentials instrument (through evaluation of courses taught by non-tenure track and adjunct faculty). Courses taught by tenure-track faculty would continue to be evaluated using the full Diagnostic Feedback Instrument.
This plan for Spring 2016 allowed for tenure-track faculty, as well as tenured faculty who opted in, to get the full Diagnostic Feedback evaluation. At the same time, the number of evaluations completed by students was minimized (with the goal of reducing evaluation “fatigue” and increasing response rates), and experience was gained with both the Teaching Essentials and Learning Essentials instruments.
Reports of the FDU IDEA Faculty Task Force may be found on the FDU IDEA WordPress site. This site was set up to communicate IDEA information during the pilot, implementation, and early phases of IDEA at FDU. The WordPress site was set up and maintained by Scott Behson, Professor of Management in Silberman College of Business.
IDEA Implementation Overview
Spring 2012: Faculty task force is formed to investigate alternatives to Endeavor.
Fall 2012: Faculty task force issues a report recommending IDEA as a replacement for Endeavor.
Spring 2013: First pilot test
Fall 2013: Second pilot test
Spring 2014: Third pilot test
Spring 2014: APRC, faculty senate, and full faculty approve IDEA as the new evaluation instrument.
Fall 2014: University-wide rollout of IDEA Diagnostic Feedback instrument
Spring 2015: Continued university-wide use of IDEA Diagnostic Feedback instrument
Fall 2015: Continued university-wide use of IDEA Diagnostic Feedback instrument
Spring 2016: Continued university-wide use of IDEA Diagnostic Feedback instrument, introduction of the shorter Teaching Essentials instrument, and pilot of the Learning Essentials instrument