academic senate west campus community center 10:00 - noon€¦ · b. senate vice president –...
TRANSCRIPT
Academic Senate
Agenda for the Meeting of
September 7, 2010
West Campus Community Center
10:00 - noon
1. Call to Order and Roll Call
2. Approval of Minutes
A. April 27, 2010 minutes – 10 a.m. session, 11:00 a.m. session
3. Open Forum
4. Reports
A. Senate President – Douglas Haneline
B. Senate Vice President – Michael Berghoef
C. Senate Secretary – Sandy Alspach
D. Course Management Software Executive Update – Mary Holmes
5. Committee Reports
A. General Education Task Force – Don Flickinger
B. HLC Update – Roberta Teahen
C. University Curriculum Committee – Leonard Johnson
D. APRC Report – Matt Wagenheim
Roll Call
6. New Business
A. Appoint Senate Rules Committee
B. APRC Guidelines Revised July 2010 – for Senate approval
C. Establish Senate Committee Review Task Force
7. Announcements
A. FSU President - David Eisler
B. Provost – Fritz Erickson
C. Senate President – Douglas Haneline
8. Open Forum
9. Adjournment
DRAFT
Ferris State University Academic Senate Meeting
April 27, 2010 11:00-12:00 noon Session
West Campus Community Center
Minutes
I. Action Items: elections conducted by Sen. Beistle, Elections Committee chair, who welcomed the newly elected Senators. She thanked in advance the members of her counting committee:
A. Douglas (Doug) Haneline (nominated by Sen. Abbasabadi) was elected President of the Academic Senate, without opposition.
B. Michael (Mike) Berghoef (nominated by Sen. Griffin) was re-elected Vice President of the Academic Senate. There were three nominations. Maureen Heaphy (nominated by Sen. Thapa) declined the nomination. Keith Jewett (nominated by Sen. Alspach) and Mike Berghoef spoke to their candidacy. The vote was 28 for Berghoef, 6 for Jewett.
C. Sandra (Sandy) Alspach (nominated by Sen. Heaphy) was re-elected Secretary of the Academic Senate, unopposed, when Maureen Heaphy and David Hanna declined the nomination.
D. The following Senators were nominated for positions on Executive Committee as Member-at-Large. Melinda Isler (nominated by Sen. Sun) and Khagendra Thapa (nominated by Sen. ??) declined nomination. The ballot count for each nominee is recorded.
23 Maureen Heaphy (nominated by Sen. Skrocki)
20 Marilyn Skrocki (nominated by Sen. Heaphy?)
18 David Hanna (nominated by Sen. Thapa)
16 Keith Jewett (nominated by Sen. Haneline)
15 Piram Prakasam (nominated by Sen. Griffin)
Attendance:
Senators present Abbasabadi, Alspach, Beistle, Berghoef, Bokina-Lashaway, Boncher, Brandly, Colley, Compton, Cook, Dakkuri, Daugherty, Drake, Gillespie, Griffin, Hancock, Haneline, Hanna, Heaphy, Isler, Jewett, Joyce, Klatt, Liszewski, Lukusa Barnett, Luplow, Maike, Marion, McNulty, Nash, Prakasam, Rewers, Sanderson, Skrocki, Sun, Thapa, Wagenheim
Senators absent with cause Jorsch, Taylor
Senators absent Nagel
Ex Officio and Guests Teahen, Flickinger
II. Open Forum
A. Pres. Haneline made observations about the coming year for the Academic Senate.
1. He reminded Senators that „we are in the advice business‟. Since the full Senate will meet only ten times for approximately 2 hours at a time, totaling about 20 hours for deliberation, he suggested that we use our time together, and our committee structure, effectively to give the best advice possible to the administration.
2. He anticipated that “big things” will be happening in the next year for the Senate to consider. He urged the body to begin training new leaders at all levels of Senate activity now, since “we won‟t be around forever”.
3. He encouraged Senators to carefully monitor the changes that are happening in higher education in the state and in the nation.
B. Several issues arose during the election process.
1. Sen. Beistle invited candidates nominated for offices to speak to their candidacy. Following the election of the Vice President, discussion ensued concerning the process of campaigning for an office. Several candidates had emailed statements of their willingness to serve to the new Senators prior to the election meeting. The Charter does not address campaigning practices. Sen. Prakasam expressed concern about campaigning prior to the election meeting.
Because there was only one nominee for President, Sen. Alspach asked to elect „by acclamation‟; however, Parliamentarian Loesch advised that this procedure was out of order since the Charter specifies that a ballot vote must be taken for all elected positions.
During the election of Senate officers, Sen. Abbasabadi asked to identify candidates by their colleges. He expressed concern that all officers are from the same college.
During the election of Members-at-Large, questions arose about the process of listing three names on one ballot. It was noted that this procedure had been recommended by the Charter Revision Committee and adopted by the 2009-2010 Senate.
It was noted, and supported by the Parliamentarian, that the Rules Committee should review the Election Procedures described by the Charter and make recommendations.
C. Sen. Prakasam encouraged the new Senate, and the new Vice President, to establish a process for assessing the Senate‟s committee structure.
D. Pres. Haneline thanked the Senate for their confidence in him and promised that he will seek to ensure it. He invited Senators to email concerns to him directly at [email protected]. He offered three remarks:
1. The Senate should be proud of the work that has been done by this body.
2. The Senate should be frustrated that more work has not been done by this body.
3. The Senate should focus on the most important things that it should be doing in the coming year.
He asked the newly elected Executive Committee members to stay briefly after the adjournment.
III. Announcements
A. Sen. Hanna observed that the new roster of Senators and committee assignments included three representatives from the College of Engineering Technology on the distinguished Teacher Committee. Seeing this situation as inappropriate, he offered to seek an alternative committee assignment.
B. Sen. Prakasam requested that committee assignments be made earlier, ideally within the first week of the fall semester.
Pres. Haneline noted that this process is governed by the Charter language, and he suggested that the Rules Committee address Sen. Prakasam‟s concern.
C. Administrative Assistant Hadley asked all Senators to review the 2010-2011 roster for any additions or corrections.
The meeting was adjourned at 11:35 AM. Sandy Alspach Richard Griffin Secretary President
Rules Committee Establishment Motion
Excerpt from the Ferris State University Academic Senate Meeting January 12, 2010, West
Campus Community Center Minutes
D. Sen. Alspach moved, seconded by Sen. Berghoef, to establish a standing Charter and
Procedures Review Committee (see handout).
Parliamentarian Loesch suggested that this committee is usually called the Rules
Committee.
Sen. Haneline suggested that the chair of the committee should be the Secretary of the
Senate.
1. Sen. Hanna moved, seconded by Sen. Dakkuri, to amend the proposal to define the
committee as five (5) Senators, representing five (5) separate units.
The amendment passed unanimously.
2. Sen. Abbasadabbi moved, seconded by Sen. Dakkuri, to postpone discussion of the
proposal to the February meeting.
The motion to postpone passed unanimously.
Excerpt from the Ferris State University Academic Senate Meeting February 2, 2010, West
Campus Community Center Minutes
Old Business
I. The motion, postponed from the January meeting, to create a standing committee to
maintain and review the Charter, Policies and Procedures, and any other rules the Senate
may establish (“Rules Committee”) was passed as amended: 23 ayes and 2 abstentions.
A. Sen. Alspach explained that the motion had been reviewed by Parliamentarian Loesch
and he had recommended the inclusion of the phrase “and any other rules the Senate
may establish”.
1. Sen. Dakkuri asked for justification for the naming of the Secretary of the Senate
as chair of this committee.
2. Sen. Alspach reminded Senators that the motion had been amended at the
January meeting to name the Secretary of the Senate as the chair of the
committee. This procedure is also “standard operating procedure” for most
bodies like the Academic Senate, according to Loesch.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
1
ACADEMIC PROGRAM REVIEW:
A GUIDE FOR PARTICIPANTS
Division of Academic Affairs/Academic Senate
Ferris State University Big Rapids, Michigan 49307
Initiated 1988 Revised 1994-95, 1998, 2000-01, 2004-05, 2005-2006, 2010-2011
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
2
ACADEMIC PROGRAM REVIEW:
DEGREE PROGRAMS, NON-DEGREE PROGRAMS, PRE-PROGRAMS, AND OTHER NON-DEGREE CURRICULAR ENTITIES
I. Program Review Mission and Goals
A. Mission Statement of Ferris State University
Ferris State University prepares students for successful careers, responsible citizenship, and lifelong learning. Through its many partnerships and its career-oriented, broad-based education, Ferris serves our rapidly changing global economy and society.
B. Program Review Process Origin and Philosophy Academic program review has been present at Ferris since 1988. It fulfills one of the criteria that the University must meet for regional accreditation by the Higher Learning Commission (HLC) of the North Central Association (NCA). According to the Handbook of Accreditation 3/e, Core Component 2c of Criterion Two (Preparing for the Future) is as follows: “The organization‟s ongoing evaluation and assessment processes provide reliable evidence of institutional effectiveness that clearly informs strategies for continuous improvement.” As part of a larger institutional system that collects, disseminates, and evaluates institutional information, an effective academic program review process thus provides evidence that the University meets the criterion. Academic program review processes across the United States are administered by both (or either) administration and faculty. Career oriented education is the core of the mission of Ferris State University. The instruction that meets this goal occurs primarily at the program level, and it is essential to maintain and improve the quality of the programs in the University. Therefore, an effective academic program review process is essential for the health of the University‟s degree programs, non-degree programs, pre-programs, and other non-degree curricular entities.
Definitions Degree Programs: A set of courses the completion of which leads to the awarding of a degree. Non-Degree Programs: Programs that do not grant degrees but do have set coursework requirements. These programs are typically cross-disciplinary and apply to students throughout the University. Examples include the Honors Program and the University General Education Program. Pre-Programs: Degree programs that are not offered by a specific department or program. Typically associates degrees, they are most frequently granted to students after completion of a set
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
3
number of credit hours or courses. These credit hours or courses may not be specifically identified and may be general in nature. In some of these curricula relatively few students actually finish these degree programs; most transfer into other programs when they are eligible or ready to do so. Examples include Career Exploration, Liberal Arts, Pre-Law, and Pre-Pharmacy. “Stand Alone” Minors and Certificates: These are baccalaureate degree minors and certificates in which there is no major program. Examples include Spanish and Religious Studies. “Attached” minors and certificates are reviewed with the degree program of which they are part. Note – unless otherwise stated, throughout this document the word „program‟ refers to degree programs, non-degree programs, pre-programs, and other non-degree curricular entities. Any complex organization such as a University is composed of a number of constituencies with different responsibilities and perspectives. Three major constituencies in the University are the students, the faculty, and the administration. The primary responsibility of the students is to obtain an education. The faculty provides instruction and guides the learning of those students. The administration is responsible for the management of the University and providing an environment and the resources necessary for the faculty to carry out its responsibilities to students. Clear and continuing communication among these constituencies is essential for optimal function of the University and for an effective academic program review process. It is the obligation of the faculty and administration to ensure quality education for students enrolled at the University. At Ferris State University academic program review is a collaborative process that is largely faculty driven. The process described in this document requires the formation of a program review panel (composed predominantly of faculty) which is charged with collecting data concerning the program, evaluation of that data, and making recommendations with regard to future direction of the program based on its findings. The Program Review Panel (PRP) report is submitted to the Academic Program Review Council (APRC) which is a standing committee of the Academic Senate composed of faculty representing all academic units. The APRC evaluates the report and meets with the PRP for a discussion of the report. The APRC then makes recommendations to the Academic Senate which is composed of faculty representing all academic divisions of the University. The recommendations of the Academic Senate are submitted to the Provost. Based on the recommendations of the Academic Senate, the PRP report, the APRC recommendations, and any other documentation, the Provost makes recommendations to the University President concerning each Program. The University President may accept the recommendation of the VPAA or disagree with them. The central role the faculty in the academic program review process does not diminish the importance of input from or supplant the responsibilities of other constituencies in the University. During the process of preparing its report, the PRP solicits input from other stakeholders, including current students, alumni, employers of graduates, advisory committee members, faculty members who teach in the program, the Department Head/Chair, and the Dean. The Department Head/Chair and the Dean are involved with the development and writing of the report throughout the process. They are also invited to present their views by meeting with the APRC.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
4
Implementation of the recommendations made by the Provost and approved by the President with respect to curricular matters is the responsibility of the faculty in the program, the Department Head/Chair, and Dean of the College. Allocation of fiscal and human resources necessary to implement the recommendations is at the discretion of the administration.
II. An Overview of the Academic Program Review Process A. Goals of Academic Program Review It is at the program level at which the mission of Ferris State University to “…prepare students for successful careers, responsible citizenship, and lifelong learning” is truly accomplished. As a consequence, programs must respond to advances in knowledge and changes in the workplace and technology if the University is to maintain its vitality. The academic program review process provides an opportunity for the faculty and administration to evaluate the goals and effectiveness of the program and make appropriate changes that will lead to improvement in the quality of instruction, improved career and life preparation for students, and effective use of University resources. The goals of academic program review include:
1. Assist programs in identification, evaluation and assessment of their mission and goals 2. Assist programs in determination of their relationship to the Mission of Ferris State
University 3. Assist programs in evaluation of their effectiveness in preparing students for a career or
further education 4. Assist programs in assessing the quality of instruction, evaluation of instructional
methodology and identifying strengths and weaknesses in their curriculum 5. Assist programs in identification of existing resources and determination of the resources
needed to carry out their mission and goals 6. Assist programs in the development of clearly defined and measurable student learning
outcomes at both the course and program levels. 7. Contribute to the effort of the University to build a culture of academic quality and excellence,
including the goals of good citizenship and understanding of diversity. 8. Assist the University in evaluation of the viability, value, quality, effectiveness and efficient
use of resources for the academic programs at Ferris State University 9. Provide direction and priorities for the University that can be used for needs assessment,
resource allocation, and planning 10. Provide structure, a plan of action, and information for continuous program improvement
B. Report Guidelines Summary The following guidelines should be used in conducting program reviews. These guidelines should help (1) reduce the amount of documentation required in the program review process and (2) focus
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
5
the review on program goals, how well the program has done to date in meeting these goals, and the future actions needed to meet the goals. These guidelines are:
1. The report will be goal-oriented. Specific goals should be stated for the program and the attainment of those goals should be the focus of the program review report. The goals should reflect the University's mission and the departmental, college and divisional strategic plans.
2. The report will look at the program as a whole. The focus will be on the program, not on individual courses.
3. The report will be forward-looking. It will focus not only on where the program has been but also on where the program wishes to go (its goals). Using data provided to or generated by the department, it will analyze and assess whether the goals are appropriate to the discipline, the needs of students in the program, etc.
4. The focus of the report will be both descriptive and assessment-oriented. The report will evaluate progress toward program goals rather than merely document the status of the program. It will analyze available data, both quantitative and qualitative, that has been provided to or generated by the department, to assess the program‟s progress in meeting its goals. (For example, do responses from employers indicate the program is successfully preparing its graduates for the workplace, if such preparation is one of the goals of the program?)
5. Recommendations will be expressed in terms of action. Recommendations for action will indicate who will do what specific tasks, and when.
6. The Program Review process will be continuous.
C. Summary of Procedure
1. As part of the general APR process, the review begins in the fall semester of the year before the program report is due.
2. In the absence of a faculty group, the college Dean (or other senior administrative officer) appoints a representative of the degree-granting college or the designated program coordinator to be part of the review process.
3. The next steps in the APR process are the completion of the review panel, development of a review plan, and the creation of a budget.
4. Following approval of the makeup of the panel by the APRC chair and approval of the budget by the Office of Academic Affairs, the review panel develops surveys for the following groups (where they may be identified):
Alumni
Current students
Advisory committee members
Relevant faculty (i.e., those directly involved in offering the program‟s requirements)
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
6
5. The panel then conducts the surveys. In sessions that include both the Dean and the Dean-appointee or designated program coordinator, information gathered from the surveys is summarized, analyzed, and used to draft the various sections of the report.
6. The final report includes sections written by the Dean and the Dean-appointee or designated program coordinator. The Program Review Panel (PRP) Chair (typically the Dean-appointee or the designated program coordinator) develops a schedule that delineates responsibility and deadlines for completion of writing the APR report.
Initial submission of the final report is due the second Monday in June, the summer after the process is initiated, to the APRC Chair. It is the responsibility of the PRP Chair, the Dean, and the Department Head / Chair to see the deadline is met. The APRC Chair will read the report and return it to the PRP Chair with recommendations for improvement no later than the second Monday in July. The final report (including the required number of copies for the APRC members) is due to the APRC Chair the second Monday in August. The APRC reads and discusses the report. Following initial review of the written description of the program, the APRC meets with the PRP to discuss the report and the program. Information gathered from the report and the interview with the panel is used to formulate a recommendation for the program including suggestions for program improvement. This recommendation is forwarded to the Academic Senate, usually in mid-November, for its consideration. (At the same time, the Department Head/Chair, Dean, and Provost receive copies of the recommendation.) After the Senate acts on the recommendation, it is then passed on to the Provost. Steps to implement the recommendation and program improvements are considered and acted on by the VPAA, President, and Board of Trustees of the University. Implementation of the recommendations is the responsibility of the faculty of the program, the Department Head/Chair, the Dean of the College, and the Provost.
III. Policy: The Academic Program Review Process A. Academic Program Review Council Members of the Academic Program Review Council (APRC) are appointed for three-year renewable terms by the Executive Committee of the Academic Senate. The Council shall include the following: Eleven faculty members, preferably tenured:
one from each college,
one FLITE librarian, and
two at large. No more than two faculty members from any one college should serve on the APRC. The APRC Chair is appointed by the Executive Committee of the Academic Senate for a one-year term. The APRC normally operates as a committee of the whole. To facilitate timely and effective review, however, the APRC can at its discretion divide itself into subcommittees. Though some reviewing
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
7
work may be split among subcommittees, decisions made by the subcommittees will be ratified by the APRC as a whole.
B. Creating the Program Review Panel Each summer the APRC Chair notifies the programs which are scheduled to begin the review that academic year. Faculty members, the Department Head / Chair, and the Dean for the programs under review will be invited to attend a program review orientation facilitated by the APRC Chair. This meeting is typically held during the week prior to the beginning of fall classes. Each program (or cluster of programs) which is scheduled for review must form a Program Review Panel (PRP). The Department Head/Chair will convene a meeting with the faculty to provide input on membership selection for the Program Review Panel (PRP) of which he/she is a member. The panel shall consist of the following: 1. A faculty member, preferably tenured and from the program, to chair the PRP. APRC will
seek the advice of the Department Head/Chair and faculty in appointing the Chair. The Chair has principal responsibility for writing the report. It is suggested that the Chair be available during the summer.
2. The Program Coordinator, and the Head/Chair of the department in which the program is
located. 3. Two program faculty, where possible. 4. An individual with special interest in the program. This person could be an alumnus/na, an
advisory committee member, an adjunct faculty member, or an interested faculty member from outside the program.
5. A faculty member from outside the college. Note - The names of PRP members should be submitted to APRC Chair as soon as the panel is formed.
C. Preparing the Budget The VPAA will annually set aside a designated amount of funds for each program panel (see appendix for break down of expenses). The Department Head/Chair will notify the VPAA concerning the account number into which the funds will be transferred. If the PRP believes that its process will exceed the designated amount, it may submit to the VPAA (with a copy to the Chair of APRC) a budget containing all anticipated expenses the panel may incur in the process of gathering data and preparing the report. Typically these expenses include such items as copying, telephone, clerical,
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
8
and postage. After the VPAA‟s office has approved a budget, the necessary funds will be transferred from Academic Affairs into the account from which the department will pay the expenses of the review.
D. Preparing the Evaluation Plan The PRP should prepare an Evaluation Plan using the format of the sample document (see Appendix) and submit it to the Chair of the APRC for approval. With non-degree programs, pre-programs, and other non-degree entities, information about some survey items may be difficult or impossible to acquire. However, where such information can be obtained and would serve to make a point, inclusion is required. The Program Review Panel (PRP) will meet as soon as possible after its formation to undertake the following tasks: 1. Review the information contained in the Administrative Program Review document. 2. Develop a statement in which the purpose and scope of the review are articulated. 3. Assign a leader and a target date for each of the activities in the list that follows:
Graduate follow-up survey
Employer follow-up survey
Student (graduating and current) evaluation of program
Faculty perceptions of program
Advisory committee perceptions of program
Labor market analysis
Evaluation of facilities and equipment
Curriculum evaluation 4. Determine data collection techniques and information sources. The survey instruments must
be designed and distributed, in consultation with Institutional Research and Testing, to reflect general aspects of program review as well as the specific nature of the program itself. The panel must determine the number of individuals in each category to be surveyed. It is important that the results of these surveys be statistically valid.
The APRC Chair will review the plan using criteria of soundness and ability to generate sufficient data to support conclusions. The Chairs of the APRC and the PRP will work out any plan deficiencies.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
9
E. Style Guide Suggestions The program review process can be made more efficient and effective by presentation of a well written and organized report. The Chair of the APRC has examples of past reports. These may provide ideas for presentation and organization. The following suggestions may also help in preparing the report.
Include a table of contents including section names, subsection names, and page numbers.
Use tabs or dividers to indicate sections.
Use consecutive page numbers within each section of the report.
Label pages with both section and page numbers.
Present all numerical data in both text discussion and table formats. Include analysis/interpretation of all data. Include both raw numbers and percentages.
F. Writing the Program Review Panel Report Each PRP will conduct its review in accordance with the approved plan and should include the elements described in IV: Report Content Guidelines. The development and writing of the report should follow these guidelines:
After the PRP collects the data, it will provide the results from the data collection phase to the Dean.
The PRP will invite the Dean to attend a meeting in which the results of the analyses are discussed and input is solicited from all individuals in attendance regarding the general health of the program, future goals, adequacy of resource allocation, and recommendations for program rating.
PRP Chair will coordinate the development of a schedule that delineates responsibility and deadlines for completion of writing the APR report.
The PRP Chair will call meetings during the report writing phase to provide members of the PRP an opportunity to critically discuss and edit the draft as needed throughout the compilation of the document
The Department Head/Chair will submit a draft of his/her analysis of the health of the program, future goals, and adequacy of resource allocation for inclusion in the APR report. This draft will include a discussion of his/her perception of the relationship of the program to the Ferris State University mission; the program‟s visibility and distinctiveness; the program‟s value; the characteristics, quality and employability of students in the program; the quality of curriculum and instruction, composition and quality of the faculty; and the adequacy of facilities and equipment. Necessary supporting data should be included.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
10
The PRP will provide a near final draft of the report, including the Department Head/Chair‟s analysis, to the Dean for review.
The Dean will submit a draft of his/her analysis of the health of the program, future goals, and adequacy of resource allocation for inclusion in the report. This draft will include a discussion of his/her perception of the relationship of the program to the Ferris State University mission; the program‟s visibility and distinctiveness; the program‟s value; the characteristics, quality and employability of students in the program; the quality of curriculum and instruction, composition and quality of the faculty; and the adequacy of facilities and equipment. Necessary supporting data should be included.
The PRP will invite the Dean to attend a meeting in which the report is discussed by all individuals in attendance.
The PRP is responsible for editing and submitting the final report to the APRC. It is the responsibility of the Department Head/Chair and the Dean to ensure that the report is submitted by the designated deadline.
Non-Degree Programs: Every attempt should be made to address the issues identified in the Guide for Participants. Where items or issues are deemed inappropriate, or unavailable, a reply of Not Applicable should be stated. APRC will review these responses and request additional information if judged pertinent. Pre-Programs: Where specific students within the program cannot be defined because students do not declare their major prior to application for such a degree, the information should be based on the students receiving the degree since the degree inception or the last four years, broken down by year. In addition, since pre-programs are not conventional degree programs, questions relating to program activities should be ignored.
G. Submitting the Report Each PRP will conduct its review in accordance with the approved plan and will submit 14 copies of the report contained in a 3-ring notebook to the Chair of the APRC. The report should be able to fit in a notebook no thicker than 1 ½” to 2” in thickness. A complete copy of the report will be transmitted electronically to the Chair of the APRC so that it may be posted on the Academic Senate webpage and transmitted to the Provost. At the same time the report is sent to the APRC, a paper copy will be sent to the Department Head/Chair and the Dean of the College. A functioning academic program review process is a requirement of the University‟s institutional accreditation, and in the event that a PRP fails to submit a report, or submits an unsatisfactory report, APRC will review available data and make appropriate recommendations regarding the future of the program.
H. APRC Review of the PRP Reports After the APRC reviews and analyzes the PRP report, which includes sections written by the Department Head/Chair and Dean of the College, the APRC will meet with members of the PRP to
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
11
discuss the report. The APRC may submit written questions concerning the report to the PRP prior to the meeting in order to clarify information presented in the report. The PRP is encouraged to invite the Dean to this meeting. If the PRP elects to meet separately with the APRC, the Dean will be invited to meet at another time with the APRC.
I. APRC Recommendation The APRC will submit to the President of the Academic Senate its recommendation regarding the program under review. The recommendation should do the following:
1. Assign one of the ratings (see the list in L) to the program with respect to its future status. 2. Articulate the determinants which involved the assignment of a particular rating to the
program. The strengths and deficiencies of the program should be elucidated in such a fashion that their impact in arriving at the assigned rating is clear.
3. In cases other than discontinuation of the program, specify actions needed to correct the
weaknesses of the program and enhance its strengths. Additionally, measures to be taken that are consistent with the assigned rating must be presented. In the case of a program slated for enhancement, the APRC should specifically state the actions it recommends to arrive at such an outcome.
The PRP Chair, the Department Head/Chair, the College Dean, and the VPAA shall receive copies of the APRC‟s recommendations at the same time they are sent to the President of the Academic Senate. The APRC will meet with the Executive Committee of the Academic Senate and the VPAA prior to the dissemination of the recommendations to the full Academic Senate.
J. Program Review Ratings Ratings are assigned based on the program‟s status with regard to the following categories (found in IV: Section 5):
Relationship to FSU Mission
Program Visibility and Distinctiveness
Program Value
Enrollment
Characteristics, Quality, and Employability of Students
Quality of Curriculum and Instruction
Composition and Quality of the Faculty Continue the Program: The program‟s status with respect to the categories in Section 5 of the report merits continuation. Minor modifications may be needed.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
12
Continue the Program with Enhancement: The program‟s status with respect to the categories in Section 5 of the report merits continuation. The program‟s status with regard to several of the categories is significantly high, and its less satisfactory status with regard to the other categories could be significantly improved by the allocation of additional resources. Such a program enhancement may involve additional faculty/staff, equipment, or other resources and/or expansion in enrollment. Continue the Program with Reporting: The program‟s status with respect to the categories in Section 5 of the report merits continuation. However, documented problem areas exist, and the faculty and administration of the program will be asked to report as to their progress in solving these problems. Continue the Program with Redirection: The program‟s status with respect to the categories in Section 5 of the report merits continuation. However, the program needs a curricular redirection, and the faculty and administration of the program will be asked to report as to their progress in carrying out this redirection. Continue the Program with Reduction: Although the program‟s status with respect to the categories in Section 5 of the report merits continuation, the program lacks visibility and distinctiveness, the job market for its graduates is diminishing, or enrollment is declining precipitously. It should therefore be reduced in enrollment capacity or resources. Discontinue the Program: The program‟s status with respect to the categories in Section 5 of the report is such that evidence suggests that the program should be terminated.
K. Academic Senate Recommendation The Academic Senate will discuss the recommendation submitted by APRC. At the conclusion of Academic Senate deliberations on each program, the President of the Academic Senate will submit the Senate‟s recommendation to the VPAA.
L. VPAA's Recommendation The VPAA will review the recommendation of the Academic Senate, the PRP report, and any other relevant documentation compiled through the APR process and Academic Senate deliberations. Prior to sending his or her recommendation to the University President, the VPAA may choose to discuss the recommendation with the Executive Committee of the Academic Senate. No recommendation can come from the Executive Committee that is different from the one voted by the Academic Senate.
M. University President’s Recommendation The University President may accept the recommendation of the VPAA or disagree with it. He or she must inform the President of the Academic Senate of his/her decision regarding the program under
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
13
review. If the University President‟s decision is in conflict with the Academic Senate‟s recommendation and if the decision involves the reduction or discontinuation of a program, a conference committee shall be formed in accordance with Section 8 of the Charter of the Academic Senate.
N. Implementation of Recommendations Academic program review cannot be effective without appropriate implementation of recommendations. This requires feedback from and accountability by both faculty and administration. Failure to address and/or follow through on recommendations brings into question the value of the process of academic program review. For programs with ratings of Continue, Continue with Enhancement, Continue with Reporting, or Continue with Redirection, the following steps are to be taken, using the provided database for ease of communication: 1. Creation of an action plan
a) The Department Head/Chair will invite the Dean and the PRP to discuss the development of
an action plan based on the recommendation of the VPAA. The action plan will include delineation of action steps, which individuals will be responsible for completing the action steps, a timeline for completion of the action steps, and budgetary and other resources needed.
b) The PRP will complete the development of a draft action plan and submit it to the Dean within ten working days after the start of the semester when the APR report was submitted.
c) The Dean will meet with the PRP within ten working days to discuss the draft action plan, including recommendations for revision.
d) The PRP will forward a final action plan to the Dean and the Chair of APRC within ten working days after discussion of the draft.
e) The Dean will make a recommendation to accept, reject, or modify the action plan within ten working days and forward the recommendation, and his/her rationale for the recommendation, to the PRP, the Chair of APRC and the VPAA.
f) The VPAA will make a recommendation to accept, reject, or modify the action plan within ten working days and forward the recommendation, and his/her rationale for the recommendation, to the PRP, the Chair of APRC, the Dean, and the President.
2. Request for approval of resources needed to implement an action plan
a) The Department Head/Chair will convene the PRP to review the final action plan approved by the VPAA.
b) The Department Head/Chair will submit a request to the Dean by the appropriate deadline for approval of budgetary and other resources needed for implementation of the action plan.
c) The Dean will submit a request to the VPAA by the appropriate deadline for approval of resources needed for implementation of the action plan. d) The Department Head/Chair will convene the PRP to modify request for budgetary and other
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
14
resources if circumstances warrant such a modification. 3. Implementation of an action plan a) The Department Head/Chair will meet at least monthly with the Program Coordinator and/or faculty to review completion of steps within the action plan until the plan is completed. b) The Department Head/Chair will meet with the Dean at least monthly to review completion of steps within the action plan. c) The Dean will meet at least monthly with the VPAA to report completion of steps within the action plan. d) The VPAA will make a biannual report to the Academic Senate on completion of action plan steps. e) APRC will post progress made toward completion of action plan steps on the Academic Senate webpage. For programs with ratings of Continue with Reduction or Discontinue the Program, the creation and implementation of an action plan is the same except that in programs in which reduction is recommended, Steps 2 b)-d) should be omitted. For programs in which discontinuation is recommended, the following should be addressed in Step 1 a) of the creation of an action plan:
Assessment of the status of current students in the program
Delineation of a plan for students to complete the program of study
Determination of a schedule of classes to allow students to complete the program
Determination of a process whereby technical equipment and supply inventory can be liquidated
O. Review Schedule The Chair of APRC and the VPAA (with the advice of the College Deans) will update the program review schedule annually, listing the programs to be reviewed over a six-year period. Programs with curricular links (for example, associate and baccalaureate programs in the same area, or all teacher education programs) should be combined into a single review. Reviews of programs with external accrediting bodies should be scheduled so that the information developed can be used for both institutional and external reviews. The Department Head/Chair should inform the Dean of the dates when programs will undergo accreditation review so that the Dean can communicate that information to the VPAA, who will adjust the review calendar so that the program can coordinate completion of review documents for both the accreditation and APR processes. As much as possible, the reviews should be evenly spaced over the six years of the cycle. During the sixth year of each cycle, the VPAA should prepare a new schedule for the next six-year cycle. The VPAA will add new programs into the cycle as they are approved.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
15
P. Reviews Outside of the Established Schedule Should circumstances arise such that an unscheduled review is thought to be necessary, such a review can be requested by either the program faculty or the VPAA. When an unscheduled review is requested by either party, the appropriate justification and documentation supporting the need for, depth of, and timetable required for a review must be communicated to the APRC. The APRC must advise the VPAA, the President of the Academic Senate, and the program faculty of its decision to make an unscheduled review and the timetable for that review. The APRC‟s decision to allow an unscheduled review of a program is final and may not be appealed. However, if the APRC does not agree to an unscheduled review, it must justify its refusal to the Academic Senate. The Academic Senate may override the APRC‟s refusal to make an unscheduled review. If the Academic Senate concurs with the APRC‟s refusal to make an unscheduled program review, it must advise the University President of its decision. The University President may override this refusal and a review will be scheduled within a reasonable timeframe.
IV. Report Content Guidelines Note – Underlining indicates items that may not be relevant to all non-degree programs, pre-programs, and other non-degree curricular entities. Section 1: An overview of the program that addresses broadly the areas of the program included in the Administrative Program Review document. This section should acquaint the reader with the program: mission, history, impact (on the University, state, and nation), expectations, plans for improvement, and any other items that would help the reader fully appreciate the remainder of the report.
A. PROGRAM GOALS. 1) State the goals of the program. 2) Explain how and by whom the goals were established. 3) How do the goals apply to preparing students for careers in and meeting employer
needs in the community/region/marketplace? 4) Have the goals changed since the last program review? If so, why and how? If not, why
not? 5) Describe the relationship of the program goals to the University‟s mission, and the
departmental, college and divisional strategic plans. B. PROGRAM VISIBILITY AND DISTINCTIVENESS
1) Describe any unique features or components of the program. 2) Describe and assess the program‟s ability to attract quality students. 3) Identify the institutions that are the main competitors for prospective students in this
program.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
16
a) How are these programs similar and different from the FSU program? b) What can be learned from them that would improve the program at Ferris?
C. PROGRAM RELEVANCE.
1) Provide a labor market demand analysis: This activity is designed to assess the marketability of future graduates. Reports from the Department of Labor and from industry are excellent sources for forecasting demand on graduates. Request information from your Library Liaison.
2) Describe and assess how the program responds to emerging issues in the discipline, changes in the labor force, changes in employer needs, changes in student needs, and other forces of change.
3) Assess why students come to FSU for the program. Summarize the results of the graduate exit survey and the student program evaluation.
a) How well does the program meet student expectations? b) How is student sentiment measured?
D. PROGRAM VALUE. Please refer to the faculty survey.
1) Describe the benefit of the program, facilities, and personnel to the University. 2) Describe the benefit of the program facilities, and personnel to the students enrolled in
the program. 3) What is the assessment of program personnel of the value of the program to
employers? Explain how is this value is determined. 4) Describe the benefit of the program, faculty, staff and facilities to entities external to
the University (services that faculty have provided to accreditation bodies, and regional, state, and national professional associations; manuscript reviewing; service on editorial boards; use of facilities for meetings, etc.).
5) What services for extra-University general public groups (e.g., presentations in schools or to community organizations) have faculty, staff or students provided? Describe how these services benefit students, program, and community.
Section 2: Collection of Perceptions. The survey sections must include, among others, a discussion of techniques used in collecting the information, difficulties encountered during the surveying process, number and percent of respondents, and analysis of data in accordance with established methodologies. The survey instruments must be designed and distributed, in consultation with Institutional Research and Testing, to reflect general aspects of program review as well as the specific nature of the program itself. All comments should be included, but the names of individuals mentioned should be deleted.
A. Graduate follow-up survey: The purpose of this activity is to learn from the graduates their perceptions and experiences regarding employment based on program outcomes. The goal is to assess the effectiveness of the program in terms of job placement and preparedness of the graduate for the marketplace. A mailed or e-mailed questionnaire is most preferred; however, under certain conditions telephone or personal interviews can be used to gather the data.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
17
B. Employer follow-up survey: This activity is intended to aid in assessing the employers‟ experiences with graduates and their perceptions of the program itself. A mailed or e-mailed instrument should be used to conduct the survey; however, if justified, telephone or personal interviews may be used to gather the data.
C. Graduating student exit survey: Graduating students are surveyed every year on an ongoing basis to obtain information regarding quality of instruction, relevance of courses, and satisfaction with program outcomes based on their own expectations. The survey must seek student suggestions on ways to improve the effectiveness of the program and to enhance the fulfillment of their expectations. This survey is mandatory for all program graduates.
D. Student program evaluation: Current students are surveyed to obtain information regarding quality of instruction, relevance of courses, and satisfaction with program outcomes based on their own expectations. The survey must seek student suggestions on ways to improve the effectiveness of the program and to enhance the fulfillment of their expectations. This survey should be conducted during the year before the PRP report is submitted.
E. Faculty perceptions: The purpose of this activity is to assess faculty perceptions regarding the following aspects of the program: curriculum, resources, admissions standards, degree of commitment by the administration, processes and procedures used, and their overall feelings. Additional items that may be unique to the program can be incorporated in this survey.
F. Advisory committee perceptions: The purpose of this survey is to obtain information from the members of the program advisory committee regarding the curriculum, outcomes, facilities, equipment, graduates, micro- and megatrends that might affect job placement (both positively and adversely), and other relevant information. Recommendations for improvement must be sought from this group. In the event that a program does not have an advisory committee, a group of individuals may be identified to serve in that capacity on a temporary basis.
Section 3: Program Profile: Include Administrative Program Review document in this section. Provide the number and percentage for the variable addressed for each of the years since inception (for new programs) or the last program review.
A. PROFILE OF STUDENTS.
1) Student Demographic Profile. a) Gender, race/ethnicity, age (use annual institutional data). b) In-state and out-of-state. c) Full-time and part-time. d) Attend classes during the day, in the evenings, and on weekends. e) Enrolled in classes on- and off-campus. f) Enrolled in 100% on-line and/or mixed delivery courses.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
18
g) Discuss how the information presented in (a) through (f) impacts the curriculum, scheduling, and/or delivery methods in the program.
2) Quality of Students. a) What is the range and average GPA of all students currently enrolled in the
program? ACT? Comment on this data. b) What are the range and average GPA‟s of students graduating from the program?
ACT? Comment on this data. c) In addition to ACT and GPA, identify and evaluate measures that are used to assess
the quality of students entering the program. d) Identify academic awards (e.g., scholarships or fellowships) students in the
program have earned. Comment on the significance of these awards to the program and students.
e) What scholarly/creative activities (e.g., symposium presentations, other presentations or awards) have students in the program participated in? Comment on the significance of these activities to the program and students.
f) What are other accomplishments of students in the program? Comment on the significance of these accomplishments to the program and students.
3) Employability of students. a) How many graduates have become employed full-time in the field within one year
of receiving their degree? Comment on this data. b) What is the average starting salary of graduates who become employed full-time in
the field since inception (for new programs) or the last program review? Compare with regional and national trends.
c) How many graduates have become employed as part-time or temporary workers in the field within one year of receiving their degree? Comment on this data.
d) Describe the career assistance available to the students. What is student perception of career assistance?
e) How many graduates continue to be employed in the field? Comment on this data. f) Describe and comment on the geographic distribution of employed graduates. g) How many students and/or graduates go on for additional educational training?
(Give annual average.) Comment on this data. h) Where do most students and/or graduates obtain their additional educational
training? Comment on this data. B. ENROLLMENT.
1) What is the anticipated fall enrollment for the program? 2) Have enrollment and student credit hour production (SCH) increased or decreased
since the last program review? Supply a table and comment on any enrollment trends. 3) Since the last program review, how many students apply to the program annually? 4) Of those who apply, how many and what percentage are admitted? 5) Of those who are admitted, how many and what percentage enroll?
6) What are the program‟s current enrollment goals, strategy, and efforts to maintain/increase/decrease the number of students in the program? Please explain.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
19
C. PROGRAM CAPACITY
1) What is the appropriate program enrollment capacity, given the available faculty, physical resources, funding, accreditation requirements, state and federal regulations, and other factors? Which of these items limits program enrollment capacity? Please explain any difference between capacity and current enrollment.
D. RETENTION AND GRADUATION
1) Give the annual attrition rate (number and percent of students) in the program. 2) What are the program‟s current goals, strategy and efforts to retain students in the
program? 3) Describe and assess trends in number of degrees awarded in the program. 4) How many students who enroll in the program graduate from it within the prescribed
time? Comment on any trends. 5) On average, how long does it take a student to graduate from the program? Please
comment. E. ACCESS
1) Describe and assess the program's actions to make itself accessible to students. Use examples such as off-site courses, accelerated courses or other types of flexible learning, use of summer courses, multiple program entry points, e-learning, mixed delivery courses, scheduling.
2) Discuss what effects the actions described in (1) have had on the program. Use examples such as program visibility, market share, enrollment, faculty load, computer and other resources.
3) How do the actions described in (1) advance or hinder program goals and priorities? F. CURRICULUM. The curriculum review section must also contain appropriate check sheets
and example syllabi, which may be attached as an appendix. 1) Program requirements. Describe and assess the program-related courses required for
graduation. a) As part of the graduation requirements of the current program, list directed
electives and directed General Education courses. Provide the rationale for these selections.
b) Indicate any hidden prerequisites (instances where, in order to take a program-required course, the student has to take an additional course. Do not include extra courses taken for remedial purposes).
2) Has the program been significantly revised since the last review, and if so, how? 3) Are there any curricular or program changes currently in the review process? If so,
what are they? 4) Are there plans to revise the current program within the next three to five years? If so,
what plans are envisioned and why? G. QUALITY OF INSTRUCTION
1) Discuss student and alumni perceptions of the quality of instruction.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
20
2) Discuss advisory committee and employer perceptions of the quality of instruction. 3) What departmental and individual efforts have been made to improve the learning
environment, add and use appropriate technology, train and increase the number of undergraduate and graduate assistants, etc.?
4) Describe the types of professional development have faculty participated in, in efforts to enhance the learning environment (e.g. Writing Across the Curriculum; Center for Teaching and Learning, etc.) .
5) What efforts have been made to increase the interaction of students with faculty and peers? Include such items as developmental activities, seminars, workshops, guest lectures, special events, and student participation in the Honors Program Symposium.
6) Discuss the extent to which current research and practice regarding inclusive pedagogy and curriculum infuse teaching and learning in this program.
7) What effects have actions described in (5) and (6) had on the quality of teaching and learning in the program?
H. COMPOSITION AND QUALITY OF FACULTY. Describe and assess the composition of
the faculty teaching courses in the program. 1) List the names of all tenured and tenure-track faculty by rank.
a) Identify their rank and qualifications. b) Indicate the number of promotions or merit awards received by program faculty
since the last program review. c) Summarize the professional activities of program faculty since inception or the last
program review (attendance at professional meetings, poster or platform presentations, responsibilities in professional organizations, etc.).
2) Workload a) What is the normal, annualized teaching load in the program or department?
Indicate the basis of what determines a “normal” load. On a semester-by-semester basis, how many faculty have accepted an overload assignment?
b) List the activities for which faculty receive release time. 3) Recruitment
a) What is the normal recruiting process for new faculty? b) What qualifications (academic and experiential) are typically required for new
faculty? c) What are the program's diversity goals for both gender and race/ethnicity in the
faculty? d) Describe and assess the efforts being made to attain goals in (c).
4) Orientation. Describe and assess the orientation process for new faculty. 5) Reward Structure: e.g., salary, professional development funds, travel funds, UCEL and FSUGR incentive money
a) Describe the reward structure in the program/department/college as it relates to program faculty. Indicate the type of reward and eligibility criteria.
b) Does the existing salary structure have an impact on the program‟s ability to recruit and retain quality faculty?
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
21
c) Is the reward structure currently in place adequate to support faculty productivity in teaching, research, and service? If not, what recommendations would you make to correct the situation.
d) Is enhancing diversity and inclusion a component of the reward structure? Please explain.
6) Graduate Instruction (if applicable) a) List all faculty teaching graduate courses. b) What percentage of graduate courses is taught by non-tenure-track faculty? Please comment. c) What are the program‟s (or department‟s) criteria for graduate faculty? d) Have all graduate faculty (including non-tenure-track faculty) met the criteria?
Please comment. 7) Non-Tenure-Track and Adjunct Faculty.
a) Please provide a list for the last academic year of full-time non-tenure-track and adjunct faculty who taught courses in the program. For full-time non-tenure track faculty, indicate the length of their appointments and the number of years of service at the University. Comment on the program‟s ability to retain non-tenure-track faculty.
b) What percentage of program courses is taught by the faculty in (a)? What courses are they teaching? Please comment. c) Describe the required qualifications (academic and experiential) for faculty listed in
(a). Indicate if all faculty have met the criteria, and if not, what is being done to resolve the situation?
d) Does the program consider the current use of non-tenure-track faculty to be appropriate? Why or why not?
e) If the program is accredited, what position if any does the accrediting body have regarding the use of non-tenured and adjunct faculty?
I. ASSESSMENT AND EVALUATION. Describe and evaluate the program’s assessment
mechanisms. Note - Each program review must be accompanied with a TracDat report that is designed for
Program Review that provides information about the results of assessment implementation at the program level. The TracDat system has the APR Report available to all within the university, and this report must be included. Program Review panels may also elect to produce additional TracDat reports that demonstrate the effectiveness of the program.
1) List and describe student learning outcomes at the course level. 2) List and describe student learning outcomes at the program level. 3) Submit a curriculum map and an explanation of how program outcomes are achieved
through course curriculum. 4) Identify how learning outcomes at the course level are measured. Include analysis
regarding how well students are meeting course level outcomes.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
22
5) Identify how learning outcomes at the program level are measured. Include analysis regarding how well students are meeting program level outcomes.
6) Describe how assessment results at the course and program levels have assisted in making decisions about pedagogy, learning outcomes, and other course and/or program level actions.
7) List and describe what variables are tracked and why when assessing the effectiveness of the program (e.g. mastery of essentials of subject area, graduation rates, employment rates, pass rates on professional exams).
8) Provide trend data for the variables listed in (1). Compare the data to accreditation benchmark standards if applicable, or provide some other type of assessment of the data.
9) Describe how the trend data in (2) is used to assess the rigor, breadth, and currency of the degree requirements and curriculum.
10) Describe how the trend data in (2) is used to assess the extent to which program goals are being met.
J. SERVICE TO NON-MAJORS. Describe and assess the impact that delivery of service
courses offered by the program or the department has on the program. a) Identify and describe the General Education service courses provided by the
program faculty for other departments at FSU. b) Identify and describe any non-General Education service courses or courses
required for other programs. Comment on your interaction with the departments or programs for which the courses are provided.
c) Discuss the impact of the provision of General Education and non-General Education courses has on the program.
d) Does the program plan to increase, decrease, or keep constant its level of service courses? Explain.
K. DEGREE PROGRAM COST AND PRODUCTIVITY DATA. Submit Institutional Research
and Testing data. Comment on the data. L. ADMINISTRATION EFFECTIVENESS
1) Discuss the adequacy of administrative and clerical support for the program. 2) Are the program and/or department run in an efficient manner? Please explain. 3) Are class and teaching schedules effectively and efficiently prepared? Please comment. 4) Are students able to take the courses they need in a timely manner? Please comment.
Section 4: Facilities and equipment
A. INSTRUCTIONAL ENVIRONMENT 1) Are current classrooms, labs, and technology (both on-campus and at off-site locations)
adequate? Explain. 2) How does the condition of current facilities impact program delivery? Explain. 3) Describe the program‟s projected needs with respect to instructional facilities. 4) Describe current plans for facilities improvements and indicate their status.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
23
11) Describe how proposed changes or improvements to facilities would enhance program delivery.
B. COMPUTER ACCESS AND AVAILABILITY
1) Outside of computers in faculty and staff offices, identify the computing resources (hardware and software) that are allocated to the program.
2) Discuss how these resources are used. 3) Discuss the adequacy of these resources and identify needed additional resources.
4) Does an acquisition plan to address these needs currently exist? Describe the plan. Has it been included in the department or college‟s planning documents?
5) Discuss the efficacy of online services (including WebCT) available to the program. 6) Discuss the adequacy of computer support, including the support for on-line instruction if applicable.
C. OTHER INSTRUCTIONAL TECHNOLOGY 1) Identify other types of instructional technology resources that are allocated or available to the program. 2) Discuss how these resources are used. 3) Discuss the adequacy of these resources and identify needed additional resources. 4) Does an acquisition plan to address these needs currently exist? Describe the plan. Has
it been included in the department or college‟s planning documents? 5) Discuss the impact of adequacy of other types of instructional technology resources and support of these resources on the program.
D. LIBRARY RESOURCES 1) Discuss the adequacy of the print and electronic and other resources available through
FLITE for the program. 2) Discuss the service and instruction availability provided by the Library faculty and staff
with respect to the needs of the program. 3) Discuss the impact of the budget allocation provided by FLITE to your program. Is the budget allocation adequate? Explain.
Section 5: Conclusions based on data analysis derived from Sections 2-4 and on the collective wisdom and judgment of the PRP. In arriving at these conclusions, the PRP should summarize the relationship of the program to each of following specific categories and any other categories it deems appropriate:
A. RELATIONSHIP TO FSU MISSION
B. PROGRAM VISIBILITY AND DISTINCTIVENESS
C. PROGRAM VALUE
D. ENROLLMENT
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
24
E. CHARACTERISTICS, QUALITY AND EMPLOYABILITY OF STUDENTS
F. QUALITY OF CURRICULUM AND INSTRUCTION
G. COMPOSITION AND QUALITY OF THE FACULTY
V. Appendix
A. Academic Program Review Calendar Note: The Academic Program Review Calendar dates change annually. Users should access the current calendar on the APR website at the following URL: http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
B. Sample Forms 1. Program Evaluation Plan (sample document)
PROGRAM EVALUATION PLAN LEGAL ASSISTANT PROGRAM
Degrees Awarded: A.A.S. in Legal Assisting Program Review Panel: Chair and Program Coordinator: John Kane Program faculty and Assistant Coordinator: John Vermeer College of Business faculty: Michael Cooper Individual with special interest in the Program: R. Dale Hobart Faculty member outside the College of Business: Sally Krumins Management Department Chair: Vivian Nazar Purpose: To conduct a study of the Legal Assistant Program to evaluate its needs and effectiveness so the University can make informed decisions about resource allocations. Data Collection Techniques 1. Graduate surveys completed in 1989 and 1994 2. Employer surveys from 1995 3. Student evaluation of program and courses from 1994 and 1995 4. Faculty perception of program from surveys to both Legal Assistant faculty and College of Business
faculty. 5. Advisory Committee perceptions of the program from questionnaire to advisory board members. 6. Labor Market analysis information from current market indictors. 7. Evaluation of facilities and equipment by doing a review of the law collection in the library, the adequacy
of classrooms and computer facilities.
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
25
8. Curriculum evaluation information will be taken from the American Bar Association self-study completed in 1995 and the ABA assessment of that information.
Schedule of Events Activity Leader Target Date Graduate Survey Kane November 15 Employer Survey Kane November 15 Student Evaluation Vermeer December 1 Faculty Perceptions of Program Vermeer December 1 Advisory Committee Perceptions Kane December 1 Labor Market Analysis Krumins December 1 Evaluation of Facilities Krumins December 1 Curriculum Evaluation Kane December 1
2. Budget (sample document)
MEMORANDUM
TO: Doug Haneline, Chair, Academic Program Review Council FROM: Bill Killian, Associate Professor, Industrial Chemistry Technology Program Dave Frank, Department Head, Physical Sciences SUBJECT: Proposed budget for Industrial Chemistry Technology program review panel DATE: October 30, 1995 Attached is the proposed budget for the Industrial Chemistry Technology review panel. Please contact us if you have any questions. Student Surveys (375) Copying Costs $ 28.13 Mailing Costs 206.25 Return Envelope Printing 25.50 Return Mailing Costs 146.25 Advisory Board Surveys Copying and Mailing 7.00 Student Wage Support 40 Hours at $4.25/hour 170.00 Phone Expenses 50.00
Academic Program Review: A Guide for Participants –August, 2010 Approved by the Academic Senate
Questions? Please contact the APRC chair
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
26
Final Document Copying Costs 90.00 TOTAL $ 723.13
C. Six-Year Academic Program Review Cycle
Note: The Academic Program Review Cycle is updated annually. Users should access the current cycle on the APR website at the following URL:
http://www.ferris.edu/htmls/administration/academicaffairs/vpoffice/senate/progreviewcounc/
Last Update: 08/02/2010
FERRISCONNECT 2.0 LEARNING MANAGEMENT SYSTEM COMMITTEE REPORT
COMMITTEE MEMBERSHIP Doug Blakemore, MISM Faculty Kim Hancock, Pharmacy Faculty Sue Hastings-Bishop, Recreation & Leisure Management Faculty Mary Holmes, E-Learning Systems Administrator Bill Knapp, Coordinator of Instructional Technology Meegan Lillis, Instructional Technologist Rebecca Sammel, Languages & Literature Faculty
SUMMARY In 1998 Ferris State University adopted WebCT Standard Edition version 1.3. as its first learning management system (LMS) and has continued to use versions of the WebCT platform as its primary LMS since . In 2006 the University began the migration process from WebCT Campus Edition to WebCT Vista Enterprise, around the same time WebCT was acquired by Blackboard, Inc. When Ferris first began using the Blackboard Vista product there were numerous problems concerning product reliability, customer support, and vendor responsiveness. The current system has had a history of known issues which have prompted Ferris to purchase and maintain additional nodes within our clustered environment, adding to the complexity and the cost administering our Blackboard Vista LMS installation. As of 2013 Blackboard Vista 8, will no longer be supported. The emerging Blackboard Learn 9 product is technically an upgrade of Blackboard's Academic Suite. It is an early iteration in Blackboard's "NG" unification stategy of converging all of their LMS products (Academic Suite, Vista/CE and Angel). Migrating to Learn 9 from Vista will require an entire new installation. Our current license with Blackboard expires summer 2011. The decision to renew with Blackboard and for how long, will depend on whether we intend to continue using their products beyond 2013. Regardless of which platform we choose to replace Vista, the transition to another system could take up to two years. There may be no better time to consider alternative LMSs than the present.
SYSTEMS CONSIDERED The most frequently referenced learning management systems (LMS) in higher Education include Angel, Blackboard, Desire2Learn, Moodle, and Sakai. Although numerous other LMSs are available,
most are used for business and industry and are not specifically designed to integrate with Banner or other educational information technology systems.
Angel has been procured by Blackboard and will be phased out with Blackboard's unification strategy. For this reason the committee did not include Angel in the LMS review process. Desire2Learn is the only other proprietary system with significant market share used in higher education. Open Source (AKA Open Community) products represent the fastest growing market share of LMS in K-20 education. Moodle and Sakai are the leaders in open source LMS solutions.
OPEN SOURCE Open source systems are licensed under the GNU scheme, and are therefore free to download, install and use without charge. The software may be modified and customized to suit the user's needs without concern of licensing violations. Nevertheless, the cost of operating an open source system is not free, and may, in some cases actually cost more than supporting a proprietary system, when taking into consideration potential technical support costs. The support model for open source is significantly different than with proprietary systems; rather than going to the vendor, adopters of open source solutions go to the open community for support - those developers and institutions having likewise adopted the LMS for their campuses. There are hundreds of Modules and Plugins created and shared by the Open Source Development Community. See http://moodle.org/mod/data/view.php?id=6009. Many educational institutions elect to purchase hosted open source services from one of several vendors. The license is free, but the hardware, installation, upgrades, and technical support is managed by the vendor. This option has the potential to be the most cost effective.
FERRISCONNECT Over the past several years Ferris has been experiencing a significant increase in online enrollments, a significant number of which include on-campus students. As more courses and programs are delivered either fully online or in a blended format, there is an increased demand for those tools which support student to student collaboration (see features comparison).
REVIEW PROCESS The sub-committee formed four teams, each responsible for investigating one LMS. Teams collected information from vendor and institutional websites, as well as telephone and email contact. Each team was asked to identify higher educational institutions currently using the platform and individuals from those organizations who would be willing to provide a demonstration of the features and functionality from a teaching and learning perspective. Hour-long presentations were scheduled and announced campus wide, inviting faculty staff and students to attend. Feedback was gathered and the presentations were posted online. http://wiki.ferris.edu/elearning/index.php/FerrisConnect2 Feedback from Presentations can be found in appendix A.
FEATURES COMPARISON A survey of faculty using FerrisConnect (spring 2008) for all modalities (online, blended, enhanced) indicates the top essential tools include the syllabus, gradebook, document sharing, web links, assignments, calendar, announcements, learning modules, and quizzes. Breaking out responses from faculty teaching online and blended, we learn that the use of group discussion is another top feature (this is important as online enrollments are increasing).
Blackboard Desire2Learn Moodle Sakai
Syllabus X * * X
Documents X X X X
Gradebook X X X X
Web Links X X X X
Assignments X X X X
Calendar X X X X
Learning Modules X X X X
Quizzes X X X X
Announcements X X X X
Discussions X X X X All of the CMS reviewed include features which are comparable to the most used tools: syllabus*, gradebook, document sharing, web links, assignments, calendar, announcements, learning modules, and quizzes, as well as group discussions.
*The syllabus tool used in Blackboard offers the ability to build a syllabus, or to link to a file. Each of the other CMS also provide this functionality by using either a learning module, or a file link. Other Tools: Mail, Grading Forms (Rubrics), Goals, Who's Online (instant messaging)
Blackboard Desire2Learn Moodle Sakai
Group Assignment X X X (workshop) ?
Group Discussions X X X X
Self-test X X X ?
Surveys X X ?
Chat X X X X
Whiteboard X X X
Media Library X X X
Blogs X X X (database) X
Journals X X
Rubrics X
Goals X
Instant Messaging X
Mail X X X (messages) X
Student Webpage X X X ? Least used tools (used rarely or never): According to the survey, the FerrisConnect features least used include student web pages, blogs, journals, chat/whiteboard, media library, surveys, and self-tests. Should a CMS rate high in all other areas which may lack one or more of these tools, we may still wish to consider it if the advantages are great in other areas. None of the systems reviewed had all of the features of Bb Vista 8 (our current FerrisConnect version), including Blackboard Learn.
Each of the alternative LMS reviewed offered additional features not available with the Vista product.
Blackboard Desire2Learn Moodle Sakai
Wiki X X
RSS News Feed X X
ePortfolio X X
Repository X X
INTEGRATION WITH EXISTING SYSTEMS Respondus Products: Respondus, StudyMate Class Server, and Respondus Lockdown Browser integrate with Blackboard and Moodle, and with Desire2Learn (except StudyMate Class). Tegrity can be integrated with all four systems, including Blackboard, Moodle, Sakai, and Desire2Learn. e-Instruction / CPS currently works only with Blackboard/WebCT. However, vendor documentation indicates they plan to offer integration with Moodle, Sakai, and Desire2Learn in the near future.* Plagiarism Prevention services: SafeAssign is a proprietary product of Blackboard and the services is included with the Blackboard license fee. In the event a different product were selected, the University would need to consider the cost of another plagiarism prevention program such as Turnitin.com. Turnitin.com is integrated with Blackboard, Desire2Learn and Moodle, but not with Sakai at this time (http://turnitin.com/static/integration.html).
Blackboard Desire2Learn Moodle Sakai
Respondus X X X
StudyMate Class X X
Lockdown Browser X X X
CPS X * * *
Turnitin.com X X X
Tegrity X X X X
Other Issues
Blackboard Desire2Learn Moodle Sakai
SSO X X X X
R-T Banner * X X
Crosslisting X X Single Sign On (SSO) is currently accomplished using the Luminis portal. Luminis supports Blackboard and Desire2Learn for the MyCourses feature, the CPIP comnnector in Luminis allows all four systems to use LDAP to use a direct link (icon) to any of the CMSs without a second login. Real-time (RT) integration with Banner is not currently available for Blackboard Learn 9, but is a planned feature for 10/NG. Banner integration with Desire2Learn is accomplished via the Luminis Message Broker. Real-time integration with Banner is available for Moodle as an Open Community solution. Crosslisting is supported for Blackboard. Moodle has a built in crosslisting feature which allows faculty/designers to crosslist or uncrosslist their courses at their own discretion.
FINDINGS Concerning features, all of the LMS considered met the requirements for the most frequently used features. Each had some unique features such as e-portfolio, repository, wikis, and widgets, which offered added value.
Desire2Learn was perhaps the most feature rich solution, but lacked some essential integration features including the ability to Crosslist course sections, which many of our faculty use extensively. Desire to Learn is also not compatible with Studymate Class Server an increasingly popular application used with our current system.
Sakai met the basic features list but otherwise was the least feature rich solution. Sakai lacks essential integration with Banner and the ability to cross list courses. Sakai also appears to have the east amount of market share in the higher ed
The consensus of the FerrisConnect 2.0 committee is the two systems which meet the overall requirements, including critical integration functionality are Blackboard, Learn 9.1 and Moodle.
Blackboard is likely to offer the most consistent look and feel to what we are accustomed to and therefore provide the easiest transition for faculty.
Moodle is the fastest growing open source solution and many campuses have already or are in the process of migrating from Blackboard and other proprietary LMS to open source. The use of hosted services such as Moodelrooms and Remote-Learner may offer significant cost savings with the open source LMS. The realized savings in license fees, hardware, and systems administration would be offset to some degree by the costs of hosting service fees and staff training for application administration.
RECOMMENDATIONS As next steps, the subcommittee recommends running limited pilot projects for the two systems Blackboard Learn 9.1 and Moodle 1.9) to be evaluated by the faculty and students participating in the pilots, as well as those staff supporting the project.
APPENDIX A: PRESENTATION FEEDBACK
BLACKBOARD LEARN
WHAT DID YOU LIKE ABOUT BLACKBOARD LEARN? Gradebook - looks good with the running totals
I like the hot spot question addition
The ability to the first day of a course
Running totals
Choose first page students see
Improved HTML Editor
Appears to be a clean conversion from 8.0 to 9.1 – faculty can do their own import
Ability to customize is improved – entrypoint
Mashups: Flickr, Slideshare, Twitter
Able to rename tools
Discussion, wiki, blogs are separated out
Hase some nice enhancements to assessments
Grade calculation much improved
WHAT DID YOU DISLIKE ABOUT BLACKBOARD LEARN? Grading form – not being the same [as] in Vista
Still seems very click-intensive
Grading rubric is not as powerful – requires additional tool of outcomes
What else has been removed to separate products? I don’t like that features are taken away in 9 and put back in 9.1 Not sure what to expect in 10.
WHAT OTHER QUESTIONS DO YOU HAVE ABOUT BLACKBOARD LEARN? We could still have a library tab right?
DESIRE2LEARN
WHAT DID YOU LIKE ABOUT DESIRE2LEARN? Customizable toolbar and rich media embeds
Grading look good
Feature rich
Easy to use
Good system – could work for us
Appears easier to use for faculty & students
The robust file management system – especially for consolidation of student assignments
The gradbook feature which allowed for the complete display of all students and assignments with the opportunity to enter information without having to go assignments by assignment only
The test question analysis provides more features of value
Widget displays
WHAT DID YOU DISLIKE ABOUT DESIRE2LEARN?
WHAT OTHER QUESTIONS DO YOU HAVE ABOUT DESIRE2LEARN? Stay signed in to the system?
What is the ease of forming groups and later changing them for the same project?
MOODLE
WHAT DID YOU LIKE ABOUT MOODLE? I could use it
Not much different from what we have now ( so the question is…. Why change?)
Can easily copy material from one course to another.
Easier to switch roles – instrcutor/editor/student
Context sensitive help
Allows creation of Moodle webpage & create a book of content resources
Voting & db – more activities
Used in many high schools, so students have familiarity – various districts around the state
Calendar
Email notices & popups
Students flexibility in organizing how they see their content
Calendar looks good
Flexible design
Lessons have branching options
Built in wiki
Announcements go to email too (which email?)
Nested outline for all courses in left hand pane
Students help requests compared to whatever they used before
Teacher glossary/hot potato could work like studymate
Calendar will popup so you can see what is due for all courses, then drill down to specific course
All related information saved within a book – evidently books are similar to [learning] modules
News, announcements could work similarly to what we have in [Vista 8]
All instructors treaching online are listed
It was reported there are fewer student complaints
System was created by educators
The IM functionality seems useful
Seems to be easy to copy course content
Ability to easily see assignment outside the modules
Ability to turn off small areas within course content
Help pages can be modified to fit the institution
Instant messaging turned to an email
Glossary word a day
Video tutorial repository
Flexibility
Navigation straight-forward
Fewer student issues/questions
Help files are editible/customizable
Customizable look & feel
Organizes information in chuncks (Moodlebooks)
Can embed a conference tool like Adobe Connect
Build a wiki into a course
Flexible choices
Rss feeds
Quiz-building WYSIWYG
WHAT DID YOU DISLIKE ABOUT MOODLE? Visually, not very appealing. In the state of the art, icons (graphics) tend to be expected.
Reading the title for every entry or option slows down processing (the human decision-making)
No built in student view
Flexible design – too many options for faculty may result in too many different “look and feel” for students
Cant direct students as specifically to content compared to [Bb Vista\ (learning Modules
Extreme flexibility almost makes it too cluttered
Required on-campus sessions would be tough for some of our students
Questions can be saved from a chat session when you were not available go directly to your email
Labels can’t easily be changed
Not having three views with one logon
Survey tool is limited re: formats
Scrolling linear format
WHAT OTHER QUESTIONS DO YOU HAVE ABOUT MOODLE Concerns about how calculated questions are handled and how robust the engine for that may be
What would it take for the hundreds of faculty and several hundreds of courses to be tranferred/transformed to Moodle?
More about technical infrastructure & support
Is there integration for the other related applications (e-instruction, tegrity, respondus, etc.)
How easy is it to copy forward the materials for a orior offering of the course?
Is a “book” a learning module?
Is instructor access same as designer?
How do you input questions into assessment tools?
What format do files need to be saved in (same as our current platform)?
Can you forward email?
Can you repeat calendar items easily?
Does it interact with Banner (automatically pupulate)?
Authentication through LDAP?
Staff support needed & related costs?
Integration capabilities?
Student tracking and reporting tools?
39,127 users? How was that determined? Are the reporting tools accurate?
How long did it take to create training materials?
How many staff support Moodle Interface Design work (not course design)?
What O/S’s does Moodle run on?
Does Moodle have fail over features/ how many users per server?
Is there a back end Database, # of staff needed to support?
Is Moodle integrated with portal & student support system?
Are 3-rd party tools integrated w/ Moodle (Respondus, Tegrity, etc)?
Can you integrate with portals and student information systems?
If integrated w / campus systems how many staff to support?
SAKAI
WHAT DID YOU LIKE ABOUT SAKAI? Annoucement too that can be emailed
Student participation graph
Dropbox folder / email
Forums / threaded discussions
Easy to create groups
Link to campus email can be added [to menu]
Built-in wiki
Can set [up] own development site
Help files – instructions created by the TS folks - outsourced
Can add attachment to announcement & send email
Students can download folders & resources
Easy to move to & from instructor view.
Very simple (maybe too simple).
Syllabus tool allows to focus on current activities.
Has long list of available tools, including blgs and podcasts.
Can click on courses from central location
Nothing except its open source
Stats
Adding non-student participants ( I assume this could work for embedding librarians in a course)
I like the screen layout
WHAT DID YOU DISLIKE ABOUT SAKAI? (Resources) Folders are easier than Modules? Lacking robust organization of content
1-way email – sent out
Forums / threaded discussions can [become] complex & confusing w/ large & lengthy discussions
Gradebook learning curve
Modules onlu allow fle content, not discussions, assessments, etc.
No grading rubric
I don’t think the tools (assignments, gradebook, etc) are as good as our current system
No grading form
WHAT OTHER QUESTIONS DO YOU HAVE ABOUT SAKAI? Is there a rubric tool? Outside the gradebook?
Can my current FerrisConnect course structure migrate simply to Sakai? I would want a 1-click migration
Recordings of Presentations and Demos
RESOURCES: Blackboard NG (next generation) http://www.blackboard.com/sites/projectng/
Campus Computing project (November 2009) http://www.campuscomputing.net/sites/www.campuscomputing.net/files/CampusComputing2009_2.pdf
EDUCAUSE Core Data FY 2008 Survey Report http://net.educause.edu/ir/library/pdf/PUB8006.pdf
SUNY-Wide LMS Response https://confluence.delhi.edu/display/CIS/SUNY-Wide+LMS+Response
Banner integration with Moodle http://to./47ws http://www.sungardhe.com/microsite/index.aspx?&siteid=1924&id=1924&menuid=448
Luminis Message Broker integration with Moodle https://confluence.delhi.edu/display/CIS/Moodle+and+Banner
Moodle 2 Integration with Web 2.0 http://mfeldstein.com/moodle-2-file-import-from-and-export-to-cloud-apps/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+mfeldstein%2FyyMY+%28e-Literate%29&utm_content=Google+Reader
1
FerrisConnect Advisory Board FerrisConnect 2.0 Recommendation
Prepared by the FerrisConnect 2.0 Pilot Subcommittee
in collaboration with the FerrisConnect Advisory Board
July 27, 2010
History
The FerrisConnect 2.0 Team reported out to the FerrisConnect Advisory Board
(FAB) at the July 8, 2010 meeting. At that time, it was determined that a new
subcommittee would be formed to complete the recommendation for the Provost,
Associate Provost, and Chief Technology Officer. The subcommittee is led by Meegan
Lillis and includes Kimn Carlton-Smith, Humanities Faculty, Mary Holmes, e-
Learning Systems Administrator, and Brad McCormick, Core Health Sciences Faculty
and Core Curriculum Coordinator.
Proposed Pilot
FAB would like to propose a pilot project be developed where a small cross-section
of Ferris faculty will teach one course in Spring 2011 using Moodle 2.0, followed by
one course in Blackboard Learn 9 during Fall 2011. The pilot participants will
represent a cross-section of FerrisConnect users (novice to early adopters) and
would evaluate the two products using a single survey instrument.
Objectives of the Pilot
The objectives of this pilot are to:
1. Evaluate the ease or complexity of use of each course management system
(CMS):
a. IT’s administration and tech support (in-house TAC support and
external hosting service).
b. Course design potential in accordance with current FSU Best Practices
and Quality Matters Rubrics.
2
c. Ability to build course with integrating current Web2.0 learning tools
(Tegrity, Respondus Tools, CPS, etc.).
d. Faculty instructional management and teaching.
e. Student navigation and end-user experience using a variety of
internet connections (e.g. dial-up to T1 network).
2. Evaluate the ease or difficulty of migrating previously developed content to
the two proposed CMS.
3. Evaluate the ease or complexity of training administrators, support staff, and
instructors to use each proposed CMS.
4. Evaluate the benefits or shortcomings to hosting a CMS externally.
Timeline
Fall 2010:
Selection of faculty for a two semester pilot project
Development of evaluation tools (e.g. faculty evaluation, student evaluation,
and administration evaluation)
Training in Moodle (no later than October)
Development of 1 course per faculty in Moodle
Spring 2011:
Moodle pilot
Summer 2011:
Collect evaluations from Moodle pilot (no later than May 31th)
Training in Blackboard Learn (no later than June)
Development of 1 course per faculty in Blackboard Learn
Fall 2011:
Blackboard Learn pilot
Spring 2012:
3
Collect evaluations from Blackboard Learn pilot (no later than Dec 31th)
Review evaluations, summarize data and report out findings (deadline Spring
Break)
Costs Involved in the Pilot
Moodle Hosting with Remote-Learner:
Hosting $1,995.00
Faculty Training $295.00/faculty – assuming 15 faculty will participate =
$4425.00
Administrator Training $395.00/administrator – recommendation of at least
two administrators be trained = $790.00
Total: $7210.00
Blackboard Learn Hosting with Blackboard:
At this time, John Urbanick is negotiating with Blackboard on the current
contract. Included in the negotiation is free hosting for the Fall 2011 pilot as
well as free training for the faculty (administrators?) included in the pilot.
Additional Costs:
A small stipend has been recommended by FAB for the faculty who
participate in the pilot as it will be a major time commitment. The faculty
who participate must design, build, and teach a course in both course
management systems, following the proposed timeline, this would require
roughly a 15-month commitment.
$500 stipend for 15 faculty = $7,500.00
Faculty would also be recognized with a faculty recognition letter from the
Provost’s Office
Ferris State University prepares students for successful careers, responsible citizenship, and lifelong learning. Through its many partnerships and its career-oriented, broad-based education, Ferris serves our rapidly changing global economy and society.
HLC SELF-STUDY AND BEYOND
YOUR QS ABOUT HLC & CONTINUING
ACCREDITATION
What are HLC and continuing accreditation all about?
HLC (the Higher Learning Commission) :
Is one of six regional institutional accreditors in the United States
Accredits colleges and universities in the 19-state North Central region
Awards continuing accreditation status to Ferris State University
Every 6-10 years, HLC evaluates how well each institution is meeting its goals, focusing on these 5 criteria:
1. Mission & Integrity 2. Preparation for the Future 3. Student Learning & Effective Teaching 4. Acquisition, Discovery, and Application of Knowledge
5. Engagement & Service
How does FERRIS STATE UNIVERSITY earn continuing accreditation?
We conduct a self-study, evaluating how well we meet each of the HLC criteria We collect data and information that provide evidence of our successes We report this information to HLC A team of HLC peer reviewers visits Ferris to talk to us about how we’re doing The team recommends that HLC (1) renews our accreditation status,
(2) discontinues our accreditation, or (3) renews our status with required areas of improvement
When does the HLC site team visit – and what will they look for?
The team of HLC peer reviewers will visit Ferris on April 18-20, 2011 They will meet with groups from across campus, at several of our off-campus
sites, and around the community They will ask you what you think about how Ferris is doing and how well Ferris
is meeting the HLC standards based on the self-study research and findings
Ferris State University prepares students for successful careers, responsible citizenship, and lifelong learning. Through its many partnerships and its career-oriented, broad-based education, Ferris serves our rapidly changing global economy and society.
HLC SELF-STUDY AND BEYOND
YOUR QS ABOUT HLC & CONTINUING ACCREDITATION
What are Ferris’ goals for the self-study process? In addition to receiving continuing accreditation status, Ferris set these goals for the self-study process:
MISSION-FOCUSED
To expand the focus on Ferris’ mission and goals, including student success, our values of collaboration, excellence, diversity, being an ethical community, learning, and opportunity
ACTION To move to address the challenges identified in the self-study process through existing or ad hoc University processes
DATA-DRIVEN
To increase the transparency of, emphasis on, using data and evidence to inform planning, service, and program improvements
AWARENESS To enhance the knowledge among all University stakeholders concerning our services, programming, strengths, challenges, and opportunities for improvement
CELEBRATION To celebrate a successful continuing accreditation effort by recognizing our successes, exemplary practices, and outstanding programs and services
What does continuing accreditation mean for me?
Ferris State University’s commitment to quality and to exceeding HLC’s standards of excellence are recognized, validated, and celebrated
Ferris students are eligible for federal financial aid only if the University is accredited
The course credits that students earn at Ferris can transfer to other colleges and universities
Ferris alumni hold college degrees that are more valuable in the marketplace Ferris faculty and staff are eligible for federal research grants and development
opportunities
Take a Chance! Enter an original video and (potentially) earn some cash!
What:
This video competition is designed to produce videos that will be featured in our communication efforts in support of the Higher Learning Commission continuing accreditation process. Enter a video that is between 30 and 180 seconds in length (max. of 3 minutes). Creativity, significance of message, and humor are encouraged. A scoring rubric will be available here (by September 1): http://www.ferris.edu/hlc/video Videos will be scored by a team of professionals from across the university. Only submissions from Ferris students who are enrolled in fall 2010 will be considered.
Purpose:
Winning videos will be used to educate the Ferris communities about the purposes of the upcoming visit and/or the Ferris self-study findings. Primary audiences include either or both students and Ferris personnel. Learn more to inform your submission at http://www.ferris.edu/hlc/
Dates: Submissions are due on CD, DVD, or flash drive by 5 p.m. on October 28, 2010. (Minimum
standards are 600 x 800 and 30 frames per second). Submit to Sherry Hayes in RAN 255.
Prominently display your name, telephone, and e-mail on your submission so we can reach you. You will be asked to sign a form confirming the originality of your work and authorizing its use for the intended purposes. No use of copyrighted material (music, film) may be incorporated into your submission. Submissions may be from individuals or teams.
A special “invitation only” “film fest” where you will have the opportunity to present your video and explain your development strategy (for the top 15) will be held at 5:30 p.m. on November 22. This presentation will be considered in selecting the final winner.
Winners will be announced by December 8, 2010.
Video producers must grant permission for the videos to be used in Ferris promotional activities from time of submission through calendar year 2011. Originator(s) retain(s) ownership of the work product.
Videos will primarily be used by Ferris from January through April 2011. The Rewards: 1st Prize - $1,000 and frequent viewings with credits , award certificate, and press release 2nd Prize - $ 750 and frequent viewings with credits, award certificate, and press release 3rd Prize - $ 500 and occasional viewings with credits, award certificate, and press release Honorable Mentions (max . of 10) – $50 each and occasional viewings and award certificates
Ferris reserves the right to not offer the awards if the videos do not meet expectations. Questions? E-mail us at [email protected] or contact Marilyn Bejma at 591-2300 or Sherry Hayes at 591-2612.