Assessment Report 1997 Appendix B

1997 Annual Report

University of Nebraska-Lincoln
April 1998

Back to University-wide Assessment 1997 Annual Report

 

Appendix B
Highlights of College Reports

Unless otherwise noted, the following highlights have been taken from 1997 college interim and final reports that were submitted to the University-wide Assessment Coordinator.

Colleges
College of Agricultural Sciences and Natural Resources College of Fine and Performing Arts
College of Architecture College of Human Resources and Family Sciences
College of Arts and Sciences College of Journalism and Mass Communication
College of Business Administration Teachers College
College of Engineering and Technology

 


 

College of Agricultural Sciences and Natural Resources

The College of Agricultural Sciences and Natural Resources (CASNR) "...prepares professional leaders in the food, agriculture, and natural resources sciences, human resources and in agribusiness through its undergraduate and graduate programs" (UNL Undergraduate Bulletin 1996-97, p.41). Its goals include providing professional development for students and faculty, as well as personal, cognitive and behavioral development for students, along with continuing education services to the citizens of Nebraska.

Some departments, such as Agronomy and Animal Science, have begun to administer senior exit interviews and senior and alumni surveys. No data have as yet been synthesized and analyzed. In the case of Agronomy, some students must pass a standardized test to become a Certified Crop Adviser and there has been discussion of development of a competency test for all students. Agronomy students also complete a capstone course project in which they design and present a farm management plan.

Biochemistry has developed a capstone course--Biochemistry 435--and in spring of 1997 asked students in Biochemistry 432 to take the 1992 Biochemistry Examination prepared by the American Chemical Society to test their knowledge base. The median score for majors in the course was at the 80th percentile of the national norms.

The Department of Horticulture has held senior exit interviews each year since 1988. Students also complete a survey made up of seven questions regarding courses, advisement, and quality of facilities. Generally, students are satisfied with course quality although courses in business and management skills are perceived to need improvement. Work experiences and internships are highly valued by a significant number of students. The department evaluates curriculum content partly in light of the results of exit interviews and senior surveys.

The Department of Biological Systems Engineering, through two instructional improvement projects, began to renew its curriculum through a College Task Force. The Task Force developed ten outcomes for student learning which serve as the basis for new curriculum development. A three-dimensional model of learning was developed, with dimensions representing technical, non-technical, and life-long skills.

The Mechanized Systems Management Major took the model and began to integrate the technical and non-technical skills across its curriculum. Five knowledge levels are used to assess outcomes:

  • Awareness
  • Awareness-Literacy
  • Literacy
  • Literacy-Competency
  • Competency

The Mechanized Systems Management Major believed that it needed a thorough overview of student outcomes, broken out specifically to match with course-by-course objectives, each with different levels of proficiency. An outcomes matrix was developed. As courses are reviewed using the outcomes matrix, it is expected that students will benefit from "...receiving planned preparation for successful employment" and the curriculum will be revitalized. One area that has been developed to address a perceived need is in teaching the use of basic problem-solving skills using situations that are specific to Mechanization Science.

In 1996, a graduate student in the Department of Agricultural Leadership, Education and Communication (ALEC) developed an instrument used to survey the 334 employers who hired CASNR graduates from 1990-93. The survey's purpose was to elicit from employers a sense of the adequacy of training CASNR graduates had received. Employers were also asked which competencies or cluster of competencies they considered to be essential for employment of graduates: highest ratings were given to personal qualities, communication skills, leadership, and computer, quantitative and management information skills. When rating adequacy of student preparation, these same sub-scales (with the exception of personal qualities) received below- average mean scores, regardless of academic department. It was recommended that course work and co-curricular activities emphasize the development of those skills considered to be important, and that employers be surveyed on these issues every three to five years.

The college has two tools that it can use to begin college-wide assessment activities. The theoretical model that was developed by the College Task Force and the Employer Survey can provide both model and structure for outcomes assessment leading to curriculum review. Many departments currently have well-developed student activities that could form the basis for assessment. However, none of the departments have yet written formal assessment plans. Leadership and support from the Dean's office coupled with Departmental consultation should now create the cultural shift necessary to make outcomes assessment a coherent, integral part of college life that is valued by all its members.

 


 

College of Architecture

Assessment in the College of Architecture is driven by the criteria developed by the three national, independent accreditation agencies that oversee the Planning, Architecture and Interior Design Departments. Each of the accrediting agencies place considerable emphasis upon "...satisfying achievement-oriented performance criteria." Performance standards reflect three levels of cognitive complexity:

  • Awareness - familiarity with basic concepts, information and procedures; the ability to recall and correctly associate knowledge with appropriate circumstances.
  • Understanding - specific and detailed knowledge; a thorough comprehension of concepts and the ability to demonstrate their interrelationships.
  • Competency - successful application of concepts and information to complete specific tasks

Student portfolios are archived as part of the accreditation process and analyzed in terms of how the performance criteria are met along a low pass - high pass continuum.

The college requires that course prospectuses describe course objectives in considerable detail and indicate most specifically, in the case of Architecture, which NAAB requirements apply to the course and at what level of mastery. This provides the college with very fine-grained student performance and curriculum improvement processes.

Beyond assessing how students have satisfied accreditation performance criteria within their courses, a variety of assessment instruments are used by the departments, including internship evaluations, portfolio reviews, juried reviews, and terminal projects. As well, the Interior Design Program conducts a senior survey to evaluate student academic experience and the perceived strengths and weaknesses of the program. Data are examined within the college, although no results are available for inclusion in this report.

Special projects have been initiated to assess how well department goals related to diversity and advising are being met. Architecture faculty have also been awarded a Teaching Council grant to develop evaluation tools that would "help students evaluate their learning andencourage them to revise their learning strategies during the semester."

The college representative to the University-Wide Assessment Committee is an enthusiastic participant on the development of comprehensive education assessment plans (CEP) in order to develop ways of examining the performance of Architecture College students in CEP as they compare to students in other colleges and to the university's student body as a whole, thus completing the outcomes portrait of students in the College of Architecture.

The College of Architecture might consider adding some indirect measures to its accreditation-based assessment processes, such as a senior survey in all the departments, an employer survey, or an alumni survey. It might also consider formalizing reporting on assessment in those years when accreditation or academic program review do not occur.

 


 

College of Arts and Sciences

The mission of the College of Arts and Sciences as stated in the UNL Undergraduate Bulletin, 1996-97, includes the following:

  • "To educate undergraduate students of the College of Arts and Sciences to a high level of competence in their major fields through instruction that integrates formal course work with experience in research and creative activity."
  • "To provide all undergraduate students with a range of knowledge and a broad intellectual experience that can form the basis for critical and imaginative thinking, thereby enabling them to become tolerant and responsible members of a global society."
  • "To provide undergraduate and graduate students across the campus with courses in the arts, humanities, social sciences, and sciences to meet their academic needs in their major programs."

As the University of Nebraska's largest college, with 4500 undergraduate students, the College of Arts and Sciences not only serves its own students but also provides general education courses to the entire undergraduate population. It is also committed to serving students through the learning objectives of the Comprehensive Education Program (CEP).

The process of developing a student outcomes assessment program was originally led by the Dean's office in 1995 when chairs and directors were asked to inventory what assessment measures were currently in place and to suggest further measures that may be of use in the future. A committee structure was developed and charged with encouraging and guiding departments in the development of meaningful assessment plans, evaluating those plans, supporting instructional change resulting from evaluation of the data and providing forums for faculty discussion about assessment issues.

It was decided that the college would assess the majors as its first priority. In its guidelines for major assessment, the college required that each program identify the learning objectives for the major. Second, it required that multiple measures be used and that at least one be a direct measure of how well the students are achieving the learning objectives. Lastly, each major was required to set out the process by which the data would be analyzed and the results used for program and instructional review. As is the case for the university as a whole, departments vary enormously in terms of commitment to the assessment process, state of preparedness to begin assessment activities, and assessment measures chosen. Some departments have already administered tests to their graduating majors. It should be noted that in many cases the actual numbers of majors assessed within a department is very small. For example, Anthropology held an oral exam for its two majors and Great Plain Studies assessed its one graduating major. In the case of the larger departments, however, the numbers and therefore the challenges for assessment are larger. Progress reports have not yet been received from European Studies, Latin American Studies, or Political Science. Notwithstanding these gaps, however, the departments of Arts and Sciences have made a genuine effort to tie learning objectives to measures, to develop appropriate measures for the discipline, and to complete some assessment activities during the semester. A complete roster of the progress reports and the responses and suggestions from the College Assessment Committee may be found in the accompanying Arts & Sciences documentation and highlights are as follows:

Biological Sciences is administering the Major Field Test in March, 1997 and is developing an exit interview instrument.

Chemistry has experimented with administering several commercial standardized tests for general, organic, and physical chemistry. It has also begun exit interviews of graduating seniors, a survey of students and TAs to evaluate laboratory experiments, and informal tracking of undergraduate professional presentations and publications.

Classics faculty evaluated the achievement of their graduating seniors by completing a competency checklist for each student, using materials collected from all the 300-levels in the major. In addition, Greek and Latin majors completed a translation exam. An alumni survey is also being developed.

English surveyed graduating seniors and paired the survey with a faculty assessment of student performance by means of a competency checklist. It has analyzed the results and will fine tune the instruments and protocols for use with the next group of graduating seniors.

Environmental Studies pre-tested a senior exit survey in the spring semester, 1996. In December, 1996 and May, 1997, the department thesis advisor and a second faculty advisor assessed the senior thesis and administered an oral exam to the department's eight graduating seniors.

Geography and Meteorology has assessed two of three core knowledge areas: Human Geography and Geographic Information analysis. The remaining area, Meteorology, will be undertaken in the 1997-98 academic year. Human Geography administered an essay exam while GIA used a multiple choice measure. Both also held a faculty-student feedback session after the exams to determine student perceptions of the strengths and weaknesses of the program. Results have been reported to the Assessment Committee.

History has developed and administered a portfolio review that uses a quantitative assessment protocol based on learning objectives. The assessment of the majors is based upon both individual composite achievement and a general composite of all outcomes assessment results. The results have been discussed by the Department's Undergraduate Committee. One resulting observation has been the perceived need to focus on giving the students the "skills to do history."

Mathematics and Statistics is engaged in the development of course portfolios and an assessment checklist linked to learning objectives. A senior exam was piloted in the spring semester, 1997.

Modern Languages and Literatures relied upon assessment of portfolios as their major student outcomes measure. Although there is some variation across majors, the portfolios generally included essays, research papers, exams, tapes of presentations, and results of a standardized oral proficiency interview. Spanish majors were also required to present one of their papers before a faculty panel.

Philosophy is in the process of evaluating materials from its 200-, 300-, and 400-level courses by means of competency checklists. It has engaged in an exam evaluation for Philosophy 110. It is analyzing student course evaluations in terms of skills and content questions. The department is developing an majors survey to be done annually.

Physics & Astronomy has developed course-specific learning objectives for all upper division majors courses. Portfolios have been collected and partly assessed. A voluntary comprehensive exam will be administered in spring, 1997, and senior exit interviews will be held as well. An alumni survey will be carried out in connection with the next Academic Program Review in 1999.

Psychology has assessed certain performance criteria by a random sample of 30 senior psychology majors. A rigorous methodology was applied using a three point scale to assess student products against performance criteria. Each semester, a senior survey will be administered. An annual report will be made to the department. The faculty will discuss the results and make curriculum changes as necessary.

Sociology developed course grid assessments that were independent of course grades. All majors were assessed in the fall semester, 1996 and in the spring semester, 1997. An assessment of anonymous senior projects was completed. This assessment used a grid format completed by faculty. A second assessment of the grids was then done by the Undergraduate Committee. A senior exit survey was piloted in spring, 1996. An alumni survey will be completed in 1997-98 as part of a five-year rotation.

The College of Arts and Sciences has asked programs to develop plans for assessing program roles in graduate education in the fall of 1998.

Notwithstanding the college's exemplary leadership and success in instituting a comprehensive assessment program for its majors within such a large and unwieldy college, many challenges still face it. The College of Arts and Sciences bears the largest burden in the university for providing general education courses to non-majors. Long range planning should include the development of a system for the assessment of non-majors. Such a system would require the active support and cooperation of all undergraduate colleges as well as the Office of the Vice- Chancellor for Academic Affairs to ensure that a task of this magnitude not fall solely on the shoulders of the College of Arts and Sciences but be appropriately shared and supported.

 


 

College of Business Administration

The plan for student outcomes assessment in the College of Business Administration is one component of the comprehensive indicator approach the College employs to monitor outcomes from its mission. The mission is "...to foster intellectual curiosity and business insight by providing high quality instruction, research and service to our students, the citizens of Nebraska, and to the national and international communities we serve."

The Standards for Business Accreditation state:

The processes used to strengthen curriculum, develop faculty, improve instruction, and enhance intellectual activity determine the direction and rate of improvement. Thus, these processes play an important role in accreditation, along with the necessary review of inputs and assessment of outcomes. As part of each school's effort to prepare its students for future careers, the school should provide a total educational experience that emphasizes conceptual reasoning, problem-solving skills, and preparation for life-long learning. (AACSB, 1994, pp. 1-2)

Both the College's own mission and the AACSB's standards set out some of the learning objectives that should be assessed in terms of student outcomes.

Assessment issues are considered and discussed by the Ad Hoc Committee on CBA Assessment which has, over the last year, focused on the development of college-wide assessment plans. Its priorities have been on the college level because it was assumed that the professional exams routinely taken by students in some majors (e.g. CPA exam in Accounting), act as direct measures of student learning and would provide departments with some solid data in that regard. In reality, however, only a small portion of CBA's students sit for professional exams. Nevertheless, the focus of the committee does have merit because all CBA students participate in a common business core and capstone course which permits faculty to evaluate both their level of knowledge and the development of required skills.

Instead, the committee focused its efforts on the comprehensive surveying of groups both within and without the college that affect its mission. These groups include graduating seniors, alumni, employers, faculty, staff, and graduate students.

The first of these surveys to be administered was the senior survey in spring, 1996. The survey was administered to seniors in the 1996-97 academic year and will be done annually. The college is working with the Alumni Association to include learning outcomes questions in the Alumni Association surveys and is collaborating with the University's Career Services Center to collect employer data. Survey instruments have been developed for faculty, graduate students, and staff that will be administered in spring, 1997. It is expected that the survey data will be used to construct additional indicators for the comprehensive assessment process.

The 1996 senior survey produced generally positive findings. Students hold an overwhelmingly positive view of their education in CBA. They were asked about skill and knowledge development in 13 areas and their positive responses ranged from working cooperatively in groups to using new technologies. A large majority of students participated in computer projects using the CBA lab, but their involvement in other college activities, such as non-credit independent study and participation in college organizations, was very low. The survey's results highlighted the need for departments to encourage greater student-faculty interaction in educational and advising sessions.

Additionally, substantial funds were earmarked and used for the administration of the Business Major Field Test (Educational Testing Service) as an exit exam for a sample of graduating seniors, thus providing an additional direct measure of student outcomes. This exam was taken by 21 graduating seniors in the winter of 1997. It is nationally normed and was developed specifically for assessing student achievement in major subjects. The results will be reviewed by administrators and faculty and decisions will be made regarding the expanded use of the exam in the future. The college will consider its use on a pre-test/post-test basis to measure knowledge acquisition over time.

At the department level, assessment instruments tend to conform closely to those indicators set out by the college, with professional examinations providing data on knowledge and skill development within the majors, and a break-out of senior survey results providing attitudinal data by department. All departments use employer surveys and student placement data to determine outcomes.

Due to its unique culture and position in the College of Business Administration, the Department of Economics has included some additional measures tied to learning objectives (APR Self-Evaluation Report, Econ., 1996, p. 14). It plans to use surveys and interviews annually at the sophomore, junior and senior levels and to administer a nationally normed examination such as the Test of Understanding in College Economics. A set of guidelines is being developed so that faculty can undertake a systematic evaluation of student writing and research with some degree of uniformity. The Undergraduate and Graduate Committees are responsible for overseeing assessment activities and are required to submit an annual report. The report will describe the process, will analyze data and comment on findings, and will make recommendations for improvements in the assessment.

Over all, the College of Business administration has begun an excellent initiative in planning and undertaking assessment activities and is committed to assessment in the long term. Its college-wide surveys and Business Field Test pilot project are well considered, well focused initiatives, though increasing participation in the latter should be made a priority. The intentions of the Economics Department to employ multiple measures are to be encouraged and supported. Departments should aim to become more active participants in their own assessment activities and to plan on testing out one new assessment measure next year.

On the college level, the Ad Hoc Committee on CBA Assessment could be made a standing committee and given a mandate to coordinate and oversee assessment activities in the college and to report them annually to the Dean, the faculty, and the University-Wide Assessment Committee.

 


 

College of Engineering and Technology

The College of Engineering and Technology is coming late to the process of comprehensive student outcomes assessment, although as the college points out in its planning document, "Each undergraduate degree granting program within the College of Engineering has been involved in outcomes assessment for the past several years, although in a rather piecemeal fashion."

Direct measures include the capstone design projects which are reviewed by practicing engineers and build upon many of the fundamental concepts required in mathematics, sciences, engineering sciences and communication skills and the Fundamentals of Engineering examination. This is a national exam taken by undergraduate engineering majors after 90 credit hours. The exam is not mandatory for UNL students, but those that take it tend to pass at a higher rate than the national norm.

Indirect measures include periodic alumni surveys and use is made of engineering data from the Career Services survey.

The new engineering accreditation criteria (ABET Engineering Criteria 2000) links learning objectives to outcomes assessment. The College has adopted the eleven criteria outlined in Engineering Criteria 2000 as the learning objectives around which the assessment process will be built. Measures being considered include portfolios, design projects, nationally normed content exams, alumni and employer surveys, and placement data.

The college program needs to be coordinated and implemented beginning next year in order to coincide with the larger accreditation phase-in schedule.

 


 

College of Fine and Performing Arts

Assessment activities for programs in the College of Fine and Performing Arts have been driven by the expectations of their accrediting agencies. The following excerpt from the 1997-98 handbook of the National Association of Schools of Music (p. 70) is representative:

"Music units have available a broad range of evaluation techniques such as juries, critiques, course-specific and comprehensive examinations, institutional reviews, peer reviews, and the performance of graduates in various settings. The indicators chosen shall be analyzed and organized to produce a composite picture of the extent to which the educational and artistic goals and objectives of the music unit are being attained. In turn, this information is used as an integral part of planning and projection efforts. The music unit shall be able to demonstrate that students completing programs have achieved the artistic and educational levels and competencies outlined in applicable NASM standards."

As a result of their accrediting bodies having clearly defined the competencies students are expected to demonstrate, the programs in FPA have historically focused on direct evaluation of student performance. However, the three programs differ with respect to the extent to which they have developed a formal assessment plan focusing on program improvement.

In Theatre Arts and Dance, class work typically is performance oriented, involving extensive critiques by the student, peers, and faculty. Beginning in fall 1997, students in the Performance Emphasis will be required to maintain a portfolio of their audition, performance, and production activities, including self and faculty evaluations. In the Design/Technical Production emphasis, students conduct a portfolio presentation and review before the faculty and their peers each semester. The work of students who complete major design projects for stage shows is evaluated by all the design faculty. Students also prepare an exit portfolio of their best work that may be used in seeking employment. Faculty are considering requiring submission of a portfolio prior to acceptance into the program, which would make enable further analysis of a student's growth as they move through the program. Development of a team-taught capstone course is also under discussion.

In the Department of Art and Art History, project critiques are extensively used in each studio art course. Undergraduates participate in an annual exhibition of their work from that year, as well as an individual senior exhibition, open to the public, at the end of their program. Faculty are considering adding a portfolio requirement for admission to the program, as well.

The School of Music has integrated an extensive array of assessment activities into its undergraduate program. In addition to auditioning for admission to the program, students are given entrance exams in theory, aural skills, sight singing, and keyboard. Prior to graduation they must pass a piano proficiency exam; if they are taking applied lessons, they must also pass an upper division qualifying exam after four semesters and participate in applied juries (before an faculty panel) at the end of each semester. A degree recital is also required for performance degrees. In addition, the school plans to implement the Educational Testing Service Major Field Exam in Music covering music theory and music history.

With such a strong performance orientation, all departments in the college have tremendous resources to draw upon for assessment. The college should encourage documentation of methods used to evaluate student work and development of a process that would promote use of that outcomes information in program improvement.

 


 

College of Human Resources and Family Sciences

The College of Human Resources and Family Sciences, under the leadership of the Dean's office, began to plan a comprehensive, coherent college-wide assessment plan in 1994. From the outset the college built the assessment plan around its educational goals as set out in the Undergraduate Bulletin, 1996-97 (p. 265). These include the development of oral communication skills including listening, writing, critical thinking, understanding of research data, understanding of cultural diversity, and a foundation of knowledge according to major.

Dean Craig established a College Assessment Committee in January of 1994 and charged it with developing an assessment plan that would:

  • articulate the goals of each academic program
  • gain feedback on each program's progress towards achieving those goals and
  • use the feedback to modify the academic programs to ensure that the goals are effectively achieved.

The assessment programs were organized by establishing the desired outcome, matching it with assessment measures that related to different learning domains (cognitive, behavioral or affective), and with expectations of outcomes.

At the college level, assessment instruments were developed to survey new graduates and graduates 3,5,10 and 20 years past graduation. The survey was first administered to 1995-96 graduates. Fall 1996 graduates were surveyed in March, 1997. Overall, graduates indicated a high level of satisfaction with curriculum and instruction. They were well satisfied with the level of their own skills development in terms of critical thinking and verbal and written communications. Survey results indicated the need for improvement in advising in some program areas in Family and Consumer Sciences and for general improvement in career counseling in all departments. At the level of the major, departmental surveys are planned for one year after graduation. Textiles, Clothing and Design surveyed all 1995-96 graduates and received high ratings in all areas other than computer training and career counseling. Nutritional Science and Dietetics will survey all 1996-97 graduates. The survey for Family and Consumer Sciences has been designed and was administered in May, 1997.

Departments were also charged with developing two assessment instruments, one of which must involve direct evidence of student achievement or performance.

Students in the Department of Nutritional Sciences and Dietetics take the American Dietetics Registration exam. In 1996, the department's pass rate was 92%, well in excess of either the national pass rate of 75% and the department's own goal of at least 80%. However, in 1997, the exam and its passing standards were changed by the ADA. Of the UNL students who took the exam, 71% passed compared to the national average of 79%. Faculty have considered curricular explanations for the lower performance and are discussing the possibility of restructuring the clinical nutrition courses to include laboratory sections.

Nutritional Sciences also requires an oral senior seminar presentation that is judged by the instructor and student peers. Senior student writing is assessed by the senior seminar instructor and one external reviewer for demonstration of a variety of skills. All students met the standards set. The Restaurant/Food Option is planning an External Practicum Supervisor's Evaluation instrument and has begun analysis of a small number of student portfolios.

Textiles, Clothing and Design holds a juried undergraduate student exhibition. Eighty percent of the student work met or exceeded the criteria set by the department assessment committee, and approximately half of the entries were juried into the exhibition where the evaluation was external.

Portfolio reviews are undertaken by the internship director both before and after internship. The portfolios at the end of the internship were judged for presentation, organization and content on a scoring scale developed by the department. All students met the department's standards. Five students had internship experiences in the summer of 1996 and were judged by their external supervisors to be superior in performance, resourcefulness, and responsibility, among other criteria.

In the Family and Consumer Sciences Department, all graduating students in FACS 480 and 10% of all other students have their senior papers reviewed by one external examiner. FACS 480 also requires an oral presentation based on the senior paper, reviewed by peers and the instructor. Student performance in both areas exceeded department goals of having at least 80% of students receive a rating of "good" or higher.

The College of Human Resources and Family Sciences has developed a coherent, comprehensive assessment plan based on learning objectives that incorporates the use of college- wide survey instruments with instruments developed by the departments. Currently, outcomes assessment is concentrated at the senior level and a comprehensive picture of student outcomes is being built as assessment instruments are developed, tested and refined. It may be useful to try to incorporate some external assessment strategies in the processes used in senior seminars. Assessment by the instructor and the peer group would not seem to be sufficiently rigorous.

The college may, at a later date, consider developing further assessment instruments that relate to 200- and 300-level courses in order to be able to track student progress.

 


 

College of Journalism and Mass Communications

The mission of the College of Journalism and Mass Communications is “...to graduate highly competitive young professional who have acquired communication and critical thinking skills appropriate to the practice of journalism: writing, editing and design in print and broadcast media" (UNL Undergraduate Bulletin, 1996-97, p.285). Additionally, through the college's own mission and the accreditation criteria set out by the ACEJMC, journalism education is joined with a course of study in the liberal arts and sciences in "...a reasonable balance."

As a professional college, the College of Journalism and Mass Communications balances theoretical and practical learning experiences in a curriculum where learning is cumulative, that is, where one course builds upon another. Accrediting criteria require that professional courses have a student/faculty ratio of no more than 15:1. Consequently, student-faculty contact tends to be personal and the tracking of student learning can be done on an individual level. Some of the components of the University Comprehensive Education Program fit naturally into the curriculum of the College of Journalism and Mass Communications, particularly in the areas of human diversity and competence in writing.

There are three departments in the College: Advertising, Broadcasting, and News- Editorial. In keeping with the college's stated interest in producing competitive professionals, each department has its own advisory board made up of professionals in the field who can help relate college outcomes to the professional environment. These boards meet at least every two years and provide feedback to the departments regarding student performance and curricular strengths and weaknesses by means of an oral report to the Dean and a written report to departmental and college faculty.

The college's assessment tools rely upon both internal (faculty) and external (advisory board, client, employer, alumni) evaluations. Senior exit interviews and student portfolio reviews were done concurrently by the departments beginning with the fall semester, 1996.

Exit interviews in the three departments provided much positive feedback, especially regarding hands-on, experiential components of the program, such as labs and internships. As well, students made suggestions for curriculum improvements. For example, Advertising students called for more rigor in core journalism courses and suggested additional emphasis on account servicing and management. Broadcasting students found hands-on experience with the equipment to be crucial and expressed some "...frustration with equipment problems and in some cases lack of equipment...". News-Editorial Students called for more emphasis on small publishing enterprises and stronger, more accurate advising.

Portfolio review in December was informal and individual and used largely as an opportunity for career advising. All three departments are working on ways to formalize and standardize assessment in this area so that data may be analyzed. It may be useful to design a portfolio review process that is developmental in nature, assessing student skills acquisition over time as it relates to the learning objectives of the department/college.

Internships and capstone courses give immediate, quality feedback from potential employers and clients on student performance. These data are available to the student- practitioners and to their faculty. There is potential here for assessing a sample of student work across the curriculum as well, perhaps as part of curriculum review.

The advertising Department is in the process of developing competency checklists for use in outcomes assessment. Also in development for the college is an alumni survey. Alumni surveys have been sent out occasionally in the past.

The College of Journalism and Mass Communications collects a wealth of information on its students' performance. Its connections to external evaluators are particularly important for tracking professional skills development against the market. Due to its small professional class sizes, CJMC tracks student progress on a more individualized basis than some other colleges do. The college's challenge now is to look at the data it collects in new ways: to connect its assessment activities more explicitly to learning objectives, to refine its methodologies, and to further develop reporting mechanisms within departments and across the college.

 


 

Teachers College

The mission of Teachers College is to educate teachers, administrators and specialists that will provide " outstanding educational leadership in communities across the state and the nation in teaching, administration, vocational and adult education, communication disorders, special education, health and human performance, and educational psychology" (UNL Undergraduate Bulletin, 1996-97, p. 305).

Over the past year, Teachers College has engaged in a highly collaborative, developmental process of student outcomes assessment. The process has largely been driven by the Dean's office, but consultation and discussion across the college has been the cornerstone of the process.

Initially, the faculty were asked to reflect upon the current status of student learning in their programs and to provide evidence in terms of the following areas: the conceptual model serving as the program's foundation

  • specific objectives for exit outcomes
  • multiple teaching modes that relate to the conceptual model
  • data providing evidence of student outcomes
  • examples of how these data have been used in program development
  • program evaluation tasks to be undertaken
  • program development priorities related to student learning

The resulting documents were discussed at department faculty meetings and then shared electronically throughout the entire college. From this exercise was developed a set of 23 core learning themes which will serve as a framework for assessment and program development across the college. An across-course theme matrix was developed for the majors at the end of the fall, 1996 semester in order to provide an overview of student progress in key areas.

A student survey dealing with the learning themes was administered to students in December, 1996 and these data have been analyzed by program in order to illustrate how students view the curriculum. Student focus groups were held in the fall semester of 1996 and two more were conducted in the spring 1997 semester.

First-year teacher focus groups are held on the occasion of college-sponsored dinners and provide feedback to faculty about the effectiveness of their preparation for teaching.

First-year teacher and employer surveys have been conducted by the Dean's office for many years. The survey is made up of 25 core items that appear each year, as well as questions on a topic of particular interest that vary from year to year. The special questions for the 1996 survey focused on classroom management and school violence. Results may be found in the Teachers College Assessment Report. The 1997 survey focused on instructional technology; results are not yet available.

Other plans for 1997-98 include surveys of cooperating teachers for practicum and student teaching, third-year teacher education graduates, all K-12 superintendents and principals in Nebraska, current graduate students, and current juniors. A study of practices in the assessment of student teaching is also underway.

Each program has also developed an advisory committee composed of professionals and employers to examine the performance of program graduates. Most of the committees have met at least once.

Teachers College has built an appropriate and thoughtful process for student outcomes assessment in terms of its college-wide assessment instruments; consultation process; and dissemination of results to programs, departments, and across the college for purposes of program review. It combines a balance of external and internal assessment processes to assure that a complete picture of student performance is created.

The process is led and supported by the Dean's office, but at the departmental level, there is considerable variation in the progress it is perceived is being made. For example, in the Health and Human Performance self-study for Academic Program Review of November, 1996, the department stated that "...we have anecdotal evidence and very informal feedback relative to [outcomes]; a systematic process is not yet in place..." It does, however, have some program evaluation measures in place: teacher evaluations, exit interviews with graduating seniors, and program advisory committees. It is currently developing a process for portfolio assessment and is taking part in the assessment activities initiated by the Dean's office. This is an example of how programs often do not recognize that they already have a good deal of assessment data, but have not analyzed or shared it within the department and elsewhere.

As its next developmental step, Teachers College has asked programs to add one new measure of student outcomes assessment, to identify current program development priorities, and to identify at least one other program with which to have a detailed discussion of their reflection documents, thus boosting cross-college interaction and engagement with assessment issues. Department assessment reports are due to the Dean's Office by May 1, 1998.

Challenges for Teachers College, as for others, include resource planning for assessment generally and for data analysis in particular. The development of faculty awareness of and participation in assessment activities will continue to be a college goal.