In this page you can find readings related to the ABET acreditation project. The repository is organized as follows:
General Information
Outcomes and attributes
Assessment methods
Some documents may be placed in more than one section according to the topics they develop. This list of documents does not pretend to be fully comprehensive but it may point the reader to related material to improve the understanding of the accreditation process. Even though documents are listed without any special order, the most relevant ones for the accreditation process at Javeriana University and highlighted.
Authorized users of this wiki are welcome to add more material to complete this page. We strongly advice to add a short description for each document to ease the reading and the maintenance of the page.
Criteria for accreditation process for computer science may be found
here
A succinct description of the accreditation process and the general model of the assessment can be found in
“R. Felder and R. Brent. Designing and Teaching Courses to Satisfy the ABET Engineering Criteria” (
web-site). This paper also clarifies the terms and vocabulary associated to the ABET accreditation process.
“D. Haws. Ethics Instruction in Engineering Education: A (Mini) Meta-Analysis” (
PDF): presents a survey on pedagogical approaches used to transfer an understanding of ethics to the student.
The assessment of the program and courses is the most critical phase in the accreditation process. Showing evidence of the attainment of the program objectives and outcomes leads to use indirect (e.g., surveys, reports, etc) and direct (e.g., student grades) evaluation instruments. The following documents show approaches to carry out this phase of the process.
“R. Felder and R. Brent. Designing and Teaching Courses to Satisfy the ABET Engineering Criteria” (
web-site) clarifies the terms and vocabulary associated to the ABET assessment process. The paper reports a comprehensive list of references related to this subject.
“Assessment Tips with Gloria Rogers, Ph.D. Death By Assessment” ({{::abet:death-by-assessment.pdf|
PDF}) gives practical tips to take into account in the assessment process. It reports on the balance between direct and indirect methods of assessment. It also gives a good guidance on how to make practical and effective the process.
“Gloria Rogers. Do Grades Make the Grade for Program Assessment?” (available
here) explains how much we can obtain from student's grades for the assessment of the program.
The Computer Science Department at Iowa State University defines in
PDF the approach to carry out the assessment of the program. The cycles of the assessment are described along with the instruments to evaluate the attainment of the objectives.
“Grading vs. Assessment of Learning Outcomes: What’s the difference? ,Carnegie Mellon University,(available
here) is a nice reflection on the differences between grading and assessment.
“M. Trevisan et al. Designing Sound Scoring Criteria for Assessing Student Performance” (
PDF) gives some advices on how to deal with performance assessment, more specifically, with the development of
scoring criteria.
Stanford University, Rubrics (
PDF): Performance assessments require the use of a standardized scoring procedure usually involving a rubric. A rubric is a matrix that identifies the expected outcomes of performance on task with the respective levels of performance along those outcomes. This document gives a good example on how to build rubrics.
Carnegie Mellon University: Final Assessment (
PDF) gives an example of a Final Assignment rubric.