Abstract
In a cooperative arrangement between Westinghouse and Carnegie-Mellon University, a test version of CMU's LISP Intelligent Tutoring System (LISPITS) was installed on a Westinghouse VAX 11/785 that could be accessed by engineering personnel from company sites anywhere in the country. The object of this research was to evaluate LISPITS's performance in a more industrial environment than heretofore attempted. More specific research questions concerned (a) dialog structure, (b) computer resource requirements for large numbers of students, (c) rule–base applicability to students of different backgrounds, and (d) LISPITS's effectiveness as measured by student performance. Four classes of data were collected: (1) computer usage (accounting data) (2) 34–item questionnaires, (3) mid–term and final exams, (4) computer–readable files of activity in both the exercise and coding windows.
Results suggest (a) overall, this group's experience with LISPITS was a positive one, and that basically this technology works in an industrial environment; (b) dialog management, while adequate, could be further optimized on the current 80–column x 24–line display; (c) dialog could be improved even more greatly with a more advanced display (e.g., more and larger windows, high resolution, bit-napped text and highlights) (d) two aspects of the interaction appear to have salient impact on learning and user acceptance for students with professional engineering experience: (1) the tutor's flexibility in dealing with potentially–valid student solutions, and (2) the level of analysis that governs error detection, diagnosis, and intervention strategies.
Get full access to this article
View all access options for this article.
