Intelligent tutoring methodology using consistency rules to...

Education and demonstration – Question or problem eliciting response

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C434S30700R, C706S927000, C706S014000

Reexamination Certificate

active

06540520

ABSTRACT:

BACKGROUND
1. Field of the Invention
The present invention relates to the use of artificial intelligence and its application to tutorial programs. In particular, a computerized methodology is disclosed for a tutoring curriculum that interacts with the misconceptions of a student.
2. Description of the Related Art
Tutoring relies heavily on the tutor to assume what the student was thinking at the time he or she made a mistake in a problem. Personal tutors assist a student in learning a subject by figuring out that a solution to a problem is wrong and then showing the student a correct solution. However it is often difficult to show the student why he or she is wrong. Understanding why a misconception was made allows the student to acquire a more rational understanding of a problem.
An intelligent tutoring system is defined as an educational program capable of humanlike thought processes, such as reasoning and learning. An application of intelligent tutoring is a computer utilizing educational software derived from expert programming. Expert knowledge in a field is important for programming artificial intelligent systems. Similarly, expert knowledge in a subject, as well as expertise in the teaching of the subject, is necessary for the development of intelligent tutoring systems.
Expert systems and artificial intelligence systems are known in the art. See U.S. Pat. No. 4,670,848 to Schramm. This system is characterized by its interaction with a user in gathering statements through inquiries to develop the most specific understanding possible by matching of the statements with a database. See also, U.S. Pat. No. 5,386,498 to Kakefuda. An expert system is disclosed that expresses knowledge of a human expert using knowledge modules. An inference result is based on a determined certainty factor during execution of the process.
Intelligent tutoring systems utilizing artificial intelligence have also been developed. See, for example, Bloom et al., U.S. Pat. No. 5,597,312 and, in particular, a computer-assisted instruction method taught in “Computer-Assisted Instruction in Chemistry” by Lower et al. from the
Journal of Chemical Education.
Intelligent tutoring involves justifying steps by rules as a student works through a problem, ultimately to its solution. An expert can find a correct solution to a problem in the least amount of steps, obviously by having mastered the understanding of the most helpful rules used for the problem. A student learning a subject is best instructed on a step-by-step method because as long as a student gets to a correct solution, even by taking a different ‘path’, the student has still been able to rationalize what he or she knows along the way. What the student did not know during the course of the solving of the problem would be rationalized by the system as the student performs each step.
Currently, intelligent tutoring systems rationalize mistakes made by a student by implementing a direct model of misconception, called buggy rules, which are also production rules. A buggy rule anticipates a mistake by a student, so that if the student performs a wrong step in the solution to a problem, the system can target the mistake and take a specific action. This production rule model rationalizes a mistake by matching the student's mistake to the particular rule violation already anticipated and pre-programmed. The prior art intelligent tutoring system not only figures out that a performed step in a solution is wrong, but also that it is wrong because the student matches the action of the pre-programmed rule. The system then correlates the mistake to this common misconception and suggests to the student that the mistake was made because of the misconception associated with the buggy rule.
Thus, in conventional artificially intelligent tutor systems (ITS) the program is primarily oriented to help the student by showing the right next step and explaining why the right step is right (using the knowledge of the expert system). In the present invention, the rules have a much different outlook inasmuch as they serve to explain to the student why a wrong step is wrong. This is much more important to a beginning student in developing the proper mental schemes than studying or memorizing the correct solution. The prior art can achieve this only when the error is anticipated.
Understanding science and other curriculum means utilizing equations, methods, and rules to ultimately find a solution to a problem. Fundamental rules are sometimes overlooked as a student tries to understand more recently studied subject matter. For instance, a student concentrating solely on a single chapter may overlook or forget a fundamental principle he or she learned prior to the lesson. The student may have also made a simple mistake based on a principle he or she had known before, but simply forgot it or was unaware of its relevance. The student might even have simply made a typographical error, but not realized this has led to an unreasonable result.
Thus, certain mistakes may be made that cannot possibly be matched and correlated to an anticipated buggy rule. The sole use of buggy rules for tutoring students targets only a narrow range of possible mistakes made by a student in a step-by-step method of teaching.
There is a need for a methodology that improves the intelligence of the tutor by implementing a rule set that always allows for a meaningful response and which is used even when the production rules fail. Termed herein as consistency rules, the rules target the mistakes that cannot be explained through application of buggy rules, thereby providing a new way to determine whether a student's step is “wrong” more reliably. This is accomplished by evaluating and comparing the inputted solution to an expanded fundamental rule set representing relevant constraints on the solution to assess whether or not the solution is reasonable.
The conventional assessment of “wrong” is that the student's step is not in the conflict set (the set of all possible correct next steps generated by the expert system). In the present methodology, “wrong” is defined as a violation of a consistency rule (CR) in a new tutor rule set. If the set of CR's is complete for the problem domain, then if a step can be proven to be wrong by a fundamental principle in the context of the student's work so far, it will violate a CR. Violation of a CR guarantees the step is wrong. A wrong answer not matched or anticipated by a pre-programmed buggy rule can still violate a fundamental principle. The present methodology of using consistency rules allows an educational software program to always say something meaningful when a wrong step in the solution is identified, thereby improving the quality of diagnosis of a student's mistake.
SUMMARY OF THE INVENTION
The present AI methodology is directed to an improved intelligent tutorial utilizing rules that evaluate a constraint on a solution and compare this constraint with an improved, more general rule set. By expanding beyond the model of misconception that accounts for the mistakes of students, in which many student errors were previously unable to be tutored from the sole application of production rules, the consistency rules deliver qualitative, conceptual feedback for intelligent tutors.
This is accomplished by assessing the reasonableness of a solution based on an evaluation of a constraint on the solution imposed by a relevant fundamental principle. The basis of the consistency rule is that any wrong answer had to have violated a relevant principle, even when the error is outside of those normally anticipated by an artificial intelligent system. Thus, the present methodology accounts for all possible violations to provide a meaningful response.
By expanding the fundamental rule set by implementing a means for evaluating a constraint on the solution, the CR's are further capable of augmenting existing, programmed production rules with the functionalities of the consistency rules. The consistency rules used by the system do not all have to

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Intelligent tutoring methodology using consistency rules to... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Intelligent tutoring methodology using consistency rules to..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Intelligent tutoring methodology using consistency rules to... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3113325

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.