All Academic, Inc. Research Logo

Info/CitationFAQResearchAll Academic Inc.
Document

Can “Cooperative” Agents Enhance Learning and User-Interface Relationship in Computer-based Learning Environment?
Unformatted Document Text:  Effects of Cooperative Agent 11 colearner simply gave its own answers upon being asked by the teacher agent (n= 26). The third condition, which had the same teacher agent and the user avatar but no co-learner agent, served as a control condition (n= 26). 4.3. Measures Learning assessment. In order to assess participants’ learning, 11 open-ended questions were asked on the English idioms that participants learned with the teacher and the colearner agent upon the completion of the idiom lesson. More specifically, the instruction was given as: “Please fill in the blanks using the idioms you learned so that the sentences make sense.” Participants were instructed to fill in the blanks in the given sentences, e.g., “How could you believe what he said? He was just ________________” (The correct answer was “pulling your leg”). A preliminary analysis of the fill-in-the-blank responses showed that 2 questions out of the 11 might have seem somewhat confusing to the participants: more than 80% of the participants gave answers totally irrelevant to the idioms they learned. In order to minimize a noise in the data, the responses to the 2 questions were eliminated, and only the responses to the remaining 9 idiom questions were analyzed for hypothesis testing. Each answer to the open-ended questions was assessed on a 5-point scale, which ranged “0” (left blank or completely irrelevant) to “4” (perfect answer); participants received partial points depending upon the number of grammatical or spelling-related mistakes they made; then the score for each item was added up. Feelings of social support. An index of “feelings of being cooperated” was created based on three items (“Not Alone,” “Praised,” and “Supported”). The participant rated how well the given adjectives described their feelings during their use of the software on a 10- point Likert scale, respectively. A factor analysis showed that the three items were loaded on

Authors: Lee, Jong-Eun Roselyn., Maldonado, Heidy., Nass, Clifford., Brave, Scott B.., Yamada, Ryota., Nakajima, Hiroshi . and Iwamura , Kimihiko .
first   previous   Page 11 of 22   next   last



background image
Effects of Cooperative Agent
11
colearner simply gave its own answers upon being asked by the teacher agent (n= 26).
The third condition, which had the same teacher agent and the user avatar but no co-learner
agent, served as a control condition (n= 26).
4.3. Measures
Learning assessment. In order to assess participants’ learning, 11 open-ended
questions were asked on the English idioms that participants learned with the teacher and the
colearner agent upon the completion of the idiom lesson. More specifically, the instruction
was given as: “Please fill in the blanks using the idioms you learned so that the sentences
make sense.” Participants were instructed to fill in the blanks in the given sentences, e.g.,
“How could you believe what he said? He was just ________________” (The correct answer
was “pulling your leg”). A preliminary analysis of the fill-in-the-blank responses showed that
2 questions out of the 11 might have seem somewhat confusing to the participants: more than
80% of the participants gave answers totally irrelevant to the idioms they learned. In order to
minimize a noise in the data, the responses to the 2 questions were eliminated, and only the
responses to the remaining 9 idiom questions were analyzed for hypothesis testing.
Each answer to the open-ended questions was assessed on a 5-point scale, which
ranged “0” (left blank or completely irrelevant) to “4” (perfect answer); participants received
partial points depending upon the number of grammatical or spelling-related mistakes they
made; then the score for each item was added up.
Feelings of social support. An index of “feelings of being cooperated” was created
based on three items (“Not Alone,” “Praised,” and “Supported”). The participant rated how
well the given adjectives described their feelings during their use of the software on a 10-
point Likert scale, respectively. A factor analysis showed that the three items were loaded on


Convention
All Academic Convention can solve the abstract management needs for any association's annual meeting.
Submission - Custom fields, multiple submission types, tracks, audio visual, multiple upload formats, automatic conversion to pdf.
Review - Peer Review, Bulk reviewer assignment, bulk emails, ranking, z-score statistics, and multiple worksheets!
Reports - Many standard and custom reports generated while you wait. Print programs with participant indexes, event grids, and more!
Scheduling - Flexible and convenient grid scheduling within rooms and buildings. Conflict checking and advanced filtering.
Communication - Bulk email tools to help your administrators send reminders and responses. Use form letters, a message center, and much more!
Management - Search tools, duplicate people management, editing tools, submission transfers, many tools to manage a variety of conference management headaches!
Click here for more information.

first   previous   Page 11 of 22   next   last

©2012 All Academic, Inc.