• Keine Ergebnisse gefunden

FALL JOINT COMPUTER CONFERENCE

N/A
N/A
Protected

Academic year: 2022

Aktie "FALL JOINT COMPUTER CONFERENCE "

Copied!
692
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

AFIPS

CONFERENCE PROCEEDINGS

VOLUME 41 PART IT

1972

FALL JOINT COMPUTER CONFERENCE

December 5 - 7, 1972

Anaheim, California

(2)

Library of Congress Catalog Card Number 55-44701 AFIPS PRESS

210 Summit Avenue Montvale, New Jersey 07645

©1972 by. the American Federation of Information Processing Societies, Inc., Montvale, New Jersey 07645. All rights reserved. This book, or parts thereof, may not be reproduced in any form without permission of the publisher.

Printed in the United States of America

(3)

CONTENTS

PART II

Cognitive and creative test generators ... . A conversational item banking and test construction system ... . MEASUREMENT OF COMPUTER SYSTEMS-EXECUTIVE

VIEWPOINT

Measurement of computer systems-An introduction ... . ARCHITECTURE-,-TOPICS OF GENERAL INTEREST

A highly parallel computing system for information retrieval ... . The architecture of a context addressed segment-sequential storage .. .

A cellular processor for task assignments in polymorphic multiprocessor computers

A register transfer module FFT processor for speech recognition ... . A systematic approach to the design of digital bussing structures ...

DISTRIBUTED COMPUTING AND NETWORKS

Improvement in the design and performance of the ARPA network ...

Cost effective priority assignment in network computers ... . C.mmp-A multi-mini processor ... . C.ai-A computer architecture for multiprocessing in AI research ...

NATURAL LANGUAGE PROCESSING

Syntactic formatting of science information ... . Dimensions of text processing ... . Social indicators from the analysis of communication content ... . MEASUREMENT OF COMPUTER SYSTEMS-SOFTWARE

VALIDATION AND RELIABILITY

The DOD COBOL compiler validation system ... . A prototype automatic program testing tool ... . An approach to software reliability prediction and quality control ... . The impact of problem statement languages in software evaluation .. .

649 661

669

681 691

703 709 719

741

755 765 779

791 801 811

819 829 837 849

F. D. Vickers F. B. Baker

A. Goodman

B. Parhami L. D. Healy K. L. Doty G. Lipovski J. A. Anderson D. Casasent W. Sterling K. Thurber E. Jensen

J. McQuillan W. Crowther B. Cosell D. Walden F. E. Heart E. K. Bowdon, Sr.

W. J. Barr W. A. Wulf C. G. Bell C. G. Bell P. Freeman

N. Sager G. R. Martins P. J. Stone

G. Baird L. G. Stucki N. Schneidewind A. Merten D. Teichroew

(4)

tive processing ... . Minicomputer models for non-linear dynamics systems ... . Fault insertion techniques and models for digital logic simulation .... . A program for the analysis and design of general dynamic mechanical systems ... .

COMPUTER NETWORK MANAGEMENT

A wholesale retail concept for computer network management ... . A functioning computer network for higher education in North Carolina ... . SYSTEMS FOR PROGRAMMING·

Multiple evaluators in an extensible programming system ... . Automated programmering-The programmer's assistant ... " .. . A programming language for real-time systems ... . Systems for system implementors-Some experiences from BLISS ....

MEASUREMENT OF COMPUTER SYSTEMS-MONITORS AND THEIR APPLICATIONS

The CPM-X-A systems approach to performance measurement .... . System performance evaluation-Past, present, and future ... . A philosophy of system measurement ... . HISTORICAL PERSPECTIVES

Historical perspectives-Computer architecture. . . ... . Historical perspectives on computers-Components ... . Mass storage-Past, present, future ... " .. . Software-Historical perspectives and current trends ... .

INTERACTIVE PROCESSING-EXPERIENCES AND POSSIBILITIES

NASDAQ-A real time user driven quotation system ... . The Weyerhaeuser information systems-A progress report ... . The future of remote information processing systems ... . Interactive processing-A user's experience ... .

859 867 875 885

889 899

905 917 923 943

949 959 965

971 977 985 993

1009 1017 1025 1037

V. A. Orlando P. B. Berra J. Raamot S. Szygenda E. W. Thompson D. A. Calahan N.Orlandea

D. L. Grobstein R .. P. Uhlig L. H. Williams

B. Wegbreit W. Teitelman A. Kossiakoff T. P. Sleight W. A. Wulf

R. Ruud C. D. Warner H. Cureton

M. V. Wilkes J. H. Pomerene A. S. Hoagland W. F. Bauer A. M. Rosenberg

G. E. Beltz J. P. Fichten M. J. Tobias G. M. Booth H. F. Cronin

(5)

IMPACT OF NEW TECHNOLOGY ON ARCHITECTURE

The myth is dead-Long live the myth ... . Distributed intelligence for user-oriented computing ... . A design of a dynamic, fault-tolerant modular computer with dynamic redundancy ... .

MOS LSI minicomputer comes of age ... , ... .

ROBOTICS AND TELEOPERATORS

Control of the Rancho electric arm ... . Computer aiding and motion trajectory control in remote manipulators.

A robot conditioned reflex system modeled after the cerebellum. . . ...

DATA MANAGEMENT SYSTEMS

Data base design using IMSj360 ... . An information structure for data base and device independent report generation ... . SIMS-An integrated user-oriented information system ... .

A data dictionary j directory system within the context of an integrated corporate data base ... _ ... .

MEASUREMENT OF COMPUTER SYSTEMS-ANALYTICAL CONSIDERATIONS

Framework and initial phoses for computer performance improvement ..

Core complement policies for memory migration and analysis ... . Data modeling and- analysis for users-A guide to the perplexed ... .

TECHNOLOGY AND ARCHITECTURE (Panel Discussion-No Papers in this Volume)

1045 1049 1057

1069

1081 1089

1095

1105 1111 1117

1133

1141

1155 1163

E. Glaser F. Way III T. C. Chen R. B. Conn N. Alexandridis A. Avizienis G. W. Schultz R. M. Holt

M. L. Moe J. T. Schwartz A. Freedy F. Hull G. Weltman J. Lyman J. S. Albus

R. M. Curtice C. Dana L. Presser M. E. Ellis W. Katke J. R. Olson S. Yang

B. K. Plagman G. P. Altshuler

T. Bell B. Boehm R. Watson S.Kimbleton A. Goodman

(6)

The QA4 language applied to robot planning ... .

'Recent developments in SAIL-An ALGOL-based language for artificial intelligence ... .

USER REQUIREMENTS OF AN INFORMATION SYSTEM

A survey of language for stating requirements for computer-based information systems ... .

MEASUREMENT OF COMPUTER SYSTEMS-CASE STUDIES A benchmark study ... .

SERVICE ASPECTS OF COMMUNICATIONS FOR REMOTE COMPUTING

Toward an inclusive information network ... : ... .

TRAINING APPLICATIONS FOR VARIOUS GROUPS OF COMPUTER PERSONNEL

Computer jobs through training-A final project report ... .

Implementation of the systems approach to central EDP training in the Canadian government ... . Evaluations of simulation effects in management training ... .

ADVANCED TECHNICAL DEVICES

Conceptual design of an eight megabyte high performance charge- coupled storage device ... . Josephson tunneling devices for high performance computers ... . Magnetic bubble general purpose computer ... .

1181

1193

1203

1225

1235

1243

1251 1257

1261 1269 1279

D. V. McDermott J. A. Derksen J. F. Rulifson R. J. Waldinger J. A. Feldman J. R.Low D. C. Swinehart R .. H. Taylor

D. Teichroew

J. C. Strauss

R. R. Hench D. F. Foster

M. G. Morgan N. J. Down R. W. Sadler G. H. Parrett H. A. Grace

B. Augusta T. V. Harroun W. Anacker P. Bailey B. Sandfort R. Minnick W. Semon

(7)

ADVANCES IN NUMERICAL COMPUTATION

On the numerical solution of III-posed problems using interactive graphics ... . Iterative solutions of elliptic difference equations using direct methods ..

Tabular data fitting by computer ... . On the implementation of symmetric factorization for sparse positive- definite systems ... .

1299 1303 1309 1317

J. Varah P. Concus K. M. Brown J. A. George

(8)
(9)

Cognitive and creative test generators

by F. D. VICKERS

University of Florida Gainesville, Florida

INTRODUCTION

Noone in education would deny the desirability of being able to produce quizzes and tests by machine.

If one is careful and mechanically inclined, a teacher can build up, over a period of time, a bank of questions which can be used in a computer aided test production system. Questions can be drawn from the question (or item) bank on various bases' such as random, subject area, level of difficulty, type of question, behavioral objective, or other pertinent characteristic.

However, such an item bank requires constant main- tenance and new questions should periodically be added.

It is the intention of this paper to demonstrate a more general approach, one that may require more initial effort but in the long run should almost elimi- nate the need to compose additional questions unless the subject material covered changes or the course objectives change. This approach involves the design and implementation of a computer program that generates a set of questions, or question elements, on a guided but random basis using a set of predetermined question models. Here the word generate is used in a different sense from that used in item banking systems.

The approach described here involves a system that creates questions from an item bank which is, for all practical purposes, of infinite size yet does not require a great deal of storage space. Storage is primarily devoted to the program.

It appears at this stage of our research that this approach would only be applicable to subject material which obeys a set of laws involving quantifiable pa- rameters. However, these quantities need not be purely numerical as the following discussion will demon- strate. The subject area currently being partially tested with this approach is the Fortran language and its usage.

The following section of this paper presents a brief summary of a relatively simple concept which has

649

yielded a useful generator for a particular type of test question. This presentation provides background ma- terial for the discussion of concepts which are not so simple and which are now under investigation. Fi- nally, the last section provides some ideas for future development.

SYNTAX QUESTION GENERATION

A computer program has been in use at the Uni- versity of Florida for over six years that generates a set of quizzes composed of questions concerning the syntax of Fortran language elements. See Figures 1 through 5. The student must discriminate between each syntactic type of element as well as invalid con- structions. The program is capable of producing quizzes on four different sets of subject area as well as any number of variations within each area. Thus a dif- ferent variation of a quiz can be produced for each section of the course. Figure 2 contains such a variation of the quiz shown in Figure 1. The only change re- quired in the computer program to obtain the variation is to provide a single different value on input which becomes the seed of a psuedo random number genera- tor. With a different seed a different sequence of ran- dom numbers is produced thereby generating different variations of question elements.

For each question, the program first generates a random integer between 1 and 5 to determine the answer category in which to generate the element. As an example, consider Question 27 in Figure 1. The random integer in this case was 2 thus a Fortran integer variable name had to be created for this ques- tion. A call was made to a subroutine which proceeds to generate the required / name. This subroutine first obtains a random integer between 1 and 6 which repre- sents the length of the name. For Question 27, the result was a 2. The routine then enters a loop to gen- erate each character in the name. Since for integer names the first character must be I, J, K, L, M or N,

(10)

CIS 3"2 ~l Ar·~ E ••••••••••••• ~ ••••••••••••

OUIZ 1 SECTION 1 1

n •••••••.•••••••

THE 25 ELEr1ENTS BELO\'! BELONG TO ONE OF THE FOLLO\'IING FIVE CATEGORIES.

I NO I CATE ON BOTI-I TH I S SHEET Arm YOUR ANS\.'!ER SHEET IN \'JH I CH CATEGORY EACH ELEMEtJT BELONG5.

1. A FORTRAN IV SPEC'fi.L CHAPACTER 2. A FORTPAN IV CON5TANT

3. f\ FORTPAN IV SYtv1BOL

4. A VALID JOB CONTROL LANGUAGE COHMANO 5. NONE t"IF THE ABOVE

1. MH..JA68 • ••• 11J • l$r·10YV2

2.

,

• ••• 1.5. IENO

3. 65KNFI2ST · ... 16. (

4. 15856251 • ••• 17. ICALC

5. ICALC • ••• 18. )

6. ..J~16 K • ••• 1. q •

,

7. r.J55 • ••• 20. $\'1$4 T

t) 6.9543E-5 • ••• 21. 78L7KUJ

n.

fl. l~hJ2 • ••• 22. 110 838475,56

• •• • ,. n •

) • ••• 23 .. 42760.

· .. . 11. 1~4793F)460 • ••• 24. =

• ••• 1.? •

.

• ••• 2 5 • L6QIX

· ••• l3 .. IFllST

THE 25 ELEMENTS BELOW BELONG TO ONE OF THE FOLLOWING FIVE CATEGORIES.

1. A FORTRAN INTEGER CONSTANT 2. A ~nRTRAN INTEGER VARIABLE 3. A FORTRAN REAL CONSTANT 4. A FORTRAN REAL VARIABLE 5. NnNE OF THE ABOVE

• ••• 2 ~ 1'lY- • ••• ~ 9.

• ••• 27 • KS • ••• 40.

• ••• 28. APV~' K • ••• l~ 1 •

• ••• 2 q • 584 • ••• 42.

• •• . 30. *OO5g0 • ••• 43.

· .. . 31. .655147 • •• ;lJ 4.

• ••• 3? • 61~O 176 • ••• lJ 5 •

• ••• 3:3. MN • ••• 46.

· .. . 34. KOKLTP • ••• 47.

• ••• 35. PHK4Q( • ••• 48-.

• ••• 3f1 • 5.l~{)E-5 • ••• 4 ~.

• ••• 37 • Y5Z • ••• 50.

• ••• 3R • 37

SCOI1ING FORMULA

=

I1IGHT*2

~~ I N H~tJM SCORE

=

10 24.20

Figure I-Quiz 1 example

2.70E+7 .449E-3 .04G39E+4 447675023 J

EHHY$G5 JUPTAH47F 50.E+l 725 3.E+3 fJYR

U$QQR*S3447

(11)

Cognitive and Creative Test Generators 651

CIS 3t')~ r'~A ~.~ E ••••••••••••••••••••••••••

('(lIZ 1 SECT !CHI 2 In .•.•...•....

THE 25 ELEr1E~!TS Br:LOH BELONG TO ONE DF THE FnLl()VIING FIVE CATEGORIES.

IN!1ICATE NJ ~nTH THIS SIIEET ANn YOllR AnS"'ER SHEET n~ '1HICH CATEGORY EACH ELFYENT BEUHJ3S.

1. ' .A. FORTRAN IV SPECIAL CHARACTER 2 • ,,\ FORTR~.N IV CONSTAnT

3. A FonT Ri\tJ IV SYM[H'~L

4. l\ VALl!) Jon CONTrOL LANGUAGE cnm1AtlO 5. NONE OF THE ABOVE

1. X57.1=!( • ••• 14. I

2. = • ••• 15. .78242E+9

3.

.

· •.. 1

n •

IV

4. .62522E+8 • ••• 1.7 • IlllJ'I7T

5. IFLIST · ..• 18. CY0KF.

~. q I "'~ n~~1 • ••• 11) •

,

7 •

n

• ••• 2 n • IINSERT I)

8. 5134 r "J81l • ••• ,21. 4XI

0 "

.

PPh~~KIJ

u-w

• ••• 22. 1/7KG

• ••• 1 n • OTEf1A0KG • •.• 23. 7f'20SQSR2

· ... '.1.

45048833 • ••• 24. 8. E +ll

• • It 1 ~ • 3.7 • ••• 25. KOV

• .•. 13. IINT[P.

THE 25 ELEMENTS BELOW BELONG TO ONE OF T.'E ~OLL()WING FIVE CATEGORIES.

1. '\ FORTRAN I NTEGEP. COtlSTANT 2. r... FOPTP.AtI INTEGEr VAP.Il\BlE 3. A FORTRAN REAL CONSTANT 4. A FORTRAN REAL VARIABLE 5. rWNE OF THE ABOVE

• • ., • 26. JJ8 • ••• 3<1.

• ••• 27 • K2NP3 • ••• 4" •

• ••• 28. PFR • ••• 41.

• ••• 29. AZJVM7 • ••• 42.

· .. . 30. 41 • ••• 43.

• ••• 31. H8Z • • ~ l~ 4 •

• ••• 32. L3F5 • .' • • l,~5.

• ••• 33. SEEXQH • ••• 4 f' •

• ••• 3lJ • .8FF.+5 • ••• 47 •

• ••• 35. VFKCY • ••• 48.

• ••• 3n • R*JVYP • ••• 4 q •

• ••• 37 • • FE-l~ • ••• 50.

• ••• 38. 9.E-2

SCORING FO~MtJLA

=

nlGHT*2

t·1INI~·11H·1 sconE = 10

24.30

Figure 2-Quiz 1 variation

H3VOG

5. 7453l~E-3

184

Q 5 Ll 0 1HOQUVT Y70D+$7PO 9.04E+1 ,!;810E4DL H03(

8.2873c}E+4 2

096 1'13

(12)

CIS 302 NAt~ E ••••••••••••••••••••••••••

QUIZ 2 SECTION 1 I O •••••••••••••••

THE 25 ELEMENTS BELOW BELONG TO ONE OF THE FOLLOWING FIVE CATEGORIES.

INDICATE ON BOTH THIS SHEET AND YOUR ANSHER SHEET IN WHICH CATEGORY EACH ELEMENT BELONGS.

1. AN EXPRESSION CONTAINING ONLY ONE r-10DE OF OPERAND (INTEGER OR RE 2. ALL OTHER EXPRESSIONS

3. A VALID ARITHMETIC STATEMENT

4. AN INVALID ARITm~ETIC STATEMENT (CONTAINS AN =SIGN) 5. NONE OF THE ABOVE

1. LG09'F9= ( I TM-JSC) • ••• 1

r. •

«7239+XDZU»

2. N55W=ALOG(.4/Z$D/.Q5) • ••• 15. N=OROBLI)

3. 28( • ••. 15. Y1K)(l)

4. BI=NY7M-5 · .. . 17 . A,(395278364)

5. EXP«-O.5f1255» · ..• 18. S$X·«.~OE-9»

5. -8+3 · .. . 19. 2358(

7. (+(-L) •••• 20. -\tJ5'HJFX**1

8. K=(-JUPT21)+5 · .. . 21. +JQ*l

9. COS ( + til J -D ) • ••• 22. TQ=*~3296*q

· ... '.0.

ARS(COS(5.6QSOE+4**4» • ••• 2 3. 37=(LE)+9

· .. . 11. 9.48E+4=(-IX6RY) • ••• 24. +DE=6591')

• ••• 1.2. TANH {ZXH**\JTHY) • ••• 25. .504111

• ••. 13. ,«53)/LlS8R1)

THE 25 ELEMENTS BELOW BELONG TO ONE OF THE FOLLOWING FIVE CATEGORIES.

• ••• 2 f) •

• ••• 27 •

• ••• 28.

• ••• 29.

• ••• 30.

· .. . 31.

• ••• 3? •

• ••• 3 3 •

• ••• 3 r~

• ••• 35.

• ••• 36.

• ••• 37 .

• ••• 38 •

1. A STATn·1E~IT CAUSING AN UNCONniTIONAL TRANSFER

2. A STATp·1ENT HAVING A 2 HAY CONDITIONAL TPANSFER (ASSUME NO 3. A STATEt-1ENT HAVItJG A. 3 HAY CONDITIONAL TRANSFER (ABNOR~1.1\L

4. J\ STATEMENT HAVING A 4 t .. ,.A.Y CONDITIONAL TRANSFER (TERMINATIONS 5. NONE OF THE APOVE ANn/OR A SYNTACTICALLY INCORRECT STATEMENT GOTO(796,562,282),K18BWB

GO TO 175

GOTO(7886,~S,q,1),INAIGO

GOTO(7,7,7),HYI I F ( N p-4 ) q, 2 5, C)

GO TO 65

GOTO(77,5,402,524S),L81V GOTn(3),N3017

GOTO(Q6,210,210,Q6),N GOTO(8,8,8,S),CfZ

IF«A»31,31,31 (KL)3350,672,33S0 GOTO(282,6Rl,1,5),NKS

• ••• 3 q •

• .•• 40.

· ••• 41.

• ••. 42.

• ••• 43.

• ••.• 4 r~

• ••• 45.

• ••• 46.

• • • • f.s. 7 •

• ••• 48.

• ••. 49.

• ••• 5 0'.

GO TO 989

(9,82,30,952S),IUK GOTn(514,55,648,8),K$J

IFC-LGUZN)4,3,814

IF(O~FG»16,4,22 IF(OW02/.5)5~70,1,S3

00TO(917,657,3433), I

IF«DZTLY»2,2,2 GnTO(S,813,S,QS),MOXO GnTO(Q,8383,8,48),NOIAO GOTO(4,1,2,5283),LRIGP9 IF{ALOG(W»376,413,413

SCOR I NG FORr'aJLA = R I CHT*2

MINIMUM SCORE

=

10 24.20

Figure 3-Quiz 2 example

(13)

CIS 302 OUIZ 4

Cognitive and Creative Test Generators 653

NAt~E ••••••••••••••••••••••••••

SECTION 1 I D . . . . THE 25 ELEMENTS BELOW BELONG TO ONE OF THE FOLLOWING FIVE CATEGORIES.

INDICATE ON BOTH THIS SHEET ANn YOUR ANSWER SJiEET IN WHICH CATEGORY EACH ELEMENT BELONGS.

1.

A

VALID INPUT STATEMENT 2. A VALlO OUTPUT STATEMENT

3. A VALID FIELO SPECIFICATION OR FOR~1AT CODE 4. A VALID FORHAT STATEMENT

5. NONE OF THE ABOVE

1. PR I NT, JJN I , PZ · .• . 14. RE.f\O, MLGN, G4 K, J 2. FORMAT(5HI7ZV(,217) • ••• 15. 12

3. E13.6 · ••• 16. F 0 Rr·,1A T ( E 11 • 0 , E 1 fl • 4 )

4. PRINT,VCXNl • ••• 17 • PRINT,N,uonER5,L

• • 4J: • 5. PRINT,IAOSI · .. . 18. REAO,IF

6. READ(5, 988 )l~ZS • ••• 1 q • PRINT,G,LT,XIHJC 7. PRINT,C,62,NCOZ~ • ••• 2 () • FORMAT(2A2) 8. FORMAT(832) • ••• ? l . El1.12

9. 551 • ••• 22. 2X

• ••• 10. 41 • ••• 23. READ, t;18

• ••• 11. 2Ell.4 • ••• 24. 291

• ••• 12. REAn(S,73)X,MBWDVZ,JSY3Y • ••• 25. REAn(5,32)E2146

• ••• ] .. 3. FORMATC'F',4H2RH*)

THE 25 ELEMENTS BELOW BELONG TO ONE nF THE FOLLOWING FIVE CATEGORIES.

1. A VALID SURSCnlPT

2. A VALID INTEGER SUBSCRIPTED VAPIABLE 3. A VALID REAL SUBSCRIPTED VARIABLE

4. A VALID Dn~ENSION STATE~~ENT FOR r1AIN PROGRAMS ONLY 5. NONE OF THE ABOVE

• ••• 2[1. -74

• ••• 27. N5Z(161)

• ••• 28. 9*LVDMS4-9

• • 0 2 !l. K3GOJY(5*K-2)

• ••• 30. LP-5 .... 31. 0

• ••• 32. DIHENSION Q$OC(13,4,7,3)

•••• 33. DAYB7A(LXPG)

• •• • 34 . -4

• ••• 35.·, DIMENSION 0(5,8,7,1,1,3)

• ••• 36. ~HX(4*ILMHP-5)

• ••• 37 • ~ I B(NONU8, 7*,JRl-8,IJ4M)

• ••• 38. WSSC C 3 *N~1)

SCOR f NG FOP.~1IJLA = R I CHT*2

M~NIMU~ SCORE

=

10

• ••• 3 q •

• ••• 40.

· .. . 41.

• ••• IJ 2.

• ••• 43.

• ••• 44.

• ••• 45.

· ... 'l () .

• ••• 47 •

• ••• 48.

• ••• 4 Q.

• ••• 50.

24.10

Figure 4-Quiz 4 example

DIMEN~ION S(R,5,6,3,63) K$V1SV+8

E~Q(7*LV9JY2,~*N,644)

ZZI2U(S*NNBLW-4,M9U,5Q) OIMENSION ZA05T(7,8,3)

DI~1ENSION FRO(1,5,6)

M34B(N,M8+5,q*JRC,S,N~)

D7NOCM+9,NRG74E,8*NY) IR7K4

DIMENSION dF91 (4,5,7)

AZ(I+9~5*L9,8*f,MVOV8+7)

ITA4UCM8U+4, K-6, 8*t1~'1pr'.~2)

(14)

CIS 302

nUlz 5

tJA~ ... 1E ••••••••••••••••••••••••••

SECTION 1

I D •••••••••••••••

THE 25 ELEMENTS BELOW BELONG Tn ONE OF THE FOLLOWING FIVE CATEGORIES.

INDICATE ON BOTH THIS SHEET AND YOUR ANSWER SHEET IN WHICH CATEGORY EACH ELEMENT BELONGS.

1. A VALID DO STATEMENT WITH IMPLIED INCREMENT 2.

A

VALlO

DO

STATEMENT WITH EXPLICIT INCREMENT

3 •

CAN

BE EITHER AN INDEX, INITIAL VALUE, UPPER L I

,..11 T

OR INC REM EN

'1

4. CAN ONLY BE AN INITIAL VALUE, UPPER LIMIT OR INCREMENT 5. NONE OF THE ABOVE

1.

DO

90

J

: I ~,

94, 6 • ••• 14.

0090 M

4, J30

· ...

2.

DO 7566 JDKAUT

:I

885, KI

· . .. 15. DO

9431 NY6P$ • 3, LM 3. DO 22 IAo.S == K, 52 •••• 16. DO 7 NOWY • 225, 1861

· ... 4. LT2

· . .• 17 .

1.6378E-8

· ... 5. DO 5

N : I

NACICR, 98

· . .• 18. 009290

KM58NI • 85,

N

6. 00

4978

I : I

4, 2fl • ••• 19. N5

7. 5

• ••• 20. DO

82

J

== 38, 927

· ... 8. DO 8 JAg

IiII

MYJ9, 351, 21 • ••• 21. L1YVOS

· ...

9.

431436592

• ••• 22.

o.

•••• 10. 8489622 • ••• 23. DO 453 KXR

I:

J3, 7437

.... 11.

DO 9 MCSXLU =

3,

N7LC,

H

• ••• 24. 105

• ••• 12. I

• ••• 25.

DO 1583

K

== 241),

K

• ••• 13. DO 8847 L

:I

35, 880, LIl

THE 25 ELEMENTS BELOW BELONG TO ONE OF THE FOLLOWING FIVE CATEGORIES.

1. A VALID ARITHMETIC STATEMENT 2. A VALID CONTROL STATEMENT

3. A VALID INPUT OR OUTPUT STATEMENT 4. A VALID SPECIFICATION STATEMENT 5. NONE OF THE ABOVE

•••• 26. GO TO NNBL . •••• 39.

••.• 27.

WRITE(6,17)CQU,YB7,VY • .... 40 ..

•••• 28. FORMAT(6X,18)

· .• . 41 •

•••• 29. READ,V9E,Ll

• ••• 42 •

•••• 30. GOTOC17,5148),COPAAF • ••• 43 •

· .. . 31.

STOP

• ••• 44.

•••• 32. G=ALOGCO.E-2)*9462.56 • ••• 45 •

•••• 33. CONTINUE • ••• 46 •

• ••• 34. DIMENSION JBCLC2,3,5) • ••• 47.

•••• 35.

(823~4,837,4},MZPRI

•••• 48.

•••• 36. FORMAT(7X) • ••• 49 •

•••• 37. O.QYPP==N I D •••• 50.

•••• 38. PRINT,63777729,SC,AK14

SCORING FORMULA == RIGHT*2

MINIMUM SCORE = 10 24.10

Figure 5-Quiz 5 example

5.3E+21i11+40052*MW2 GO TO 9150

READ,UU2CK,NX

FORMAT(')=),',lA3,2F6.5) FORMAT('=M+L',lX,'('}

PRINT,S,FW4KNL,NX~J4 WGAME=Y~T**L/S2VFON

READ,KI,$QJ9U,W

DIMENSION LX(6,5,4!J,1) GO TO 653

L-029668-MGR

FORMAT(7H041$D2/,')')

(15)

Cognitive and Creative Test Generators 655

KEY KEY KEY KEY KEY

('lUll 1 nUIZ 1 OUIZ 1 QUIZ 1 01.117. 1

SEC 1 SEC 1 SEC 1 SEC 1 SEC 1

1. 3 1. 3 1. 3 1. 3 1. 3

2. 1

'- .

1 2. 1 2. 1 2. 1

3. 5 3. 5 3. 5 3. 5 3. 5

4.

'-

4. 2 4.

'-

4. 2 4. 2

5. 4 5. 4 5. I~ 5. 4 5. 4

6. 3 6. 3 6. 3

n.

3 6. 3

7. ?> 7. 3 7. 3 7. 3 7. 3

8. 2 8. 2 8. 2 8. 2 8. 2

11. 5 C). 5 9. 5 ~

.

5 9. 5

10. 1 10. 1 10. 1 10. 1 10. 1

11.

'-

11. 2 11. 2 11. 2 11. 2

12. 1 12. 1 12. 1 12. 1 12. 1

13. 4 13. 4 13. lJ 13. 4 13. 4

14. 5 14. 5 14. 5 14. 5 14. 5

15. 4 15. 4 15. 4 15. 4 15. 4

1fi. 1 16. 1 16. 1 16. 1 16. 1

17. 4 17. 4 17. 4 17. 4 17. 4

18. 1 18. 1 lR. 1 18. 1 18. 1

1"l. 1 lq. 1 19. 1 ltl. 1 19. 1

20. 3 20. 3 20. 3 20. 3 20. 3

21. 5 21. 5 21. 5 21. 5 21. 5

?2. 4 22. 4 22. l~ 22. 4 22. 4

23. 2 23. 2 23. 2 23.

'-

23. 2

24. 1 24. 1 24. 1 24. 1 24. 1

25. 3 25. 3 25. 3 25. 3 25. 3

26. 5 2fi. 5 26. 5 2F.. 5 26. 5

?7. 2 27. 2 27. 2 27. 2 27. 2

2R. l~ 28. l~ 28. 4 28. 4 28. 4

2g. 1 2fl. 1 29. 1 29. 1 29. 1

30. 5 30. 5 30. 5 30. 5 30. 5

31.

~

31. 3 31. 3 31. 3 31. 3

32. 32. 3 32. 3 32. 3 32. 3

33. 2 33. 2 33. 2 33. 2 33. 2

34. 2 34. 2 34. 2 34. 2 34. 2

35. 5 35. 5 35. 5 35. 5 35. 5

36. 3 36. 3 36. 3 36. 3 36. 3

37. 4 37. 4 37. 4 37. 4 37. 4

38. 1 38. 1 38. 1 38. 1 38. 1

39. 3 3t). 3 3q. 3 39. 3 39. 3

40. 3 40. 3 40. 3 40. 3 40. 3

41. 3 41. 3 41.. 3 41. 3 41. 3

42. 1 42. 1 42. 1 42. 1 42. 1

43. 2 43. 2 43. 2 43. 2 43~ 2

44. 5 44. 5 44. 5 44. 5 44. 5

45. 5 45. 5 45. 5 45. 5 45. 5

46. 3 46. 3 46. 3 'J6. 3 46. 3

47. 1 47. 1 47. 1 47. 1 47. 1

48. 3 48. 3 48. 3 48. 3 48. 3

49. 4 49. 4 4q. 4 49. 4 49. 4

SO. 5 50. 5 50. S 50. 5 50. 5

24.20 24.20 24.20 24.20 24.20

Figure 6-Key example

(16)

the first random number in this loop would be limited to a value between 1 and 6. Subsequent random num- bers produced in this loop would be between 1 and 37 corresponding to the 26 letters, 10 digits and the $ sign. Thus, for Question 27, the characters KS resulted.

In similar fashion, the names for Questions 33, 34, and 43 were produced.

table is listed for each quiz and section as shown in Figure 6 for use in class after quiz administration is complete. A card is also punched containing the key for input to a computerized grading system which is.

used to grade tests and homework and maintain records for the course.

To illustrate the scope of this quiz generator in terms of programming effort, the following list gives the name and purpose of each subroutine in the total package. Each routine is written in Fortran IV:

As each category for each question is determined by the main program, the values between 1 and 5 are kept in a table to be used as the answer key. This

Name MAIN SETUP QUIZi ALPNUM SYMBOL CONSTA SPECHA JCLCOM NONEi INTCON INTEXP REAEXP MIXEXP MIXILE UNIARY PAREN BINARY FUNCT ARITH GOTON IFSTAT COGOTO INOUT FIESPE FORMAT DOSTAT SIZCON CONTRL SPESTA INTVAR REACON REAVAR STANUM SUBSCR INTSUB REASUB DIMENS

Purpose

General test formatting and key production Prints a leader to help operator setup printer Calls routines for categories in each quiz Generates single alphanumeric characters

" a Fortran symbol

" " " constant, real or integer

" " " special character

" " job command

" none of the above entries for each quiz

" a Fortran integer constant

/I II II

"

"

"

II II II II

"

II II II

"

II II

"

"

"

1/

1/

1/

1/

1/

1/

1/

" "

" "

" "

"

real mixed

expression

"

"

" illegal expression

" uniary operator expression

" expression within parentheses

" binary operator expression

" function call

" Fortran arithmetic statement

" " GO TO statement

" " IF "

" " comp GO TO

" " I/O statement

" format field specification

" format statement

" Fortran DO statement

" constant of a given size

" control statement

" specification statement

" integer variable

" real constant

" " variable

" statement number

" subscripted variable

" integer "

" real "

" dimension statement

"

"

"

The only major criticism that can be made of these quizzes is that they fail to test the student on his understanding of the behavior of the computer under the control of these various statements either singly

or in combination. This understanding of the semantics of Fortran, of course, is imperative if a programmer is to be successful. Thus a method is needed for generat- ing questions which will test the student in this under-

(17)

standing. It is this problem the solution of which is now being sought. The following sections describe some of the major concepts discovered so far and possible methods of solution.

SEMANTIC QUESTION GENERATION

Work is now under way for designing a system to produce questions which require semantic understand- ing as well as syntactic recognition of various Fortran program segments. The major difficulties in such a process is the determination of the correct answer for the generation of a key and the computation of the most probable incorrect answers for the distractors of a question. Both of these determinations sometimes involve semantic meanings (i.e., evaluation of expres- sions or the execution of statements) which would be difficult to determine in the same program that gen- erates the question element in the first place. As a good illustration, consider the following question model:

Given the following statement:

IF (X

+

2.0 - SQRT(A» 5,27,13 where X = 6.5

and A = 22.7

Transfer is made to the statement whose number is (1) 5 (2) 27 (3) 13 (4) indeterminant

(5) none of the above as the statement is invalid

Here the generator would have created the ex- pression X

+

2.0 - SQRT(A) , the three statement numbers 5, 27 and 13 and finally the two values of X

MAIN TEST

I

GENERATOR

..

~

GENERATOR KEY KEY

Figure 7-2nd stage involvement of key

and A. The order of the first four answer choices could also be determined randomly. In this particular ques- tion, determination of the distractors is no problem but the determination of the correct answer involves an algorithm similar to the following:

X = 6.5 A = 22.7

IF (X

+

2.0 - SQRT(A» 5,27,13

Cognitive and Creative Test Generators 657

MAIN GENERATOR

ANSl'lER AND DISTRACTOR GENERATOR

KEY GENERATOR

Figure 8-2nd stage involvement of key and distractors

5 KEY = 1 GO TO 10 27 KEY=2 GO TO 10 13 KEY=3 10 CONTINUE

This problem can be solved by letting the main generator program generate a second program to com- pute the key as well as generate the question for the test. This second program would then be passed to further job steps which would compile and execute the program and determine the key for the question.

Figure 7 illustrates this concept.

As an illustration of a question involving more difficult determination of answer and distractors, the following question model is presented.

Given the statement:

I

=

J/2

+

X

where J = 11 and X = 6.5 the resulting value of I is

(1) 11.5 (2) 11 (3) 12 (4) 6.5 (5) 6 The determination of the five answer choices would have to be determined by an algorithm such as the

~,1A IN

GENERATOR

Figure 9-No 2nd stage involvement

(18)

THE NEXT FOUR QUESTIONS REFER TO THE FOLLOWING STATEMENT:

DO 746 LS21Q4

=

K, N, 537 WHERE N

=

961 AND K

=

1

1. THE FINAL VALUE OF THE 00 VAPIAP,LE, LS2IQ4, IS:

(1) 1 (2) 537 (3) 538 (4) 2

(5) NONE OF THE ABOVE

2. THE STATEMENTS WITHIN THE DO LOOP ARE EXECUTED M TIMES,

\~HERE ~.1 IS:

(1) 1 (2) 537 (3) 538 (4) 2

(5) NONE OF THE ABOVE

3. IF K

=

962, THE STATEtAENTS ~nTHIN THE LOOP \'!OULO BE EX ECUTEQ N T U~ES \\IHERE N IS:

(1) 0 (2) 1 (3) UNDETEPMINARLE (4) THE PROGRAM WILL NOT BE EXECUTED

(5) NONE OF THE ,ABOVE

4. ONE LEGITIMATE STATEMENT FOR THE LAST STATEMENT IN THE LOOP IS:

(1) 25G/XKL+L~6

(2) GO TO 31

(3) STOP

(Ld RETURN

(5) "'!R I TE ( G, 20) I 5. GIVEN THE STATE~ENT:

GO TO (578,95~,q75,852,212,864,4g8,7q3),K6

~"'HERE K6 IS 4

TRANSFER IS "1AOE TO THE STATH1ENT ~JHOSE NW1BER IS:

(1) TRANSFER IS ~.~ADE TO THE FIRST 8 NIH}.BEPS HITHIN THE PARENTHESIS IN THAT ORnER

(2) 852

(3) 4

(4) MORE INFORMATION IS NEEDED

(5) TRANSFER I S NOT tAAOE REC.A.USE THE STATH·1ENT I S I NVAL I D 6. GIVEN THE STATEMENT:

IF(CXPJE+797) 43, 326, 896 IF CXPJE = .24

TRANSFER IS t"AOE TO THE STATEt)nn NUMREREO:

(1) 0.24000 (2) 43 (3) 326 (4) 8gB (5) ~.'()NE OF THE ABOVE

Figure lO-Semantic question examples

(19)

following:

J = 11 X

=

6.5 ANSI = J/2

+

X IANS2

=

J/2

+

X IANS3

=

J/2.

+

X

ANS4 = X IANS5 = X

In this problem not only does the determination of the key depend on further computation but also the distractors and the correct answer. Thus the second

\ program generated by the first program must be in- volved in the production of the test as well as the key.

Figure 8 illustrates this concept.

Some questions are very simple to produce as neither key nor answer choices depend on a generated al- gorithm. An example is:

Given the following statement:

DO 35 J5 = 3, 28, 2

The DO loop would normally be iterated N times where N is

(1) 13 (2) 12 (3) 14 (4) 28 (5) 35 Here the answer choices are determined from known algorithms independent of the random question ele- ments. No additional program is therefore required for producing this test question and its key. Figure 9 illustrates this condition.

It would then appear that a general semantic test generator would have to satisfy at least the conditions exhibited in Figures 7, 8 and 9.

Figure 10 illustrates results obtained from a working pilot program utilizing the method illustrated in Figure 8. This program is a very complicated one and was very difficult to write. To produce a Fortran program as output from a Fortran program involved a good deal of tedious work such as writing Format statements within Format statements. It has become

Cognitive and Creative Test Generators 659

TEST ORIENTED

SOURCE LANGUAGE

TOSL PROCESSOR

(SNOBOL) FORTRAN PROGRAM

Figure ll-TOSL Language environment

obvious that a more reasonable method of writing the source program is needed.

FUTURE INVESTIGATION

An attempt will be made to design a source language oriented toward test design which will then be trans- lated by a new processor into a Fortran program. See Figure 11.

This new language is visualized as being composed of a mixture of languages including the possibility of passing simple English statements (for the textural part of a question) through the entire process to the test. Fortran statements could be written into the source language where such algorithms are required.

Finally, statements to allow the specification of ran- dom question elements and the linkage of these ran- dom elements to the algorithms mentioned above will be necessary.

Several special source language operators can be introduced to facilitate the writing of question models.

Certain special characters can be chosen to represent particular requirements such as question number control, random variable control, answer choice con- trol, answer choice randomization, and key production.

It is anticipated that SNOBOL would make an ex- cellent choice for the processor language as it will allow for rapid recognition of the source language elements and operations and in a natural way gen- erate and maintain strings which will find their way into the Fortran output program and finally into the test and key. The possibilities of such a system look very promising and hopefully, such a system can be made applicable to other subject fields as well as the current one.

(20)
(21)

A conversational item banking and test construction system

by FRANK B. BAKER University of Wisconsin Madison, Wisconsin

INTRODUCTION

Most conscientious college instructors maintain a pool of items to facilitate the construction of course examina- tions. Typically, each item is typed on a 5" X 8" card and coded by course, book chapter, concept and other such keys. The back of the card usually contains data about the item collected from one or more administra- tions of the item. To construct a test, the instructor peruses this item bank looking for items that meet his current needs. Items are selected on the basis of their content and further filtered by examining the item data on the card, overlapping items are eliminated, and the emphasis of the test is balanced. After having main- tained such a system for a number of years, it became obvious that there should be a better way. Conse- quently, the total process of maintaining an item bank and creating a test was examined in detail. The result of this study was the design and implementation of the Test Construction and Analysis Program (TCAP).

The design goal was to provide an instructor with a computer based item banking and test construction system. Because the typical instructor maintains a rather modest item bank, the design emphasis was upon flexibility and capabilities rather than upon capacity.

In order to achieve the necessary flexibility TCAP was implemented as a conversational system using an inter- active terminal. Considerable care was taken to build a system that had a very simple computer-user inter- face.

The purpose of the present paper is to describe the TCAP system. The order of discussion proceeds from the file structure to the software to the use of the system.

This particular order enables the reader to see the underlying system logic without becoming enmeshed in excessive interaction between components.

661

SYSTEM DESIGN File structure

The three basic files of the TCAP system are the Item, Statistics and Test files. A record in the Item file con- tains the actual item and is a direct analogy to the 5"X8" card of the manual scheme. A record in the Statistics file contains item analysis results for up to ten administrations of a given item. Test file records con- tain summary statistics for each test that has been ad- ministered. The general structure of all files is essentially the same although they vary in internal detail. Each file is preceded by a header (see Figure 1) that describes the layout of the record in the file. Because changing computers has been a way of life for the past ten years, the header specifies the number of bits per character and number of characters per word of the target computer.

These parameters are used to make the files word length independent. In addition, it contains the number of sections per record, the number of characters per record section, characters per record and the number of records in the file. The contents of the headers allow all entries to data items within a record to be located via a relative addressing scheme based upon character counts.

This character oriented header scheme enables one to arbitrarily specify the record size and layout at run time rather than compile time; thus, enabling several different users of the system to employ their own record layouts without affecting the TCAP software.

A record is divided into sections of arbitrary length, each preceded by a unique two character flag and termi- nated by a double period. Sub sections within a section are separated by double commas. These flags serve a number of different functions during the file· creation phase and facilitate the relative addressing scheme used to search within a record. Figure 2 contains an item

(22)

File Header Element

1 2 3 4 5 6-15

Contents Name of file

Number of bits per character in target computer

Characters per word in the target computer Characters per record in the file

Number of sections in the record

Number of characters in section where i = 1,2, ... 10

Figure I-Typical file header

file record that represents a typical record layout. The basic record layout scheme is the same in all files, but they differ in the contents of the sections. A record in the item file consists of seven sections: Identification, Keyword, Item, Current item statistics, Date last used, and Frequency of use, previous version identification.

The ID section contains a unique identification code for the item that must begin with *$. The keyword section contains free field keyword descriptors of the item separated by commas. The item section contains the actual item and was intended primarily for multiple choice items. Since the item section is free field, other item types could be stored, but it has not been tried to date. The current item statistics section stores the item analysis information from the most recent administra- tion of the item. The first element of this section is the identification code of the test from which the item statistics were obtained. The internal layout of this section is fixed so that the FORTAP item analysis pro- gram outputs can be used to update the information.

The item statistics section contains information such as the number of persons selecting each item response, item difficulty, and estimates of the item parameters. The next section contains the date of the most recent ad- ministration of the item. The following section contains

Item File Record

*$ STAT 01 520170 ..

ZZ EDPSY,STATISTICS,ESTIMATORS,MLE. .

a count of the total number of times the item has been administered. These two pieces of information are used in the test construction section to prevent over use of an item. The final section of the item record contains the unique identification code of a previous version of the same item. This link enables one to follow the development of a given item over a number of modifications.

A record in the Statistics file contains 11 sections, an item identification section and 10 item statistics sec- tions identical in format to the current item statistics section of the item record. These 10 sections are main- tained as a first in, last out push down stack with an eleventh data set causing the first set to be pushed end off. Records in the Test file are similar to those of. the Item file and have five sections: Identification, Key- words, Comments, Summary statistics of the test, and a link to other administrations of the same test. The comments section allows the instructor to store any anecdotal information he desires in a free field format.

The link permits keeping track of multiple uses of the same test such as occurs when a course has many sec- tions.

The record layouts were designed so that there was a one to one correspondence between each 72 characters in a section and the punched cards used to create the file. Such a correspondence greatly facilitates the ease with which an instructor can learn to use the system.

Once he has key punched his item pool, the record lay- outs within each file are quite familiar to him and the operations upon these records are easily understood.

This approach also permitted integration of the FOR- TAP item analysis program into the TCAP system with a minimum conversion effort.

It should be noted that the file design allows many different instructors to keep their items in the same basic files. Alternatively, each instructor can maintain

QQ ONE OF THE CHARACTERISTICS OF MAXIMUM LIKELIHOOD ESTIMATORS IS THAT IF SUFFICIENT ESTI- MATES EXIST, THEY WILL BE MAXIMUM LIKELIHOOD ESTIMATORS. ESTIMATES ARE CONSIDERED SUFFI- CIENT IF THEY, ,

(A) USE ALL OF THE DATA IN THE SAMPLE"

(B) DO NOT REQUIRE KNOWLEDGE OF THE POPULATION VALUE, , (C) APPROACH THE POPULATION VALUE AS SAMPLE SIZE INCREASES, , (D) ARE NORMALLY DISTRIBUTED.

VlW TEST 01 220170 ..

1 1 0 0014 .18 - .21 -01.36 -0.22"

1 2 1 0054 .69

+

.53 -00.93 .63, , 1 3 0 0010 .12

+

.64 -01.77 -0.83 ..

VV 161271. . yy 006 ..

$$ STAT 02 230270 ...

Figure 2-A record in the item file

(23)

Conversational Item Banking and Test Construction System 663

his own unique set of basic files, yet, use a common copy of the TCAP program. The latter scheme is preferred as it minimizes file search times.

Software design

The basic programming philosophy adopted was one of cascaded drivers with several levels of utility rou- tines. Such an approach enables the decision making at each functional level to be controlled by the user inter- actively from a terminal. It also enables each level of software to share lower level utility routines appropriate to its tasks. Figure 3 presents a block diagram of the major software components of the TCAP system. The main TCAP driver is a small program that merely pre- sents a list of operational modes to the user: Explore, Construct and File Maintenance. Selection of a particu-

, .

lar mode releases control to the correspondmg next lower level driver. These second level drivers have ac- cess to four search routines that form a set of high level utility routines. The Identification search routine enables one to locate a record in a file by its unique identification code. The Keyword search routine imple- ments a search of either the item or test file for records containing the combination of keywords specified by the user. At present a simple conjunctive match is used, but more complex logic can be added easily. The Parameter search utility searches the item or statistics files for items whose item parameter values fall within bounds specified by the user. The Linked search routine all~ws

one to link from a record in one file to a correspondIng record in another file. For example, from the item file to the statistics file or from the item file to the test file.

Due to the extremely flexible manner in which the user can interact with the three files it was necessary to ac- cess these four search routines through the Basic File Handling routine. The BFH routine initializes the file

Figure 3-TCAP software structure

handlers from the parameters in the headers, coordinates the file pointers, and handles certain error conditions.

Such centralization relieves both the mode implementa- tion routines and the search routines of considerable internal bookkeeping related to file usage. The four search routines in turn have access to a lower level of utility routines, not depicted in Figure 3. These lowest level utilities are routines that read and write records, pack and unpack character strings, convert numbers from alphanumeric to integer or floating point, and handle communication with the interactive terminal.

The purpose of the EXPLORE routine is to permit the user to peruse the three basic files in a manner analogous to thumbing through a card index. The EX- PLORE routine presents the user with a display listing seven functions related to accessing records within a file. These functions are labeled: Identification, Key- word, Parameter, Linked, Restore, Mode and Con- tinue. The first four of these correspond to the four utility search routines. The Restore option merely re- verses the linkage process and causes the predecessor record to become the active record. The Mode option causes an exit from the EXPLORE routine and a re- turn to the Mode display of the TCAP driver. The Con- tinue option allows one to continue a given search using the present set of search specifications.

The Test Construction Routine is used to assemble an educational test from the items in the item file. Test construction is achieved by specifying a set of general characteristics all items should have and then defining sub sections of the test called areas. The areas within the test are defined by user supplied keywords and the number of items desired in an area. The Test Construc- tion routine then employs the Keyword search routine, via BFH to locate items possessing the proper key-

, .

words. This process is continued until the speCIfied num- ber of items for an area are retrived or the end of the item file is reached. Once the requirements of an area are satisfied the user is free to define another area or terminate this phase. Upon termination certain sum- mary data, predicted test statistics, and the items are printed.

The function display of the File Maintenance routine presents the user with three options: Create, FORTAP and Single. The Create option is a batch mode proc~ss

that uses the File Creation from Cards subroutIne (FCC) to create any of the three basic files !rom a ca:d deck. To use this option, it is necessary to SImulate, Via cards, the interaction leading to this point. The FOR- TAP option is interactive, but it assumes th~t the FORTAP item analysis routine has created a card Image drum file containing the test and item analysis results.

The file contains the current item statistics section for each item in the test accompanied by the appropriate

(24)

identification sections and test links. A test file record for the test is also in this file. The File Maintenance routine transfers the current item statistics section of the item record of each item in the test to the corresponding record in the statistics file. It then uses the FCC subroutine toreplace the current item statistics section of the item records with the item statistics section from the FORTAP generated file. If an item record does not exist in the Item file a record is created containing only the identification sections and the current item sta- tistics. The test record is then stored in the Test file and the header updated. The Single option is used to perform line item updates on a single file. Under this option the File Maintenance routine assumes that card images are stored in an update file and that only parts of a given record are to be changed.

OPERATION OF THE SYSTEl\1

The preceding sections have described the file struc- ture and the software design. The present section de- scribes some interactive sequences representing typical uses of the TCAP system. The sequences contained in Figure 4 have had the lengthy record printouts deleted.

The paragraphs below follow these scripts and are in- tended to provide the reader with a "feel" for the sys- tem operation.

Upon completion of the usual remote terminal sign in procedures, the TCAP program is entered and the mode selection message--TYPE IN TCAP MODE=

EXPLORE, CONSTRUCT, FILE MAINTENANCE is printed at the terminal. The user selects the appropri- ate mode, say EXPLORE, by typing the name. The computer replies by printing the function display mes- sage. In the EXPLORE mode, this message is the list of possible search functions. The user responds ty typing the name of the function he desires to pe~form, key- word in the example. The computer 'responds by asking the user for the name of the file he wishes to search.

N ext, the user is instructed to type in the keywords separated by commas and terminated by a double period. The user must be aware of the keywords em- ployed to describe the items and tests in the files.

Hence, it is necessary to maintain a keyword dictionary external to the system. This should cause little trouble as the person who created the files is also the person us- ing the system. Upon receipt of the keywords, the EX- PLORE routine calls the Keyword Search routine to find an item containing the particular set of keywords.

The contents of the item record located are then typed at the terminal. At this point the system asks the user for further instructions. It presents the message FUNCTION DISPLAY NEEDED. A negative reply

causes a return to the Mode selection display of the TCAP driver. A YES response causes the EXPLORE function list to reappear. If one wishes to find the next item in the file possessing the same keyword pattern, CONTINUE, is typed and the search proceeds from the last item found. In Figure 4 this option was not selected. Returning to the Mode selection or reaching the end of the file being searched causes the Basic File Handler to restore the file pointers to the file origin.

The next sequence of interactions in Figure 4 links from a record in the Item file to the corresponding rec- ord in the Statistics file. It is assumed that one of the other search functions has been used to locate a record prior to selection of the LINKED option, the last item found via the Keyword search in the present example.

The computer then prompts the user by asking for the name of the file from which the linking takes place, item in the present example. It then asks for the name of the file the user wishes to link to statistics in the ex- ample. There are several illegal linkages and the Linked search routine checks for a legal link. The Linked search routine extracts the identification section of the item record and establishes the inputs to the Identification Search routine. This routine then searches the Sta- tistics file for a record having the same identification section. It should be noted that a utility routine used a utility routine at this point, but the cascaded control was supervised by the EXPLORE routine. When the proper Statistics record is found its contents are printed at the terminal. Again, the system asks for directions and the user is asked if he desires the function display.

In the example, the user obtained the function display ,and selected the Restore option. This results in the prior record, the item record, being returned to active record status. and the name of the active file being printed.

The system allows one to link and restore to a depth of three records. Although not shown in the example se- quences, the other options under the EXPLORE mode operate in an analogous fashion.

The third sequence of interactions in Figure 4 shows the construction of an examination via the TCAP sys- tem. Upon selection of the Construct mode, the com- puter instructs the user to supply the general item specifications, namely the correct response weight and the bounds for the item parameters X50 and {3. These minimum, maximum values are used to filter out items having poor statistical properties. The remainder of the test construction process consists of using keywords to define areas within the test. The computer prints AREA DEFINITION FOLLOWS: YES, NO. After receiving a YES response the computer asks for the number of items to be included in the area. The user can specify any reasonable number, usually between 5 and 20.

The program then enters the normal keyword search

(25)

Conversational Item Banking and Test Construction System 665

TYPE IN TCAP M~DE =EXPL~RE, C~NSTRUCTIPN, FILE MAINTENCE EXPL~RE

FUNCTIPN DISPLAY

TYPE KIND ~F SEARCH DESIRED

IDENT,KEYW~RD,PARAMETER,LINKED,REST~RE,CPNTINUE,M~DE

KEYWORD

TYPE IN FILE NAME ITEM

TYPE IN KEYW0RDS SEPARATED BY C0MMAS TERMINATE WITH ..

SKEWNESS,MEAN,MEDIAN ..

THE ITEM REC0RD WILL BE PRINTED HERE {

*$AAAC 02 230270 ..

FUNCTI0N DISPLAY NEEDED YES,N0 YES

FUNCTI0N DISPLAY

TYPE KIND 0F SEARCH DESIRED

IDENT,KEYW0RD,PARAMETER,LINKED,REST0RE,C0NTINUE,M0DE LINKED

LINKED SEARCH REQUESTED TYPE NAME 0F FILE FR0M ITEM

TYPE NAME 0F FILE LINKED T0 STAT

THE STATISTICS REC0RED WILL BE PRINTED HERE {

*$AAAC 02 230270 ..

FUNCTI0N DISPLAY NEEDED YES,N0 YES

FUNCTI0N DISPLAY

TYPE KIND 0F SEARCH DESIRED

IEDNT,KEYW0RD,PARAMETER,LINKED.REST0RE,C0NTINUE,M0DE REST0RE

ITEM REC0RD FILE REST0RED FUNCTI0N DISPLAY NEEDED YES,N0 YES

FUNCTI0N DISPLAY

IDENT,I<:EYW0RD,PARAMETER,LINKED,REST0RE,C0NTINUE,M0DE M0DE

TYPE IN TCAP M0DE = EXPL0RE,C0NSTRUCTI0N,FILE MAINTENANCE C0NSTRUCT

TYPE IN WEIGHT ASSIGNED T0 ITEM RESP0NSE 1

TYPE IN MINIMUM VALUE 0F X50 -2.5

TYPE IN MAXIMUM VALUE 0F X50 +2.5

TYPE IN MINIMUM VALUE 0F BETA .20

TYPE IN MAXIMUM VALUE 0F BETA 1.5

AREA DEFINITI0N F0LL0WS YES,N0 YES

TYPE IN NUMBER 0F ITEMS NEEDED F0R AREA 10

TYPE IN KEYW0RDS SEPARATED BY C0MMAS TERMINATE WITH ..

CHAPTER1,STATISTICS,THE0RY,FISHER. . AREA DEFINITI0N F0LL0vVS YES,N0 YES

TYPE IN NUMBERS 0F ITEMS NEEDED F0R AREA 10

Figure 4-0perational sequences

Referenzen

ÄHNLICHE DOKUMENTE

Perhaps some explanation of this theme is in order. I have been exposed to the rapidly developing art of large-scale digital computation since 1941. It appears to me

applications, such operations as interpolation occur often enough and are time consuming enough to make it practical to employ a special purpose computer for one

* An effective address is the result of indexing and! or indirect addressing, if used; if not, the effective address is the instruction address field. a processor in Job

Class X milliseconds milliseconds (2) to (1) to Class X milliseconds a. Because both the average amount of delay and the standard deviations increase markedly as

DRA W describes a part of a drawing by means of a draw-string, consisting' of a string of elements which in- dicate a sequence of drawing functions. The effect of DRAW is

b) Each module makes direct connection to a highway which conveys digital data, control signals and power. The highway standards are independent of the type of module or

1. The direction of data flow on the chip is perpen- dicular to the physical lines of the control signals. The area taken by a single register circuit is increased 0

listed previously to assess the effects of job scheduling. There has been very little previous study of the inter- action between job scheduling and CPU scheduling. We