Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

Question types,Testing Receptive Skills, Lexis, item analysis

ITI 13th April 2013
by

andrew bosson

on 11 November 2015

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Question types,Testing Receptive Skills, Lexis, item analysis

Word / Semantic
paragraph
text (within)
text (external)
inter text
CEFR B
CEFR C
Paragraph level - reorder sentences into a pararaph
scoring is very difficult for teachers to mark, easy for computers
Selective Deletion Cloze test at text level
have to understand most of the text to complete, but eye-tracking software shows people rarely read beyond sentence level for this type of text.
Whole text level. 7 paragraph text - identify paragraph headings
substantial texts
integrate sentences into 1 text
integrate 2 texts
Selecting
Establish Quantitative Parameters

-Coh-Matrix
-Complete Lexical Tutor
-Lexile.com
vocab profile (BNC-20)
however this assumes lexical knowledge is related to frequency
http://www.slideshare.net/marcomed/fundamentals-of-language-assessment-manual-by-coombe-and-hubley
What to test?
-skimming for gist
- scan for specific details
-understand text organisation
Expeditious Reading
Careful Reading for:
- main idea
- supporting details
- author's argument & purpose
- relationships and paragraphs
- fact vs opinion
Information transfer from non-linear texts
"Major" reading skills...
"Minor" reading skills...
Understandng at Sentence level
- syntax
- vocabulary
- cohesive markers
Understanding at inter-sentence level
- reference
- discourse markers
Word Level
- spelling
- semantic
Understanding components of nonlinear texts
- meaning of a graph or chart labels, keys...etc
- ability to find and interpret intersection points
What?
prose texts
diagrams
lists
graphs
notes
advertisements
(any appropriate texts)
Considerations
Topic
controversial
unbiased
background knowledge needed
Lexis
90% should be known (Nation)
Text length
readability
lexis
Tools
Texts
Test
format
Multiple Choice
Matching
Sentence Completion
Short Answer
Information transfer
empahasis usually on meaning (not spelling)
Complete a text at sentence level - multiple choice options for lexis
easy to mark
CEFR A
Specifications
cultural knowledge /
Grammar
Genre
should be familiar
Questions
can you write the number & type you want?
Operationialising tasks
Checklist
-Are there enough items to be useful?
-Do the items test what is supposed to be tested?
-Do the items sample the syllabus?
-Is there a sufficient variety of task types?
-Are the questions at the right level?
-Are the learners familiar with the question types?
-Are the questions clear?
-Are the tasks in the appropriate order?
-Can the questions be answered with "world knowledge"?
-Are the questions independent?
-Is he paper well laid out?
-Is the time sufficient to answer the questions?
-Are the questions grammatical?
-Is the spelling correct?
-Do the learners know the assessment criteria?
Rubrics
-Is it clear on what the learners
have to do?
-Is it written in the shortest possible sentences?
-Is it grammatically correct?
-Is it spelled correctly?
-Is the rubric familiar to the
students?
-Is the timing clear?
-Will the task require different types
of response?
If so - give specific instructions.
Marking Scheme
-Is there only one clear answer?
-Is the key complete and correct?
-Is it easy to compute marks?
-What is the pass score? How did you decide?
-Are marking & markers reliable?
-Do you agree on ignoring errors in mechanical accuracy (spelling, punctuation, grammar)?
-
-Is the number of acceptable answers
limited?
Final Check
-Have you proof-read the test?
-Have trust-worthy & interested
colleagues reviewed the tests?
-
Only used to test real things
Cannot be used for creative tasks
often biased against stronger students
they think there is a trick in the questions, weaker students do not
Below are the names of four animals draw a line around the name of each animal that is useful on the farm?
cow, tiger, rat, wolf
item
stem
options
key
distractors
Criteria / key
Practice
Monitoring
Standard-setting
Define the Minimally Acceptable Person
(MAP)
- each question - would the MAP answer the question correctly?
options for finding a CUT SCORE:
a. use this as the cut off
b. imagine 100 students - how many would pass each question and get average
c. use an an expert panel
Key Issues in Writing MC
Stem
Problems with options
common errors
confounding skills
eg testing English using a mathematics question
Clues in the answer
word or phrase replicated in stem and answers
a. ground b. area c. pound d.stadium
non-plausible answers
even one reduces the odds if guessing.
Use of background knowledge
knowledge that may be gained other than in lessons
Word Matching
Too much detail in options
More than one correct option
(avoid which is the best)
Options connected
c. answers a & b. All items must be independent
Options not grammatically acceptable
(non-plauibilty)
Length of option
incosistent numbers of distractors or to many distractors
no correct answers
more than one correct answer
ask for the correct answer
show just stems and ask for answer
checking items
never write / compose tests alone
Analysing
Standard-setting
Post test confirm that the test worked as you planned (scale / key / items)
Decide at least on the major boundaries before the test (pass / fail; distinction / pass)
Marking the Test
must contain essentials for identifying correct option
must relate to the skill or ability we are trying to test
must elicit viable options
should not be too long or complex
no correct option or more than one correct option
distractors not plausible
some options may have the same meaning
clues in the options "all of the above" and "none of the above"
The position of the correct option is too regular
Another view on MCQ
http://books.google.com.tr/books/about/The_Tyranny_of_Testing.html?id=shpqlwNVyt8C&redir_esc=y
Answer Key
Using the answer key before you start marking
Use overlays & bubbles
Always count - NEVER add
Take stock during a marking event - or ask somebody to double-check your work
Deciding what to test
The claim I wish to make about the candidate
How can I be sure that the test allows me to make this claim?
Carefully consider the test format
The Construct
Clear Link
The Test Items
As teachers we place a lot of importance on testing so we should do it right
useful to measure ability to reproduce specific information
ss complete statement or BRIEF phrase.
must be fully standardised, do not give partial scores (without CLEAR guidelines).
More cognitively challenging than MCQ.
You have an idea why the student gives the response
3-5 words
Short Answer Format (SAF)
make sure the instructions are clear
word items so that students CLEARLY understand exactly what information they are expected to supply
ensure that the required response is brief and specific
write questions so there is either one answer r a limited number of answers possible
if more than one anwer is acceptable ensure that they all are reflected in the Answer Key
Do not use the same language in the item as is used in the input.
never give the students the opportunity to copy a chunk of the text
try to avoid partial credits
used to measure a students understanding of the relationship between words, events and ideas
True / False / (not mentioned)
usually used to measure knowledge or comprehension
come in different format (yes/no)
VERY dubious as guessing is clearly very easy
consider an additional option how certain are you?
Advantages
allows testing of relationship between words, events or ideas
In vocabulary tests allow for depth rather than breadth to be assessed
objective scoring
easy to administer and score
lmpacts impact of writing
allows for item analysis
difficult to write initial list
guessing a problem
as with MCQ recognition only assessed (students may not 'know')
Limited use
Advantages
can be used to test knowledge or comprension
easy to write
objecitve scoring
easy to administer and score
limits impact of writing
allows for item analysis
Disadvantages
guessing a problem (true is correct 60% of the time!)
do not discriminate well between weak and strong students
can limit test to relatively trivial tasks or information
where possible DO NOT USE (except with confidence)
avoid using exact language from the input text (leads to memorisation)
unlike MCQ (where long options tend to be true) here long statements tend to be false - make all the same length where possible, especially double negatives
ensure roughly equal numbers of true & false options (though be aware that students tend to go for TRUE when randomly guessing)
consider asking students to correct false statements
can be difficult to score
limited to knowledge & comprehension questions
requires higher cognitive abilities
responses can be modeled & understood
guessing is less likely
good for detailed response
relatively easy to construct.
s
Matching items
use

clear, accurate and simple language when writing descriptions etc. (i.e. below level of students)
ensure items on both lists belong to the same broad category or topic (no 'tricks')
ensure items have only one correct response
ensure items are independent of each other
never allow a page break with a matching item
ave all items independently checked for language (rater should identify what you are trying to test) and bias
Cloze Tests
Regular
- Words are deleted at regular intervals from a text. Learners complete with appropriate word.
Variation 1
(controlled response)
- MC options
Variation 2
(C-test)
- Clue included e.g. first letter of word. Uses shorter texts.
- easy to construct, score & analyse
- Doubts about what is being tested
- Native speakers may not get full marks
- May get different scores with different deletions
- (As a reading text - not sure if learners understand the text after performing the task)

- easier to score than regular cloze
Controlled Response
Regular
As above
C-test
As above
-
- Shorter texts mean variety of possible texts.
-More control of potential answers = more valid.
Listening
Audiolingual method
- phonemic discrimination
- paraphrase recognition
- response evaluaiton
e.g. minimal pairs
e.g. listen & elect closest statement
e.g listen to question & choose best answer
Integrative Approach
-dictation
-cloze
Communicative Approach
- listener must comprehend the message then use in context.
- must be authentic
Historical Tasks
Setting
-Good acoustics, minimal background noise
Rubric
-Should be very clear -1 level or L1.
-"Skill contamination" vs "Skill integration"
Input
-Learners should have a clear communicative purpose / reason to listen.
Speech
-For fairness should match the teacher population
-Speed of delivery should be consistent with learner level and input materials.
Structure
-
GB.
Easier to more difficult in test & mix formats in
sections.
-
USA.
Sections have the same formats but difficulty
changes within section (30% easier, 40%
medium, 30% more difficult)
.
This can ease test anxiety.
Suitable texts
-Reading texts do not easily transfer to authentic listening texts. They miss features important to understanding speech.
Timing
-Learners should be familiar with text length.
-Number of repetitions?
Main idea - once?
Detail - twice?
Authenticity? e.g.Lectures?
Lexis
-90-95% should be familiar to learners.
- "Lexical overlap" the use of words from the text in
questions and responses.
Easier - word used in correct answer.
More difficult - word from passage used as a distractor
- Unknown lexis should not be part of the correct answer.
"Testing is always a compromise"
Validity is holistic - it comes from the whole process
"Testing is validity"
Item Analysis
important for grading & refining items / tests
Item Facility
Item Discrimination
Basic Questions
Receptive or Productive
Breadth (amount of words) vs depth (usage etc of words)

What this might mean
Reading
Question Forms
Lexis
Discrete
A measure of vocabulary knowledge or use a an independent construct. The focus is clearly on vocabulary knowledge (developer decides on the specific items focus).
Embedded
A measure of vocabulary which forms part of the assessment of ome other, larger construct. The focus is on a broader measure of which vocabulary forms part (e.g. in a rating scale)
Selective
A measure in which specific vocabulary items are the focus of the assessment. All receptive tests of vocabulary are selective (the developer decides on the focus).
Comprehensive
A measure which takes account of the whole vocabulary context of the input material (reading / listening tasks) or test-takers' response (writing / speaking tasks). Where we are making statements about the candidates overall vocabulary ( as is the case in writing & speaking tests).
Context-Independent
A measure in which the test-taker can produce the expected response without referring to any context.The candidate does not need to engage with the contextual presentation of the target item to respond to the text.
Context-Dependent
A measure which assesses the test-takers ability to take account of contextual information in order to produce the expected response. (i.e the TT must engage with the text in which the target item is presented).
Read's dimensions of vocabulary assessment
What to test
Form
spoken
written
word parts
Meaning
form & meaning
concept & referents
associations
Use
grammatical functions
collocations
constraints on use
register, frequency...
Nation 2001
Scoring
-Attitude to grammar and spelling
Full transcript