Test Taking Strategies?

Hello friends, We are moving toward the end of our semester and that means it will soon be time to begin post-testing with the NRS-approved testing instruments, which for the students I'm currently working with is TABE reading. I'm wondering what -- if anything -- teachers are doing to prepare students for this type of assessment. Are you teaching test-taking strategies? If so, how are you doing this? What strategies have you found to be most helpful to students?

Please share your ideas and strategies with us here.

Cheers, Susan

Moderator, Assessment CoP

 

Comments

Hello Susan,

 

I would be happy to share with you the Teacher Instructional Resource section of the TABE Teachers Guide for Reading and Language to see if these exercises will help your students.

Please send me an email and I will reply back with the PDF file.

 

Regards,

Mike Johnson
National Adult Education Manager, CTB
McGraw-Hill Education 
630-995-6712
mike.johnson@ctb.com

Hi everyone, I just saw some testing results that have me scratching my head. A student was tested on TABE Level M for reading and got a score of 6.2 GLE. A week later -- by mistake --  someone administered TABE Level E to the same student. The student's score on Level E  for reading was 4.1 GLE. Of course, it was a mistake to administer two tests a week apart, and it was also a mistake to give the student two different levels of TABE reading. But these results are strange. What do you all think?!

Happy Thanksgiving!

Susan, Moderator Assessment CoP

 

Hi Susan,

There are many possibilities. For example, one (or both) testings may not have been properly administered. I would also ask the student what the experience was like the first time, and what it was like the second time. I would probe for feelings about the test-taking experience each time. For example, if the student was frustrated at being given the test a second time s/he may have blown it off, or if s/he was stressed by external factors the second time, this could explain a lower GLE. There could be other explanations, too. (In a few weeks, ask the student's reading teacher what s/he would estimate as the student's GLE, based on direct instruction. That's often more reliable.)

Sometimes a student's self assessment is more reliable than a standardized test. I recall a (I think University of PA) study many years ago in which the TABE locator was found to be as good a predictor for placement purposes as any of the longer TABE level assessments. The same study found that giving new students leveled reading materials to look at, and an opportunity to talk with students currently enrolled in the various class levels, produced the most reliable placements. Of course, standardized tests like the TABE are used now not only for placement but for reporting measures of NRS level gains.

David J. Rosen

 

 

 

I agree with David - that a student's condition and other factors can greatly influence the results of student testing. If the student is a non-native speaker, there is another possible explanation. When we used to administer the TABE to our ESL students, I noticed that occasionally (not often) students with high academic skills would actually score higher on the more difficult forms of the TABE (M and D) than on the E, and I think the reason was partly cultural and partly linguistic. At the time (pre-Form 9), the readings for the E level were more functional and less academic. So, even though the vocabulary and structures used on Forms M and D might seem more sophisticated, students who had learned and used English in an academic setting were more familiar with this style of writing. Depending on their native language, they might also have found more cognates in these readings. I didn't do any research in this area, and I'm not familiar with any research that's been done, but occasionally students did report that they felt more comfortable with the readings at the higher levels. Of course, if the student is a native English speaker, you can throw that theory out the window. :-)

If nothing else, this student's experience is a reminder that no one given test is a measure of a student's ability.

Dorothy Taylor

 

Thank you, David and Dorothy, for your thoughtful comments. These tests were administered several months ago, so I'm not sure if asking the student how she felt during testing would be that helpful at this point. Our test administrators are trained, so I would not expect that the testing conditions would have been a factor.

In fact, the student is an advanced English learner who does not speak a language where cognates would help. The student is not highly educated in the primary language either, but I think it's still possible that a higher level test may seem easier -- some how!

You are so right that the way a student feels on any given day can have a huge impact on the test results. I agree completely that "no one test is a measure of a student's ability." It's always disappointing for teachers and learners when scores decrease, but I agree completely that there are many ways to assess a learner's progress besides a standardized test. This point underscores why formative assessment, in which students participate in assessing themselves, is so valuable.

It would be great to hear from others on this issue.

Cheers, Susan

Assessment CoP

 

 

 

Hi David and all, I wanted to give an update on the odd testing results I reported in my previous message. It turns out the the information was incorrect. Thankfully, the student was not given another test at the wrong level a week later. The testing results for the higher test actually belonged to a different student. This particular mystery is solved, but it doesn't solve the mystery of why students sometimes lose points after studying for months. As you noted, there can be many complex reasons for that.

Best, Susan

Moderator, Assessment CoP