ABE Learning Gains Issues

Colleagues: I just recently visited the OVAE/DAEL website and found that the site now has copies of the annual reports to Congress for the three Program Years of 06-07, 07-08, and 08-09. I noticed that for the learning gains part of the National Reporting System reports the national average for the percentage of adults moving from one level of learning up to the next level was about 40 percent for ABE/ASE combined.

 

As a matter of interest in how well certain states compared to the national average I looked at California and Massachusetts as representing states which have both contributed a lot to innovations in ABE/ASE. California introduced the widely used Comprehensive Adult Student Assessment System (CASAS) in the 1980s when the movement toward competency-based education was in full swing. Massachusetts in its turn conducted the National Center for the Study of Adult Learning and Literacy (NCSALL) for ten years and also through the work of David Rosen introduced the first national adult literacy advocacy discussion list on the internet. Politically, both states are considered as blue or Democratic states. For California, the average percentages of adult learners making one level or greater gain in learning in ABE/ASE for the three program years 06-07, 07-08, 08-09 was 26%, 26%, and 28% respectively. For Massachusetts the corresponding percentages were 23%, 21% and 23%. Both states report achievements in learning well below the national average of 40%.

 

For comparison purposes I looked at two states that are not especially known for innovations in the adult basic education system across the nation and are typically classified politically as red or Republican states. These states are Texas and Mississippi. Interestingly both of the states reported much higher rates of learning achievement than California or Massachusetts. For Texas the three program years reported gains of 38%, 39%, and 49%. For Mississippi the corresponding gains were 38%, 47%, and 57%. Setting aside the political make-up attributed to these states by various political commentators, I am wondering why there are such big differences in those states generally considered innovators in adult basic education and those that are not considered innovators. Indeed, in national assessments of reading in the K-12 system Mississippi has stood at or near the bottom of educational achievement for decades. But then they appear to be in the top tier of adult basic education.

 

Any ideas about what is going on here?

 

Tom Sticht

Comments

Tom and everyone -

Why the difference in outcomes for ABE gains? Two thoughts.

1) Most states use the TABE to measure progress. You can "teach to the test" - that is, practice only those skills that appear on the tests. The California and Massachusetts tests may be testing for concepts and the application of concepts in new situations.

2) Who is monitoring the testing? Does every site in every state do a rigorous job of enforcing tight testing procedures at every site? Are students given extra time? Do instructors give "hints"? Any of these situations could boost scores. Tight enforcement could lower scores.

You asked for some ideas...

Dorothea Steinke

 

 

Being from California, I think these statistics are indeed interesting. I wonder if we should be looking at the tests used to gauge improvement. Really are standardized tests the best measure of performance? That is my big question.

As an ABE trainer for Mississippi, I can attest to the training every teacher receives in regard to the TABE.  Every new teacher has a TABE Administrators' manual and is taught to follow the transcript to a "T".  Every instructor is well versed in the procedures for testing. Many locations have group enrollments where there are several proctors in the room. At our last summer conference, we had a McGraw-Hill present a workshop on TABE testing and standards. Both sessions were standing room only.

Since the inception of AEMS, we have worked diligently to meet our annual performance targets. Our instructors are trained how to read and interpret their table 4 and to pinpoint those students in need of post-testing and encouragement.  Many of our classes are grouped by TABE scores so we have a more leveled classroom. I know at my program, during every staff meeting we scrutinize our AEMS.  It is reviewed by Instructors, Directors and state-level personnel on a monthly (if not daily) basis. We take our numbers pretty seriously down here.

Many of our students start out at a very low level and have quite a mountain to climb. I don't know if they start out lower than other states. It looks like Mr. Sticht's numbers are an overall gain; they aren't broken down by skill levels. We might find some insight if we looked at the data by skill levels.

I can't say we do anything different than any other state; we are all trying our best for our students' future. It's our future, too.

 

Linda Letherwood, Jackson Public Schools

Jackson, MS

I just now reread the messages in this item. Something struck me about the higer percentage that is in the Federal report compared to the low percentage in California.

Are the Feds adding the percentage of progress from each state (and DC) and then dividing by 50 (or 51 with DC)? That means the lower percentage of progress from California (high population state) is being under-reported compared to smaller states (low population state).

If the Federal report is taking the NUMBER of people in each state (say, 3 million form California - I don't know the real number - and 150,000 from Mississippi) then the overall percentage of progress is accurate. I don't know which way the report to Congress did the numbers.

Another thought -

The Federal reports use the TABE because it is an easy test to measure. People can pass the TABE (all procedural, computation questions) and not be able to pass the GED (all application, "word problem" questions). That is because those students are missing an understanding of NUMBER RELATIONSHIPS.

If you - or your colleagues - are coming to COABE next week, I'm giving a presentation on NUMBER SENSE on Wednesday morning. You will get a 10-minute test that shows you what really basic ideas about number relationships your students may be missing.

Dorothea Steinke

 

Dorothea: No one knows what is being tested on the TABE, CASAS, ABLE,NAAL, IALS, or other general reading or math-related tests. These tests all purport to assess reading or math or literacy but are totally confounded with working memory assessment, background knowledge diffrences, and obscure and little understood thinking and reasoning processes. The tests are incomensurate but are all purportedly measuring whatever a "level" is supposed to represent in learning. The percentages of level gains from the states do not reporesent the same thing on the TABE, CASAS, etc yet they are averaged and reported to the Congress as indicating gains in learning---though what has been learned is not clear. The assessment business is caught up in the use of standardized tests which do not assess to see if what has been taught is what has been learned. Instead the tests require the presupposition that something called "generalization" or "transfer" will occur such that even though the specific knowledge assessed in a test is not what specific knowledge was taught in the classroom (teaching to the test is not proper!), for some reason and by some unknown  mechanism(s) students will improve their performance on the standardized tests. That is the strange circumstances within which programs are evaluated for producing gains in learning. Perhaps in some future time, students will be assessed to see if they have learned what was taught instead of what was not taught to report learning gains.

Tom Sticht 

Do you think that there could be a difference in the student population in these states?

It may be that in states where a diploma is more obviously valuable (to a teenager), students who have middling skills and a little bit of endurance are more likely to stick with it long enough to graduate. That would leave only the hardest-to-serve students for the ABE/GED programs later on--the ones who have very low skills, very difficult family backgrounds, difficulty finishing a thing once started, etc. Meanwhile, a student in another region who dropped out of high school and got a job (that maybe even required a little bit of reading or math on a daily basis) might be more likely to succeed if he or she decided to return to the classroom later on.

I also wonder how many ABE students in the country as a whole are ESL students who have improved their language skills to the point that they are tested on the TABE? I know that in our program, we have a total number of ABE students and an unofficial "true" ABE number (a "true" ABE student being one who needs to learn to read, write, do math, and/or pass the GED, as opposed to a student whose goal is still mostly to improve his/her English language comprehension and production). I would expect that the proportions of ABE students who are actually high-level ESL students might vary widely across the country, though I'm not sure what affect that would have on the percentages of success in different regions. I do know that these are very different students who often end up lumped together in reports.

Or, maybe the TABE is just easier if you live in the South? ;)

Rachel