Skip to main content

TABE 11-12, Have you started? Impressions and Suggestion?

SC has delayed starting TABE 11-12 until January 2019.


Has your program started?

Paper-Based or Computer-Based?





Tags: TABE 11 12


Christina Blair's picture

I am in Texas and we are researching to see if there are other options to use besides TABE 11-12.  We are worried about the length of the test.  We would be using pencil-paper based.  I would welcome other assessments being used.

Donna A's picture

Thank you for keeping this discussion going. I'm eager to hear from practitioners using the assessment. Ohio has postponed implementing TABE 11-12 (planning full implementation July 2019) until we learn more. 

Donna Albanese

State Director, Aspire Adult Ed

S Jones's picture

Our adult ed folks just started using it.   I'm ducking across the parking lot to help out .  Yes, it takes a lot of time :(     The up-side of that is that  those willing to stick to it are probably more likely to persist.   

The test itself is pretty onerous but it's required for funding, here.  We're using the paper ones, I believe. 

Stanley Schauer's picture

All the Adult Learning Center sites that are ran in connection with the State office have moved to TABE 11/12.  We started to make the jump at the beginning of 2018 and we have not had any tests with 9/10 in a couple months.  We are a computer-based testing State, with some paper-based testing still sprinkled in where needed.  I do believe we were one of the first States to fully commit to the move from 9/10 and did so because we felt that assessment was fairly inaccurate and not aligned to standards.  Most of our students have a goal of earning a GED and the 9/10 was not tied to that goal.  There is some adjusting that needs to happen with the new assessment.  One big thing that folks on here have discussed is the times.  However, as you know and as we have found with our students, rarely is the full time used with a group of students testing.  We are learning how to best group our testing sessions together at the site level.  Once we hone in the scheduling piece, the times start to become a non issue.  It is more of a matter of changing some process, i.e. orientation.  We used to complete the locator and the assessment the same day or back to back days.  We now locate during orientation or student on-boarding and then use this a piece of educational background to help formulate a plan.  Then, the actual TABE test is given during first week of instruction.  Also, can then administer test two or three (if apart of plan) in the following week.  I will chose finding time for a better locator then chose to use one that is inaccurate or apart of an inaccurate test any time I am given that choice.  There are many differences between the old and the new, and becoming familiar with these was important as we transferred.  We are not experts, we are still learning things to find more effective and efficient methods.  One big thing the field (teachers, site directors, non-adult ed folks who work with our students, etc.) were worried about the NRS level in which the student was being placed in.  We noticed with the 11/12 it was a bit lower than 9/10.  The test is harder and it needs to be to match harder standards and a GED test that also became harder.  The TABE is used to measure EFL gains, it is not so important where the student starts-but the gains made between tests.  We did have to adjust some things with partners like WSI and Job Service, so they were aware of the difference in scores from old to new.  A suggestion if going to make the move soon would be to use the Locator instead of Auto-locator until you get a feel for the exam and a change to analyze some data.  We did get some dreaded O/R scores right away.  This was apart of reason of taking the locator and then looking at cut scores and other educational history items we could collect on students and then placing them in what we thought was best test.  I also highly suggest the online training that is available and the vast amount of PDFs that are on the Insight website.  These helped immensely (along with DRC helpdesk folks) in making the move.  

In summary-we are making great progress with the transition and the determining factor (as always) was that we felt it was best for our students.  


Kathy_Tracey's picture

Hi Stan and all, 
I think the larger question is how do you use the results of the TABE test to guide class placement and instruction. I love your model, and with anything new, we need to adjust processes. But, can  you share with me how you use the results to guide instruction and have you seen any changes in learner outcomes with the 11/12? These outcomes might come from better classroom placement or more targeted instruction. 

I'd love to hear your thoughts. 


Stanley Schauer's picture

Good Morning Kathy, 

This is a process that is still being worked on.  As you know and like with any assessment, still getting that "feel".  However, some of our sites and ideally, we have classes by subject and they are leveled.  Eventually it would be nice to use the TABE as a strong predictor for a placement into, for example, Math 1, 2, or 3.  The classes are divided by skills that come from the standards, so Math 1 might end at the numbers and operation Domain-so if a student is showing proficiency on the items that would be covered in Math 1, we would place in Math 2.  Also, with more data and analyzing, we can start to look at how the scale scores relate to placement.   Please keep in mind this is one piece or tool used in placement or guiding instruction.  Even with ALC's or sites that do not have the resources to have leveled classes, I think the order in with the standards and the domains are laid out with the TABE give a solid order to follow as far as content area progression.  A good number of our students have what i call spotty or un-linked background knowledge/skills in multiple areas.  I feel the TABE can give us a narrowed down look at were that break or knowledge gap is and allow our teachers to build that connection.  So yes, more targeted instruction.  I do not feel we have enough data yet to make a determination such as 11/12 has increased or decreased outcomes.  That will come later down the road when we can run some good data sets against each other.  I feel that right out of the gate it raises expectations, similar to how the GED 2014 raised expectations with increased rigor.  We are also starting to put together some cross walks with other assessments we use to see if we can find any loose correlations (i.e. GED ready, GED official, etc.).  I also feel that alignment with core standards will only assist in raising learner outcomes. 

Hope I somewhat answered your question! 

Rebekah Wagner's picture


thank you for your insight on how you guys are utilizing the TABE 11/12.  It is is helpful, if for nothing else, to see that other institutions are facing the same challenges and also looking toward the same future goals -- achieving better and more full student outcomes.  Raising the bar for students, though it may initially detour some students, I think allows us to helps students reach those long term goals.  I think the wane in attendance will change when those long term outcomes for the persistent begin to be realized and the process is more fully understood by our students.  Again thank you for the information; as you guys begin to make correlations between the tests, please follow up and post on that specific subject.   

Jeff Goumas's picture

Hi Kathy, 

Great question and one I've been trying to understand myself!! The blueprints (available at provide a breakdown of what standards are tested on each of the tests, indicating the emphasis level of each. These are organized by the College & Career Readiness Domain levels, with specific standards listed within. The problem is, in identifying proficiency levels, the TABE test report only breaks things down to the domain level, and is organized by "TABE Skills", which are great because they are written in teacher- and student-friendly language, but they don't indicate what standards from the blueprints fall under each skill.

I'm providing a link to a graphic you can view/download that has two parts: at the top lists the TABE Skills from one domain—Key Idea and Details—shown on the TABE Test Report. The bottom shows the standards breakdown of that same domain (Key Idea and Details), from the standards blueprints. The color-coded underlining is mine—trying to connect the standards to the TABE skills. 

CrowdED Learning is working with a number of publishers to aggregate their standards alignments so that instructors have the ability to see and devise learning plans for students that are organized by this hierarchy (Subject > Level > Domain > TABE Skill > Standards. The tool, called SkillBlox, will help instructors can see what lessons and activities, including free and open resources, align to the standards. Here is a link to a donwloadable image that shows a mockup of how this will work. The the hierarchy shown here is Reading  > Level E > Key Idea and Details > Recall Details, and the listing of resources includes a theoretical resources from a publisher ("Reading in the Workplace" and then free and open resources that have been entered into the SkillBox database. 

Not sure how all this will go, but it's going to be our attempt to support curriculum alignment as folks transition. In the meantime, I would love to pull together folks who are interested in helping to create an "open map" that we can all share and provides instructors with something that helps to show the breakdown of Subject > Level > Domain > TABE Skill > Standards. I've got to believe that if we go enough folks to volunteer, we could pull that together very quickly and have a really helpful tool available to the field. 

brettstaylor's picture

Thank you Stanley.  It is great to hear from someone who has been pioneering it.  I appreciate your suggestions (Locator use and interpretation with care) and have passed your comments on to others.


AmbroseRichardson's picture

I am in adult corrections in Idaho. One of my sites has been chosen to pilot TABE 11/12. We will be using paper/pencil. I hope to start testing next week. We are still in the planning stages but the time aspect seems to be the most challenging issue so far.

Kathy_Tracey's picture

I understand the length of time can be intimidating for our learners. It can be frustrating to have a goal of getting a GED but since the learners are not familiar with the process - the testing can be overwhelming. I have had many students come in the first day, never to be seen again. 

I believe students need to understand the 'why' of the test. They need to know that the longer the test, the better the results that can guide us in proper selection of curriculum so they can meet their goals quicker. I am not as concerned about the length as much as how we use the results. I'd love to hear more about your challenges and successes so we can continue to brainstorm and learn together. 


Stefanny Chaisson's picture

Our program uses paper/pencil 95% of the time.

The reports take a while to get used to.

The time isn't as much an issue as we thought. But we are only giving the Language and Math to start off with. 

I have bombarded the help desk with any and all suggestions and questions.

If you get a "document failed" notification try turning the answer booklet upside down. 

DRC showed up at our state conference and answered some questions. It doesn't look like the test or the answer booklets will change.  My program is under the impression that they are trying to "encourage" online testing. 

Stephanie Lindberg's picture

Yeah, I feel the same way about the "encouragement" of online testing, but DRC hasn't really been much help in getting our computer testing going! We also have off-site locations that need paper tests. 

Lacey Miller's picture

Colorado implemented TABE 11/12 on July1. Our program utilized the paper-based TABE 11/12 for the first time last week. We spent time before our orientation preparing materials and practicing as a staff. Our orientation process is two-days so we split the testing between them. We had a significantly smaller orientation than anticipated, but it allowed us to learn more about the testing process for the future and think through our orientation procedures. We developed in-house scoring materials that helped speed up the testing process. Most individuals completed the tests before the maximum allotted time expired. 

Stephanie Lindberg's picture

Lacey, I'm in Colorado too. We started in July with the hope of using the computer version, but our testing center still is not able to use the computer version. This has been pretty frustrating as it now takes a lot more work to score, and as of now, I don't know how to help students interpret scores and we don't have a diagnostic system to focus in on specific student gaps. With 9/10, we had a diagnostic for the paper version so after scoring, we could use a diagnostic sheet to say, ah yes, this student missed 8 of the 10 questions on fractions, or whatever. We can't seem to find a similar thing for the 11/12 paper version. Does anyone know if this exists? I know that the computer one prints out a diagnostic, but what do we do with print? Students and instructors have found this really helpful in finding the gaps in students' knowledge. Any ideas? 



Lacey Miller's picture

We have had many of the same questions, Stephanie. We spoke with the state about the possibility of going online with our testing, but realized that it would take more time than we had available for this year. We chose to stick with PBT for now. The diagnostic tools you are looking for are found in the Forms Individual Diagnostic Profile. There is a profile for each level that you can purchase in sets of 25. We purchased one of each, but it also takes time. Each level is broken down by test, then by domain and skill. Item numbers are listed for each domain. You circle the items that were correct and then determine how many points were achieved. This is meant to tell you if the individual is not proficient, partially proficient, or proficient. It really is a helpful tool for instruction and discussing results. 

Will you be at CAEPA in October? Which program do you work with in Denver? 

Bill Ritter's picture

We have exclusively used computer-based TABE 11/12 for pre-testing since April 1, 2018.  We continued to use TABE 9/10 for post-testing through the end of the 17/18 fiscal.  We are now entirely 11/12.  Student scores and achievement have dropped dramatically.  As you know, the test is long, about 6 hours for each version (with locator).  I am sure we will navigate this like we did the change in GED testing, but so far it's been a rough road.  I would be very interested in speaking with others about "+" and "-" scores, and what your program is doing about them (entering in NRS or re-testing, etc.)  I need more information about scores falling outside the ranges the levels are supposed to score.  I have reviewed everything I can find from the publisher and can't seem to find what I am looking for.  

Stefanny Chaisson's picture

We received an email saying that they would be doing away with the O/R but we should be retesting the +/- scores; we didn't retest on 9/10, I don't see us retesting on 11/12. We are only testing Language and Math. 

Natalie Reigle's picture

Was the email you received from DRC?

Stefanny Chaisson's picture

I think it came from my state. I'm sorry, I have been unable to recover it.

Karen Kirchler's picture

Hello all,

In Georgia we will be required to move to full implementation of TABE 11/12 by April, 2019.  Computer-based testing is required, with paper-based testing being the exception.  We are trying to think through concerns around logistics for testing, training and prepping instructors, and identifying best resources.  I'm curious as to what states have implemented TABE 11/12 and what insights/lessons learned can be shared.  We'd appreciate any feedback.  Feel free to contact me directly if you feel like having a conversation.  We have significant concerns, especially regarding logistics.  


brettstaylor's picture

Have you got access to so you can read all the test directions, etc.?

Thanks for posting on this topic.

I am leaving my position as Training Specialist for South Carolina Adult Ed in December.  If you might have a need for help around Georgia between January and April implementation I am available. My personal email is and my cell is 803-230-1069.  Currently, I do not have anything set to do come January, but I am looking around.


Laura Granelli's picture

I am an adult educator for a local community college in NJ. The new TABE is awful!!!!!! Classes are held in a M/W or T/T time frame, between the intake form, locater and the TABE it is sometimes 2 weeks before all is finished-IF they return. Imagine being someone who has decided after 20 years to return and all I do is test them!!!! It is very frustrating!!!! We are using the paper and pencil, if we had to code in the front page for the computer to score I cannot imagine the additional time!!!! Our supplies and classroom space is limited-I even have 2 days a week at the public library-so there are other stressors. My classroom student list is constantly changing for who finished the test, who is still testing and who can go on with instruction-----VERY FRUSTRATING!!!!!

David J. Rosen's picture

Hello Laura,

Thanks for joining the Program Management group, and for seeking some help with the challenges you face in using the new TABE 11 - 12 test. I hope some TABE users here can offer help.

I am not an expert on the new TABE, so I can't offer any advice on the test itself or how the paper and pencil version is supposed to be administered; however, I wonder if you could explain which of the challenges you experience have to to with the test itself or regulations from the test-maker on how it must be administered and which, you believe, are a result of how the test is administered at your community college. It might help to know if the challenges are the result of community college decisions, not requirements from the test maker, although if you are a test center administrator you may already know this. If not, it might help you to understand what the test maker does require of test center administrators. Perhaps a TABE representative could help clarify this for you and others in this discussion; for example, if the TABE 11-12 test administrator guidelines are public, to post a link to them in this discussion.

One argument for computer-based testing is that it can reduce the time for getting test results to administrators and students. I wonder what others' experience might be with the computer-based version of the new TABE 11-12. If Laura's community college switched to the computer-based version, do you think it would be helpful in addressing any of the issues she raised?

Are others here experiencing some of the challenges that Laura raises, for example, that it is difficult to keep track of which test-takers have started, which are still testing, and which ones have finished? If so, how do you handle that problem? Do you use a spreadsheet? If so, do you have a customized version that you could share with others here who may benefit? Do test administrators have access to an online tool from the test-maker that makes this easier?

Laura, and others who use the paper and pencil version of the test, can you tell us why you chose this option? What are the constraints you face that make it difficult or impossible to use the computer-based version?


David J. Rosen, Moderator

LINCS CoP Program Management group

Laura Granelli's picture

We had one meeting with someone from DRC, who explained the new TABE. However we did not get to see the new booklets. To me the biggest problem in order for a student to complete the Locator and the TABE we are talking about at least 6 hours.....then they have intake paperwork and a short orientation that all of this testing does NOT count towards the GED......and I thought the PARCC was ridiculous. I could teach my students with only the info from the Locator! However I believe the Dept of Labor needs the additional info, I do not.

Stefanny Chaisson's picture

While the computer based testing does streamline the process, I personally have found it is quite a feat to get the students enrolled online and then assign the test. If the student doesn't show, then you have to reassign them. You have to make sure they are comfortable with the computer....I have people who do not know how to operate a mouse. In addition, with limited funding and room, I would have to have a full time person to do non stop testing for the 1,100 students I served last year. I can test 40 people in one day for paper and pencil. I could only test 16 a day (that is 2 testing sessions) with my current computer situation. 

We are locating everything, but only testing Math and Language.

Laura Granelli's picture

In reading through this chain of replies, I am very interested in how you found out what test questions tested what skill. Other than sitting with a questions booklet and comparing how ach student answered each question, how is this done? I am testing paper and pencil-the testing times are arduous and the process can take me weeks. I am the lone instructor and have limited computer access, so even if the students were taking the test on line, it could be an adventure to say the least. I am following suggestions and think I will switch to the Locator and just math and language.....

I have lots of experience teaching-over 30+ years in the public schools, but I have not taken the time to look at each individual question. As it is, I have an ABE section on M/W and move into Ged on T/T. But I allow the students to show up when they can. When so many are returning after years, I try to make it as easy as I can.

I am also dealing with a large ESL population. Some feel they can complete one year of ESL and go right to GED. They are allowed to test in Spanish, but I have no bilingual materials and I cannot speak Spanish.

I can prepare anyone for the test without the TABE-the test is simply for funding and to prove growth, in my humble opinion

Laura Granelli's picture

ok, people are finding the same stressors about time....however, I have not been able to figure out how I know what skills the individual student might be lacking. For instance....let's just say a student missed #5 on the reading test. The question dealt with the skill of main idea. If they got it wrong, then I know to go over that skill. Now, how do I do this? We are a paper and pencil test and I am the only instructor. I cannot level my classes, as there are no other rooms available. I teach whole class lesson on the 4 subjects, they they break out for individual practice in math. To me, nothing has changed from 9-10 to 11-12 other than the frustratingly long pretesting period. 

Natalia Devlin's picture

Laura, I respectfully disagree that TABE is only useful for funding: workforce centers use TABE to determine someone's eligibility for vocational training or apprenticeship. In Colorado, Denver Public Schools uses TABE to identify highly-qualified talent for Teacher's Aide positions. And I am aware of at least two community colleges in Colorado that accept TABE scores instead of Accuplacer, not for all programs, but nevertheless.

Natalia Devlin's picture

I am in Commerce City, CO, and we use computer-based TABE 11/12 through DRC. I would say it has been not that big of a deal for us. Our staff took significant chunks of their time last year to learn about TABE 11/12, but also to analyze the competencies and create learning plans that align our instructional resources to TABE 11/12.

Sure, computer-based testing presents some administrative challenges, but this is our second year doing computer-based testing across the board. Last year, we used TABE 9/10 for Adult Basic and Secondary Ed classes, and we used CASAS for ESL courses and for Bridge to ABE courses. But this fiscal year, we moved all Bridge to ABE, as well as ABE/ASE students to TABE 11/12. The challenging part was to ensure that our computers are updated in a timely manner and that our partners provide us with WiFi access, which they do. Another factor to consider is how many computers you have and how many students you typically register. Our Program Director has purchased additional Chromebooks this year to use in the classroom and for testing. We typically have about 10 students registering per day, and so, we try to have 10-20 computers available.  

Of course, occasionally, we enroll adults with low levels of literacy or specifically low levels of computer literacy. This is where what you say before the test is important. We explain that students would not need to type a lot or do any complicated tasks. They just have to point and click. And knowing that someone is potentially a low-literacy student, we watch them more closely as they start the test.

Our students complete the initial testing in 2 days. On the first day, they complete the intake form, attend orientation and start testing. That day they manage to get through the Reading parts of the test. Reading includes practice items, the Locator and two parts of the test. We ask them to stop after they complete Reading - Part II. On the second day, students complete all parts of Math. (Our state does not require the Language section of the test). The entire pre-testing on Reading and Math takes about 4 hours.

We print out the Individual Portfolio form from DRC for each student and their teacher. The first week of classes, students review the syllabus for their course, as well as their assessment results. In fact, we designed our syllabi to be learner-centered. That is, students can review their TABE results and check off the topics their have demonstrated proficiency in on their syllabi. The first week is also the goal setting week. We use TABE results to formulate our academic goals. This is the practice we adopted last year in all our classes, and it seems to work well. Students are more aware of what their TABE scores mean. And I am actually excited about the Individual Profiles for TABE 11/12, as they provide a detailed list of competencies students were tested on. 

One important observation: students score lower on TABE 11/12 than they did on 9/10. This causes some frustration in both learners and instructional staff. However, we just keep explaining that TABE 11/12 is more aligned with today's learning outcomes. This is what K-12 graduates are expected to know and do, so why set a lower bar for adult learners?

So, in summary, I think it is important to prepare for TABE 11/12, especially if you are adopting computer-based testing. I would make sure all proctors and admins are trained in time, scripts and presentation slides are ready to go and all stakeholders buy into computer-based testing as an effective assessment tool. Without that, testing will be frustrating to all parties involved.

Laura Granelli's picture

As I read the 2 posts above, my guess is my most extreme frustration is I AM IT, I AM THE LONE INSTRUCTOR. There is no one else in the building when I teachnGED that does what I do. One other INSTRUCTOR take one of the night classes, but we are it.


I am also reading and am not clear on some of the procedures, anacronyms ( Like what is O/R.) And. Materials you have available. 


This makes me sad for adult learners in NJ, where the passing score is higher than 47 other states,  to try to get ahead with just me.


I pretest while teaching, I differentiate as best I can. I score and keep track of my own paperwork attendance and is a three hour stint on roller skates.

NOW everything said, I love what I do. I had about 200 students pass through the doors...maybe 10% made it!...some came once or twice, some were absent for months, some practiced on their own.....

David J. Rosen's picture

Hello Laura and others,

From my limited experience with high school equivalency exams, I believe the accepted practice is to separate formal pre- and post-testing from instruction, at least to be sure that the person handling the testing is not the teacher of the same students. I am not clear from what you have said if that is not the case in your situation, but perhaps we could have Mike Johnson from DRC describe or point us to their recommended or required procedures regarding testing. 

David J. Rosen, Moderator

LINCS CoP Program Management group



brettstaylor's picture

 Examiners must not be involved in any related preparatory course. 

This quote is from page 44 (p.50 on pdf) of the TABE 11 & 12 Test Administrator Manual on the site posted updated 4/12/18.

(username and password required)

I was told that someone at DRC gave oral permission for a state to view this as a suggestion, a preferred situation.  Many programs have their teacher giving the TABE tests, but yes, David, the standard I was taught  was no learning and testing in the same room at the same time.  This is a different scenario and clarification from DRC would be welcomed.



Natalia Devlin's picture

How is this even possible to get an accurate result if students are testing in the same room where instruction is taking place? How do such programs report instructional time?

I think it makes sense to dedicate some time to just testing. I have heard that other programs hire temporary staff for proctoring. They train them and everything, but only use them for testing.

In our program, we have extra staff trained on both assessments we use (CASAS and TABE 11/12).

Natalia Devlin's picture

Hello David,

That is how we do it, as well: We do pre-testing before we place students in classes.

Sounds like Laura has minimal support, and that explains why it is so challenging for her program.

Rachel Donelson's picture

Hey Natalia,

We're looking into using the online TABE 11/12, but we don't need all subject areas for our program.  I'm curious if you guys are using the Locator or the Auto-Locator- and whether you're able to tailor the locator to only level students according to the subjects they need.  It's not clear to me from the online training materials what the online locator consists of- and whether the manual set-up locator includes different content than the Auto-Locator.  We've also had some issues getting consistent information from the reps. 

So, do you (or anybody else reading this thread) know if it's possible to just give them the Auto-Locator in Reading, for example- kind of like how the CASAS eTest is set up?  Also, what is your experience with the locator being an accurate predictor of student level?  Our adult ed students often rush through the locator on the CASAS and then end up with an out-of-range score on the level test because they're taking a level test that's too easy.  Have you noticed that problem with the TABE- especially since it's such a long test?  Are they itching to get through it?


Stefanny Chaisson's picture

I have used the online briefly. You should be able to auto-locate with reading only. 

Pam Young's picture


West Virginia will be making the transition July 2019. There is no choice but to move to TABE 11/12 due to NRS requirements. 

We have had meetings with Mike Johnson from DRC about TABE 11/12.  It is important to note that the Literacy Level for TABE 11/12 is absolutely not at a beginning  literacy level.   We will continue to use CASAS for literacy purposes.  

Pam Young


Adam Woodrow Nathanson's picture

Hi Pam,

Was it Level L or Level E that you found to be above beginning literacy?



Rachel Baron's picture

I'm waiting to see if CASAS gets their math test approved before we need to buy tests to start pre-testing for next program year. It rubs me the wrong way to have no choice of test at all (especially since whatever we choose, we'll likely be sticking with it for 5-10+ years). I want to at least be able to pick my poison.

Whichever test we choose, we're almost certainly going to switch from all paper to (mostly) computer-based testing. Our state administration is strongly encouraging computer-based because there's less room for administrative error. It's also the only way that I can imagine giving different TABE 11/12 levels simultaneously, since there are so many different timings.

Our program has teacher-taught classrooms and volunteer tutors who mostly work one-to-one out in the community (libraries, coffee shops, etc.), so we have a lot of thinking to do about how much group administration we'll be able to do. As it stands, a lot of our tutoring students get post-tested individually at their tutoring site or in a coordinator's office. That could be difficult/impossible if the test takes 3-4 hours to administer. We might need to start requiring students to come to post-testing sessions, but we'll still end up doing some individual administrations, I'm sure. Some of our tutoring students have complicated availability, which is why they're in the tutoring program to begin with. At least the coordinator can catch up on paperwork while giving the test...

Currently, we set aside a couple of classes at the end of our managed enrollment sessions to test students who have their hours (and to give practice HSE tests if appropriate). We might need to reconsider this, too, since it could start seriously cutting into the session.

We're going to have to lengthen our orientation and spread it out over more days to break up the testing, but we might actually have to cut back the amount of time we have to talk about the program and the students' options. That's also up in the air.

I will say that I am excited to finally be able to use a test that is better aligned with our students' HSE goals. I'm just ready to be on the other side of the transition!

Susan Finn Miller's picture

Hello Rachel, Thank you for sharing your thinking process as your program transitions to TABE 11/12. I am certain that many other providers are going through the same thing, including my own. I do think that computer-based testing has benefits. Moreover, I agree that having a test that is better aligned is positive. However, the length of the test definitely requires careful thought about needed changes to orientation.

It would be interesting to hear from others regarding their experiences thus far with the question of alignment with the HSE tests as well as how programs are structuring orientation.

Cheers, Susan Finn Miller

Moderator, Teaching & Learning CoP

Rachel Baron's picture

I just got notice the the CASAS GOALS Math has finally been approved.  It's too late for me, since we already decided to go with TABE, but for those of you who haven't made a decision yet, you can at least look at two options.

David Reynolds's picture

Hello all

I am the assessment coordinator for a small program.  We have been using the 11/12 on the computer for a bit over six months now.

In regards to the time component, while the new test is taking longer, it is not as bad as I initially feared.  

The real frustration that we are feeling is the reports, which leads to my observations and wondering if anybody else has seen the same:

1. We are getting post tests with low out of range (O/R) scores.  DRC is redacting scale score and NRS levels when this happens.  We have no idea if the student is one point or twenty one points off.  The redacted scores are also starting to cause a log jam of data as we have no scores to report in our state system for progress tests.

2. The Performance on Domains section is far too obtuse.  We have had students make gains yet the check marks do not move from test to test.  Others have check marks rapidly swinging with no real movement in scale score.  It also does not help that a single check mark can cover six or more skills.

3. Recently I noticed that (-) and (+) notations are proving to be unreliable.  One student tested in range on an initial and O/R on a progress.  The difference was one point yet there was no indication that the student was on the edge.

4.  While digging into the above issue I found two more issues - first, the DRC report numbers on points v. scale score in some cases are not matching up with DRC published charts.  Second, I found eleven initial tests where the reports in no way indicated the student would benefit from taking a higher level test, yet they were maxed out in the possible NRS level, making it impossible for that student to achieve an NRS level gain in that subject.  Eleven tests doesn't sound like much until you realize that I have only 32 students that have taken the 11/12.

These last two findings are went sent me to the internet looking to see if anybody else has seen this.


Thank you


Susan Finn Miller's picture

Hello Dave, Thank you for sharing the challenges and confusion you are facing with the TABE 11/12 reports. I'm certain others will weigh in on this important issue. Members, please let us know if you are seeing anything similar. Do you have any guidance for Dave and the rest of us?

Cheers, Susan Finn Miller

Moderator, Teaching & Learning CoP

Stefanny Chaisson's picture

1. We get one or two  O/R scores each testing session. We have 3 or 4 sessions a month. We are retesting them after they complete orientation. We are only testing Language and Math; rarely are they O/R on both. When you look at other scores or have a conversation, you can get a good idea if they gave up or if they somehow strangely tested high on the locator. Most of the time, it is a borderline Locator score.

2. We agree the standards are too obtuse. TABE is nervous we will "teach the test" and are reluctant to narrow the scope.

3. We are ignoring the -/+. There is no point if you are paying attention to the Form and NRS Level.

4. a. I haven't noticed the points/ss not matching the charts. Make sure you are in the correct form, level, and subject. b. For example, Language M level and an NRS of 4? Is that what you mean? Yes it happens. We are trying not to place students in the subjects that have this issue. Especially since we have just received the few aligned materials.



David Reynolds's picture

Hi Stefanny

Sorry for the late reply, I've had my head down in other projects.  This image shows what I am talking about with things not matching.  Granted, a single digit off is not exactly ground breaking, but it makes me wonder what other details are getting missed.

Stefanny Chaisson's picture

That is a great example! Now that you have said that we have found that! I remember being told that some questions have more weight that others especially the multiple answers, I chalked it up to that. 

KenZutter's picture

We are computer-based only and have fully transitioned from 9/10 to 11/12 this year.

We wholeheartedly agree with the O/R, +/-, and diagnostic issues.  However, my biggest gripe is with the DRC Insight Portal. This has to be the worlds most USER UNFRIENDLY program in the world. Click and wait, click and wait. Even if you made no changes to a dialog and click Cancel, you have to wait. The whole concept is backward from how we deal with testing. It is testing session based, where you create sessions and add students to the session. I am sure this works for DRC's whole  K-12 district-wide testing, but not for how we test in Adult Ed. We have students that we assign tests to, not tests that we assign students to.

Length of testing has always been an issue, but the TABE was the only game in town. This is now changed with CASAS GOALS Math coming on board. We have our fingers crossed that the State can get it fast-tracked into their assessment policy and our database vendor can get it onto their system before July 1. Of course, we will have to determine if it meets our instructional needs, but since our teachers have little use for the limited TABE diagnostics, I think it will come out ahead.

Stefanny Chaisson's picture

"We have students that we assign tests to, not tests that we assign students to".

I have told my admin that we personally do not have the capacity for computer based testing. (we only have 7 computers in the room) And it is so time consuming to enter new students. I would love to know what your procedures are for entering students and testing them.

sadkins2009's picture

West Virginia is working to train and start implementation of TABE 11/12 beginning on July 1, 2019. We are encountering some questions from partner agencies about setting new entry scores for their programs and we aren't quite sure how to help them since TABE 11/12 is so new. Additionally, feedback that I am reading in this discussion says that users are experiencing lower scores from TABE 9/10. 

For example, many Workforce programs set a cut off for entry at 9.0-grade level. So, with TABE 11/12 not having a "grade level" and it seemingly being more difficult, what are your Workforce or CTE programs using for cutoff scores?

Please share, we'd love to have your feedback!

Nicole Worley's picture

Are any of you using the Level L test?

We have students who tested at this level in TABE 9/10 who we expect will need level L for 11/12.

Upon reviewing the Level L in TABE 11/12, it appears--based on Page 7 of the Scoring Guide--that a student can get a Raw Score (NC) of 0 and be given a Scale Score of 300!?  Confused--Based on the NRS table, 300 appears to be a valid score? Is that true?  I do not see an out of range chart anywhere. Can someone really get 0 questions correct on Level L and be given a valid score? I feel I must be missing something.


Rachel Baron's picture

Hi Nicole,

I just got the Scoring Guide for the 11/12, and it looks like the scores are on a 300-800 scale, so 300 is basically the same as 0. Since the L is the lowest level test, there would be no point in having an out of range score, since there is no lower test that could be administered to a student.

If you have a student who really got 0 on the L, have you done any informal assessment to determine whether that seems accurate to you? (Ex: word list, writing sample, etc.) Is this a native English speaker? Did the student understand how the test works? We haven't tested anyone yet, and I'm curious how common a score of 300 might be...

I'm curious to hear your observations.


brettstaylor's picture

Despite the many issues associated with administration and specific results reporting, how have the post-test results been, a lot of GAINS??

Any thoughts/observations on HSE test performance in relation to TABE levels?


PS- I was laid off as SC Adult Ed Training Specialist at the end of, just in case...seeking a new position.

Stefanny Chaisson's picture

We are showing gains. Not like we were before! However with the lack of materials available, I am impressed at how many gains we are making!

I find TABE is also more reliable when it comes down to predicting success on the HiSET test. 

Julie Neff-Encinas's picture

We've been using TABE 11/12 since May 1, 2018 for those who had already shown a gain in FY18 using the 9/10 and for new enrollees.  As of 1/1/19, it is the only test allowed under any circumstance.  We saw a drop in NRS level almost completely across the board for all.  With progress testing we're really struggling to get gains.  One thing we're noticing is that when the locator and test put someone at the top of the test level such as at a NRS 3 and that test only goes that far, the progress test, which has to be same the test level, won't allow a gain!  They simply give us the highest scale score for the level and the + sign, but retesting at a higher level isn't considered the right action for a progress test.  (Are we alone in seeing and thinking this?)

We've also seen things like students who score 37 out of 47 points and get scale score 533 and then on the retest they get 41 out of 47 points and yet a scale score of 529... yes lower.  (I'm not looking at the exact report but giving sample figures that demonstrate the concept.)  How is this real?  How is this fair to the student?

Frustration is our major emotion lately.  We do not believe in buying materials geared directly to the TABE.  We teach holistically for real life improvements and to achieve on the GED(R) test.  Anyone seeing what we are?


Stefanny Chaisson's picture

"One thing we're noticing is that when the locator and test put someone at the top of the test level such as at a NRS 3 and that test only goes that far, the progress test, which has to be same the test level, won't allow a gain!  They simply give us the highest scale score for the level and the + sign, but retesting at a higher level isn't considered the right action for a progress test."

Am I right in understanding that if the pretest is an E Level and an NRS of 3, you are NOT testing M Level? It is impossible to get a 4 on an E level. You HAVE to test an M. Which means you have to teach the M level standards. This holds true for each level. The purpose is to show they are progressing through the levels, why wouldn't it be ok to post test them on the next level?

I understand not "teaching the TABE" however, the TABE test (from what I understand, we do HiSET) is more closely geared towards HSE. We find a way to hit on common standards on the TABE and HiSET.

Julie Neff-Encinas's picture

I will not attempt to defend this policy, but it is as I am saying.  We must test at the same level as the initial test.

Stefanny Chaisson's picture

What did you do when this happened on the 9/10? Is this a state guideline? I am fascinated now!

Julie Neff-Encinas's picture

Here are the exact words from the assessment policy:

Progress tests must be administered in the same content areas as the initial assessments. A progress test must be given in the alternate test form but same test level as the initial assessment. For instance, if the initial assessment was TABE 11, Level M, the progress test must be TABE 12, Level M. Measurable Skill Gain by pre-/post-assessment is determined by comparing the learner’s initial assessment with a valid post assessment.


We don't recall hitting the same problem with 9/10 as much.  Not sure why and my assessment coordinator is gone to a conference this week, so I can't double check with him.

David Reynolds's picture

With the 9/10 a student could always show a gain provided the pre-test they took was "in range".  If they maxed out on the subsequent post test the system would produce an NRS level above what the student was at on the pre-test.  However, the spread on the 11/12 can set up a situation where it is possible for a student to get a better score that is in range without the possibility of an NRS gain.

The Math M scoring guide is sitting in front of me so that is what I will use.  This test covers an NRS spread of level 2 through 4.  If a student gets a raw score of 27 through 33 (scale score of 541 through 588) they will be scored at NRS level 4.  This point spread is all within what is considered in range.  Scores from 34 to 39 (scale score 595) should trigger the + indicating the student should take the next level up yet still produces an NRS level 4.  This sets up a situation where the student is showing growth yet the program can not show it until the next progress test if they assign the next level, in this case a D.  

For example:  JimBob McFreeny comes in and takes his pretest.  On the Math section the locator places him in the M level and he takes that test.  JimBob has some math skills and scores a 542 scale score and gets an NRS level 4.  This score is in range and therefore must be used.  40 hours later JimBob takes his progress test, that must be at the same level according to state rules, and scores a 595+.  Even though he has not only improved his score, but maxed out the M level math test, JimBob does not get a gain until I can give him the D test on his second progress test in another 40 hours.

This is true across all subjects and levels of the TABE 11/12

Stefanny Chaisson's picture

What would you do if they scored a 9.0 on the M level TABE 9/10?


Right! We give JimBob the D level test for his progress test. 

Is the mandate about the level of the progress test a state or NRS requirement? We have been specifically told that we can NOT test on a lower level but we can test a higher level.

David Reynolds's picture

What would you do if they scored a 9.0 on the M level TABE 9/10?

This question does not relate well to the conversation on 11/12 as grade equivalency is not included in the reports.  Having said that, if while taking a progress test a student maxed out a level on the 9/10 it would always produce a gain on the NRS level from the pre-test.  In other words, it was always possible to get an NRS gain.


Right! We give JimBob the D level test for his progress test. 

Is the mandate about the level of the progress test a state or NRS requirement? We have been specifically told that we can NOT test on a lower level but we can test a higher level.

​Unfortunately, due to state rules, we can not change the test level on the first progress test.  Because JimBob's pre-test was in range on the M he must take the M on his first progress test.  In the scenario I presented, (which has happened a few times already) it is impossible for JimBob to earn an NRS/EFL gain on his first progress test.  This raises a problem from a program standpoint because other than passing all GED tests, NRS/EFL gains is the only metric that Arizona uses when evaluating a program.

Yes, I can give JimBob a D level test on his second progress test, however this means another 40 hours of seat time (now 80 hours from the pre-test) before he has an opportunity to earn a gain.

Stefanny Chaisson's picture

That is horrific! It sounds as if y'all need someone in high places that understands the ground work. I'm sorry for the 3rd degree, I am relatively new and trying to figure out what is state vs. federal mandates!

susan.morss's picture

Now that we are aware of these "Black Holes" where a student's initial test places him/her in an NRS level where the he/she cannot make a gain, we are retesting at initial testing time. Moving them up then to the higher TABE level to allow for the student to be able to make the gain. DRC just sent out their Scoring Best Practices this week which suggests allowing for a progress test to be in the next level up from the initial. Until the State Assessment policy is revised to state this, I am retesting during initial placement testing. It is pretty awful and we were led to believe that there would be fewer retests due to the longer Locator. But no one told us about the black holes. It wasn't until we say a student on her second progress test still scoring 595+ that I started researching what was going on. Illinois and Florida seem to be ahead of this curve. I recommend you look at page 10 in this presentation. It has the scale scores and NRS levels clearly marked.


David Reynolds's picture

Thanks for the info, I missed the notice from DRC on the best practices.

It is pretty awful and we were led to believe that there would be fewer retests due to the longer Locator. 

On the longer locator test: I recently attended a DRC session where they stated one thing that they think is throwing off the locator is that students are taking too long and they are looking at shortening the time available.  He also recommended telling students to skip problems they don't know on the locator rather than attempt them.


Danielle Cox's picture

But no one told us about the black holes. It wasn't until we say a student on her second progress test still scoring 595+ that I started researching what was going on. Illinois and Florida seem to be ahead of this curve.


Glad to hear others are having the same issues as we did in MD. We noticed these "black holes" last May when we were setting up our certificate levels, to coincide with NRS levels, prior to test implementation on July 1, 2018. It was pretty crazy the first few months. We decided that if the student received a negative scale score on the initial placement test, we would immediately retest them at the next lowest level (Ex. if student receives a negative or O/R on the D test, we gave them the M). If they score in the positive range, we immediately tested them on the next level up (Ex. if they score positive on the M test, give them the D). We then used the score on the second test for class placement. This gave the student an opportunity to make a gain on their post (or progress) test. We made this policy, in addition to many other procedures related to TABE 11/12, and trained all teachers. 

Stephanie Lindberg's picture

We are just starting to complete post-testing on TABE 11/12 for the first time. The first student to post test was Located into Level M originally. The student scored O/R on the post test in Level M, so had to be re-tested in Level E. We suspected other students who tested in the low range on Level M could also potentially score O/R on a post test. It's a small sample size so far, but that's what has happened. 

We have more of an issue of people in the pre-test process testing O/R on the low end, not the high end. For someone who tested in the + range, we would expect to choose the next higher level on a post-test. That's what we did with 9/10 as well (though, it was also a small portion of our post-testers). 

We did see NRS drops with students between 9/10 and 11/12 (the majority one level drop, i.e. NRS 4 to NRS 3), like we thought might happen because of increased rigor in the levels themselves. A Level M book on 11/12 is more difficult than a Level M 9/10. I'm curious to see what gains are like on 11/12 post tests. 

I don't know whether it's a better predictor of HSE readiness yet. I have students who dropped in NRS on the 11/12, but have been successfully passing HSE tests with high scores. Could this be attributed to students not taking the TABE 11/12 seriously? Possibly. I don't know if this is a problem in other programs or not, but there are definitely students who do not try their hardest and so when they get into a classroom, it's often evident that the TABE was not an accurate assessment of their skills. Students who experienced the change over from 9/10 to the 11/12 (so not students who started the program on 11/12) could have just thought it wasn't a big deal and didn't try as hard as they could have. Or, it could be that 11/12 is more difficult, so it is a truer measurement of skills at a specific level. 

I'm excited for the increased rigor of TABE 11/12, but I'm also anxious to see if students will actually show gains in the post-test. 



Rachel Baron's picture

I just completed the first part of the Pennsylvania TABE 11/12 training (haven't done the observation part yet), and we are instructed that if a student gets a + score on the pretest, they should be posttested on the next level test. So, there's no need to retest immediately; we can still wait until the student has the required hours of instruction.

If someone gets an O/R score, we have to retest immediately on a lower level test. If they get a - score, then we're advised to pay special attention to teaching the student the material needed at that level. I know that timing tests can be tricky enough as it is, but I'm thinking that folks with - scores might do better if they get more hours of instruction before retesting, so if I'm on the fence about whether to test a student and I see that they have a - score, I might decide to wait. I'm curious to hear if you are seeing these students move up into the regular score range on the same test, or if you're retesting them...

It's interesting to hear that they're considering shortening the locator--If it means more accurate test placement and shorter initial testing times, I'm in favor! It's a pain  to retest students on the 9/10. I can only imagine that it would be worse with the 11/12, since it's so much longer. Given the way retesting works in PA, it seems like it would be preferable to have students in a slightly easier test rather than a harder one. They can always move up next time.

It's always interesting to me to hear how different states write their policies. I feel bad for those of you who have to retest both + and - scores... and I especially feel bad for your students! I hope that your states decide to follow DRC's advice in this matter. Things are crazy enough as it is without adding more hoops to jump through.

Laura Granelli's picture

A couple of questions. I understand if they pretest in level D, they should post test in level D. And yes, I agree if they were the highest score in D on the pretest, it will be difficult to make a gain. When the TABE itself was less time consuming, you could re pretest students s the next level. However, the new TABE is way too long.


My question is...the student pretests . at D4. They attend classes for 6 months or more with great success. They complete GED READY with better than average scores and complete their GED in month #7. Their attendance is spotty, so as the instructor, I take what I can get and in 6 months they have only accrued maybe 90-100 hours, so a post test is needed. However, since they earned their high school diploma, they do not return.


Now, I have been told that will not count as a gain! This really racks my brain! I have been told they must be level 5 or 6 for the GED to count as a gain@&#$÷!



Sarah Michaels's picture


Can you post-test at 50 hours? We are allowed to in our state, and we almost always do. If we waited until 90-100 hours, many of our students would already be done with their HSEC, too.

Laura Granelli's picture

Yes, I certainly could test in less hours but I forget. I am all by myself and I enter the classroom and start my "act". Therehvae beenmultiple time that I have set up the TABE booklet and answer sheet up and have neglected to give it to the student, or they are absent and then I really forget.THE administration is in another location and I do get emails and info from them, reminding me.


There is also the rare instance where a student has spotty attendance but manage to earn their GED in less than 50 hours.


I still am frustrated that earning your high school diploma is not a gain.