SC has delayed starting TABE 11-12 until January 2019.
Has your program started?
Paper-Based or Computer-Based?
I am in Texas and we are researching to see if there are other options to use besides TABE 11-12. We are worried about the length of the test. We would be using pencil-paper based. I would welcome other assessments being used.
Thank you for keeping this discussion going. I'm eager to hear from practitioners using the assessment. Ohio has postponed implementing TABE 11-12 (planning full implementation July 2019) until we learn more.
State Director, Aspire Adult Ed
Our adult ed folks just started using it. I'm ducking across the parking lot to help out . Yes, it takes a lot of time :( The up-side of that is that those willing to stick to it are probably more likely to persist.
The test itself is pretty onerous but it's required for funding, here. We're using the paper ones, I believe.
All the Adult Learning Center sites that are ran in connection with the State office have moved to TABE 11/12. We started to make the jump at the beginning of 2018 and we have not had any tests with 9/10 in a couple months. We are a computer-based testing State, with some paper-based testing still sprinkled in where needed. I do believe we were one of the first States to fully commit to the move from 9/10 and did so because we felt that assessment was fairly inaccurate and not aligned to standards. Most of our students have a goal of earning a GED and the 9/10 was not tied to that goal. There is some adjusting that needs to happen with the new assessment. One big thing that folks on here have discussed is the times. However, as you know and as we have found with our students, rarely is the full time used with a group of students testing. We are learning how to best group our testing sessions together at the site level. Once we hone in the scheduling piece, the times start to become a non issue. It is more of a matter of changing some process, i.e. orientation. We used to complete the locator and the assessment the same day or back to back days. We now locate during orientation or student on-boarding and then use this a piece of educational background to help formulate a plan. Then, the actual TABE test is given during first week of instruction. Also, can then administer test two or three (if apart of plan) in the following week. I will chose finding time for a better locator then chose to use one that is inaccurate or apart of an inaccurate test any time I am given that choice. There are many differences between the old and the new, and becoming familiar with these was important as we transferred. We are not experts, we are still learning things to find more effective and efficient methods. One big thing the field (teachers, site directors, non-adult ed folks who work with our students, etc.) were worried about the NRS level in which the student was being placed in. We noticed with the 11/12 it was a bit lower than 9/10. The test is harder and it needs to be to match harder standards and a GED test that also became harder. The TABE is used to measure EFL gains, it is not so important where the student starts-but the gains made between tests. We did have to adjust some things with partners like WSI and Job Service, so they were aware of the difference in scores from old to new. A suggestion if going to make the move soon would be to use the Locator instead of Auto-locator until you get a feel for the exam and a change to analyze some data. We did get some dreaded O/R scores right away. This was apart of reason of taking the locator and then looking at cut scores and other educational history items we could collect on students and then placing them in what we thought was best test. I also highly suggest the online training that is available and the vast amount of PDFs that are on the Insight website. These helped immensely (along with DRC helpdesk folks) in making the move.
In summary-we are making great progress with the transition and the determining factor (as always) was that we felt it was best for our students.
Hi Stan and all,
I think the larger question is how do you use the results of the TABE test to guide class placement and instruction. I love your model, and with anything new, we need to adjust processes. But, can you share with me how you use the results to guide instruction and have you seen any changes in learner outcomes with the 11/12? These outcomes might come from better classroom placement or more targeted instruction.
I'd love to hear your thoughts.
Good Morning Kathy,
This is a process that is still being worked on. As you know and like with any assessment, still getting that "feel". However, some of our sites and ideally, we have classes by subject and they are leveled. Eventually it would be nice to use the TABE as a strong predictor for a placement into, for example, Math 1, 2, or 3. The classes are divided by skills that come from the standards, so Math 1 might end at the numbers and operation Domain-so if a student is showing proficiency on the items that would be covered in Math 1, we would place in Math 2. Also, with more data and analyzing, we can start to look at how the scale scores relate to placement. Please keep in mind this is one piece or tool used in placement or guiding instruction. Even with ALC's or sites that do not have the resources to have leveled classes, I think the order in with the standards and the domains are laid out with the TABE give a solid order to follow as far as content area progression. A good number of our students have what i call spotty or un-linked background knowledge/skills in multiple areas. I feel the TABE can give us a narrowed down look at were that break or knowledge gap is and allow our teachers to build that connection. So yes, more targeted instruction. I do not feel we have enough data yet to make a determination such as 11/12 has increased or decreased outcomes. That will come later down the road when we can run some good data sets against each other. I feel that right out of the gate it raises expectations, similar to how the GED 2014 raised expectations with increased rigor. We are also starting to put together some cross walks with other assessments we use to see if we can find any loose correlations (i.e. GED ready, GED official, etc.). I also feel that alignment with core standards will only assist in raising learner outcomes.
Hope I somewhat answered your question!
thank you for your insight on how you guys are utilizing the TABE 11/12. It is is helpful, if for nothing else, to see that other institutions are facing the same challenges and also looking toward the same future goals -- achieving better and more full student outcomes. Raising the bar for students, though it may initially detour some students, I think allows us to helps students reach those long term goals. I think the wane in attendance will change when those long term outcomes for the persistent begin to be realized and the process is more fully understood by our students. Again thank you for the information; as you guys begin to make correlations between the tests, please follow up and post on that specific subject.
Great question and one I've been trying to understand myself!! The blueprints (available at www.tabetest.com) provide a breakdown of what standards are tested on each of the tests, indicating the emphasis level of each. These are organized by the College & Career Readiness Domain levels, with specific standards listed within. The problem is, in identifying proficiency levels, the TABE test report only breaks things down to the domain level, and is organized by "TABE Skills", which are great because they are written in teacher- and student-friendly language, but they don't indicate what standards from the blueprints fall under each skill.
I'm providing a link to a graphic you can view/download that has two parts: at the top lists the TABE Skills from one domain—Key Idea and Details—shown on the TABE Test Report. The bottom shows the standards breakdown of that same domain (Key Idea and Details), from the standards blueprints. The color-coded underlining is mine—trying to connect the standards to the TABE skills.
CrowdED Learning is working with a number of publishers to aggregate their standards alignments so that instructors have the ability to see and devise learning plans for students that are organized by this hierarchy (Subject > Level > Domain > TABE Skill > Standards. The tool, called SkillBlox, will help instructors can see what lessons and activities, including free and open resources, align to the standards. Here is a link to a donwloadable image that shows a mockup of how this will work. The the hierarchy shown here is Reading > Level E > Key Idea and Details > Recall Details, and the listing of resources includes a theoretical resources from a publisher ("Reading in the Workplace" and then free and open resources that have been entered into the SkillBox database.
Not sure how all this will go, but it's going to be our attempt to support curriculum alignment as folks transition. In the meantime, I would love to pull together folks who are interested in helping to create an "open map" that we can all share and provides instructors with something that helps to show the breakdown of Subject > Level > Domain > TABE Skill > Standards. I've got to believe that if we go enough folks to volunteer, we could pull that together very quickly and have a really helpful tool available to the field.
Thank you Stanley. It is great to hear from someone who has been pioneering it. I appreciate your suggestions (Locator use and interpretation with care) and have passed your comments on to others.
I am in adult corrections in Idaho. One of my sites has been chosen to pilot TABE 11/12. We will be using paper/pencil. I hope to start testing next week. We are still in the planning stages but the time aspect seems to be the most challenging issue so far.
I understand the length of time can be intimidating for our learners. It can be frustrating to have a goal of getting a GED but since the learners are not familiar with the process - the testing can be overwhelming. I have had many students come in the first day, never to be seen again.
I believe students need to understand the 'why' of the test. They need to know that the longer the test, the better the results that can guide us in proper selection of curriculum so they can meet their goals quicker. I am not as concerned about the length as much as how we use the results. I'd love to hear more about your challenges and successes so we can continue to brainstorm and learn together.
Our program uses paper/pencil 95% of the time.
The reports take a while to get used to.
The time isn't as much an issue as we thought. But we are only giving the Language and Math to start off with.
I have bombarded the help desk with any and all suggestions and questions.
If you get a "document failed" notification try turning the answer booklet upside down.
DRC showed up at our state conference and answered some questions. It doesn't look like the test or the answer booklets will change. My program is under the impression that they are trying to "encourage" online testing.
Yeah, I feel the same way about the "encouragement" of online testing, but DRC hasn't really been much help in getting our computer testing going! We also have off-site locations that need paper tests.
Colorado implemented TABE 11/12 on July1. Our program utilized the paper-based TABE 11/12 for the first time last week. We spent time before our orientation preparing materials and practicing as a staff. Our orientation process is two-days so we split the testing between them. We had a significantly smaller orientation than anticipated, but it allowed us to learn more about the testing process for the future and think through our orientation procedures. We developed in-house scoring materials that helped speed up the testing process. Most individuals completed the tests before the maximum allotted time expired.
Lacey, I'm in Colorado too. We started in July with the hope of using the computer version, but our testing center still is not able to use the computer version. This has been pretty frustrating as it now takes a lot more work to score, and as of now, I don't know how to help students interpret scores and we don't have a diagnostic system to focus in on specific student gaps. With 9/10, we had a diagnostic for the paper version so after scoring, we could use a diagnostic sheet to say, ah yes, this student missed 8 of the 10 questions on fractions, or whatever. We can't seem to find a similar thing for the 11/12 paper version. Does anyone know if this exists? I know that the computer one prints out a diagnostic, but what do we do with print? Students and instructors have found this really helpful in finding the gaps in students' knowledge. Any ideas?
We have had many of the same questions, Stephanie. We spoke with the state about the possibility of going online with our testing, but realized that it would take more time than we had available for this year. We chose to stick with PBT for now. The diagnostic tools you are looking for are found in the Forms Individual Diagnostic Profile. There is a profile for each level that you can purchase in sets of 25. We purchased one of each, but it also takes time. Each level is broken down by test, then by domain and skill. Item numbers are listed for each domain. You circle the items that were correct and then determine how many points were achieved. This is meant to tell you if the individual is not proficient, partially proficient, or proficient. It really is a helpful tool for instruction and discussing results.
Will you be at CAEPA in October? Which program do you work with in Denver?
We have exclusively used computer-based TABE 11/12 for pre-testing since April 1, 2018. We continued to use TABE 9/10 for post-testing through the end of the 17/18 fiscal. We are now entirely 11/12. Student scores and achievement have dropped dramatically. As you know, the test is long, about 6 hours for each version (with locator). I am sure we will navigate this like we did the change in GED testing, but so far it's been a rough road. I would be very interested in speaking with others about "+" and "-" scores, and what your program is doing about them (entering in NRS or re-testing, etc.) I need more information about scores falling outside the ranges the levels are supposed to score. I have reviewed everything I can find from the publisher and can't seem to find what I am looking for.
We received an email saying that they would be doing away with the O/R but we should be retesting the +/- scores; we didn't retest on 9/10, I don't see us retesting on 11/12. We are only testing Language and Math.
Was the email you received from DRC?
I think it came from my state. I'm sorry, I have been unable to recover it.
In Georgia we will be required to move to full implementation of TABE 11/12 by April, 2019. Computer-based testing is required, with paper-based testing being the exception. We are trying to think through concerns around logistics for testing, training and prepping instructors, and identifying best resources. I'm curious as to what states have implemented TABE 11/12 and what insights/lessons learned can be shared. We'd appreciate any feedback. Feel free to contact me directly if you feel like having a conversation. We have significant concerns, especially regarding logistics.
Have you got access to DRCeDirect.com so you can read all the test directions, etc.?
Thanks for posting on this topic.
I am leaving my position as Training Specialist for South Carolina Adult Ed in December. If you might have a need for help around Georgia between January and April implementation I am available. My personal email is firstname.lastname@example.org and my cell is 803-230-1069. Currently, I do not have anything set to do come January, but I am looking around.
I am an adult educator for a local community college in NJ. The new TABE is awful!!!!!! Classes are held in a M/W or T/T time frame, between the intake form, locater and the TABE it is sometimes 2 weeks before all is finished-IF they return. Imagine being someone who has decided after 20 years to return and all I do is test them!!!! It is very frustrating!!!! We are using the paper and pencil, if we had to code in the front page for the computer to score I cannot imagine the additional time!!!! Our supplies and classroom space is limited-I even have 2 days a week at the public library-so there are other stressors. My classroom student list is constantly changing for who finished the test, who is still testing and who can go on with instruction-----VERY FRUSTRATING!!!!!
Thanks for joining the Program Management group, and for seeking some help with the challenges you face in using the new TABE 11 - 12 test. I hope some TABE users here can offer help.
I am not an expert on the new TABE, so I can't offer any advice on the test itself or how the paper and pencil version is supposed to be administered; however, I wonder if you could explain which of the challenges you experience have to to with the test itself or regulations from the test-maker on how it must be administered and which, you believe, are a result of how the test is administered at your community college. It might help to know if the challenges are the result of community college decisions, not requirements from the test maker, although if you are a test center administrator you may already know this. If not, it might help you to understand what the test maker does require of test center administrators. Perhaps a TABE representative could help clarify this for you and others in this discussion; for example, if the TABE 11-12 test administrator guidelines are public, to post a link to them in this discussion.
One argument for computer-based testing is that it can reduce the time for getting test results to administrators and students. I wonder what others' experience might be with the computer-based version of the new TABE 11-12. If Laura's community college switched to the computer-based version, do you think it would be helpful in addressing any of the issues she raised?
Are others here experiencing some of the challenges that Laura raises, for example, that it is difficult to keep track of which test-takers have started, which are still testing, and which ones have finished? If so, how do you handle that problem? Do you use a spreadsheet? If so, do you have a customized version that you could share with others here who may benefit? Do test administrators have access to an online tool from the test-maker that makes this easier?
Laura, and others who use the paper and pencil version of the test, can you tell us why you chose this option? What are the constraints you face that make it difficult or impossible to use the computer-based version?
David J. Rosen, Moderator
LINCS CoP Program Management group
We had one meeting with someone from DRC, who explained the new TABE. However we did not get to see the new booklets. To me the biggest problem in order for a student to complete the Locator and the TABE we are talking about at least 6 hours.....then they have intake paperwork and a short orientation that all of this testing does NOT count towards the GED......and I thought the PARCC was ridiculous. I could teach my students with only the info from the Locator! However I believe the Dept of Labor needs the additional info, I do not.
While the computer based testing does streamline the process, I personally have found it is quite a feat to get the students enrolled online and then assign the test. If the student doesn't show, then you have to reassign them. You have to make sure they are comfortable with the computer....I have people who do not know how to operate a mouse. In addition, with limited funding and room, I would have to have a full time person to do non stop testing for the 1,100 students I served last year. I can test 40 people in one day for paper and pencil. I could only test 16 a day (that is 2 testing sessions) with my current computer situation.
We are locating everything, but only testing Math and Language.
In reading through this chain of replies, I am very interested in how you found out what test questions tested what skill. Other than sitting with a questions booklet and comparing how ach student answered each question, how is this done? I am testing paper and pencil-the testing times are arduous and the process can take me weeks. I am the lone instructor and have limited computer access, so even if the students were taking the test on line, it could be an adventure to say the least. I am following suggestions and think I will switch to the Locator and just math and language.....
I have lots of experience teaching-over 30+ years in the public schools, but I have not taken the time to look at each individual question. As it is, I have an ABE section on M/W and move into Ged on T/T. But I allow the students to show up when they can. When so many are returning after years, I try to make it as easy as I can.
I am also dealing with a large ESL population. Some feel they can complete one year of ESL and go right to GED. They are allowed to test in Spanish, but I have no bilingual materials and I cannot speak Spanish.
I can prepare anyone for the test without the TABE-the test is simply for funding and to prove growth, in my humble opinion
ok, people are finding the same stressors about time....however, I have not been able to figure out how I know what skills the individual student might be lacking. For instance....let's just say a student missed #5 on the reading test. The question dealt with the skill of main idea. If they got it wrong, then I know to go over that skill. Now, how do I do this? We are a paper and pencil test and I am the only instructor. I cannot level my classes, as there are no other rooms available. I teach whole class lesson on the 4 subjects, they they break out for individual practice in math. To me, nothing has changed from 9-10 to 11-12 other than the frustratingly long pretesting period.
Laura, I respectfully disagree that TABE is only useful for funding: workforce centers use TABE to determine someone's eligibility for vocational training or apprenticeship. In Colorado, Denver Public Schools uses TABE to identify highly-qualified talent for Teacher's Aide positions. And I am aware of at least two community colleges in Colorado that accept TABE scores instead of Accuplacer, not for all programs, but nevertheless.
I am in Commerce City, CO, and we use computer-based TABE 11/12 through DRC. I would say it has been not that big of a deal for us. Our staff took significant chunks of their time last year to learn about TABE 11/12, but also to analyze the competencies and create learning plans that align our instructional resources to TABE 11/12.
Sure, computer-based testing presents some administrative challenges, but this is our second year doing computer-based testing across the board. Last year, we used TABE 9/10 for Adult Basic and Secondary Ed classes, and we used CASAS for ESL courses and for Bridge to ABE courses. But this fiscal year, we moved all Bridge to ABE, as well as ABE/ASE students to TABE 11/12. The challenging part was to ensure that our computers are updated in a timely manner and that our partners provide us with WiFi access, which they do. Another factor to consider is how many computers you have and how many students you typically register. Our Program Director has purchased additional Chromebooks this year to use in the classroom and for testing. We typically have about 10 students registering per day, and so, we try to have 10-20 computers available.
Of course, occasionally, we enroll adults with low levels of literacy or specifically low levels of computer literacy. This is where what you say before the test is important. We explain that students would not need to type a lot or do any complicated tasks. They just have to point and click. And knowing that someone is potentially a low-literacy student, we watch them more closely as they start the test.
Our students complete the initial testing in 2 days. On the first day, they complete the intake form, attend orientation and start testing. That day they manage to get through the Reading parts of the test. Reading includes practice items, the Locator and two parts of the test. We ask them to stop after they complete Reading - Part II. On the second day, students complete all parts of Math. (Our state does not require the Language section of the test). The entire pre-testing on Reading and Math takes about 4 hours.
We print out the Individual Portfolio form from DRC for each student and their teacher. The first week of classes, students review the syllabus for their course, as well as their assessment results. In fact, we designed our syllabi to be learner-centered. That is, students can review their TABE results and check off the topics their have demonstrated proficiency in on their syllabi. The first week is also the goal setting week. We use TABE results to formulate our academic goals. This is the practice we adopted last year in all our classes, and it seems to work well. Students are more aware of what their TABE scores mean. And I am actually excited about the Individual Profiles for TABE 11/12, as they provide a detailed list of competencies students were tested on.
One important observation: students score lower on TABE 11/12 than they did on 9/10. This causes some frustration in both learners and instructional staff. However, we just keep explaining that TABE 11/12 is more aligned with today's learning outcomes. This is what K-12 graduates are expected to know and do, so why set a lower bar for adult learners?
So, in summary, I think it is important to prepare for TABE 11/12, especially if you are adopting computer-based testing. I would make sure all proctors and admins are trained in time, scripts and presentation slides are ready to go and all stakeholders buy into computer-based testing as an effective assessment tool. Without that, testing will be frustrating to all parties involved.
As I read the 2 posts above, my guess is my most extreme frustration is I AM IT, I AM THE LONE INSTRUCTOR. There is no one else in the building when I teachnGED that does what I do. One other INSTRUCTOR take one of the night classes, but we are it.
I am also reading and am not clear on some of the procedures, anacronyms ( Like what is O/R.) And. Materials you have available.
This makes me sad for adult learners in NJ, where the passing score is higher than 47 other states, to try to get ahead with just me.
I pretest while teaching, I differentiate as best I can. I score and keep track of my own paperwork attendance and LACES.....it is a three hour stint on roller skates.
NOW everything said, I love what I do. I had about 200 students pass through the doors...maybe 10% made it!...some came once or twice, some were absent for months, some practiced on their own.....
Hello Laura and others,
From my limited experience with high school equivalency exams, I believe the accepted practice is to separate formal pre- and post-testing from instruction, at least to be sure that the person handling the testing is not the teacher of the same students. I am not clear from what you have said if that is not the case in your situation, but perhaps we could have Mike Johnson from DRC describe or point us to their recommended or required procedures regarding testing.
David J. Rosen, Moderator
LINCS CoP Program Management group
Examiners must not be involved in any related preparatory course.
This quote is from page 44 (p.50 on pdf) of the TABE 11 & 12 Test Administrator Manual on the DRCeDirect.com site posted updated 4/12/18.
(username and password required)
I was told that someone at DRC gave oral permission for a state to view this as a suggestion, a preferred situation. Many programs have their teacher giving the TABE tests, but yes, David, the standard I was taught was no learning and testing in the same room at the same time. This is a different scenario and clarification from DRC would be welcomed.
How is this even possible to get an accurate result if students are testing in the same room where instruction is taking place? How do such programs report instructional time?
I think it makes sense to dedicate some time to just testing. I have heard that other programs hire temporary staff for proctoring. They train them and everything, but only use them for testing.
In our program, we have extra staff trained on both assessments we use (CASAS and TABE 11/12).
That is how we do it, as well: We do pre-testing before we place students in classes.
Sounds like Laura has minimal support, and that explains why it is so challenging for her program.
We're looking into using the online TABE 11/12, but we don't need all subject areas for our program. I'm curious if you guys are using the Locator or the Auto-Locator- and whether you're able to tailor the locator to only level students according to the subjects they need. It's not clear to me from the online training materials what the online locator consists of- and whether the manual set-up locator includes different content than the Auto-Locator. We've also had some issues getting consistent information from the reps.
So, do you (or anybody else reading this thread) know if it's possible to just give them the Auto-Locator in Reading, for example- kind of like how the CASAS eTest is set up? Also, what is your experience with the locator being an accurate predictor of student level? Our adult ed students often rush through the locator on the CASAS and then end up with an out-of-range score on the level test because they're taking a level test that's too easy. Have you noticed that problem with the TABE- especially since it's such a long test? Are they itching to get through it?
I have used the online briefly. You should be able to auto-locate with reading only.
West Virginia will be making the transition July 2019. There is no choice but to move to TABE 11/12 due to NRS requirements.
We have had meetings with Mike Johnson from DRC about TABE 11/12. It is important to note that the Literacy Level for TABE 11/12 is absolutely not at a beginning literacy level. We will continue to use CASAS for literacy purposes.
Was it Level L or Level E that you found to be above beginning literacy?
I'm waiting to see if CASAS gets their math test approved before we need to buy tests to start pre-testing for next program year. It rubs me the wrong way to have no choice of test at all (especially since whatever we choose, we'll likely be sticking with it for 5-10+ years). I want to at least be able to pick my poison.
Whichever test we choose, we're almost certainly going to switch from all paper to (mostly) computer-based testing. Our state administration is strongly encouraging computer-based because there's less room for administrative error. It's also the only way that I can imagine giving different TABE 11/12 levels simultaneously, since there are so many different timings.
Our program has teacher-taught classrooms and volunteer tutors who mostly work one-to-one out in the community (libraries, coffee shops, etc.), so we have a lot of thinking to do about how much group administration we'll be able to do. As it stands, a lot of our tutoring students get post-tested individually at their tutoring site or in a coordinator's office. That could be difficult/impossible if the test takes 3-4 hours to administer. We might need to start requiring students to come to post-testing sessions, but we'll still end up doing some individual administrations, I'm sure. Some of our tutoring students have complicated availability, which is why they're in the tutoring program to begin with. At least the coordinator can catch up on paperwork while giving the test...
Currently, we set aside a couple of classes at the end of our managed enrollment sessions to test students who have their hours (and to give practice HSE tests if appropriate). We might need to reconsider this, too, since it could start seriously cutting into the session.
We're going to have to lengthen our orientation and spread it out over more days to break up the testing, but we might actually have to cut back the amount of time we have to talk about the program and the students' options. That's also up in the air.
I will say that I am excited to finally be able to use a test that is better aligned with our students' HSE goals. I'm just ready to be on the other side of the transition!
Hello Rachel, Thank you for sharing your thinking process as your program transitions to TABE 11/12. I am certain that many other providers are going through the same thing, including my own. I do think that computer-based testing has benefits. Moreover, I agree that having a test that is better aligned is positive. However, the length of the test definitely requires careful thought about needed changes to orientation.
It would be interesting to hear from others regarding their experiences thus far with the question of alignment with the HSE tests as well as how programs are structuring orientation.
Cheers, Susan Finn Miller
Moderator, Teaching & Learning CoP
I just got notice the the CASAS GOALS Math has finally been approved. https://www.casas.org/social-media-newsroom/2019/03/07/math-goals-nrs-approval It's too late for me, since we already decided to go with TABE, but for those of you who haven't made a decision yet, you can at least look at two options.
I am the assessment coordinator for a small program. We have been using the 11/12 on the computer for a bit over six months now.
In regards to the time component, while the new test is taking longer, it is not as bad as I initially feared.
The real frustration that we are feeling is the reports, which leads to my observations and wondering if anybody else has seen the same:
1. We are getting post tests with low out of range (O/R) scores. DRC is redacting scale score and NRS levels when this happens. We have no idea if the student is one point or twenty one points off. The redacted scores are also starting to cause a log jam of data as we have no scores to report in our state system for progress tests.
2. The Performance on Domains section is far too obtuse. We have had students make gains yet the check marks do not move from test to test. Others have check marks rapidly swinging with no real movement in scale score. It also does not help that a single check mark can cover six or more skills.
3. Recently I noticed that (-) and (+) notations are proving to be unreliable. One student tested in range on an initial and O/R on a progress. The difference was one point yet there was no indication that the student was on the edge.
4. While digging into the above issue I found two more issues - first, the DRC report numbers on points v. scale score in some cases are not matching up with DRC published charts. Second, I found eleven initial tests where the reports in no way indicated the student would benefit from taking a higher level test, yet they were maxed out in the possible NRS level, making it impossible for that student to achieve an NRS level gain in that subject. Eleven tests doesn't sound like much until you realize that I have only 32 students that have taken the 11/12.
These last two findings are went sent me to the internet looking to see if anybody else has seen this.
Hello Dave, Thank you for sharing the challenges and confusion you are facing with the TABE 11/12 reports. I'm certain others will weigh in on this important issue. Members, please let us know if you are seeing anything similar. Do you have any guidance for Dave and the rest of us?
Cheers, Susan Finn Miller
Moderator, Teaching & Learning CoP
1. We get one or two O/R scores each testing session. We have 3 or 4 sessions a month. We are retesting them after they complete orientation. We are only testing Language and Math; rarely are they O/R on both. When you look at other scores or have a conversation, you can get a good idea if they gave up or if they somehow strangely tested high on the locator. Most of the time, it is a borderline Locator score.
2. We agree the standards are too obtuse. TABE is nervous we will "teach the test" and are reluctant to narrow the scope.
3. We are ignoring the -/+. There is no point if you are paying attention to the Form and NRS Level.
4. a. I haven't noticed the points/ss not matching the charts. Make sure you are in the correct form, level, and subject. b. For example, Language M level and an NRS of 4? Is that what you mean? Yes it happens. We are trying not to place students in the subjects that have this issue. Especially since we have just received the few aligned materials.
Sorry for the late reply, I've had my head down in other projects. This image shows what I am talking about with things not matching. Granted, a single digit off is not exactly ground breaking, but it makes me wonder what other details are getting missed.
That is a great example! Now that you have said that we have found that! I remember being told that some questions have more weight that others especially the multiple answers, I chalked it up to that.
We are computer-based only and have fully transitioned from 9/10 to 11/12 this year.
We wholeheartedly agree with the O/R, +/-, and diagnostic issues. However, my biggest gripe is with the DRC Insight Portal. This has to be the worlds most USER UNFRIENDLY program in the world. Click and wait, click and wait. Even if you made no changes to a dialog and click Cancel, you have to wait. The whole concept is backward from how we deal with testing. It is testing session based, where you create sessions and add students to the session. I am sure this works for DRC's whole K-12 district-wide testing, but not for how we test in Adult Ed. We have students that we assign tests to, not tests that we assign students to.
Length of testing has always been an issue, but the TABE was the only game in town. This is now changed with CASAS GOALS Math coming on board. We have our fingers crossed that the State can get it fast-tracked into their assessment policy and our database vendor can get it onto their system before July 1. Of course, we will have to determine if it meets our instructional needs, but since our teachers have little use for the limited TABE diagnostics, I think it will come out ahead.
"We have students that we assign tests to, not tests that we assign students to".
I have told my admin that we personally do not have the capacity for computer based testing. (we only have 7 computers in the room) And it is so time consuming to enter new students. I would love to know what your procedures are for entering students and testing them.
West Virginia is working to train and start implementation of TABE 11/12 beginning on July 1, 2019. We are encountering some questions from partner agencies about setting new entry scores for their programs and we aren't quite sure how to help them since TABE 11/12 is so new. Additionally, feedback that I am reading in this discussion says that users are experiencing lower scores from TABE 9/10.
For example, many Workforce programs set a cut off for entry at 9.0-grade level. So, with TABE 11/12 not having a "grade level" and it seemingly being more difficult, what are your Workforce or CTE programs using for cutoff scores?
Please share, we'd love to have your feedback!