GED® Completion Rates?

Hello friends, A while back David Rosen posted a message asking about GED®  (and other high school equivalency tests) completion rates. I wanted to pose the question again since I'm hearing that the completion rates for the new GED® are shockingly low in PA. I'll share some stats from Pennsylvania in my next message. We are also experiencing much lower numbers of students in our GED® classes. I'm wondering how widespread this phenomenon is ---- or not -- for those who are using the new GED® test. How about for other high school equivalency tests?

Thanks, Susan

Moderator, Assessment CoP

 

Comments

Thanks Jackie for posting the PBS video.  David, I think the reason that Randy Trask did not mention the two level cutoff score--one for HSE and one for college readiness is that the bi-level cutoff score does not exist, at least as I thought it would.  What appears to be the case is that passing students can earn a passing grade on the test or a passing grade with honors.  These seem to be the cutoff scores.  Regular passing would indicate high school equivalency. I suspect the devil is in the details of the norming process GEDTS utilized.  Several months back I asked Martin Kehe if the norming process was going to be essentially the same as for the 2002 GED test where the norm was set at the level where 60% of high school seniors passed and 40% did not.  Martin said that it would essentially the same combined with some sort of focus group or other strategy.   The test is not normed at a level anything close to 60/40 in my view.  One need only look at the ridiculous Math question in the Cleveland Scene article to see that the wheels have fallen off the bus.

I thought the Cleveland Scene article was an excellent overview of the whole situation.  When one thinks about it, one might say with a certain level of confidence, that testing to the Common Core standards is at the heart of the Pearson business model.  It is heavily invested in the Common Core and any commitment to testing toward anything else is not viewed as profitable.  As a result, the valued intent and historical legacy of the GED test trying to account for and somehow "measure" the life experiences of adults starting with the returning WWII GIs is totally absent.  And that is a shame.  From what little I can glean regarding the other HSE options out there, I believe that the HiSet is the HSE test most appropriate for the adult education population in the US.

 

Best Regards,

Jon Engel

jengel@communityaction.com

 

Thanks for posting this, Jackie. It's good to see this issue in the national conversation. As David pointed out, we have been discussing this in the Assessment Community, so folks may want to check there to add further comments. The article in the Cleveland Ohio Scene magazine piece is especially relevant. (Thanks to Di Baycich for posting this.) One point that is raised in the article, among several other important ones, is that many learners do not have credit or debit cards to pay for the online test. This may be a fairly minor issue, but it is real.

Also, relevant is the fact that the American Federation of Teachers (union) has endorsed Hi-SET over the GED due to issues around equity.

Cheers, Susan

Moderator, Assessment CoP
 

Good morning, everyone,

Thanks David, John, and Susan, I'll be sure to refresh on the Assessment Group discussion as well. I'd like to add another national article to this cross-posted discussion that just came out this morning on NPR. You can listen to or read the story here:

A 'Sizable Decrease' In Those Passing The GED

The audio version of the NPR report includes quotes from a master's level teacher of 30 years who said that she took the practice test but did not pass it. Have others here in this discussion taken the practice test? If so, what was your experience of it?

Thanks all, for the very timely discussion.

Jackie

Jackie Taylor, SME, Evidence-based Professional Development Community of Practice

I believe we are seeing significant decreases in completion rates for many reasons. Increased costs in tests, longer preparation time due to the expectation of background knowledge in some of the content areas, higher expectation of content knowledge for the student are some of the reasons.  However, I am more intrigued about the comment about the master's level teacher not being able to pass the practice test. This statement says volumes about the struggle many teachers are feeling. Throughout my professional development experiences, I have heard from teachers across the country and their main concern is the struggle with teaching content areas in which they are not experts. For example, teachers who are strong in math are often not experts in the social studies or science areas and reading teachers no longer feel comfortable teaching the higher level math. 

I have taken the practice tests and passed the content areas, but I will admit my scores were not as high as I would have preferred. 

 

 

 

 

 

 

 

Hello Kathy and all, I think teachers taking practice tests is a good idea. We need to understand the structure and content of the test we are preparing students for. When I was teaching for the GED®, I took several practice tests so I could support students in their preparation. Since I am not currently teaching for the GED®, I have not taken any of the new practice tests. I hope to find time to do that, especially in light of the discussion we've been having.

Thanks for sharing your experience with this. I welcome other teachers to share their experience with taking practice tests themselves.

Cheers, Susan

Moderator, Assessment CoP

 

Is there a way to take this online? 

I looked at it and really struggled myself with understanding the questions and they way they were asked. I actually need to cover all of this for my new class. I have never had HiSED students before; I'm coming from ELL and moving into HiSED. The questions are layed out funny and I did struggle at first glance. I thought about scanning a sheet and projecting it to simply show the students how to read the questions before they ever get to answering them, but I didn't know if that was allowed. If there's something online that I can show them and practice with them, please let me know. Thanks.

There is a free 1/4-length practice test available on the GEDTS website. (It's pretty much the same thing as the item sampler, so you can search for either.) Students with MyGED accounts should also be able to access it, although it gets moved around occasionally.

I am interested in the implication here that some of the drop in completion rates may be in part because the instructors are struggling to teach the new material. As a GED instructor, I know firsthand that this new test has presented a steep learning curve to teachers--even a year later, I am still working hard to adapt my teaching to the new test. I am alternately overwhelmed by everything I have to teach my students to do and excited by the higher level of work that they can accomplish.

As the difficulty of the test has increased, there has been a corresponding increase in the difficulty of teaching students who are preparing for the test. We have to become more familiar with a wider array of topics, and we have to learn material well enough to teach it. We are also still getting a feel for the relative importance of different skills. It's possible that as teachers gain experience, there will be a rise in our success rates. It's also possible that, due to factors like the ones Kathy cited (cost, length, content knowledge), we won't ever fully regain the ground we lost.

As far as I can see, the biggest challenge is simply time. It takes a student more time to prepare for a harder test. Many of our students have limited time to devote to education. There will be more of them who simply don't have the time they would need in order to succeed. Can that gap be narrowed by effective teaching? Probably. Can it be closed? That remains to be seen.

Running out of time on the essay part is what another instructor in our area said her students stressed over the most. She said they were able to do the writing, but not in the time allowed. 

Rachel, You have raised an issue that has not yet been brought up here, and one that I'm certain is a factor -- i.e., teacher preparation. It would be interesting to hear how states and programs have been supporting their teachers to teach the new test effectively.

What professional development have GED® teachers found helpful? What kind of additional support would you like to be able to access?

Cheers, Susan

Moderator, Assessment CoP

Colleges interested in the HSE,

The Cleveland Scene published another article on the GED(r) 2014 exam on December 17th, 2014, "Nearly 500,000 Fewer Americans Will Pass the GED in 2014 After a Major Overhaul to the Test. Why? And Who's Left Behind?" by Daniel McGraw.

http://www.clevescene.com/cleveland/after-a-major-overhaul-to-the-ged-test-in-2014-18000-fewer-ohioans-will-pass-the-exam-this-year-than-last-along-with-nearly-500000-across/Content?oid=4442224

David J. Rosen

djrosen123@gmail.com

Good Morning Everyone!

I have read the comments on the new GED and the drop in students that are passing it. We are experiencing the same type of drops at my school. After reading the article I would be interested in hearing someone from the collegiate side to speak if the GED represents as assessment that they would want their incoming freshmen to know.

As for the job-seekers, I try to get my students to understand that them conquering this test will set them up for success in terms of learning new skills for their jobs. Also, can anyone speak to what employer look at (if anything at all) when they inquire are an employee having a GED.

Lastly, thank you to those who responded about GED software. It is much appreciated.

-Alfons

Hi Alfons and all, My understanding is that the new GED® was designed to be in line with the Common Core to address the gap in GED graduates' preparation for college. The large number of students in college developmental courses is evidence of this gap. I am less certain about the role employers played in the design of the new GED®. 

What do other members know about the development process?

Cheers, Susan

Moderator, Assessment CoP

I thought it would be helpful to add a bit more information about the change in the GED pass rate between 2014 and the last time a new GED test series was released in 2002. 

Here's some more detail about how the pass rate has changed:   Overall,nationally, the individual subject test pass rates for 2014 are closely aligned with subject test pass rates in 2002 (despite common perceptions). The significant change from 2002 to 2014 was the elimination of the compensatory model. In the past test-takers didn't need to reach a full HS equivalency level in Math, as they could make up extra points on a different subject to boost their overall passing score total. In fact, if the 2002 cut score for Math was 450 instead of 410--which would be more equal to today's Math test 150 HS equivalency passing score-- the Math pass rate in 2002 would have been 60.3%. However, due to the compensatory model the overall 2002 pass rate jumped to 70%. The 2014 test is different because test-takers must be able to demonstrate a HS equivalent level of knowledge and skill in all subject area before passing the GED test.    Additionally, more than 75% of test-takers that fail Math are within 10 points of passing.   Professional Development Resources We have put a number of professional development resources on the GED Testing Service website (http://www.gedtestingservice.com/educators/professionaldevelopment), as well as a list of frequently missed skills/concepts on the 2014 GED test (released this last summer). On that page you'll find a number of resources, including: an eight-week, self-paced free course; a content comparison between 2002 and 2014 tests; instructional videos; training courses; and other resources. Staff and consultants are frequently traveling to states to present sessions on professional development, so be on the look out for those in your area.   You can expect us to continue our professional development activities over the next year, and look for a new version of the "frequently missed" document with some examples included in the coming months.

Hello,

Do you know what points are allocated for the Extended Responses in RLA, Social Studies and Science? I have heard that the GED Ready does not count those responses in its practice score because the software cannot be attached to a $6 test. If I knew (I suspect that they are weighted) what the score range was for the various Responses, I could adjust my teaching to either emphasize or de-emphasize the curriculum accordingly. I have seen some poor writers pass the RLA with room to spare, so I suspect that less emphasis is given to the essay than in years past.

Since I am a writer, this breaks my heart to de-emphasize the writing aspect of the test, but it that is what it takes to allow my students to efficiently pass, then so be it. Again, I am not looking for testing secrets, just a range of points. 

And, thanks for all your information about the test. I pass it along to team members and part-time instructors so we can all help our students.

Hey Doug, I don't know if this will help you but I understand the extended responses and short answers response is as follows: RLA is worth 6 points; Social Studies 4 points; and Science 3pt/short answer response. All points are doubled if the students answer appropriately. For instances, if the student gets 6 points for the RLA portion he is given 12 points for that response. So yeah, the extended answer portion on the RLA and Social studies as well as the short answers for the science is critical. I hope this information is useful to you.

 

 

I wanted to follow up on this comment, and find out if it is fact or opinion and whether it can be located on the GED TS site: "I understand the extended responses and short answers response is as follows: RLA is worth 6 points; Social Studies 4 points; and Science 3pt/short answer response. All points are doubled if the students answer appropriately. For instances, if the student gets 6 points for the RLA portion he is given 12 points for that response." 

It is contrary to what I have learned about the writing portion/s of the GED exam, so want to make sure I am understanding it correctly.  Thank you!

 

Doug, Twillis and JN... those points seem to be right on target. You can find the information on the GEDTS FAQ page at: http://www.gedtestingservice.com/educators/2014-faqs  or this, plus other detailed information about what's covered on the 2014 GED test in the Assessment Guide for Educators at: http://www.gedtestingservice.com/educators/assessment-guide-for-educators

I would not be surprised. If you look at the test strategically, the 45-minute Extended Response is only a 6-point loss. If a student struggles with writing, then those points may not be worth the extra effort.On the GED 2002 version, if the student answered all the multiple choice questions correct, but wrote a substandard essay, he or she failed the GED Language Arts: Writing test. That would not happen now.

I'm not saying the writing element is unimportant (especially if the student has any aspirations of going on to a post-secondary credential), but it is no longer a base requirement. 

Hi Douglas, Thanks for your comment. I would say what you describe is accurate. I agree that adult learners who have aspirations to go on to post secondary absolutely need to be able to write, so it's disappointing to me that the writing component is not essential.

At the same time, I would say that the type of writing featured on this test is more in line with the type of writing adults need to do in their lives as well as in post secondary.-- that is, compared to the old version of the test.

What do members think about the type of writing featured on this test? How do the 2014 GED® writing expectations compare with those of the other high school equivalency tests?

Cheers, Susan

SME, Assessment COP

The RLA writing and its time limit make sense; the SS time limit is dreadful despite GEDTS's contention that 25 minutes is plenty. I applaud the fact that students now need to be much more familiar with key historical documents, thinkers, and events and be able to synthesize material in a sophisticated way, but good lord--25 minutes?! We are dooming most of our students to failure and feeling like failures. Extend this time by an hour, and I can expect my students to grapple with the prompt, consider prior knowledge and how best to incorporate it, and then write and revise.

Here is some info just released by Larry Breeden, Adult Education Administrator, NJ Office of Certification and Induction
 "Below is NJ’s 2014 TASC  statistics.   Looks like we are at a 67%  pass rate, which is just about the same as we were under the old GED.  I don’t have the final numbers in front of me, but I believe it is pretty close to what the Hi-Set numbers are as well.  For the GED we are at 59 %."

 

My personal experience with students in Bergen County (across he bridge from Manhattan)  taking TASC and Hi-SET is that they are passing at an even higher rate, but very few are willing to try. They are afraid of failing and put off by the $92 fee (up for $50 in 2013). 

I found this stream interesting- we are all dealing with a new test, GED or TASC and seeing varying results. Here in NY, the test is aligned with COmmon Core, and yet, High School is not- (working toward it, but not rolled up yet). So people who drop out today, take the test, are testing at a higher rate, level, than they would be if they had just staying in High School. Right now, the test is harder than high school. I try to tell them that, but they don't believe me.

I don't know the real numbers on pass/fail rates, but my students are passing, many of whom never passed before.

I believe, because they haven't had that many people take the test, the curve is big, and you need to get very few right in order to pass.

While I don't find this right- I know that sooner or later, the curve will work itself out.

I think the new test is good- we need a harder test, High School is harder, and it should not be easier to drop out and take the test- or more will do it. If it is discouraging, it was made to be that way so that people (kids) do not choose this route.

 

Thanks for sharing your experience with TASC, Margie. It's good to hear that students are passing even though you are saying that it is a harder test.

I'm not sure I understand what you mean when you describe the curve "and you need to get very few right in order to pass." How does this work exactly?

Cheers, Susan

Moderator, Assessment CoP

[My comments here are focused on math because that is my expertise. I have worked in adult education for 10 years, across many states—as a math teacher, educational researcher, professional developer, curriculum developer, and as a consultant giving feedback on a number of federal adult education projects.]

In my view, there are deep tensions lying underneath the Common Core State Standards (CCSS), the OCTAE-recommended subset of those standards for adult education, and the new HSE exams that are being discussed here.

On the one hand, it makes sense that there should be a relationship between what adult learners are doing, and the standards that we imagine are guiding instruction in the K-12 system. It’s important to remember, though, that the CCSS remains an aspiration. We do not have a large share of K-12 students meeting these standards, and we shouldn’t expect to see this any time soon.

The tensions I see in adult education stem from three competing expectations. Firstly, federal/state governments and testing companies want to project that adult numeracy teaching and HSE math tests are “aligned” or are “aligning” over time with the CCSS. Secondly, practitioners and governments believe that adults should not be held to a higher standard than high school learners (who are largely not meeting CCSS standards at present). And thirdly, practitioners and governments don’t want to see pass rates plummet on HSE exams. Simply put, these goals cannot be achieved at the same time.

The testing companies are in a heated battle for market share around the country, and some or all of them are likely tinkering with their tests to reach a “sweet spot” in the pass rate—not too high, and not too low. They can do this on the math test in a few ways. One way is to change the average difficulty of the questions, but there is a tension here because they want to project the image that they are a rigorous test aligned with the CCSS. Another way is to keep lots of difficult content, but change the number of correct answers needed to pass (the cut score). Normally, I would not expect any of the companies to admit that they are tinkering with the average difficulty level of the questions, with the cut scores, or with both of these to reach a particular pass rate, but I think we have this when one of the companies apparently claims that its pass rate will mirror the pass rate on the old GEDTS exam.

Even when company representatives say their math test is guided by a norming study done of high school students, the way those studies are designed gives the company lots of flexibility in how they ultimately set average question difficulty and the cut score.

What we do know is that none of the companies can afford to have a test that becomes known as one that students can’t pass. To avoid this, a company will likely make adjustments so that they are not very different from the pass rates on the other HSE exams. [Of course, there also is an incentive to keep a challenging test that students need to take multiple times to pass, because that leads to more profit from the extra tests students pay for. However, there has to be a limit to how low the pass rate can go before students just give up trying and states decide the test isn’t viable.]

When companies include math questions that reach far beyond the content that has been tested in the past (and that lies beyond the content knowledge of current high school seniors and the adult numeracy teaching force), the only way they can keep pass rates up is to reduce the cut score. A concern is that the cut scores in one or more of these HSE math exams could get (or are already) so low that students will pass the test even when they guess randomly. We don’t want a test that has become so challenging (in terms of the content) that it becomes easy (because a non-trivial percentage of students will pass even when guessing). A math test that can be passed by guessing should never be considered a rigorous test, or one that signals that a student is ready for college. National press reports and what we’ve seen from the companies themselves suggest this may already be happening.

Unfortunately, most state government offices in charge of adult education do not have experts who could demand information from the testing companies and analyze it from a critical perspective. Government-employed assessment and content experts should decide when items test meaningful content, not just when they are statistically reliable. Assessment experts should identify instances when cut scores get unreasonably out of whack. And content specialists should guide the companies on the appropriate subset of CCSS content to assess, and how to do that. Sadly, these decisions have and are being made almost entirely by private companies, with government officials and the public on the sidelines. We are mostly stuck with talking points from salespeople.

I would like to think the field has improved a great deal in one year, and that this explains why current pass rates on some or all of the HSE math tests might look similar to the pass rates on the old GEDTS exam. That’s not realistic, though. I think the pass rates (and how they may be changing) have much more to do with what the companies are doing behind the scenes. And this is unfortunately where we are now in adult education—private companies are in charge of high stakes assessments, and folks in the field (governments included) are on the outside trying to figure out what is going on, and what the best course of action is in our teaching. It shouldn’t be this way.

 

Steve Hinds

Director, Active Learning in Adult Numeracy (alanproject.org)

Adult Numeracy Educator, Truman College (The City Colleges of Chicago)

Hello Steve,

We have never met, but I have heard much about you from former colleagues in Kentucky.  I want to thank you for articulating your insight and thoughtful arguments about the current situation in adult education.  Your stated your position well.  I think that the low numbers of individuals that have passed the new test speaks loud and clear.  The companies have discredited the students and instructors who work so hard everyday.  It is very disheartening.

Meryl Becker-Prezocki, SME College and Career Standards

Steve, Thank you for sharing your thoughts specifically about the HSE math exams. I do not have a background in math, but I have taught HSE math in the past, though not for a couple of years. From reviewing the GED® Ready practice test, it is clear to me that the level of mathematical understanding needed to pass the new test has increased quite a bit. Math has always been the most challenging test for most learners, so that is nothing new. However, with this new test, the challenge is greater.

I'm not familiar with the other HSE tests currently being used since these are not available in Pennsylvania, so I'm curious if learners are facing similar challenges in math with those tests.

I hope those who are teaching math and preparing students for the different HSEs will weigh in on this important discussion.

Cheers, Susan

SME Assessment CoP

 

Oscar Mireles

Executive Director/Principal of Omega School

Edit post Fewer finish GED tests

By Hillary Gavan hgavan@beloitdailynews.com | Posted: Thursday, January 15, 2015 4:00 pm

Because of a new test and increased fees, the number of Blackhawk Technical College (BTC) students receiving a high school equivalency certification this year has decreased in line with state trends.

According to the Department of Public Instruction, 912 people graduated from the state’s General Education Development (GED) program in 2014, compared to 11,378 people in 2013, representing a 92 percent decrease.

The number of BTC students starting the test hasn’t changed significantly. In 2014, there were 596 students attending orientation for testing, similar to previous years, according to the Blackhawk Technical College Director of the Student Success Center and GED Chief Examiner Terese Tann.

Although the amount of students who began the test process was strong, the number of students actually completing it has significantly dropped during the past year. For example, in 2012 there were 223 BTC students who completed the test, and 188 students finishing it in 2013. In 2014, however, there only were 14 BTC students who completed the test.

In January of 2014 the GED test changed, with students having to take it on a computer. It also went from five to four tests as it combined the reading and writing test to become a language arts test. The math test was also revised to include algebra.

Test fees also increased. Previously, it cost $75 and it’s now $33.75 per exam for a total of $135. Tann, however, noted students can pay per test as opposed to having to pay it all up front like in the past.

Although the numbers of BTC students completing the test was up in 2012 and 2013, Tann attributed it to a huge push for students to finish up by the December 2013 deadline.

Tann said there are 195 students in the pipeline to take the test in 2015, and she anticipates numbers of those who complete the test will grow.

She said GED preparatory classes, offered at BTC Beloit Center at 50 Eclipse Boulevard, are free of charge. Courses are offered during the day as well as evenings. For more information people can call 608-757-7741.

In addition to free preparatory courses, there is financial assistance for the GED for those who qualify. The BTC Foundation has supported 38 tests per year, or $1,500. If a student gets 1-2 tests covered, it would bring the total cost down to around what it was previously.

“We try to remove those barriers,” Tann said.

In the past BTC offered 45 courses throughout Rock and Green counties through the year. In 2014 it offered 61 courses to help students prepare for the new test. BTC has also increased the number of students it can test at a time.

“We’ve remodeled our Student Success Center and that includes our testing service area,” Tann said. “We just encourage people. The fear is not worth putting it off. We have the resources, and so many of them are at no charge.”

She said instructors also work to pre-test students to evaluate what areas they need help with and to form an individualized course of action for them.

Nationally, Tann said 40 million people still need their high school diploma. Prior to the test change 8 million people a year would begin their GED.

Nationally, 60 percent of students are passing the new test. At BTC 61 percent of students are passing it, and the state average is 65 percent of students.

Tann said she spoke to a student who started the 2013 test and wasn’t able to complete it. He then tried the new one in 2014. Although it was more challenging, he said he did OK as he was well prepared through his BTC courses.

“He went through our classes and that was a benefit. He knew what to expect,” she said.

One nice thing about the new test, Tann said, is that scores are calculated automatically.

“By the time they drove home they can pull up their online account and see where they are at,” Tann said.

Thank you, Oscar,  for sharing the information about GED® completion rates at Blackhawk Technical College and in Wisconsin. It's good to see the rates improving. These improvements are in line with a recent report The Decennia Scurry that was posted earlier in this thread by CT Turner. I know we are all eager to see more students earn a high school equivalency credential.

I hope others in our community are starting to see similar improvement in pass rates. Let us know how things are going in your area.

Cheers, Susan

Moderator, Assessment CoP

Colleagues,

If you continue to follow the state-by-state results of the new GED this July 9th Washington Post article focusing on Texas, "The big problems with Pearson’s new GED high school equivalency test," may be of interest. http://www.washingtonpost.com/blogs/answer-sheet/wp/2015/07/09/the-big-problems-with-pearsons-new-ged-high-school-equivalency-test/.

David J. Rosen

djrosen123@gmail.com

 

 

 

 

 

Colleagues,

In 2014-2015, there was a lot of concern about the new GED® tests and their higher standards. Now, three or four years later, is this still a problem, or 1) have changes by the GED Testing Service and Pearson, lowering minimum passing scores on some tests, 2) changes in teaching practices, and 3) teachers, administrators and students adjusting to the new computer-based format and the new test content made this no longer an issue? Are your HSE preparation students who are preparing for these tests: 1) passing them at about they same rate that they were passing the GED® 2002 series tests? 2) Are they doing nearly as well? 3) Are they still doing poorly on the new test? If they are doing as well, nearly as well or better than on the former tests, what do you believe accounts for the improvement?

David J. Rosen

Good afternoon. I thought I'd post some stats related to this topic to give folks a more national line of sight into things like pass rates, average passing scores since the GED scores were adjusted, and postsecondary outcome measures to date:

Pass Rates for the GED Test:

Overall Pass Rate in 2013 (last year of testing on the 2002 Series GED Test) nationally was: 76%
National Pass Rate in 2017 (last year of testing) across all GED test-takers was: 79%

Average GED Scores in 2017:
Average passing score: Math, 153; RLA, 155; Science 156; Social Studies 155
Average non-passing score: Math, 141; RLA, 140; Science 1141; Social Studies 140
Average overall scores: Math, 150; RLA, 152; Science 154; Social Studies 153

*It is important to note that both the average passing and overall average scores continually are above and continue to trend upwards from the HSE cut score for the GED test.

Other outcomes measures:

Attendance in a college-level program after earning a GED credential since launch of the new exam in 2014 (based on annual data matches with the National Student Clearinghouse - latest data match was Fall, 2017):
35% enrolled within 1 year.
45% enrolled within 3 years.

Persistence
More than 90% of GED grads in postsecondary data match continue to be enrolled semester to semester
Persistence rate of GED grads in postsecondary on the 2002 Series GED Test was 29%.

CT Turner
Senior Director, GED Testing Service

 

Thank you, CT, for sharing these encouraging stats. I hope you will continue to post here as new data become available. I am especially interested, if you will have data on this, to know not only about GED grads' persistence in postsecondary education, but also their completion rates in one-year certificate programs, two-year AS and AA programs, and four year undergraduate programs. Given the significant increase in postsecondary persistence rates, I expect you will also have good news to report on completion rates. It appears from these data that the response from adult learners to the GEDTS raising the bar, often with the help of GED preparation programs, has been to get over the bar to pass the exam and do better in postsecondary education. In addition to percentages, however, can you also share a comparison of the national number of GED test-takers in 2013 and 2017?

David J. Rosen, Moderator

LINCS CoP Program Management group

 

 

Hi all, just in a quick survey of the graduates from our program in the last year, half took HiSet and half took GED. But I see a similar thing to the scores GED posted here. Only two graduates taking GED tests scored 146. All the rest had scores of 150 or above with the highest score being 165. The average score of a passing math test on GED for these students was 153. Of our HiSet graduates, where the passing score is a scale score of 8, only one student scored a 9, the highest score was 19 (out of 20), and the average was 12. I don't have scores specifically for students who have taken the test and not passed, as that is relatively uncommon in our program. But we have changed our class structure to more match specific levels of study in which we move students through to the highest level (using TABE and alternative assessments) of our classes. This highest level of class has students who are grade equivalent to high school, 9th through 12 grade on TABE. This is where we really encourage students to go take tests. I think this has increased the scores on HSE tests, no matter what test students take. At this level, we are covering a lot of higher level math that enables students to pass with higher scores. When students progress through our levels, we are seeing success in the testing center. They are persisting and staying in class longer to build these higher level skills. 

-Stephanie

Thanks for posting this inquiry, David. And thanks to CT Turner and Stephanie for sharing statistics. It's good to see improvement in HSE pass rates, and to hear that more adults have enrolled and are staying in post-secondary programs. It would be great to hear from more members on these issues.

I do wonder about lower level learners. It seems obvious that it would take much longer for learners at lower level to prepare for these tests. It would be interesting to hear stories from teachers who work with lower level learners who have HSE goals.

Cheers, Susan Finn Miller

Moderator, Teaching & Learning CoP

Colleagues,

I would like to call your attention to some new data about the high school equivalency (HSE) exams and to thank our colleague JoAnn Weinberger for calling my attention to this column from the Hechinger Report newsletter, GED and other high school equivalency degrees drop by more than 40% nationwide since 2012, . https://hechingerreport.org/ged-and-other-high-school-equivalency-degrees-drop-by-more-than-40-nationwide-since-2012/

Here are some highlights from the short article:

  • "Decline linked to 2014 change in exam and adult ed budget cuts, researcher says"

  • "Little is known about what has happened to adult learners seeking high school degrees since the old GED exam disappeared because annual data is no longer published as it used to be every year. But thanks to a data collection effort by an expert in adult education at a nonprofit research organization in New York, Center for an Urban Future, we now have evidence of a sharp decline in new high school equivalency degrees in almost every state between 2012 and 2016."

  • Specifically, the annual number of test takers who completed one of the three exams has fallen more than 45 percent from more than 570,000 in 2012 to roughly 310,000 in 2016. The number passing the exam and earning a diploma has decreased more than 40 percent from almost 400,000 in 2012 to just over 225,000 in 2016.

  • “Every state has fewer people obtaining high school equivalencies. We need to have alternative routes for people who don’t graduate from high school. Communities and states that have large populations of people who lack a high school credential are places that will have heavy users of public services, whether welfare or Medicaid.” (from Tom Hilliard, a senior fellow at the Center for an Urban Future)

  • A map is offered, with a number of red states where the number of people obtaining an HSE dropped by more than 50% between 2012 and 2016.  You can go to the map from the article and click on your state to see what the drop there has been.

What do you think?

What should be done about this? Should more states offer alternative competency-based credentials like the National External Diploma Program? Should states create their own equivalency exams? Should HSE test-makers, as some have recently done, lower the cut scores for some of their tests, for example the math or writing test?  If we had 2017 data would we see a different picture? Has this disparity in opportunity been recently narrowed, or not?  Perhaps you do not see this as a problem, that this is an inevitable consequence of raising the testing bar so more people with an HSE cannot only enter post-secondary education but now, with a higher level of knowledge skills, can succeed. If so, does this mean that we need to increase intensity of instruction at pre-HSE or even more basic levels?  If so, how can programs do this without significantly increased funding? What's your perspective on this news?

David J. Rosen

 

        

 

 

 

 

David and all,

I think this information paints an incomplete picture. From the article, "Specifically, the annual number of test takers who completed one of the three exams has fallen more than 45 percent from more than 570,000 in 2012 to roughly 310,000 in 2016. The number passing the exam and earning a diploma has decreased more than 40 percent from almost 400,000 in 2012 to just over 225,000 in 2016." 

Without enrollment data, this information may, or may be accurate. Did enrollment also decline at that same time? Are there less students completing their high school equivalency certificate because there are less students enrolled in programs? 

I'd love to see if anyone can shed some light on this. 
Thanks, 
 Kathy 

 

 

 

 

 

 

Indeed, the number of test takers has greatly decreased from 2012 to 2016, perhaps for many reasons such as less money for adult ed, higher HS graduation rates, etc. However, given the reported numbers of test takers and passers, the pass rates have not changed much. Actually, there's a slight rise in the pass rates. Using the numbers provided, in 2012 70% passed the test. In 2016, 72.6% of those taking the test passed.