Recent Research on Technology and Adult Basic Skills: Day 1

Technology and Learning Colleagues,

Today we begin our discussion with three panelists who have led studies on the use of technology for adult basic skills: Dr. Robert Murphy, from SRI International, Principle Investigator for the Technologies for Adult Basic Literacies Evaluation (TABLE) study;  Dr. Jill Castek, from Portland State University (Oregon), with the study, Exploring How Problem Solving in Technology-Rich Environments Can Be Used to Design Responsive Programming; and Adam Newman, from Tyton Partners, a marketing research firm in Boston, with the survey Learning for Life, the Opportunity for Technology to Transform Adult Education.

I have asked our panelists to describe their research today and, if they wish, to say a little more about themselves. I hope that they all might be able to answer these questions in describing their studies:

1.    What are the most important findings from your study for adult basic skills administrators, teachers and tutors and professional developers?

2.    From your study what have you learned about adult basic skills learners’ use of technology, for example their digital literacy skills, their ability to do online learning, their persistence in online courses, and their interest in using technology?

3.    What have you learned about using the PIAAC Education and Skills Online  assessment for literacy, numeracy and/or PSTRE skills?

4.    During your research, what are some important questions that emerged, for which further research is needed regarding adult basic skills learners’ use of technology and online content?

As discussion participants read the panelists’ descriptions of their studies here, I hope they will suggest additional questions. If so, please post them at any time this week, the sooner the better.

If you haven’t yet, you may wish to read these featured resources that will provide some background on the studies:

Learning for Life, the Opportunity for Technology to Transform Adult Education  Adam Newman,Tyton Partners

Using PIAAC’s Education and Skills Online to Examine Adults Skills Locally Dr. Jill Castek, Portland State University

Description of the TABLE study,  and Slides from a COABE 2106 Presentation on the TABLE Study Dr. Robert Murphy, SRI

Here is some background on our panelists:

Dr. Jill Castek is the Director of the Literacy, Language, and Technology Research Group at Portland State University. As an active researcher, she has connected adult education with innovations in health care, use of technology, libraries, K-12 and postsecondary education. Dr.Castek's work explores educational opportunities for adults, especially members of economically vulnerable and socially excluded populations. One facet of this work involves improving library practices, programs, and services for adult patrons—especially economically vulnerable adults, seniors, English learners, and others lacking basic digital literacy skills. A second facet aims to increase patient engagement with health care by improving the digital literacy skills of patients in community health setting. A third facet involves describing how vulnerable adults acquire digital literacy. By working with a range of educational and community-based organizations, these projects have provided her valuable experience working with local adult education providers and practitioners in multiple states. Implication can be used to help underserved populations to cross the digital divide, utilize broadband Internet services, and acquire the knowledge, skills, and attitudes needed for personal, social, and economic success in the digital world of the 21st century. 

Dr. Robert Murphy is the Director of Evaluation Research in SRI International's Center for Technology in Learning. His research focuses on the design and implementation of formative and summative evaluations of educational programs and technologies. His research combines studies of program implementation across different populations and contexts and program outcomes using experimental and quasi-experimental designs. Currently Dr. Murphy is the Principal Investigator (PI) of a study of online learning within adult basic education programs serving low skilled adults funded by the Joyce Foundation. He is also a co-PI of a U.S. Department Education funded study of an online math homework platform. He was also the PI on two recently completed projects that studied how k-12 schools are blending online and offline instruction to support teaching and learning. This work, funded by the Bill & Melinda Gates Foundation and the Michael and Susan Dell Foundation, included a study of Khan Academy. Dr. Murphy has a Ph.D., from the School of Education and Social Policy at Northwestern University. He also completed a program in public sector analysis at the University of Dublin - Trinity College. He has an M.S. in mechanical engineering from Rensselaer Polytechnic Institute. With SRI's Barbara Means and Marianne Bakia, Dr. Murphy is co-author of the book Learning Online, published in April 2014.

Adam Newman is a founding partner of Tyton Partners.  He has more than 15 years of experience in consulting and market research, management, banking, and teaching across all segments of the education sector. He began his professional career as a k–12 educator and athletic coach at schools in Boston, MA, and New Orleans, LA. He holds an AB in English from Duke University.

David J. Rosen

Moderator, technology and Learning CoP

djrosen123@gmail.com

Comments

I appreciate the document links that Adam Newman provided. In going through them, I found a telling graphic showing who was surveyed about using technology with adult education students. In the second document (which is more aimed at getting materials providers to understand that there is a market in adult ed), one graphic gives an "Illustrative Distribution of Customer Sites in Adult Education". The total 18,700 sites on the graphic included 1100 community colleges (remedial education for people with a HS diploma), correction institutions (4500), public libraries (8900), K-12 school districts (1300), and 2500 American Job Centers (where only 1% of students actually receive ABE-type educational services per the first of these documents). Only 400 sites are Community-Based Organizations - with this added caveat: 

"Data for Community-Based Organizations represents the number of sites receiving WIOA Title II funding;

there are thousands of other CBOs providing adult education services without Title II federal funding."

Who has spoken with/surveyed these thousands of other CBOs who do not have Title II funding? What has been learned about their technology needs/teacher (or volunteer) familiarity/ student accessibility?

These are the questions I would hope to have answered in this discussion. This is where the large unmet need is, both in student access and professional development.

Dorothea Steinke

Lafayette, Colorao

Thanks for the question, Dorothea, and it's a great one. In our efforts, we had a number of conversations with national CBO leaders, most of whom lamented the lack of visibility into efforts across the diversity of CBOs active in the space. Subsequent to this national work, we have worked with funders in a specific state to understand similar dynamics and have had a bit better experience with understanding the CBO experience. Generally speaking, the CBO segments of our research our equally bullish on the role of technology, but tend to lag in terms of technology infrastructure and access relative to programs in K-12 districts and postsecondary institutions.

 

 

 

 

Dorothea and Adam,

There are some important reasons that may account for CBOs having a greater need for the technology (hardware, software, high bandwidth Internet connectivity), and instructor training in how to use technology well:

  • CBOs are not eligible for the federally subsidized e-rate that public schools and libraries benefit from, and that make it easier for them to provide high bandwidth access to students and patrons in their schools and libraries
  • CBOs tend to be smaller than public school systems and many urban and suburban library systems; because they are smaller they often don't have the resources to support large-scale hardware and software purchases, and for hardware and software maintenance and upgrading, and for large-scale technology professional development for teachers
  • Because CBOs are often smaller and sometimes not part of larger networks they don't learn about opportunities that are available to them.

I am very glad you raised this concern, Dorothea, about how CBOs in many states may be "3rd class citizens".  If adult basic skills providers in public schools and community colleges are, as I have sometimes heard them described, "2nd class citizens" in the education world, then perhaps CBOs are "3rd class" citizens. There are some interesting exceptions, of course, where CBOs are strong and well-supported adult basic skills providers; some adult charter schools, and some not-for-profits that are part of larger, national umbrella organizations, and perhaps others.

It might be useful to have a LINCS discussion -- perhaps in the Program Management or Diversity CoP -- on how to strengthen CBO's in the adult basic skills provider system.

Adam, I will certainly be eager to learn more about the results of your one-state study of CBOs' use of technology.

David J. Rosen

Moderator, Technology and Learning CoP

djrosen123@gmail.com

 

 

Hi Everyone,

In Ontario Canada, government funding provides for online adult literacy programming via web conference live delivery of math, literacy, and other skills under the e-Channel banner.  Learners can register with Anglophone, Francophone, or Aboriginal cultural stream providers to undertake programming this way.  (ASL-communicating students work with mentors via video links through registration and assessment and once a training plan is developed they follow asynchronous programming using an LMS.) 

A guided tour provides a view of how the synchronous delivery works:  https://youtu.be/J0PPZ_GjAM4, and in the lower right hand corner of the page:  http://e-channel.ca/students you will see the current course list and get an idea of what math, literacy and other courses are supported in live classes as well as asynchronous options.

 

Hi All,

I'm excited to be a part of this thread of conversations.  I've been involved in research in digital literacy for several years.  Our research team at Portland State just completed a 3-year research study that addresses digital literacy acquisition among vulnerable adult learners.  For more information, research briefs, and case studies see http://pdxscholar.library.pdx.edu/digital_literacy_acquisition/  and to access an executive summary or to read the research brief series, go to http://pdxscholar.library.pdx.edu/dla_research_briefs/.  Our team is very proud of this work conducted within a national network of partnerships, and as a result we wanted to build on in our local Portland, Oregon community to support adult learners digital skills.  These skills are vital to participation in personal, civic, and workplace spheres. 

Adult learners often turn to the local library as a context for both access and learning opportunities.  Our local library provides computer training for individuals with very limited digital literacy experience, but has less programming designed for those individuals who have some initial skills, but require more support in learning to navigate the digital world to complete tasks such as looking for a job, accessing health care options, searching for information, and completing taxes to name a few. For this group of learners, the skills of digital problem solving are key.

Our research group at Portland State, in collaboration with the Multnomah County library, has been funded by the Institute for Museum and Library Services for a 2-year national leadership grant to examine digital problem solving skills of library patrons.  This project is entitled Advancing Digital Equity in Libraries:  Examining Patrons' Digital Problem Solving Skills (see https://www.pdx.edu/linguistics/pstre).  One goal of the project is to bring libraries into the conversation about PIAAC to join national and international efforts.  As a result, we've  launched into using the PIAAC online assessment (known as Education and Skills Online see http://www.oecd.org/skills/ESonline-assessment/abouteducationskillsonline/) to examine patrons' digital problem solving skills. Collecting and analyzing these data provides the library needed information to design responsive programming that meet patrons' where they are in their acquisition of digital skills. 

While data collection has been underway for a few short months, we've begun exploring initial patterns in our data.  First, it appears that digital problem solving requires more than digital skills -- it requires perseverance and stick-with-it-ness (in short, the traits to work through several parts of a task, even when the task is challenging or difficult to accomplish, or when there's few signpost to know if you're on the right track).   While digital literacy skills are teachable, teaching traits such as perseverance require experience and practice in addition to support and guidance.

In the initial stages of work with the library, we've been exploring instruction that takes a more learning collaborative approach.  This approach draws on multiple resources, and multiple networks, to work through digital problems, since no one person can be an expert in all areas.  Distributed expertise is based on experience drawn from a variety of situations. Learning materials that situates learners in real-life scenarios where they can observe/watch experts and opportunities to talk through options and steps involved in problem solving can be helpful instructional strategies. 

New questions that are emerging require us to look at the design of library interfaces as well as learner skills when it comes to digital problem solving.  By examining things from both perspectives, we're able to work simultaneously from a resource and learners perspective and use what we've learned about challenges learners face to make those interfaces more intuitive to use.

I hope this forum post opens the door to the discussion of these ideas:  a collaborative approach to teaching digital skills, embedding instruction in real-life scenarios, and interface design to name a few.  Also, the use of Education and Skills Online to learn about adults' skills using a valid and reliable framework from PIAAC.  

I look forward to your ideas and thoughts!  I'm excited about engaging in discussions with you!

 

Hello Jill,

Thanks for helping us kick off this discussion with such interesting observations. I am hoping to see questions about your study(ies) from others here – both participants and other panelists; here are mine:

  1. To answer this question you may want to draw on the previous study you mentioned, that I believe is the Institute for Museum and Library Services (IMLS)-sponsored BTOP programs study.  You refer to the importance of digital skills not only for personal and workplace spheres but also for the civic sphere. What have you learned from the current study, or your previous studies, about the digital skills needed in civic life?
  2. Since both your study and Robert Murphy’s study have used the ETS Education and Skills Online assessment, based on the PIAAC Survey of Adult Skills, I wonder if you could describe in some detail, as he has done, the opportunities and challenges of using this assessment, especially with participants who lack high school graduate level reading, numeracy/math and/or digital literacy skills.
  3. It is especially interesting to me that you have found that a so-called “non-cognitive” skill -- perseverance, or as you also named it ,“stick-with-it-ness” is critical for problem solving in a technological environment, especially when, as you put it,  there are no signposts to tell you if you are on the right track. You wrote, “While digital literacy skills are teachable, teaching traits such as perseverance require experience and practice in addition to support and guidance.”  I wonder if you could tell us more about that, either from this or previous studies you have done.
  4. Please tell us more about what a “learning collaborative”  or “collaborative approach to teaching digital skills” approach means in the context of working with the Multnomah County Library. Perhaps give us some examples of what you described.
  5. It sounds like “distributed expertise” may be a particular strategy. Is it? And if, so can you describe it?
  6. What are the "library interfaces" you refer to? Are these library web pages, the physical environment of the library, library personnel, or something else?

I look forward to seeing your responses.

David J. Rosen

djrosen123@gmail.com

 

Hi All,

I've found the threads within the wider conversation very intriguing and thought provoking.  I appreciate the opportunity to engage with this online community and share ideas.   

David you had several great questions (that for thinking deeply about what I said in my first post).  I'd like to take your list of question in pieces across the week's interactions because I'd like to leave room for others to chime in with their own experiences and thoughts.  I'm a researcher, and by nature that means I have more questions than answers ;-)  Other's experiences are just as relevant as my own (if not more so). 

For now, I'll tackle your first question about people in our study of vulnerable adult learners' acquisition of digital literacy skills and what they seemed to be seeking to learn vis-a-vis civic participation.  I'm so glad you asked this question!!

Our population was made up 12,000 learners from 6 locations in 5 states.  All sought to learn digital literacy skills through programming in their communities that was offered as a result of Broadband Technologies Opportunities Program (BTOP) funds.  Learners who took part set their own goals that they pursued through a self-access, tutor-facilitated learning system called Learner Web (see learnerweb.org).  Perhaps not surprising, many of these individuals were looking for ways to volunteer in their communities through libraries, senior centers, shelters, and other social serving institutions.  Since learners were receiving support to learn digital literacy skills in the community computer labs they attended, many wanted to give back to help others (as they had been helped).  For some, the confidence found in learning in essence translated into spreading the love of learning to others, in a variety of different contexts.  Online listings are a primary form of finding volunteer opportunities and civic networking.  More detail about civic and other forms of volunteerism are available in the case studies entitled Volunteers in an Adult Literacy Library Program:  Digital Literacy Acquisition Case Study, and Community Connections: Digital Literacy Acquisition Policy Brief.  Both are available at:  http://pdxscholar.library.pdx.edu/dla_research_briefs/

Other digital skills that connected to civic engagement included ways to be a more savvy consumer of technology products and services (such as purchasing Internet services), and the desire to advocate for affordable and accessible Internet access for all.  The current legislation around expanding the Lifeline program to include Internet access can be seen perhaps as an outgrowth of this advocacy.  

I look forward to what others might see as civic pursuits related to digital literacy.  And more thoughts in response to your other questions to come!

 

 

I went through the Education and Skills Online demo, and my immediate impression was that the content could have just as well been delivered through a print book instead of digitally. For most people, the digital experience is limited to the user interface, which is frequently modeled after non-digital artifacts. For example, as I am writing this, I am reading the PC-version of the New York Times site in a browser. The layout is similar to the print version. The biggest difference is that to continue reading an article, you need to click (or tap) on it instead of flipping to the print page on which it is continued. Because I was familiar with reading print newspapers, my learning curve was virtually non-existent. Someone who had never read a print newpaper, however, would possibly have a problem with navigating the site.

The same holds true for online research. Whether they realize it or not, anyone who has ever written a research paper for a class has almost all the skills needed to research a question in a browser. And with semantic search capabilities, the biggest issue is knowing how to formulate your question correctly and vet your results. 

I have often notices a misperception that because something is done on a computer, it is completely different than what is done outside the computer and is something only “very smart people” and “geeks” can do. In reality -- and, again, I’m speaking at the user-interface level -- 80% or 90% of what are termed digital literacy skills are non-digital-literacy skills that are enhanced through using a computing device and the Internet. I would much rather spend 5 minutes finding an answer to a question using a search engine than a day in a library.

In addition to perseverance and stick-with-it-ness, I would add curiosity and self-motivation to the list of soft skills needed for digital problem solving. But these are character traits needed for any kind of problem solving, not just digital ones.

Hi rwessel51 - 

Your post pushed me to think about problem-solving and specifically the PSTRE assessment in a new light. You likely know that the impetus for the assessment is the assumption that because of technological ubiquity (increased access to info and distributed knowledge via the Internet and/or demands for use of tech in many contexts) increasingly more tasks/problems require the use of some digital technology to solve. The requirement may be out of necessity (must use an online venue for tasks like filling out a job application, signing up for health care resources, or addressing issues with immigration status) or one of expediency (your library example). Regardless, I do think it important to consider the impact of digital technologies on the way we solve problems because, as you say "very smart people" will make use of those technologies to solve problems. I think our job as eduction practitioners, advocates, researchers is to consider ways to ensure that ABE learners - often newcomers to technology use - have access to some support, direct instruction, and opportunities to practice such problem solving so that when they need to accomplish a task in the real world they have the confidence to try to accomplish it without an intermediary. 

Thanks for the post!

Jen

As I read and listen to discussions about digital learning, I hear much energy centered on barriers or challenges to learning or adoption. This gets me scratching my head a bit. On the one hand, I know many adult learners I work with have had struggles with technology and learning with technology. On the other hand, projects like the "Hole in the wall" demonstrate that humans have the incredible ability to self teach, correct, modify and then teach others regardless of how foreign the material seems. (there is a good TED video on Hole in the wall here). Additionally, I look at how my learners self identify their comfort and competency with any new technology when they first come to me and towards the end of their educational journey with me and I find there are quite drastic changes. I find it interesting that the main shift is that by the end of their time with me, technology is not as scary or intimidating. It is just another possible way to learn or interact with learning. Many learners share they come in with expectations that education and learning is "about getting things right" and this need to be correct or not wrong is a huge challenge in learning. Digital learning opportunities can offer many safe ways for learners to "fail" in positive ways that non digital failure does not. A simple example of this is in doing multiple drafts of a writing. With pen and paper, I detest writing because every revision requires me to write the entire piece over and over again. My draft copy is so laborious because I fear the work of reproducing the whole thing over and over again. In a digital version, my first draft is often anxiety free because I know that all my errors will be easy to find, fix and adjust to.

These thoughts get me wondering how problem solving is affected by fear and anxiety? If these fears and anxieties derive from test heavy forms of assessment, is it possible to open doors to learning by changing assessment?  Is it possible that fear/anxiety may help some problem solve and hinder others?

 

Real-world problem solving is not scripted the way it often is in educational settings. In the real world, the most meaningful learning comes when, after making your best effort, you fall flat on your face and then take the time to understand why you failed. In the process, you develop critical thinking skills, problem solving skills, a more thorough and intuitive understanding of whatever it was you failed at, and, as JenVanek said, “the confidence to try to accomplish it without an intermediary.”

How to effectively use technology in problem solving can’t be taught like a chemistry-class experiment in high school, where your grade depends on how well you followed the “recipe.” After you’ve learned the basics of the interface, you need to experiment. You need to ask yourself, “What does this do? What happens if I try this?”  You’re not going to ruin the experiment or blow up the building if you make a mistake, and I think much of the anxiety and fear some people have about using technology comes from having a little teacher inside their heads who keeps saying, “Don’t touch that! Don’t do that! Wait until I tell you what to do!”

The test here is not getting it right the first time but how you went about recovering from and learning from your failure. If you -- or your teacher -- will be punished for your failures, you are going to have a strong disincentive to take the kinds of risks needed for learning, and the assessments will tend to close the doors to learning instead of opening them.

   I watch the effect of fear & anxiety in our math courses every day.   It's huge.   

   I have also watched the "hole in the wall" talks and wonder about the *rest* of the people.   Who are the individuals who seize the learning opportunity?   Who doesn't?   (... and where does anxiety work in that picture?)   

I understand and appreciate your comment about problem solving being similar in a digital and non-digital context.  I agree with all that you said. However, I think one major difference  in the digital realm is that the context online is never constant and is always changing.  For that reason, digital problem solving is constantly required on any individual as they work with and figure out interfaces that are being constantly updated, new tools added, and new features are included. This happens each time an update occurs but also when sweeping changes occur in the way an online site operates.  For example, while many of us knew how to work with Google Docs, Google Drive came around and we needed to problem solve to learn the new features and layout to work with it optimally.  Problem solving digitally is a mindset and requires learners to know that constant changes are what's to be expected (and embraced).  

I appreciate your comments and ideas, they made me really evaluate all that's "new" and constantly changing online. 

Hi All,

I am the PI of the Technologies for Adult Basic Literacies Evaluation (TABLE) study funded by the Joyce Foundation. You can find details about the study in the two links David included in the description of this week’s panel discussion.

We are studying the use of 5 online learning products in a 14 different adult basic education sites. The 5 products include GED Academy, Core Skills Mastery, Reading Horizons, MyFoundationsLab, and ALEKS. The participating sites range from adult education programs within public school districts, community based organizations, and community colleges. The goal of this research is to learn what is takes to effectively use online learning products like the ones selected to support teaching and learning with “low skilled” adult learners (functional math and/or literacy levels from 4th to 9th grade) and the potential impacts of the use of these products on student learning and better program outcomes. Some of the products focus on both literacy and math skill development, while others focus exclusively on math or literacy skills. Some products are meant to be used independent of the core curriculum while others are meant to be integrated into the curriculum.  

Most of the research sites started their use of the products during fall 2015 while a few started during the first quarter of 2016. We are currently in the middle of our major data collection activities including site visits to each of the sites (involving observations of classrooms and interviews with administrators, instructors, and students), instructor and student surveys, and accessing student academic records. We are also collecting students’ system use data from each of the vendors. In many sites, we are also administering the online Education and Skills Online assessment (developed by ETS and based on the PIACC).

I presented some of our early findings at COABE in April (see link to PowerPoint slides in the panel description). Since then we have completed some additional site visits. I highlight some of the general findings in my response to David’s questions below.  One thing to remember, is that many of the instructors and students in the study are using online learning tools in a systematic way for the first time. All of these individuals are at the start of of the learning curve, and for some that curve might be pretty steep, depending on the product and the individual. So for these instructors and students, their responses reflect their initial reaction to their first experience of new type of instruction and learning.

1. What are the most important findings from your study for adult basic skills administrators, teachers and tutors and professional developers?

Many students told us they like the opportunity to learn independently of others in the classroom, to be able to struggle and make mistakes “in private”, and learn at their own pace. Many students found the online environments engaging. However, other students told us that they prefer learning directly from an instructor and did not like their initial experience with their online learning product.  

Many instructors said they enjoyed the opportunity to allow their student to learn independently and to learn how to learn independently using online tools. Many instructors mentioned that the use of the products allowed them to add variety to their instruction and more easily differentiate instruction for students with a wide range of incoming abilities. However, some instructors expressed concern that products would not serve their students well (particularly older students), that students would get lost within the online environment, and waste precious instructional time. Other instructors expressed concern that the reading-level of the content of their particular product was too difficult for many of their students.

Some instructors told us that they regularly use the system-generated reports of student progress to monitor students and inform interventions with individual students as well as their direct whole class instruction. Other instructors reported that they rarely view the reports provided by the online system.  Some instructors reported that they believe students’ ability to monitor their progress via system reports and receive immediate feedback was a significant motivating factor.

Some administrators and instructors with GED prep programs expressed concern that some products did not seem to be aligned with the new GED exam or did not make it easy for instructors to identify and assign content that was aligned with what they were teaching in the classroom. Some administrators and instructors that were using products designed to be used by students independently and in a self-paced mode, felt uncomfortable that students might be spending instructional time on content that was different from what was being covered in the classroom.

2. From your study what have you learned about adult basic skills learners’ use of technology, for example their digital literacy skills, their ability to do online learning, their persistence in online courses, and their interest in using technology?

Thus far, digital literacy has not come up as a key challenge for most instructors and administrators interviewed. Many instructors told us that the adult learners look forward to their time spent learning with the products. And based on a small sample of initial survey responses, students tell us that they are using the products when they are away from campus, often at home. However, instructors in at least two sites expressed concern that some students, particularly some older students in the program, may not feel comfortable learning in an online environment, “staring at a screen”. These instructors raised the issue that these students might feel that the ABE program is not fulfilling their obligations to students by moving instruction online – “this is not what we signed up for”.

3. What have you learned about using the PIAAC Education and Skills Online  assessment for literacy, numeracy and/or PSTRE skills?

The administration of the Education and Skills Online (ESO) assessment has been a learning experience for all (including our partners at ETS and us at SRI Education). As many of you know ESO was released for public use in August of 2015, around the time many of our sites began implementing their selected products. Our TABLE study represented the largest “pilot” of ESO at the time (this may still be true). We worked with ETS on the development of communication materials for instructors to use with students taking the test. Given that ESO is an online assessment, and this was its inaugural rollout, there were some technical glitches that led to some frustrating debuts at some sites, but nothing out of the ordinary. ETS staff were very responsive and resolved the issues quickly. Also remember, that in almost all cases, sites were being asked to administer ESO on top of the testing that is required of them and/or part of their existing practice, including use of TABE, COMPASS, Accuplacer etc.  

Some of the initial feedback from site administrators and instructors on ESO include the following –

  • Time for test administration is too long for many learners and takes away from instructional time. Some administrators and instructors expressed concern that due to the duration of the test, some students may suffer “test fatigue” and not try to their best to answer questions correctly.  With assistance from ETS, we analyzed data from an initial sample of test takers. We found that approximately 90% of students who start the test complete it. The time to complete the core ESO literacy and numeracy assessment for the average student was 1 hour 10 minutes plus an additional 25 minutes if students were asked to also complete an optional reading component (those who scored below a minimum threshold on the core literacy assessment) or the problem solving with technology component (all students who scored above the minimum literacy threshold on the core). Due a concern about testing time, we gave sites the option of administering the additional test components. According to ETS, the core literacy and numeracy component was designed to be completed in 1 hour.
  • A general concern of administrators and instructors is that students are not taking the test seriously since it has no impact on their success within their ABE program. While students are encouraged to do their best on the test, some respondents felt that many students understand that the test has no bearing on how well they perform in the program and therefore do not take the test seriously, moving through the test too quickly without taking time to really think through the solutions or check their work. We will work with ETS to determine if we can identify indicators within the ESO system data of students who might fit this profile and examine whether these students score significantly different from their peers on the ESO assessment and the programs’ own assessments.
  • Administrators and instructors within some GED prep programs want score reports to reflect progress on skills that are aligned with the new GED test. Some administrators and instructors within GED prep programs mentioned that they felt the reports provided by ESO on individual students are of limited use since they do not provide data that can be directly interpreted in terms of the progress students are making towards the likelihood of passing the GED exam.
  • Best administration practices from PIACC need to be leveraged and communicated to sites for the administration of ESO. According to ETS staff, during the administration of the PIACC for OECD, all test sessions are proctored and proctors are told to encourage students to not spend too much time on any one problem and to proceed to the next problem if they get stuck; It is OK if you don’t know the answer to a particular problem. This is an attempt to try to keep frustration from building in those students who are struggling and anxious about answering a problem incorrectly. Remember, historically, many of these students have not had much success in formal learning and testing environments. While all ESO testing for the study was done on the campuses of the participating sites with an instructor present, instructors were not trained or provided with instructions on best proctoring practices.

4. During your research, what are some important questions that emerged, for which further research is needed regarding adult basic skills learners’ use of technology and online content?

As I mentioned, we are still in the middle of our data collection activities for the TABLE study. Below are a few questions that we feel are important to explore within the TABLE study or a follow-on research study.

  • How does use of the products and impacts vary for students from different age cohorts and with different levels of digital literacy?
  • For a single session of use, what duration of use is optimal for learning and how does this vary for different types of learners?
  • What human, product-embedded, and ABE program supports are needed to ensure students are making progress within the online environment?
  • What level of training, ongoing support, and resource materials are needed for ABE instructors to use a product effectively and how does this vary for instructors with differently level of experience using educational technologies?
  • Are the products being used off-campus and how intensively?
  • What student characteristics, and product and ABE program practices are associated with more off-campus use?
  • For those products that are compatible with mobile devices, are students accessing the courseware on their smartphones, particularly when they away from campus or away from a stand-alone computer? If not, why?

I look forward to your questions and interesting discussion this week. 

Bob

Thank you, Bob, for a great introduction to the TABLE study. I have a few follow-up questions for you, but first two comments:

  • This is a large and complex study with five products, 14 sites (in schools, community colleges and community based organizations), and several levels of literacy and math skills, further complicated by the use of a new assessment, the ETS Education and Skills Online, and other data gathering methods. It is unusual, for the adult literacy education field at least, to have a study of this breadth and complexity. I hope it has yielded a great deal of rich data that can be mined to answer the important questions you have posed, and perhaps others.
  • My second observation is about the range of instructors’ comfort and competence in using online learning products and integrating them with their face-to-face instruction: it reflects the range I see in the field. I wonder if it reflects what other participants in the discussion see in their programs or states.
  1. You mentioned that some instructors expressed concern that “that students would get lost within the online environment, and waste precious instructional time”. I wonder if,  from the data you have collected, that concern has been found to be warranted.  I have heard instructors express similar concerns, but wonder, given that these are all closed content management systems, not the wild and wooly Internet, if this is in fact what students’ experience.
  2. You mentioned that “Other instructors expressed concern that the reading-level of the content of their particular product was too difficult for many of their students.” Has this concern been found to be what students experienced? If so, I wonder if there is any evidence that that students were not well matched to the products, and/or if there appear to be other reasons.
  3. You mentioned that some instructors said they frequently use system-generated reports of student progress, and that some of these said their students like and use the report data too, but that other instructors rarely use them. I am interested to know if you have found any instructor characteristics that account for these differences: age, gender, professional development in integrating technology, or other variables.
  4. You have provided some great information on the use of the ESO assessment. I expect that many readers are not familiar with this new assessment. Perhaps you could suggest some resources for those who would like to learn more about it, who may be considering using it in their own program or state.

I hope others who have questions for you will begin to post them now.

David J. Rosen

Moderator, Technology and Learning CoP

Djrosen123@gmail.com

David, thank you for your questions. 

As you know we are still in the midst of data collection and early stages of data analysis, so these findings are very preliminary. And unfortunately we have not analyzed enough data from instructors and students to know whether the reading-levels of the content within each product was a good match for all students in the sample (Question #2) or what instructor characteristics were associated with more effective uses of progress reports (Question #3). However, in regards to Question #3, based on our prior research of the use of online learning products in the k-12 space, one of the universal findings is the significant underutilization of system-generated progress reports by instructors to monitor student progress and  to inform their instruction outside of the online system. The reasons for this vary and vary by product, but in general we hear from instructors that they don't have the time to review and interpret the reports, the information reported is not directly aligned with the core curriculum so it's of little value, they don't trust the information provided, and therefore they feel more comfortable relying on their own formative and summative assessments outside of the system to make judgements about student progress.  

Regarding your general questions about instructor perception of the products versus how students experience them, based on the data we've collected and analyzed so far, the general feedback from students regarding their experience with the products has been pretty positive. Yes some students tell us they don't like learning online for various reasons, but these students haven't been in the majority. That said, instructor perceptions, whether they reflect actual student experience or not, is important. For ultimately it is the instructors who are tasked with integrating and supporting the use of the online products in their classrooms. If an instructor doesn't feel a product is appropriate for their students or benefiting their students, then you can be assured that they will look for ways to reduce its use in their classroom and/or communicate to students that they believe their time using the products would be better spent elsewhere. Without an instructor's firm belief in the potential benefits of an online product and commitment to its use, it's highly likely the product, regardless of how well designed it is, will not get a fair test in that instructor's classroom. This puts the onus on developers to be able to effectively communicate to programs and instructors the value of their offerings based on rigorous evidence collected on programs, students and instructors who are very similar to their own.  This also highlights the need for rigorous outcome research in the use of online learning products in the ABE area coupled with rich understandings of implementation and how products are being used and experienced. 

Finally, if you want more information and resources on the use of the Education and Skills Online Assessment, you can check out their website (http://www.oecd.org/skills/ESonline-assessment/). 

Hello Dr. Murphy & others.

Thank you for this discussion!  These questions caught my eye.

  • What human, product-embedded, and ABE program supports are needed to ensure students are making progress within the online environment?
  • What level of training, ongoing support, and resource materials are needed for ABE instructors to use a product effectively and how does this vary for instructors with differently level of experience using educational technologies?

You mention that the TABLE study includes classroom observation and interviews with teachers. I would love to know what data you've found that might support or contest findings of a study done by Zhao et al (2005). The meta-analysis revealed instructional involvement as a key to successful instruction in Distance Learning. They defined it as the "extent to which instructor is involved in actual delivery of content and available for interactions with the students" (p. 1846). They found that instructional involvement varied in the distance-learning programs included in the analysis and that this variance had an effect on the quality of the student experience: increased teacher involvement meant increased success (p.1857).  I put the full reference at the bottom.

In my work supporting ABE teachers as they develop their expertise as instructors in Distance Ed and Blended learning environments, I look for ways to emphasize  the significance of that study and try to guide programs to create space for instructor involvement. As you observed, many of the (new to Distance) teachers you interviewed don't even feel they have the time to view the reports available to them in the proprietary curricula they use. That's disheartening!  Feel free to respond here or please know that I think there is value in reporting on the impact of those non-proprietary curricula efforts to provide instruction or to complement the instruction found in those curricula.

One more note, about this part of the first bullet above: "are making progress within the online environment". I would love to see how the TABLE study data connect progress w/in a learning environment to broader learning goals of the students. I am guessing those goals are much broader than the progress benchmarks defined by the online environment. I mention it because I see new teachers getting hung up on getting learners through a particular online curriculum - losing site of more personalized learning goals and potentially creating blinders that make it hard to recognize a need for outside/supplemental instruction or learning resources.

I look forward to reading more when the TABLE data are published.

Thanks!

Jen Vanek

Zhao, Y., Lei, J., Yan, B., Lai, C., & Tan, H. S. (2005). What Makes the Difference? A Practical Analysis of Research on the Effectiveness of Distance Education. Teachers College Record, 107(8), 1836–1884. doi:10.1111/j.1467-9620.2005.00544.x

         

Wendy thank you for your comments, and everybody for the great discussion this week. 

Given all the discussion about distance learning and the supports and non-academic factors needed for success, I wanted to make one final clarification about out TABLE study funded by the Joyce Foundation. Our study focuses on the use of blended models of instruction, combining face to face lectures and instructional activities with time spent in online learning environments while on campus. It is not a study of distance learning. Clearly there are differences in the demands placed on products and program supports for students in distance learning courses compared to blended courses, with greater demands for support required of distance learning courses. 

Almost all the participating sites (N=14) and instructors in the TABLE study are using the technologies in a blended model with their students - students are on campus or in classrooms meeting with their instructors each week and spending up to 30% of instructional time working within the online products. Instructors are present while students are working online and available to address questions and for additional tutoring. Although students have access to the online products when they are off campus, none of the sites have made the use of the products outside of regular class time mandatory due to concerns that some students may not have access to devices or the internet. We will have a good use records from most of the products that will allow us to understand the amount the products were used outside of the campus-based instructional time and, in some cases, even the extent to which they were used on mobile devices.  We also have data from student surveys regarding their use of the products outside of class time. 

Hi everyone,

What an exciting discussion! I have been  a distance-only teacher in the Massachusetts distance learning program for a number of years, and an ABE classroom teacher for 20+ more years before that. My students are at the HSE level, although I suspect some of their reading levels -- and certainly most math levels -- are below ninth grade.   They are preparing now for the HiSET test as Massachusetts chose not to go with the newest GED version.  I found Jill Castek's early remark particularly relevant to us:  "digital problem solving requires more than digital skills -- it requires perseverance and stick-with-it-ness (in short, the traits to work through several parts of a task, even when the task is challenging or difficult to accomplish, or when there's few signpost to know if you're on the right track).   While digital literacy skills are teachable, teaching traits such as perseverance require experience and practice in addition to support and guidance." We find that perseverance is the biggest issue we face.  We have used five different products over the years -- in the past MySkillsTutor, GED-i and one from Houghton-Mifflin (I can't remember the name), and currently KET's FastForward and Khan Academy.  The issue remains, no matter the platform. None of these products provides for direct teacher involvement with students, which my fellow instructors and I view as a major part of the problem; communicating with students requires extra steps on both sides for phone calls or emails. This would  seem to substantiate the observation that teacher involvement is critical to success with these products, but interestingly those students who are enrolled both in distance and classroom-based learning generally use the products the least.  (Of course, this is likely because the teachers are different and at a distance from each other, and there's no effective means of coordinating DL with what is being covered in the classroom.)

To get back to Jill Castek's comment, we find over and over that students want to learn and for one reason or another -- schedule, transportation, social anxiety -- don't  find regular classes possible.  What they consistently underestimate is the discipline involved in solo study, especially in the face of competing life demands. I can only echo her observation that teaching those essential habits of mind is a difficult battle. I'd love to hear any experiences or information about improving persistence in a program like mine.

 

--Wendy Quinones

Hi Wendy,

I echo your thinking and wanted to contribute a few thoughts.  I think that praise and encouragement go a long, long way toward combatting frustration.  Recognizing that many tasks take time to think through and time to complete is also important to convey to students.  Sometimes, I've found that students get discouraged because they think that they should be quicker and faster -- and that others around them are quicker and faster.  Providing encouragement to work through multi-step processes one bit at a time I've found builds confidence.  While encouragement and working time are vital, perhaps the most critical component is the student achieving a goal or task successfully.   That builds a can-do attitude that can transcend a specific situation and carry over to new contexts.

What ideas or experiences might be supportive of helping to build persistence? 

Great ideas, Jill. I wonder what people think? Could the work of Andy Nash from World Ed on the Drivers of Persistence be adapted to the use of technology? What has been your experience with the following?

  1. Sense of belonging and community
  2. Clarity of purpose
  3. Agency
  4. Competence
  5. Relevance
  6. Stability

Opportunity to make a plug here if anyone is interested.  World Ed offers courses on persistence, some of which are free. http://elearningpd.worlded.org/courses/

Steve Quann

 

I've been keeping one eye on this week's conversations while juggling flaming alligators, but this post drew me in.  #1 on your list, Sense of community and belonging is a BIG issue in distance learning. Rafts of research papers have been written about how critical it is to successful online learning design and instruction. That said, it sounds like a number of the implementations of technology being discussed here have students working primarily "alone" or independently on "skill and drill" type practice programs.  While engaging in this type of work, learners are more likely to feel a sense of disconnection from peers and instructors. This is especially true if instructors don't have time to check on student progress and provide feedback on their technology-based work.

While work done completely at a distance is different from that which is blended or simply technology work in a face 2 face classroom, the "Community of Inquiry" Model http://www.apus.edu/ctl/faculty/community-of-inquiry/  might be a good tool with which to examine the factors affecting "sense of community/belonging."  Adapting this model to a broader discussion of "technology-based leaning" one might ask the following question(s) - "How well does the technology-based learning address..."

Social presence? This would address how welcome and connected to others learners feel when engaging in the technology-enhanced learning process, how well they apply that learning to other experiences and how they connect to peer through and about the technology-enhanced learning process. here I would ask the question

Teaching presence? This would address how the instructor connects with learners during and about the technology-enhanced learning process. How well is it integrated with other learning activities? What kinds of guidance and feedback are provided (including encouragement, celebrations, etc.). This model recommends an approach based not "just when the student asks for help" but proactively planning assistance, encouragement, check-ins, etc.

Cognitive presence? This would address how students move through the learning process – approaching problems, seeking out new knowledge, gaining new levels of understanding, and sharing that understanding with the learning community. While some of this is based in the software chosen (and good software incorporates elements that effectively build cognitive presence), this issue would address goal setting, measuring progress against goals, applying learning in more real-world situations, etc.  It also asks the question "Are learners just 'going through the motions" in the software or are they engaging higher cognitive functions?"

For anyone working in learning primarily at a distance or a relatively balanced blend of distance and face to face learning, research suggests that this model is *critical* to online learning success.  Applying this model more generically like this - to technology activities in more face 2 face settings would be an innovation.

What do you think - are these helpful questions to add to the research process? 
Are they helpful to administrators making program structure/purchasing decisions or instructors planning learning experiences?

Duren Thompson
Center for Literacy, Education and Employment
Region II PD Center Staff
& LINCS National Technology & Learning Trainer

Wendy, Jill and others,

I think Wendy’s reflection on Jill’s comments and her experience and observations about working with her students on different platforms is the one of the most fascinating aspects of this discussion.  It brings home a common issue with teachers using technology, and that is letting the technology lead the the instruction.  Whether it is using a packaged product, OERs, courses in learning management systems or the soon to be developed apps developed under the Xprize, some teachers (and learners) want to direct the learning, while others can tend to abdicate control and assume these products could know best.  This is a tension that I have noticed with some studies since to test a platforms efficacy, sometimes teachers are asked to step back. Yet we know (for now) that a blended approach works best, going back more than decade with the research Dr. Jere Johnston did under Project IDEAL and with more recently findings of on the efficacy of blended learning.  

Wendy’s post was call for us to keep reminding ourselves of how the allure of the technology with all the benefits of vetted content, standardization, engaging multimedia and adaptive learning and will fall short if tools are not in service to educators.  (Those of us integrating technology should keep forward in our mind that “e-ducere”  derived from Latin means "to pull out" or "to lead forth".)  I think we need to see ourselves and not the products as the central facilitators of learning (and keep a balance of still being “guides on the side” of learners). And the key here is that I believe there does need to be “direct teacher involvement with students.”  The best facilitators I know are communicating with students about content but also with crucial non-cognitive support through discussion forums, email, screencasts and yes as Wendy mentioned often on the phone.  As Jill says, “teaching traits such as perseverance requires experience and practice.”

I hope panelists (or others) can say more about what our field is already doing well in this regard but more about what they think, based on their findings, what we as a field need to offer in our professional development to best train educators in using ready-made products, particularly in blended and distance learning.   

Thank you all for a wonderful week,

Steve Quann
Director, EdTech Center @ World Education, Inc. 

 

Thanks for sharing your comments, Wendy. One observation I would make to build on yours, is that suppliers often struggle to design products and services that fit seamlessly into the workflows of teachers and instructors. This is a problem across the education ecosystem, but perhaps even more acute in the adult education space based on our review of active suppliers. Designing products so that they works as seamlessly as many of the devices we use in our personal lives is not a hallmark of educational technology... yet. Unless and until suppliers deeply map and understand the needs and contextual teaching / instructional practices of the professionals, there will always be an "implementation hurdle" when integrating new tools and resources. I would add, that this holds true to a certain extent whether we're talking about analog or digital solutions, although it often seems / feels more pronounced vis-a-vis digital ones.

 

Thanks for the opportunity to participate in this discussion, David. I lead a consulting and investment banking firm that works broadly across the education sector assisting a diverse set of organizations and leaders to address growth questions facing their organizations and make investments to catalyze their mission objectives most effectively. We have had the pleasure of partnering with several foundations - both national and more locally focused - to understand the aspirations of adult education and workforce development leaders, instructors, and practitioners vis-a-vis instructional technologies and other innovative teaching and learning models. One of the goals of our foundation partners has been to raise awareness among policy stakeholders of the appetite for technology and innovation among practitioners in adult ed and workforce development programs, and to catalyze interest in enhancing the ecosystem of instructional resources and tools that can specifically address the needs and gaps for adult learners (Note: On this latter front, the recent Adult Literacy XPrize is a good example of trying to stimulate development of new tools and solutions.) 

1.    What are the most important findings from your study for adult basic skills administrators, teachers and tutors and professional developers?

We felt that there are several key findings from our work. First, it is clear that practitioners in the field are desirous of enhanced availability of technology tools and resources to support the needs of adult learners. This is a community that is eager to benefit from the broader investments and interest in thoughtful applications of technology and education, but is largely not benefiting from dynamics in more traditional, institutional (i.e., K-12, postsecondary) segments.  Second, mobile is a powerful, broadly available technology among the adult learner population, with roughly 55 - 75% of them owning or having access to a smartphone device. This dynamic represents significant future potential for supporting adult learners. Finally, instructors seem bullish on the use and value of "free", instructional and non-instructional technologies (e.g., Google Apps, YouTube, Facebook, Khan Academy) and are already making active use of these tools. This trend raises a number of potential questions regarding selection and efficacy of these types of resources adopted by instructors, and broader issues of resource quality, selection and instructor professional development.  

These issues all bear further investigation and discussion, as do others that emerged during our efforts. 

 

 

 

Hello Adam,

Thanks for your introduction to your research. You wrote "This is a community that is eager to benefit from the broader investments and interest in thoughtful applications of technology and education, but is largely not benefiting from dynamics in more traditional, institutional (i.e., K-12, postsecondary) segments"

I have several follow-up questions for you

1. What are some of the benefits you were thinking of? Online content designed specifically for the subjects and ages of the students? Public subsidies for schools' broadband access to the Internet?  Professional development for teachers in using and integrating technology in instruction? Research devoted to the questions of concern to adult basic skills teachers? Other benefits?

2. What do you think accounts for this difference, and what can researchers, adult basic skills practitioners, and/or adult basic skills policy advocates do to broaden the technology applications benefits for adult basic skills programs and schools?

3. From your research, and perhaps from other adult basic skills and technology research you may be familiar with, what can you tell us about how adult basic skills (including ESOL/ESL) learners are using mobile devices for learning?

4. Many adult basic skills teachers have found some software developed for or widely used by the K-12 market to be useful in adult basic skills education. I am thinking for example of Schoology, Edmodo, Kahoot, and as you mentioned, Google Apps, and Khan Academy. There may be other applications as well. Do you see ways that other useful software products developed for the K-12 market could be be inexpensively customized or adapted to be as useful in the adult basic skills market? Are you aware of efforts to do that?

David J. Rosen

Moderator, Technology and Learning CoP

djrosen123@gmail.com

 

 

Hi, Adam.

I appreciated reading your reports and your introduction to them above.  I have two questions or maybe a question and a comment:

1 - The survey was delivered online, correct? How much do you think that introduced some bias in favor of those who use technology? 

2 - I agree with observations made about tech-engaged teachers wanting to have more support and resources for using tech in their instruction and that those teachers are not benefiting from the traditional model of schooling evident in most ABE programs. This is something that those working in ABE and doing PD with ABE teachers already know. I would really love to see policy  research that spells out what is required in programmatic and funding environments to make those teachers happier - especially in light of the current  transition to complete WIOA implementation. 

 

Thanks!

Jen

Thanks for your question, Jen. As regards your first question, yes, the survey was delivered online. I think it's fair to assume that this may have introduced a some bias in our sample. At the same time, in situations like this across the primary research work we do, we try to account for what the prevailing trends are, recognizing we will always have some selection bias based on our approaches. In this case, I think it's fair to say that while the "entire" adult education practitioner community may not be as bullish as our sample, both the prevalence of instructional technologies in our learning environments and the ubiquity of technology in our lives more generally will require practitioners and programs to bring technology-infused instruction into adult learning models to be contextually relevant. 

As regards your second comment, I can only speak to it based on some work we've done in a Midwestern state. In a follow-up survey that we did - preserving many of the national questions, while also introducing new ones - relevant PD for adult education instructors in the effective, relevant use of technology was a gap / opportunity. As a result, the state agencies involved in drafting the WIOA application and other state-specific foundations and CBOs have flagged investments in this area as a priority moving forward. Obviously early days, but promising in understanding PD as fundamental for transitioning program practices. 

Good Morning Everyone,

I am enjoying this discussion as I am a math instructor in an adult public charter school that uses blended learning. We experience first hand many of the things that have been mentioned. I think blended learning will work as more students get used to using the internet in an educational setting. I think when blended learning programs are being designed the designers should make sure that time when students can be off task is strictly limited. Most students haven't used the internet to actually learn. So when left to navigate online learning on their own, it has been my experience that students will get lost because they will revert to what they're used to doing on the internet, which has been recreational. If the switch is made to have students become more comfortable then blended learning will take off. 

-Alfons I. Prince

Hello Alfons,

Thanks for your great observations. I would like to know more about how you help your students get more comfortable in using the Internet to learn. Do you demonstrate for them how to use specific applications? Do you give them Internet-based assignments? Do you try to build these new Internet learning skills upon Internet skills they might already have? Do you use an online content management system or learning management system and, if so, how do you help them to get comfortable with it? Do you find that other strategies work better for you? If so, what are they?

I would also love to hear from other adult basic skills teachers about their strategies for helping students get comfortable using the Internet for learning.

David J. Rosen

Moderator, Technology and Learning CoP

djrosen123@gmai.com

 

Good Morning David,

I let students work in the software on their own. My reasoning goes back to the fact that many people don't read the manuals of the technology they use. I then make the comparison that students figure out and use technology in their everyday lives. I then make sure that they understand that technology enhances the learning opportunities that is out there. The software provides internet based opportunities so I do not have to do anything but check the scores and see how the students are performing on those assignments. 

For me, I've found that letting students work through on their own, but I make sure that my students feel comfortable enough to ask for help when they inevitably face an issue. To me that creates the life-long learning environment that will help change the lives of my students. Let me know if that was a good start, or if you have any more questions. I'd be happy to answer them. 

Thank you and have a great day!

Alfons

 

Thanks Alfons,

You may be aware that in the top-performing countries on international numeracy and math assessments (sadly the U.S. is not one), an important part of the approach to teaching numeracy and mathematics is providing a problem, at a carefully chosen level of difficulty, that will challenge but not overwhelm students. Then students are encouraged to work together to solve the problem, and often a teacher will ask students to share their solutions with each other. There are several important elements to this math and numeracy teaching and learning approach that I think apply to solving other kinds of problems in technology-rich environments and more broadly, as you have suggested, in lifelong learning. Here they are:

  • Learners are provided with a problem or task that is challenging, just beyond their reach, but not impossible
  • They struggle to solve it. The struggle is important because it provides engagement, a desire to solve the problem
  • They have an opportunity to help each other, often to work in teams, but even if they are working independently in the same space, they can provide each other with peer support. Those who have solved a problem provide tips and clues, but not "answers". Students are coached by a teacher on how to provide useful peer support.
  • When there is more than one way to solve a problem, this is acknowledged and, even when learners have already learned one way to solve a problem they are encouraged to learn other ways, to develop versatility in their approach to problem solving, or at least to appreciate that there are other successful approaches.

In your description of what you do to help your students learn how to use technology for learning, Alfons, I see several of these elements. I wonder if you could tell us more about what kinds of issues students inevitably face when they are learning online and what your approach is to offering them help.

I hope others will join in this conversation too. One of the most important aspects of adult basic skills teaching is the art of helping students to become successful, resourceful, confident, and persistent lifelong learners. I am very interested in learning more about what adult basic skills teachers have found to be effective in helping students become successful lifelong learners.

David J. Rosen

djrosen123@gmail.com

 

Technology and Learning Colleagues,

With only two days left of our rich discussion on Research on Technology and Adult Basic Skills, if you have a question or comment, please post it today.

This would also be a good time for researchers to ask each other questions about their research, and the research implications.

I would like to reference a California study conducted by OTAN on how adult learners use the Internet, and to ask our researchers in this discussion, especially Adam Newman, to comment on how the findings from that study may be similar or different from the Tyton Partners Learning for Life Study or from other adult learner technology use studies you may be familiar with.  The link to the 2015-2016 OTAN study in California is https://adulted.otan.us/info.cfm?fuseaction=studentResults&yr=201516

If I understand this correctly over 43,000 adult learners in California have participated in this survey that asks them how and where they access the Internet and how they use it. 82% of those responding say they access the Internet using smartphones. More than 60% use the Internet for a variety of kinds of information searching and research. There are other findings that may be of interest as well. I believe the participants are a sample of learners from publicly funded adult basic skills (mostly ESL) programs. Colleagues from OTAN who are participating in this discussion may have some corrections to my understanding or some additional information.

David J. Rosen

Moderator, Technology and Learning CoP

djrosen123@gmail.com

 

 

Thanks for highlighting the OTAN work, David. Our efforts have centered on program administrators and not instructors, so we don't have access to adult data as rich and specific as this. What I can offer is that for the state-based work we conducted, we asked respondents to identify where their students have "consistent access to the Internet within their community." We received the following data: 

  • Libraries - 97%
  • Cafes and restaurants - 72%
  • Adults' workplace - 66%
  • Adults' home - 56%

What is interesting in comparing perceived access (our data) and use (OTAN data), is thinking about how we can leverage those locations - and community partners - as an extension of our more traditional / formal settings for adult education and workforce development programs. 

 

Technology and Learning Colleagues,

I would like to express deep appreciation to our research colleagues Jill Castek, Robert Murphy and Adam Newman, and to all of those in the Technology and Learning Community who contributed their questions and comments. Our panelists were fabulously generous with their time and their thoughtful comments, as were those of you who engaged them in important conversation about their research. As I expected, I learned a lot myself, and especially appreciate the panelists' genuine interest in engaging in this kind of discussion. I hope it was as useful to them as it was to many of us. I welcome their continuing to join us in this community of practice and to contribute their comments as they can. I also look forward to additional reports on their studies that are currently in progress, and to new studies they may undertake that may be relevant to adult basic skills educators.

David J. Rosen

Moderator, Technology and Learning Community of Practice

djrosen123@gmail.com