Teaching Problem Solving in Technology Rich Environments

Introducing our Discussion on Problem Solving in Technology Rich Environments

Colleagues,

The United States has participated in an international study that has brought attention to teaching adults how to use computer technology in problem solving.  Sponsored by the Organization for Economic Co-operation and Development (OECD), the study is known as The Program for the International Assessment of Adult Competencies (PIAAC), and its data collection instrument is called the Survey of Adult Skills (SAS).  “The survey measures adults’ proficiency in key information-processing skills - literacy, numeracy and problem solving in technology-rich environments - and gathers information and data on how adults use their skills at home, at work and in the wider community.” http://www.oecd.org/skills/piaac/ 

The PIAAC SAS was administered on a computer to a broadly-ranging sample of American adults, not only adult basic skills learners. The survey is now also available for a fee as Education and Skills Online, which may be of interest to adult basic skills programs, adult basic skills researchers, and others. http://www.oecd.org/skills/ESonline-assessment/. The PIAAC SAS has shifted the discussion in our field about what technology skills adult basic skills learners need to a new realm that many of us do not yet fully understand. This problem solving domain assumes basic technology or digital literacy skills, and assesses higher order thinking skills applied in environments that offer and expect the regular use of computers and possibly also portable digital devices.

We are fortunate this week to have with us prominent researchers who have looked at and written about the PIAAC PSTRE data: Dr. Jill Castek formerly with Portland State University in Oregon, and now with the University of Arizona; Dr. Stephen Reder, a professor emeritus at Portland State University; and Jenifer Vanek, a Doctoral Candidate in Curriculum and Instruction at the University of Minnesota, and the Director of the IDEAL Consortium for distance and blended learning sponsored by World Education. We also have with us experienced numeracy and math, and technology skills practitioners Edward Latham from Maine, and Kenneth Tamarkin from Massachusetts, each also with many years of experience in teaching adults both digital literacy and problem solving.

I hope we will have a lively discussion this week, with many good questions from participants in our LINCS Communities of Practice.  Let’s begin today with our panelists introducing themselves, followed by Jill Castek’s description of the Portland State University Learner Web project key findings from their Institute for Museum and Library Services-funded research, and from other Learner Web research findings on digital literacy and problem solving. On Tuesday we will focus on questions about the PIAAC Survey of Adult Skills and in particular the PSTRE Domain, and we will also hear from Jen Vanek about how to apply the findings from PSTRE research in adult basic skills (including ESL/ESOL) classrooms, research that will be published soon in a PIAAC paper. On Wednesday we will hear Kenny Tamarkin's and Ed Latham’s thoughts in response to Jen Vanek’s findings and recommendations; I hope we will also hear from many other teachers with questions and comments about how to teach problem solving skills in an environment that offers and expects the use of technology. Later, on Wednesday or early Thursday, Steve Reder will describe the framework he has proposed in his PIAAC paper on PSTRE. On Thursday we will hear Ed Latham’s and Kenny Tamarkin’s reactions to Steve’s recommendations, and I hope we will have reactions from other teachers participating in this discussion. Thursday will be the last opportunity for participants to pose questions that our panelists can respond to. On Friday panelists will have an opportunity to answer remaining questions and to reflect on the week’s discussion, both on questions they have been asked and perhaps also on what they have not been asked.

I am greatly looking forward to this discussion, to our panelists’ presentations of their research, to practitioner and research panelists’ reactions to the research, and to questions and comments from members of the LINCS Technology and Learning, Program Management, and other Communities of Practice.

And now, panelists, please introduce yourselves.

David J. Rosen, Moderator

Technology and Learning, and Program Management CoPs

djrosen123@gmail.com

Comments

Hi, everyone.

Thanks, David, for inviting me to this discussion!  Let me start with an introduction. I have worked in the field of adult education for nearly twenty years, including teaching adult English language learners in many different settings, creating online learning curricula for adult learners, providing professional development and technical support in the areas of digital literacy and distance learning, and conducting qualitative research in educational settings. In my current work with the IDEAL Consortium I help support member states grow or revise their distance and blended learning programming by connecting practitioners and administrators across the country to share expertise and make the best use of relevant research.

My other job is service on the leadership team of the Northstar Digital Literacy Assessment. Our latest accomplishment with Northstar is the release of an information literacy assessment, an assessment module that draws heavily on the PS-TRE framework (which I'll get into more tomorrow!). You can see this if you review the Northstar Information Literacy standards on this webpage. Finally, and most relevant here, I have written a research to practice brief on PS-TRE for the American Institutes for Research, which is in it's final stages of review. The purpose of the brief is to frame the PS-TRE components in terms that can support classroom instruction.  Given the impact and prevalence of Information Communication Technologies in our daily lives, our adult learners need to have some proficiency with PS-TRE and what better place to begin than in adult education and literacy classrooms?  I look forward to sharing more about my brief tomorrow.

Good day everyone and thank you David for the invitation. I have been a technology integration mentor and mathematics teacher for over twenty years working with k-12 and Adult Education teachers to increase learning opportunities and problem solving skills for learners in many locations in Maine. I am part of the state of Maine College and Career Readiness Standards team as a mathematics specialist and have independently been developing technology systems that help address many of the challenges in CCRS implementation.

I am an avid gamer and often explore how online digital gaming and offline face-to- face table top gaming can be used to improve learners problem solving abilities. By creating local, community gaming groups, I have observed growth in problem solving skills in people of all ages. Structured face to face, socially engaging game play offers an informal educational environment that can develops problem solving skills that are often very difficult to foster in other, more formal, environments. My exploration of problem solving growth experienced in table top game play offers insight as to how our more formal education efforts might be improved to increase PSTRE proficiency. 

I am looking forward to the discussions this week and hope we can all learn some actions that improve our learners' proficiency with problem solving in technology rich environments. 

 

Hello Everyone!  

I appreciate being invite to this discussion to exchange ideas with you.  I am a faculty member in the department of Teaching, Learning, and Sociocultural Studies at the University of Arizona.  In collaboration with others across campus, I'm exploring opportunities for technology enhanced language learning. Prior to joining the faculty at the University of Arizona I was the Director of the Literacy, Language and Technology research group at Portland State University. I earned a Ph.D. from the University of Connecticut where I was a fellow in the New Literacies Research Lab.  

I am an active researcher and experienced educator whose work explores digital literacies, digitally mediated learning, and problem solving in K-12 classroom and community contexts.  My work in disciplinary literacies explores how reading, writing, communicating, and collaborating digitally can support knowledge building and content learning. I am co-editor of a column in the Journal of Adolescent and Adult Literacies (JAAL) focused on digital literacies for disciplinary learning. I'm interested in designing learning environments, instructional design, implementing supports for diverse learners, and developing partnerships with a range of educational and community-based organizations. 

Supported by a National Leadership Grant from the Institute for Museum and Library Studies (IMLS), and together with colleagues at Portland State University, I am conducting a study of library patrons’ digital problem solving in collaboration with the Multnomah County Library in Portland, OR.  This research is designed to improve library practices, programs, and services for adult library users—especially economically vulnerable and socially isolated adults, seniors, English learners, and others lacking basic digital literacy skills.  This work builds on a three year study focused on the digital literacy acquisition process also funded by IMLS.  To explore findings from this work, visit the Digital Literacy Acquisition and Equity Research Hub.  I will share more details about this research in subsequent posts.     I am honored to be a part of this discussion thread to discuss digital problem solving and its implication for instruction.  Looking forward to sharing ideas with you! 

Introducing Kenny Tamarkin

Greetings everyone, and a special thanks to David Rose for organizing and facilitating this discussion and inviting me to participate. I have been involved in Adult Basic Education since 1972 and have taught computer skills since 1979. I have written or co-written a number of books for McGraw Hill Contemporary Books, including Number Power 6, Word Problems, and Social Studies preparation books for previous versions of the GED. For years, in my role as a regional technology coordinator for the System of Adult Basic Education support, (SABES), I worked in staff development. I recently retired from the Asian American Civic Association in Boston, Massachusetts, where I returned to the front lines as a curriculum developer and teacher of computer skills. My approach to teaching using technology can best be summarized by a single word: empowerment.

As someone who has a great interest in exploring how research can inform practice and how practice can inform the direction or research, I am looking forward to this week's discussion.

Hi everyone!  It’s wonderful to be part of this discussion, thanks for inviting me, David.  I'm Steve Reder, a professor emeritus from Portland State University, and a long-time researcher in adult literacy, digital literacy and second language acquisition. For me, as for many adult learners, these are not three separate topics, they are richly interwoven in terms of how we understand them and use them.

Early in my career I focused on understanding literacy itself as a technology of language and communication and on the ways in which adults and communities not familiar with literacy learned and began to use it.  I’ve since worked with numerous communities around the world and here in the U.S. to understand literacy development in this way.  Digital literacy is in many ways an extension of this framework to a new technology, with of course some novel features added that have not often been seen before with earlier technologies. Collaborating with colleagues at Portland State and across the country (including several of my co-panelists in this discussion), I participated in a network of community-based digital literacy programs in a range of settings and contexts across the country.  Working in more than a 100 computer labs in these diverse settings in which learners were supported by both technology and face-to-face tutors, we learned some important things about how adults acquire and develop digital literacy.  We’ll be drawing on some of these insights and related research in our discussion this week.

I’m pleased there will be an opportunity to talk more about a paper I wrote about one way to understand digital literacy using the PIAAC data and PS-TRE framework.  As you’ll see, the paper posits a series of early stages in digital literacy acquisition, some of which are measured by PS_TRE scores within PIAAC and some of which are indexed by other information in PIAAC.

I’m really looking forward to this week’s discussion.

Thanks to our five panelists for their introductions. I would like to begin the discussion with these two questions for our panelists, and I encourage others who have questions to post them beginning now, and any time this week through Thursday.

Questions for the panelists:

1. Tell us (more) about how you got interested in digital literacy skills, and in problem solving in technology rich environments.

2. How does PIAAC distinguish digital literacy skills from problem solving skills in technology rich environments (PSTRE)? How do you distinguish these if you think about this differently?

David J. Rosen, Moderator

Technology and Learning, and Program Management CoPs.

djrosen123@gmail.com

1. Tell us (more) about how you got interested in digital literacy skills, and in problem solving in technology rich environments.

At the age of 12, my family moved from the East Hartford Connecticut area to Upper Frenchville Maine to create a new home. Neither of my parents had ever built a house, but there had been hours of reading about it, so it was determined that it could not be so hard. We lived in tents while we spent the entire summer scavenging for recycled lumber and materials. This shift in lifestyle set the stage for my life-long study in problem solving. In middle school our school purchased half a dozen Apple IIe computers. I immediately jumped into trying to learn to program on my own while getting bored in Computer Literacy class. The instruction in class was all centered on which buttons to press and how to take the standard typewriter conventions into the digital realm. I wanted to learn what could be done and worry about the how to do it as I had need. I learned so much about programming (remember the Internet was not available yet) and could manipulate the word processors and spreadsheets required by the teacher.

As I taught digital literacy skills, I was frustrated at the high school, community college and adult education levels when the focus in each organization was so much centered on how to do procedures rather than what the digital tools could do. There were a number of times I would get in trouble for going "off script" by trying to get to the creative uses and problem solving options any given digital tool might offer. As we see in mathematics often, the pressure was to "cover the material" so that the next study in a sequence is not disrupted in the next course. "Keeping to the script" has caused many challenges for learners as they are exposed to different environments, different timelines, and different expected outcomes in the real world. 

I was fortunate to live in Maine where 1:1 computers for middle school (and later other grades) was introduced at a good time in my career. This state effort offered free laptops to students with the expectation that teachers would enhance the learning in schools. Professional development for teachers often consisted of full day workshops sprinkled throughout the school year, but the content taught in the professional development was difficult for teachers to fully integrate. As part of a Title IID grant that brought the eMINTS training program into classrooms I was hired to be one of the state trainer mentors. This expensive training focused on mentoring teachers in their classrooms with the focus on how the tools can increase learning options and much less focus on drilling in the procedural operation of the tools. This was a transformative experience for all the teachers I had the joy of working with and I learned so much from all the teachers. As the grant money dwindled, the state worked to move this same program into the adult education programs in the state. As we often see in adult education, the amount of money to support the participating programs was much less! Still, a good number of the programs in the state were able to benefit from some form of exposure to the idea of technology can enhance learning opportunities for students. 

After years of money and human hours spent trying to instill the spirit of Seymour Papert into classrooms, there have been some successes and some progress. There remains much room for further growth at all levels in this work. 

2. How does PIAAC distinguish digital literacy skills from problem solving skills in technology rich environments (PSTRE)? How do you distinguish these if you think about this differently?

Mark Twain said, "To a man with a hammer, everything looks like a nail." In many cases, digital literacy training focuses on how to operate a tool in isolation from natural environments in which the tool might be used. Learning a tool simply to know what all the parts of the tool do does not prepare for the world of uses that tool might be used in. In PSTRE, a real life situation or challenge is presented and the learner must use problem solving strategies to resolve the challenge using technology as at least part of the solution. If all a learner has experienced is how the technology operates, (what buttons do what functions), the learner is missing the application and problem solving skills needed to be successful. If we look back to the hammer quote, it would be interesting to see how one might approach opening a tin can of food on a survival trip. One trained that a hammer's function is to pound nails will certainly open the container, but there may be quite a mess as a result and very little usable food left over. With problem solving skills in place and a good exposure to real applications of how a hammer might be used to pry, lift, grasp, lever or puncture efficiently, one is much better equipped to open that survival food. The learning of any tool without appropriate exposure to possible applications and a flexibility of problem solving skills to adapt to challenges ends up with very good industrial aged factory workers that comply well and do their one job well. Our current times require a much more even balance of problem solving flexibility, adaption to different applications or uses, and proficiency with the operation of any available tool (digital in our discussion). 

Great hammer metaphor! I think it might be useful to extend it a bit. To capture the core concerns of PS-TRE, picture a digitally-controlled hammer. Knowing when to use that one versus a regular hammer and then figuring out what to do when it doesn't work they way you had hoped are important considerations with PS-TRE.

Panelists, and discussion participants, a theme of Ed’s reply to my questions about his interest in digital literacy skills, and problem solving in technology rich environments, is motivating situations that present problems that he (or his students or colleagues) care about solving: Ed mentioned that for him it was the need to build a house in Maine, or to learn how to write programs for an Apple II computer.

Ed, and others,  as you have worked with adult education teachers, to help them solve problems in a technology-rich environment, what are some of the most engaging problems -- for them -- that you have seen? Have your colleagues faced problems that are as engaging to them, Ed, as building a house and programing a computer were to you? If someone wanted to build a technology professional development program for adult basic skills teachers based on the problems that teachers typically face, the solutions to which involved using digital technology, what might some of those problems be?

Panelists and others, what are some of the most engaging problems or challenges that you, or teachers or students you have worked with, have seen or faced whose solutions required or benefited from the use of digital technology?

David J. Rosen, Moderator

Technology and Learning, and Program Management CoPs

djrosen123@gmail.com

Hi, David.

I haven't worked directly with learners, but when I was actively teaching in a classroom I tried to boost relevance by better understanding learners' interests and needs. I think that this is no different;  teachers can create activities that make visible the problems that learners are likely to encounter daily  - or just asking.

A proverb states that "Necessity is the mother of invention." and I think this provides much insight as to which situations will be most motivating. Simply put, situations that are important to the learner's life will provide all the motivation an educator can wish for. Getting to those individual-level details can be very difficult in teacher-centric settings, but it becomes much easier in situations that might have more individual student-centric focus. 

There are some key common goals that most of my adult learners have appreciated. Many of the the situations fall into the category of, "Would you like to save some money?" situations. Cell phone plans are a classic example that appears often online. I like to extend that one step further with students to ask if there are ways to remove the need for cell phone plans if one has access to Internet services that are free? This immediately gets interest levels up and leads to research and exploration of how could one use existing free resources to replace all the uses the individual values in having a cell phone. 

Another common theme here in Maine is in cutting down heating costs through our long winter seasons. Most of my adult education students live in substandard housing that often poorly maintained and hardly built for cold weather climates. The question poised is, "What is the most cost effective way to cut down on your heating costs?". Since each person's situation is different, rich sharing of solutions is valued by everyone as so many wonderful solutions are found. Sometimes learners will focus on local agencies discovered online that were later communicated with through email. In other instances, learners found DIY videos that demonstrated different ways to weatherize mobile homes and were able to use local social media tools to find salvaged or bartered materials locally to approximate the solutions in the videos. I have had good results with explorations into electricity costs and having learners discover not only how to do energy audits for themselves, but how to find the cheapest ways to get local support in enacting solutions to each energy sucker they found in their local area. 

I wish to add a cautious thought. While there are some general prompts that may generate some interest, like those offered above, there is much more motivation with situations that are specific to an individual's situation. This requires building an environment of trust so that learners feel comfortable sharing situations that are often difficult to share with others. Sometimes just knowing a learner's career aspirations can offer a teacher some fertile options to offer situations. The teacher needs some flexibility when thinking of options and needs to be observant to catch when a learner is not latching on to any situation or option. 

Do others have some "general" situation challenges that might help a good number of learners get into a problem solving through technology situation?

You say that the training went beyond  drill and procedures for "here's how to work the machine" and delved into "how can we use the technology to enhance learning?"  

Did any of that apply to the learning for the students?   

I'm looking at my rambling notes for creating online lessons and the idea of "connecting representations"  -- getting students to connect, say, showing amounts with number bars and number lines and physical things and how they're different ways of showing the same thing to build the abstract concepts of quantities.   I'm wondering how I can get tech to help with that :)   

Thank you so much for sharing the question about how tech might help with representation of amounts. My first thought was that a spreadsheet tool might be the first thought for many. After giving learners an exploration of how spreadsheets can produce many types of graphs, learners can be challenged with determining which spreadsheet graph would best represent given sets of data. It may be more powerful to have data that might be represented by different graphs for different purposes. Taking a look at annual electricity usage data that is often included at the bottom of monthly electric bills might be a good example. Using pie graphs of that data helps the learner get a good image of the expenses each month and which months are the cheapest and most expensive so that creating a budget can be easier. Using a bar graph can help compare how costs change from month to month to try to identify seasonal changes that might trigger the increases and decreases. This learning can help pinpoint where in the household the learner might focus on alternative ways to cut down electricity. 

Another idea in spreadsheets might include how one could compare monthly electric usage with monthly costs to derive a mathematical expression that best approximates how the two values are related. In this instance, the learner needs to know how formulas are entered into a spreadsheet cell and how data needs to be arranged in the spreadsheet. From there the learner experiments with different formula ideas and algorithms to find a "best fit" formula that represents the data. The spreadsheet allows for each formula testing and comparisons and very advanced users can even have the spreadsheet help determine best fit between the different formula attempts. 

Beyond the spreadsheet thoughts, I am brought to the idea of using other tools. If the learner is given two or three graphic representations of the same data and charged with determining how each graph is related to each other and what the graph might represent. No labels are given on each graph but the learner is told that each graph is a different representation of the same data. At first, I would let students brainstorm what online tools or resources might be used. If ideas were not coming from those discussions I might offer a few suggestions like, "Are there ways you can connect with other people that might know more about graphs?" "Are there ways you can find examples that would help give you perspectives on the graphs you were given?" "Does the internet contain ways you can look up information just with images?". These prompts don't specifically point to a tool or solution, but may help students determine that social media might be a great way to get opinions of others or maybe just a simple email to me asking "Ed, what do the graphs mean?" might be a strategy that gets some information. Doing good searches for bar graphing, circle or pie graphs, and other graphing examples may help the learner discover online examples that will help offer ideas or maybe they may even discover the exact examples of the graphs because the teacher just pulled them from a site. Possibly learners discover the power of reverse image searching on the Internet. So many ways learners may discover technologies that help solve this problem. 

Those are three ideas that come to mind reading your comment tonight. I am sure there are others. What ideas do others have on how technology might help with representing data and/or building to abstract concepts? 

1. I first became interested in supporting digital literacy skills of ABE learners through work with Steve Reder, David Rosen, and Jill Castek in the early days of the Learner Web. At the time I was teaching at the Hubbs Center/St. Paul ABE; we joined the Learner Web early use group to craft online learning opportunities for students who wanted to intensify learning. At the time St. Paul was still trying to make use of the National Work Readiness Credential and we wanted an online resource learners could use to prepare. I had never designed online content before and soon recognized how difficult it was to create quality online content for low-literacy adults. At the same time, I was involved in working with the St. Paul Community Literacy Consortium to come up with a list of Digital Literacy Standards. You can read about the community engagement process here, if you like.  The standards became the foundation of the LW Digital Literacy learning plans funded by the IMLS project that Steve Reder mentioned in his introduction. This work, I think, naturally evolved as we, like you perhaps, began to understand the importance of teaching computer skills in a contextualized manner - not as isolated skills.  Because we had always worked closely with the library system in St. Paul, we were aware of the importance of supporting proficiency with "information literacy".  When PIAAC released the results in 2013, it just gave us a new way to talk about it and showed an imperative to be more explicit about it.

2.  I think that PIAAC does not explicitly distinguish digital literacy skills from problem solving skills in technology rich environments (PSTRE). I just did a search for "digital literacy" in the PS-TRE Conceptual Framework document published by the OECD (the work defining PS-TRE) and it didn't come up, though Gilster's (1997, p. 8) seminal work on digital literacy is referenced: "Evaluation and critical judgment of information are core aspects of literate Internet use (Gilster, 1997)". The conceptual framework does suggest that proficiency with basic computer skills is necessary for accomplishing problem solving tasks. Perhaps PIAAC avoided the term digital literacy because, IMHO, it is somewhat murky, interpreted differently depending on context (and over time!).   Kathy Harris provided a nice distinction between PS-TRE and digital literacy in her LINCS article "Integrating Digital Literacy into English Language Instruction"   Harris shared the digital literacy definition put forth by US Dept of Education, "...the skills associated with using technology to enable users to find, evaluate, organize, create, and communicate information".  I think Harris situates PS-TRE at the most complex end of digital literacy, recognizes that in order to be truly digitally literate ABE learners must have basic computer skills, time to practice in contextualized settings, and must be able to apply those skills when solving problems.  

I agree with this interpretation. PIAAC suggests that PS-TRE involves three distinct components: a task, technologies, and cognition (the mental process employed when solving a problem). If you look at the discrete skills listed in the cognition component of the PS-TRE framework, you can see some similarity with the US DoE definition of digital literacy:  

  • Goal setting and progress monitoring.
  • Planning, self-organizing
  • Acquiring and evaluating information
  • Making use of information 

While both share an assumption that simple competency with computers does not make one digitally literate and both focus on finding and making use of information, PS-TRE goes further by being even more explicit about cognitive processes involved when applying those computer skills. According to PIACC/PS-TRE, in our highly technological world, particular tasks (more everyday!) demand use of technology. Part of the problem to be solved is deciding what technologies are needed, knowing how & when to use them, and, importantly, having an awareness about progress toward accomplishing the task.

1.  How did I first became interested in digital literacy?

I first started thinking about these issues before the era of widespread use of computers, smartphones, etc.   I was doing research in rural Africa, native villages in Alaska and in immigrant and migrant communities in the U.S., looking at the interplay between literacy and technologies of communication available in different settings.  This was at a time when the concept of literacy practices was first starting to develop and the work of Marshall McLuhan and colleagues (“the medium is the message…”) was also prominent.  Researchers were looking at how speech and writing differed, the distinctive features of the content and interactions in electronic media such as TV and telephone, and so forth.  My later thinking about digital literacy initially evolved within these earlier frameworks.

2. How does PIAAC distinguish problem solving (in technology-rich environments) from digital literacy? 

I agree with Jen Vanek’s earlier posting that PIAAC really doesn’t distinguish them directly.  It’s useful here to look at the history of PS-TRE within the broader skills framework developed by the OECD.  That framework has guided the design of both PIAAC and PISA, the international skills assessment of 15 year-olds.  PISA, which has conducted periodic assessments since well before the first PIAAC assessment, included assessments of literacy, numeracy and problem-solving (but not specifically in technology-rich environments as in PIAAC).  In my view, PIAAC essentially added the technology-rich specification to a pre-existing problem-solving framework that had informed earlier PISA assessments, which is one reason why there seems to be some fusion in PIAAC between problem-solving and navigation of technology-rich environments.  In future cycles of PIAAC, we may see a similar fusion of technology into literacy and numeracy tasks.

Another potentially important consideration in thinking about differences between digital literacy and PS-TRE is that PIAAC only referenced use of laptop/desktop computers both in assessing PS-TRE and in the background questionnaire that asked about adults’ uses of digital technology in work and non-work settings.  Use of tablets, mobile phones and other hand-held devices was not included (though likely will be in future cycles).

Greetings, panelists and all. I am wondering about the value of distinguishing between digital literacy skills and problem solving skills in technology rich environments (PSTRE). It would seem that the first leads to the ability to do the other. In my view, regardless of the difference, adult learners entering adult education in our programs need to solve problems in any environment. They often lack the confidence and experience to do that. Using digital literacy to help them experience problem solving  might offer two benefits among others: (1) provide engaging and varied ways to solve problems independently and (2) offer practice in using digital technologies. Both will serve them well for a lifetime.

Adding to David's questions, I would like to know how you approached helping adult learners with little experience in solving problems to even start solving them in technology-rich environments. What kinds of bait worked, and how quickly did learners gain confidence to go at it on their own?

Thanks Leecy!  I agree with your view on problem solving, that digital literacy and problem solving go hand in hand.   Having walked many adult learners through digital problem solving activities, there is an engaging quality to these scenarios that keeps them relevant and interesting.  Some of the adult learners I worked with share that they feel like puzzles or brain teasers.  I think the key is knowing when to let the learner struggle and ultimately find a solution independently and when to step in to provide support to avoid frustration.  That balances, of course, dynamically shifts as learners become more comfortable and confident with their digital skills and problem solving strategies -- and sometimes this can occur in the course of a single session.  

I'm always an advocate for contextualized learning, and I think combing work on digital skills and problem solving feels authentic and provides an engaging context for learning.  What do others think? 

I agree that problem solving is important in any context, including those where there is no technology present. The research behind PIAAC's perspective on problem solving suggests that a problem is encountered when one cannot complete a task using a habitual action or comfortable routine.  The increased presence of technology in our society means that more and more task require problem solving rather than routine.  

Leecy and others, I agree that no matter the environment, we need to increase learners' problem solving capacity. So many of our learners come to us feeling a lack of faith in themselves to be in control of their education or in areas of their life. There is a huge difference in engagement with my students that have lowered defenses and began to trust a bit when compared to some of the newer students that may come in carrying many negative experiences and a lack of trust in others. Success is a wonderful builder of trust! When I give students an activity that is relevant to the student's immediate needs and the student finds success in that activity, there is a build up of trust that I am there for them and not for any other purpose. It is not always easy to hit success on the first attempts, but even in unsuccessful ventures trust builds when I help the learner assess why things did not work out and we come up with another option. This sells the message that I am not going to give up and introduces the idea that "failures" are simply ways we learn more about ourselves and what options we might find more success with. 

I have experienced that tabletop gameplay that is face to face and has integrated social elements can be very effective in building some of the problem solving elements that are key to any environment. Not every learner will immediately latch onto the idea of "play" being part of an academic process. For those that do latch on, I have observed much improvement in program solving skills. 

Thanks for the great questions, David!  

1. Tell us (more) about how you got interested in digital literacy skills, and in problem solving in technology rich environments.

I come at examining these constructs having developed several scenario based assessments that determine proficiency with the new literacies of online reading and research.  Working with colleagues at the University of Connecticut, in the ORCA project.  This work has yielded a very strong professional development suite of materials for understanding the underling constructs for online reading and research.  The PD modules include both a "Show Me" and "Let me Try" components to help educators connect with a variety of examples of different skills and strategies involved in digital literacy, online reading, and digital problem solving.  Although digital literacies and digital problem solving were not terms used in this work, I have come to see these constructs as highly related and overlapping in some ways.  Participating in this ORCA work with adolescents and their teachers paved the way for me for looking at the digital problem solving skills assessed in PIAAC. 

2. How does PIAAC distinguish digital literacy skills from problem solving skills in technology rich environments (PSTRE)? How do you distinguish these if you think about this differently?

I think digital literacy skills transverse multiple contexts including (but not limited to) gathering online information of all types (e.g. health information, information related to topics of interest, or to address daily needs such as bus schedules), communicating with others, and using online information in transformative ways.  However, the PSTRE digital problem solving elements are situated in only three domains:  personal, workplace, and civic.  These three contexts are important but don't necessarily address the full range of contexts available.  For example, creating online information is a common online activity for many of us (sharing photos, creating content to share with others, expressing creativity through audio, video, or other non-linguistic forms) however, it's not represented as a part of PSTRE skills.   I think these constructs are evolving and we have a responsibility to ensure they ultimately come to represent a fuller, broader range of online activities.  

1. Tell us (more) about how you got interested in digital literacy skills, and in problem solving in technology rich environments.

In 1979, while working at SCALE (Somerville Center I volunteered to write a proposal for a computer skills/office skills training program. After we received funding and hired our computer specialist, he told me we needed to hire a second teacher since he could not do everything himself. I hired myself. The program went well, and I was hooked. I used Wang Basic, a programming language, to write a payroll program that we used for a few years until the city of Somerville implemented its own system. I got Word Star, one of the first word processing programs and Visicalc, the first spreadsheet , and in addition to teaching them to students, used the programs to write proposals and budgets, as well as my first book, Number Power 6, Word Problems. I taught these skills to our full-time staff and we became a proposal writing factory that tripled in size during a period when other ABE programs were cutting. One of our new funders insisted that we develop internships for all of our students as part of the training process. We soon started getting back reports that our students were rapidly being recognized as the word processing experts of their offices, even if that office used a different word processor than what we had used in our training. I believe that this happened because I never taught rote memorization, but instead tried to help our students develop understanding and fluency so that they could use any word processor as a tool to accomplish what they needed to do.

Fast forwarding over a decade, I was the Technology Coordinator for Northeast SABES, and offered a workshop in creating a website. This was the early days of the Internet, and most websites were created using HTML. At the time, no ABE centers in Massachusetts had their own websites, so I had three programs each send a team of two or three staff to our lab at Northern Essex Community College in Lawrence. I taught them basic html in the context of their creating a draft website for their agency. We met three times, and every program had a website of a few basic screens. It was a more trusting time, but it was still unusual that I was able to install a server that I was able to set up as a web server so that I was able to host all the agency websites, and help them with updates for a few years when they were all gradually replaced by more high-powered websites created and maintained by website professsionals.

Fast forward again, and I am now a computer skills instructor at the Careers in Banking and Finance program at the Asian American Civic Association. I was using their skill-based curriculum, which was very similar in approach to what I had done years before, when my focus had at least partially still been skills based. However, it became clear to me that most of the students already knew the skills we were teaching and that they were just going through the motions. At the end of that cycle, I started creating a project-based approach to replace the outdated skills based approach. For example, I had them do a report on the bank of their choice. I then asked them to use that info to create a PowerPoint presentation of at least five slides. Beyond asking them to use PowerPoint, I gave no more instruction. Some students had some background in PowerPoint, so they took the lead. It was OK for students to work either in a group or alone, or to ask other students or me for help. I had not used PowerPoint in years, so I ended up learning along with the students. When some of the students had completed their drafts, they sent them to me and I displayed them on our SmartBoard and we critiqued them. The suggestions helped inspire other students to revise their work. I think of this as a Project Runway approach, where the instructor lays out the challenge, makes a few suggestions regarding approach, tools, and materials and collectively critique the results.

The Institute for Museum and Library Services (IMLS) funded a two year study where our research team at Portland State and public librarians at Multnomah County Library in Portland could collaborate on PIAAC research designed to improve library practices, programs, and services for adult library users—especially economically vulnerable and socially isolated adults, seniors, English learners, and others lacking basic digital literacy skills. Data were collected using  Education and Skills Online (ESO) -- a PIAAC innovation that enables researchers to assess PIAAC skills in specific populations and settings and to compare these scores with other measures of performance in given contexts. In this study, we assessed the PSTRE skills of adult library users in relation to their performance on problem-solving tasks encountered in the technology-rich environment of libraries. This research will ultimately yield a data-grounded learning progression that describes all levels of digital problem solving using observable strategies. This taxonomy can be used to design effective learning sequences aligned to learners’ needs.

Approximately two hundred library users completed the PSTRE and a background questionnaire that included demographics, Internet use metrics, and perspectives about the library’s digital resources.  In order to further interpret the scores of the PSTRE, we administered 17 verbal protocols asking single individuals and pairs of adults to think aloud as they worked through the PSTRE items sharing their decision making, voicing questions and confusions, and documenting their processes.  These verbal protocols were audio recorded and screen-captured.  In addition, these participants completed five digital problem solving tasks using the library’s website. The library tasks were designed using the PSTRE framework that considers the intrinsic complexity of the problem and task directions.  The library tasks (there were 5 total) provided a cross-walk between skills assessed on the PSTRE and those that library users would use in the technology-rich environment of the library.

Inferential statistics were run to examine the scoring patterns of different demographic groups who participated. Coding of the verbal protocols is being mapped onto the cognitive dimensions of goal setting and monitoring progress; planning; accessing and evaluating information; and selecting, analyzing and transforming information.  In addition to these areas, a suite of additional problem solving skills that emerged through inductive analysis.

Preliminary analyses have revealed a set of observable strategies that seek to explain questions such as, what does the planning phase of digital problem solving look like? How do individuals self-monitor?  Verbal protocol observations provide a grounded view of what digital problem solving processes actually look like within the PSTRE items and other problem solving tasks across proficiency levels from low- to highly-skilled. We conducted these analyses to help interpret the scores gathered from the PSTRE in order to important deepen the field’s understanding of both the PSTRE framework, digital literacy, and problem solving skills for libraries and their users.

I offer this information as background so that you can better understand some of the emergent findings I will share in my next post and how these insights were gathered.  Because this research that is still in progress, the insights I will share are preliminary and initial as data analysis is still underway.  

In our sample of 200 library users...

  • Individuals who access the Internet primarily at home had higher PSTRE scores than those who accessed the Internet primarily from other locations (school, work, library, other) 
  • Individuals who access the Internet primarily from the library had lower PSTRE scores than those who access the Internet primarily from home or work. 
  • Individuals who access the Internet primarily from their cell phone had the lower scores than those whose access is centered at the library, home, or work.  

This pattern of initial results suggests that at home Internet connections allow users frequent and flexible use of online resources and digital tools, which may be a factor in higher PSTRE scores.  Those who rely primarily on public use computers have fewer opportunities to engage with digital resources, social networks, and digital tools. Less time online and fewer opportunities to use the Internet flexibly may be a factor in PSTRE scoring patterns.  With regard to cell phone access, these initial results suggest that screen real estate  may play a role in getting the full picture of the online environment and resources available.  Limited screen real estate and may limit navigation choices and and make digital problems more difficult to solve.  

While these are still preliminary results do they align with your experiences?  How might these patterns be useful in your work? 

Jill (& co-authors),

This work is going to be very useful!  My reaction is based on the difficulty I faced when writing the PS-TRE research to practice brief.  As described previously, PS-TRE is a process defined by three components: a task, technologies required to complete the task, and the cognitive steps required to apply the technologies to solve the problem and accomplish the task.  The most challenging task for me when writing the brief was to take the cognitive process and try to break it into teachable steps.  As you can see if you scroll up to the list in my post (or in Jill's post), the steps are fairly broadly framed and represent cognitive process rather than easily observable action.  What their research has done (particularly the taxonomy and the metacognition exposed during the think-aloud activities) is to make more explicit the actions required to accomplish the parts of the cognitive dimension of PS-TRE. 

The observable strategies listed in Jill's post (e.g., "what does the planning phase of digital problem solving look like? How do individuals self-monitor?") will provide useful guidance for defining skills and creating instructional activities.

Based on our observations, problem solving in technology rich environments appears to be a mix of...

  • The willingness and ability to be systematic and thorough in reading the instructions and understanding the task,

  • Being able to plan for problem solving. This includes

    • surveying the territory to know what to attend to and what to let go,

    • understanding the resources available before starting

  • Being flexible to try a different strategy when the direction you’ve headed in didn’t yield the results you expected

  • Working in a trial and error manner to discover a way to accomplish the goal

  • Having the ability to remember how the interface worked in a previous task and applying that when moving from one task to another (developing cognitive flexibility)   

  • Having the practice of double checking one's work,

  • Being able to self monitor, go back, and rework when new information is gathered or when a mistake has been made

  • An intuitive or experience-based sense of how a site/platform is organized,

  • Having the willingness and confidence to experiment,

  • Having the willingness and confidence to let go of what one already knows if it doesn’t work in the new environment

  • Being able to judge when a task is complete, and thus when to move on (confidence). 

Many of these observations represent cognitive or meta-cognitive skills rather than technology or digital literacy skills.   Are there other factors we should consider (e.g. environment, genre/context, engagement/authenticity, etc).  There are many ways to explore these ideas and your insights are highly valuable to our process.  Thanks for sharing your reflections and thoughts in this community forum. 

Jill has put together a useful list of Early and preliminary observations from verbal protocol analysis. In particular, she points out that "Many of these observations represent cognitive or meta-cognitive skills rather than technology or digital literacy skills." I think that we are looking too narrowly when we only look at technology as computers and other electronic devices. I play the guitar (not very well), and think that positioning my hands for a G-chord can be considered a literacy skill, where the objective is to be able to produce the desired sound almost automatically without giving it a conscious thought. On the other hand,putting in an extra pause followed by playing the next chord of my favorite song, "Get a Job", loudly always gets the audience reaction I am looking for. That I would consider to be a problem solving task.

While at the AACA, I taught two cycles of the TechGoesHome program, which seeks to get Boston residents an introduction to computers and a new Google Chromebook for just $50.There were some basic skills that I wanted them to have, such as using a mouse and keyboard, creating and using an email account, and creating and sending an email attachment. The most reliable project that I came up with I called going home. Just about everybody wanted to send a picture of their home village, city, or country. I had one student who did not want to do the project, but I convinced her to give it a try. She created a PowerPoint with just two slides. The first slide showed a beautiful garden with the caption "This is the Iraq my grandfather told me about." The second slide had a burning building in the background while in the foreground there was a bleeding dead body. The caption said, "This is the Iraq that I know." 

Hmmmm. I'm wondering if those results might also point to differences in economic and educational achievement. In my region, the poor and often under-prepared residents do not have Internet access at home. They would have to access the Internet through bullets 2 and 3. I would expect those who can afford connectivity at home would be much better prepared to learn from it. Just wondering. Leecy

Leecy, I was having similar thoughts about economic situations and how those factored in. It seems obvious that lower economic status is correlated to lower access to technology, but I am not convinced that lower economic status negatively affects problem solving skills.

I have two points of reference for this doubt. In my own personal experience as a teenager, I had no phone, no electricity and certainly no Internet at home all through high school and yet, the nature of having to "make do" in a moderately impoverished environment forced me to constantly problem solve for very simple life things. For example, when hauling drinking water from half a mile away from our house, it was vital to trouble shoot ways to easily haul water back home without spilling water while using the recycled containers we had available for hauling. Spilling water meant more hauling so I developed many tricks that I would not have otherwise explored if I could just turn a tap. 

My second point of reference that brings doubt to a negative correlation between economic status and problem solving is the Hole In the Wall experience. If you have not heard of this or you don't have time to watch the video, the idea is that a computer station with Internet access was given to a population of non English speakers. In very short periods of time, much of a community not only taught each other technology skills, they learned English fairly well. I hear much focus on how a lack of computer access being a deterrent to learning, but I think it is important to think about what learning is being assessed and how it is assessed in those discussions. Clearly, the community in the Hole In the Wall experiment were able to learn tons of technology and those people did not even have English fluency when they started which implies there may not be simple, easy coorelations between access to technology and some learning. 

I have shown Sugata Mitra's TED Talk on the Hole in the Wall Project to a number of groups, and am glad to see Ed mention the project. What I took away from this TED Talk is the importance of the decision making power being totally in the hands of the children. It suggests that power to use the tool might be more important than economic status. 

In reading the sample results in the bulleted items, I had this one thought that kept coming to mind. Is there a correlation between PSTRE scores and how restrictive a user's computer environment is? When home, there are probably very little restrictions on the computer use. When in a library using the computers there are a load of restrictions. Cell phones offer not only the limitations of size and often lack of good web design for phones, but there are significant interface differences between phones and computers that might negatively influence PSTRE scores. 

These thoughts bring me back to all the formal digital technology I have experienced. In situation where I was restricted to "stick to the script at my pace and my direction" from instructors I don't even remember any significant learning. I simply remember the frustration of the restrictions. Alternatively, I can point to those experiences that had almost all doors open to me as I learned. The teacher was always there to offer new tools or options on an "as needed" basis as I worked through projects I chose to engage in. 

In some computer labs, I experienced almost stifling control measures in place. Sites were blocked, permissions needed to use any application and computer access was removed if a user was caught diverging from the assigned task at hand. In contrast, in wide open labs with almost no restrictions, more exploration and learning "what if" thoughts were present. This leaves me wondering if the level of restrictions or control over environments or learners contributes to a deficiency in problem solving skills. I suspect that with higher restrictions or control, learners learn compliance well. My thoughts here seem to resonate with employers' thoughts that employees today are good if given instructions, but suffer when having to make decisions for themselves. 

I love the use of "think aloud" in the data collection. Learners often are uncomfortable expressing their thought process and I find I need to personally model my thinking through solutions with students for a good week or so before they start getting more comfortable with the idea. Jill, did you encounter any awkwardness from participants in their sharing of thoughts as they tried to process? Were prompts needed at times to keep the thoughts verbally flowing?  I use think aloud constantly when teaching community members how to play table top games. During my turn I explain all the thoughts I have about the moves others have done and what I perceive as my options this turn. I extend this verbal thinking to what each option might lead to and how I come to a conclusion that a specific move is in my best interested in the short or long turn. Then I make my game move. After a few games I start asking players to share their thoughts during their turn and all of the players start to open up. Many start to do this verbal thinking on their own after awhile. Everyone shares how helpful it is to hear the thoughts, perspectives, and strategies each person shares. It would be interesting to hear from Jill if she noticed any correlations between PSTRE scores and how easily participants engaged in verbal thinking. Of course it may have been difficult to record participant's ease of engaging in verbal thinking.

Ed,

Great comments relating the connection between restrictive access and use of technology to problem solving. "In some computer labs, I experienced almost stifling control measures in place. Sites were blocked, permissions needed to use any application and computer access was removed if a user was caught diverging from the assigned task at hand."  In PS-TRE lingo these restrictions raise the complexity of the task by creating impasses in the problem solving process. We might think of a linear process for problem solving (set a goal, make a plan, find/use information), but when faced with an impasse, a problem solver needs start planning a different approach. This common occurrence in real life makes problem solving a non-linear process.  

Jen, those restrictions certainly do add in complexity to a task! I am curious, does the PS-TRE exam include some artificial restrictions like sites being blocked? If so, it would be very interesting to see how the responses to the artificial complexity elements compare to problems that are otherwise more complex. In thinking about humans respond to challenges offered by other humans the reaction is often different than how humans respond to challenges offered by "things". For example, if your mother tells you that you can't take the car out to go see your friends that invokes a different reaction than going out to the car and turning the key and only hearing a "click" noise because the car is not working for some reason. Both are challenge, but the human response to each can vary quite a bit. I can imagine some raised voices in frustrations in both situations, but in the human situation those raised voices may be enough to "fix" the challenge while cars have not been fixed by yelling at them in my experience 

 

I yell at my cars and my kids - it doesn't work in either case!  But seriously....

Unfortunately, the Conceptual Framework document does not provide any more info on what an impasse might look like. Your question and example sparked what I've heard many consider a validity issue with the PSTRE assessment - that the items cannot possibly replicate situations that occur in real  life. That being said, I think PSTRE (maybe not the assessment itself) is useful for at least providing another means buy which to describe a problem solving process.

One of the best ways I've found to capture "authentic" dialogue that represents true problem solving is to ask two or more individuals to work together and discuss their decision making as they go.  We conducted our verbal protocols this way, even though there was sometimes a large skill gap between individuals.  It was useful to see the problem solving process and collaboration used in tandem.  The interactions and explanations occurred as a natural back and forth communication exchange.  Ed, I love the ideas of modeling think alouds.  I look forward to trying this strategy!

Colleagues,

We are already about halfway through our discussion, and it feels like we have just begun. I am fascinated with the findings that Jill Castek has reported, and I'm looking forward to hearing Jen Vanek's observations today about how how the PIAAC PS-TRE findings can be applied in the classroom. I am also looking forward to tomorrow, and Steve Reder's PIAAC PS-TRE research and recommendations. However, I would like to see more questions, and examples, from the Technology and Learning, Program Management, and other members in the LINCS Community, for example in reply to questions 2 and 3 below.

I have more questions for our panelists.

1. I think we need some background on the PIAAC Survey of Adult Skills.

  • Although many people in our field know about the PIAAC Survey of Adult Skills, for example that there are three domains: Literacy, Numeracy, and PS-TRE, they don’t necessarily know why the creators of PIAAC decided to include this third domain, and what they were trying to measure. Steve, can you shed some light on that?
  • Steve, Jill and/or Jen, can you summarize the key PS-TRE findings for us? What is it that teachers of adult basic skills should know about them?

2. Kenny and Ed, you are experienced teachers of technology and digital literacy skills. How and why has your teaching grown to include PS-TRE skills, or have you included them all along? Some people believe that students should learn digital literacy skills first, and then learn problem solving using these basic and perhaps more advanced technology skills. What are your thoughts? I hope other members of the panel and the LINCS community will also tell us what they think, and why.

3. Kenny, Ed and LINCS community members: tell us about your experience teaching digital literacy and/or PSTRE skills. Share anecdotes, aha moments, and questions you have.

David J. Rosen, Moderator

Technology and Learning, and Program ManagementCoPs

djrosen123@gmail.com

 

Technology, and by extension digital literacy, is changing at increasing rates. To assume learners will always be current with the latest and greatest options is not practical. In contrast, a focus on problem solving and the flexibility of thinking that is inherent in those skills, allows learners to be able to work through changes and different environments. As I mentioned in an earlier post, I grew up in a household that was constantly a problem solving laboratory. I think about all of the situations I have been in where "experts" and highly trained people sat there scratching heads while a solution or suggestion just popped into my head as if it was a habit my brain had adopted. Problem solving and flexible thinking can be taught and trained, but it is not easy to do in many environments.

How many students come into class environments asking a teacher, "What do you want me to do today?". Education has been so passive an experience for many and is something done to them. The teacher is in charge of direction, most choices, goal setting, reflecting on progress and adapting paces while the student simply shows up and complies with requests. This may be a bit of a simplification, but in my interviews with dozens of adult ed learners each semester, I continually hear student comments that echo how out of control learners have felt in their educational experiences. 

One possible solution has found much success in the Fort Kent (ME) Adult Education program. Their adult education staff spent considerable hours developing what they call curriculum guides. For every credit class, EX English 1, the curriculum guide would list three components. Learners would have a list of what projects or artifacts must be completed. This might include, write 4 essays with each one being of a different type from a given list, do 2 article reviews, provide a transcript of an interview you conduct with someone in a career field of your interest, etc. The second element found on the curriculum guide is a list of standards (from CCRS) that the learner must demonstrate. The language has been modified to a more simple language for the learners, but teachers have each standard code for reference. Finally, a learner sees a suggested goal layout of the semester broken into three parts (approximately five weeks each). In each of these three chunks, the learner finds a suggestion of which projects/artifacts might be completed by the end of the five week period. With this guide in place, the teacher's role becomes one of navigator and adviser. Learner and teacher work together to create the combinations of standard and product within the context of five week blocks. As learners got used to this format, it was so easy to work with each individual to develop the combinations of standards and activities that best fit the individual's experiences, goals and existing skills. Some learners take a long time to break the dependency of having the teacher make all the choices for them which is understandable because most of the student's experiences have been dictated to them for so long.  Most students adapt within a semester period very well.

Learners in this environment continually comment in semester reviews that they feel in control of their learning and that learning experiences seem relevant to their needs. After a semester or two in this environment, learners become so much more enabled to advocate for themselves and their flexibility of thinking is very evident. "Well, I just tried to find information to do what we were talking about, but I ran into these challenges ...the way I see it I could try this or this or that from here and out of those three choices I feel I might find more success with this choice because .... What do you think?" This kind of exchange is such a stark contrast when compared to my new learners coming in with, "What do you want me to do today?"

Interestingly, in the Fort Kent Adult program, there are no digital literacy classes as part of their HiSET or High School Credit programs. Even the most technophobe student learns the technology as part of the process of the mentorship environment. The teacher or other students are always available to help a learner get started with a technology or provide "just in time" support when the learner struggles with a digital function. Technology is introduced as the student needs it. As the teacher sees comfort and competency with one digital system, it is an easy matter for the teacher to offer suggestions in the next activity that will extend the student's experience into another digital experience using different tools. Throughout the experience, the ability to find and filter reliable information is an almost daily exercise as there are frequently few text books involved in most learner's work. 

Please don't think this is a quick process. I have experienced some learners that took three or four semesters within this student-centric model before the student started demonstrating real autonomy, flexibility and problem solving skills. With most learners, half a semester is an average amount of time before the gears kick in and a teacher sees a student transform from passive to active learner. Typically after the first semester, the teachers start to see more self advocacy and the learner begins asking about a wider range of options within the work being done. For those concerned about retention rates, the data from the last decade in Fort Kent has been very positive. Learners report that they show up because everything they do is about them or related to their lives or situations. There is much more motivation because the learner knows that when they complete the listed activities for the class, the class is complete and they are ready for the next opportunity. There is still a semester structure in place in the program, but that is usually in place simply for when new learners go through the admissions and intake processes. Those learners in the system may process 1-4 classes a year all within the same learning lab schedule. 

Having participated in this model as a teacher for six years, I would share that I have felt the most success as a teacher facilitating the growth of individuals within those years than at any other time in my 20+ years of education. Learners came out of those experiences with much improvement in thinking, decision making, reflection, resiliancy, and ability to adapt.

I would be curious to hear of other models or experiences people have had that have helped learners build upon their problem solving experience and proficiency. Perhaps others have found more success by focusing on digital literacy skills first then jumping into the problem solving after? I would be interested to learn how that worked in your experiences.

 

David and panelists,

David-thanks for organizing this discussion, and many thanks to our panelists for all of your valuable information. i was wondering if someone can address the online PIAAC survey that is available to the public. I was wondering if any of you knew how widespread is its use, who is using it, how it is being used, and whether people are finding it helpful and how. I know, lots of questions... Sorry about that! I have been considering using this survey in my research, but I am waiting to see how and if it is being used in the field.

Daphne Greenberg

Georgia State University

Hi,

Here's a quick overview of US performance. I put the references in footnotes in case anyone wants to follow up.

The PIAAC surveys were given to nearly 166,000 adults aged 16-65 in 24 countries and measured literacy, numeracy, and PS-TRE. PS-TRE measured technology use and higher-level cognitive skills required by the prevalence of non-routine tasks now common in daily life. These were represented in PS-TRE as test items requiring proficiency with 1) accessing information through information communication technologies (ICTs) and/or 2) solving problems that existed because of the presence ICT itself.

US participants performed poorly on the PS-TRE; the average of US participants’ scores were lower than the overall average of all countries’ participants (OECD, 2013, p. 11).  

Of the 5,000 adults who completed the PIAAC survey in the US in 2013 (the original sample), many could not even take the PS-TRE, which was only available on the computer-based version of the assessment. Those participants who took the paper-based test did so because they had no computer experience (5%), did not have basic computer skills (4%), or opted out of the computer-based assessment for other unknown reasons (6%).

Of the adults who did complete the PS-TRE, the US had the highest percentage of participants (15.8%) scoring below Level 1, the minimum proficiency level required to succeed with simple problem solving tasks encountered in daily life (OECD, 2013, p. 21). Other troubling stats:

  • 70% percent of adults aged 35-64 had low PS-TRE skills[1]
  • 58% of Millennials (young adults born after 1980 and between the ages16–34) tested at the low-skill level despite spending 35 hours per week using digital media[2]
  • Scores for Millennials in the US were among the lowest reported among all participating countries[3]
  • Of the 13% who took the paper version of the assessment 30% reported being out of the work force and 41% reported educational attainment below a high school level,3 suggesting a correlation between proficiency with skills required to complete the computerized version of the assessment and employability.

I’m sure the results are not surprising to practitioners who see learners struggle daily, navigating through tasks associated with things like communicating with their children’s teachers, using public transportation, or finding information that is only available online. Such struggles are also likely evident in the classroom, too.

Teachers also know that adult learners need PS-TRE skills for work, and there is an interesting study that nicely illustrates this. The skills articulated in PS-TRE mirror those more rewarded by employers (Shatkin, 2012). The study looked into O*NET to explore statistical correlation between thirty-five articulated skills and the occupations included in the O*NET database to determine which skills best supported employability.  The top five were, in this order:

  1. Judgment and Decision Making
  2. Complex Problem Solving
  3. Active Learning
  4. Reading Comprehension
  5. Critical Thinking

Jill, please fill in where I missed something.

Jen

[1]Does Not Compute: The High Cost of Low Technology Skills in the U.S. and What We Can Do About It, (2015) Change the Equation.

[2] Goodman, Sands, and Coley (2015).  America’s skills challenge: Millennials and the future. ETS.

[3] Time for the U.S. to Reskill? (OECD, 2013).

Jen, thank you for sharing these stats. I found myself reading with many start and stop moments. I would read a stat feeling a bit shocked and then I start reflecting on my personal experiences with learners and seeing elements that might contribute to the stat. Then I go on to the next with a similar shock then connection experience.

I would like to start with the last tidbit you shared about the top 5 skills. I have been playing with Habits of Mind and thinking about how we might promote each of these habits and track progress. I made up some habits of mind cards, printed out 10 copies and cut them out into sorted stacks. My goal was to try to identify each habit in my students and hand the appropriate card to the learner and ask them to record the action I just observed with the date. I was hopeful that at the end of a month I would have used up my stacks of cards and learners would have a nice collection of cards that documented their positive growths or patterns. I was partially successful in a few ways. I did not come close to using up my cards, but that was mostly due to my ignorance with many of the habits. There were some cards that were used up quite well and I felt comfortable in identifying examples of those habits in my students. I questioned if my learners were not demonstrating some habits or if I was just ignorant of how to catch the learners in the act. 

Learners became much more aware of the habits and started asking, "What does this one look like? Can you give me examples?" It was exciting discussions and ended with much exploration and trials with some students. The experience got me wondering what would the effect be if the Habits of Mind were our assessment tools rather than content standards in terms of adult successes in employment and life? What do you think the results of such a switch would be?

  • 58% of Millennials (young adults born after 1980 and between the ages16–34) tested at the low-skill level despite spending 35 hours per week using digital media[2]
  • Scores for Millennials in the US were among the lowest reported among all participating countries[3]

This one did not shock me. If we think about how much of life is controlled in our Mellennials' lives, it is astounding how disabled we have made this population. This generation was not allowed to "go out and play" because they would be trespassing on someone else land, loitering with friends where businesses did not want them, unsupervised because a parent was nowhere to be seen or heard from, probably vandalizing another's land if they chose to make a fort or any other fun activity and there are a long line of violations that would be imposed on parents "irresponsible" enough to allow such behavior today. Heck, even playing with peers is arranged by parents as are sporting events. Gone are the pick up games that never had scores. It comes as no surprise to me that such a generation should lack problem solving skills or be able willing to explore options outside what they know they won't get in trouble for. This generation focuses on compliance and doing only what is safe which is what many of our social systems have set the stage for. There are certainly some exceptions but most Millennials struggle with independence and problem solving because they have not been allowed to be in situations that build these skills

These were a couple of thoughts that came to mine reading those stats. What thoughts came to others?

 

 

Yes, we keep coming back to the problem-solving aspect of PS-TRE in this discussion. It's been a useful reminder to me to not focus so much on the tech side.  Since my work has been focused on digital literacy for so long, my lens for thinking about PS-TRE has been how it relates to it.  I've been thinking about it as a more focused means by which to provide opportunities for application of computer skills, but I appreciate what I see an emphasis on problem solving over computer skill. 

I'm also noticing an interesting tension between teaching a process for problem-solving, as suggested in the chart I posted yesterday and a less structured approach to learning that you suggest Millennials missed out on.  Do we the structure (& worked examples) to teach the process or would allowing an approach of trial and error be sufficient?

In looking at mathematics, there is much study and discussion that we offer too much information in our current math problems. Take geometry and volume for instance. A typical problem will give many of the dimensions, the learner goes and looks up the formula, plugs in values, computes and offers "the answer" and everything is right in the world of math. The problem is that the learner had to do very little in terms of problem solving or actual thinking beyond making the decision of what formula to use and following the appropriate computation. In contrast, when given a sketch of the same figure used in the original problem with no dimensions, the learner has tons of problem solving opportunities. If the problem is talking about gallons of maple syrup that have to fit in the container, obviously the sketch given is in a scale of some sort. Possibly we could derive the scale by using proportions but what measure should I use; metric or standard? Even after picking a scale and doing some measurements and proportions I have to look at the reasonability of my work when I go back to the context of packing up syrup into this container. 

With that math example as an example, I am strongly of the opinion that Millennials need opportunities to discover more and have less procedural compliance. The tricky part is that they have become so compliant based, they can get lost, frustrated and lack persistence on even simply tasks with a few unknown areas. If the instructions do not clearly point to a set procedure exactly as they have experienced in lecture, the Millennials suffer as a group. I have been exploring options to help bridge this gap with limited success. I have found one method that has had some progress but even that method runs into the constant barrier of lack of persistence.  I volunteer to run an informal education experience every Saturday at my local library. I use complex, highly socially interactive table top games to try to develop many learning objectives. My Millennials suffer even in this "fun" environment in that they find it difficult to focus on any given game. I have used multiple genera of game and many play styles  but there still is that period of time (usually 15 minutes) at which disengagement starts to happen. Ironically, my Millennials even share, "This is fun and all that but how much longer until the end of the game?" This response blows me away every time. I am working on persistence strategies by introducing stopping points in which all players stop and quickly report in writing what is going on and what choices they recently made and how they think those choices are turning out. I have not tried this yet, but I see the need to do some sort of stop out break to possibly insert a break that lasts as long as a TV commercial to see if that helps at all with persistence. 

I do see value in having the structure and examples available and I think those should always be available as a reference. I just don't think they should be the primary resource or focus as learners become so dependent on the spoon feeding and lack of individual problem solving inherent in these supports. 

I think that rather than hoping that learners develop a sense for how to engage in problem solving through informal exposure during classroom activities, teachers should consider some explicit instruction of the problem solving process. So, it might not be effective to simply show learners how to use a technology and then give them scenarios where they need to use it; rather, teachers could provide some amount of direct instruction of the process by which they plan, select, and employ relevant technologies for a specific task.  This could be done with “worked examples[1]”, scaffolded tasks that deliberately require learners to build proficiency with each individual stage of the problem solving process.

The steps for making this happen could look like this:

1) Teaching Learners why PS-TRE is important

Some class time early on should be designated for explaining to students the significance of the problem solving and the connection to use of Information Communication Technologies (ICT). Make it interactive by asking students to participate, sharing their past experience about not being able to accomplish a task because they didn’t know how to use an ICT or which ICTs were required. A teacher could also lead the class in a discussion about technology tools and their uses, and documenting responses in a graphic organizer (e.g., mind map, chart or table) to get learners to start thinking about how different tools align are called for in different situations.

2) Determining Needs

In a PS-TRE unit, teachers would need to determine the technology skills and learning needs of the students in the class, using the information to better understand which technologies need to be explicitly taught before learners could draw on them to solve a problem. I think that learner success with PSTRE is supported by achieving a balance between technology challenges and the level of difficulty of the problems being solved. In a typical mixed-level classroom there are usually learners with very low computer skills. With these learners, teachers might decide to use simple problem solving scenarios as context for practicing new computer skills. I think it makes sense for learners to work toward simultaneously strengthening digital literacy skills and gaining opportunities to practice the problem-solving process.

For example, say you pose a problem that requires learners to organize responses to an email about job applications. In a “worked example” you could actually send email to students that have different responses, like “You’re hired! “Send more information.” “No, thanks; we can’t hire you.”  Students could be instructed to organized the responses. For students who have mastered email, the teacher could work in using a second ICT for the task, like Excel. For students new to email, a could ask them organize email messages into folders.

3) Spell out the process.

I think that some students will benefit from having some structure or steps to follow as the try to solve less controlled tasks. For steps, we might look to the cognitive dimension of PS-TRE. In the brief that should be released early next year, I provide suggestions for instruction and practice of each step in the process. I’d suggest gradually introducing them as learners gain familiarity with each step.

In the interest of space, I’ll close this post by sharing an illustration of how the email task above might be mapped on to the cognitive dimensions – or steps. You might consider this way to structure a think aloud. The example is here, in this Problem Solving Chart in case you want to more easily copy it.

Problem Solving Chart - Note that when considering constructing a classroom activity for PS-TRE, you should include all three components: task, technologies, and the cognitive dimension (the steps).

Task:  Keep track of responses to job applications.

Technologies: Email and folder sorting, Microsoft Word or Excel for making a list.There is so much more I could say here, but I think this is a good place to start.

 

Step

What’s Involved

Student Notes

Set a goal

“Problem finding”. What do I want to happen so that I can complete the task? What is the end result?

 

Determine which of the employers I contacted might hire me.

Plan and organize

“Problem-shaping”. Create a plan for solving the problem. What subgoals, strategies, technology resources, or sort of information is critical for accomplishing the goal?

First, I need to read the email I have received and then I need to organize the information according to how each employer responded.

Subgoal

What is the first action?

Reading email from employers, creating email folders and filing email in correct folder: “Will Hire” or “Won’t hire”.

Monitor progress

Pay attention to your progress. Did you make a mistake in your planning and need to reassess the tasks and technology resources?

I correctly made two folders! Now I can sort the email.

Acquire & evaluate information

While locating and after finding information consider: Is this what I need to know? Can I trust the sources? Do I understand what it says?

Re-read email as I file them to be sure they are going into the correct folder.

Monitor progress

Pay attention to your progress. Did you find the right information? Do you need more?

While reading email I see an email from an employer that asked for more information. I see that I need an additional folder called “Maybe hire”.

Use the information you found

Consider the task required to make the information useful: Does it need to be organized, combined with information from another source, put into a different format? Consider how it will be best presented or shared.

Now that I have the mail sorted, I want to make a list.

Subgoal

What is the next action?

I will use Microsoft Word to make a list of employers who “will”, “might” or “won’t” hire me.

Monitor progress

Pay attention to your progress. Did you make a mistake in your planning and need to reassess the tasks and technology resources?

I realize that I have no idea how many employers I’ll have to contact before I get a job and I want to keep all of their information in one place. I know how to use Excel so I’ll make a spreadsheet instead of a Word doc so that I can keep their contact information and other information about them in one place and be able to sort and organize it easily.

Acquire & evaluate information

While locating and after finding information consider: Is this what I need to know? Can I trust the sources? Do I understand what it says?

Carefully read the email to gather all the important contact info included in the email and add it to spreadsheet.

Monitor progress

 

Consider whether or not you solved the problem. If not, go back to the beginning and set a new goal or add a subgoal.

In a few cases, I am missing the employer’s phone number. Must remember to search for it later if I need to call them.

Use the information you found

Consider the task required to make the information useful: Does it need to be organized, combined with information from another source, put into a different format? Consider how it will be best presented or shared.

I sort the spreadsheet to make a list of all of the employers who said they will or might hire me. I get a short list that also includes their contact information!

Jen

 

[1] Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Cognitive Architectures for Multimedia Learning Cognitive Architectures for Multimedia Learning. Educational Psychologist, 41(2), 75–86.

 

Hi,

The examples you provide Jen are a great way to support and nurture the skills of digital problem solving. I really like how your chart does a great job spelling out the key processes involved.  I can see this resource being a really helpful guide to educators and learners alike.  Bravo, Jen!

Hi all – I’ll post a summary and thoughts about my digital literacy/PS-TRE paper a little later today.  For those of you who’d like to glance at it, it’s available here:  Digital Inclusion and Digital Literacy in the United States: A Portrait from PIAAC.

Meanwhile, let me comment on one of the questions David posed for today: Although many people in our field know about the PIAAC Survey of Adult Skills, for example that there are three domains: Literacy, Numeracy, and PS-TRE, they don’t necessarily know why the creators of PIAAC decided to include this third domain, and what they were trying to measure. Steve, can you shed some light on that?

The problem-solving domain was first added by OECD (the agency running PIAAC) to its PISA survey, an international survey of 15 year-olds’ cognitive skills.  An important point is that the initial problem-solving assessments in PISA did not specifically reference technology-rich environments.  That focus came with the introduction of PS-TRE in PIAAC, which may leapfrog back into the next cycle of PISA.  In the long run, there may well be further elaborations of the domain such as assessing skills in *collaborative* problem solving or the merger of problem-solving into other domains like literacy and numeracy.

The development of technology has made it much more possible to wholistically integrate a project based approach with instruction of some skills. For example, with almost every group I work with, I make a pitch for them to learn and master typing so that their attention focuses on the problem to be addressed, not on the location of a key they need. There are a number of pitfalls with a skills first approach. One is the point that has been made so well by the participants in this discussion, but especially Ed, that such an approach decontextualizes learning, and often leads to student passivity and compliance. Also, there is a tendency to deemphasize core skills while putting unwarrented attention to rarely used skills. 

I like to use a carpentry class as an example. You would expect that an instructor would say something like "Today, we will design and build a table." rather than "Today's class is about hammers. Next week's class will be about screwdrivers." or "In this class we only use Craftsman tools. If you have Home Depot tools, we won't be able to use them."

I also think of all the people who took classes in Word when a new version came out, when actually if you are comfortable with Word97, you could probably figure out how to use the newer version to complete your project at hand.

When I teach word processing, I ask who has a home computer that does not have the Microsoft Office Suite and would like a comparable tool for free (my favorite price). I introduce them to the Google Suite, including Google docs (web based) or even Apache OpenOffice (computer based). The students can use in class whichever of the tools they choose.

I also ask my students if they ever go to Macy's or another large department store, and the item they were looking for was moved. In that situation, do they look for a Macy's Training Program, or do they just figure it out.

When I present a group of students with a project to do, I tell them what will be required elements. I also sometimes suggest additional options they could add, and I am always open to their suggestions. 

One of my favorite projects I called "Let's Go Shopping." The students went on a virtual shopping spree on the Internet. They selected the stores and the items they wanted and calculated their savings by designing and using a spreadsheet. Last year, we did this project the week before Black Friday. I told them that I needed them to carefully document all their purchases because we were going to return to the "Let's Go Shopping" spreadsheet the week of Christmas Eve and the week after New Year, adding the current prices of all those items "purchased" on Black Friday. Their research question was: "When was the best time to buy: Black Friday, Christmas Eve, or after New Years?"

Thanks, David, for the invitation to summarize some research I’ve done with the PIAAC data from the U.S.  A full description is available in my 2015 paper:  Digital Inclusion and Digital Literacy in the United States.  Here are a few comments about what the paper looks at.

As we think about the PSTRE data in relation to issues of digital inclusion (i.e., individuals’ access to and use of digital technologies), it’s important to consider non-PSTRE information PIAAC gathered about adults’ access to and uses of computers (the only digital devices considered in the initial cycle of PIAAC).  PIAAC’s background questionnaire included information about individuals’ prior use of computers and the frequency with which they use computers to perform various tasks in the workplace as well as outside of the workplace.  Individuals who had previously used computers were asked if they preferred to take the skills assessments with paper and pencil or with a laptop computer.  Those who chose computers were asked to perform a small number of tasks demonstrating use of the mouse and keyboard, and those who passed this computer screening or readiness test were administered the literacy, numeracy and PSTRE assessments by laptop.  All others were given paper versions of the literacy and numeracy (but not PSTRE) tasks.

Based on these data, I categorized individuals as being in one of four stages of what I called a digital inclusion pathway, depending on the immediate barrier they faced in becoming digitally literate:

  Access stage - had never used a computer;

  Taste stage - had used a computer but chose to use paper and pencil rather than computer;

  Readiness stage – chose to use computer but lacked basic mouse & keyboarding skills;

  Digital literacy stage – has basic mouse and keyboarding skills and varying levels of PS-TRE skills.

Breaking up the digital inclusion pathway in this way can help us better understand and meet the needs of diverse adults having disparate kinds of issues working with technology.  Those in the digital literacy stage are at different stages of development as reflected by their PSTRE scores and indexes of ICT use at work and outside of work settings.   The paper uses this framework to focus on two major digital literacy issues along points of this inclusion pathway: digital equity and digital embedding.

Digital equity is defined in terms of differences in groups’ distribution in these pathway stages after accounting for differences due to other variables such as age, education and employment status. Digital equity is examined with regard to gender, race/ethnicity, and national origin. The analyses show distinct patterns of digital inequities across the inclusion pathway associated with each of these characteristics.

The digital embedding of an economic or social outcome is defined as a specific pattern of association between that outcome and an index of ICT use at work or outside of work after statistically controlling for differences due to demographic characteristics, educational attainment and assessed PSTRE proficiency. So it is the underlying relationship between patterns of ICT use in everyday life and a given social or economic outcome that reflects the “digital embedding” or not of that outcome.

The paper examines digital embedding of six economic and social outcomes in the PIAAC data: individuals’ earnings, employment, social trust, volunteerism, political efficacy and general health. Digital embedding of earnings is found among workers but no digital embedding of current employment status is found among the general adult population. Digital embedding is found for each social outcome in the general adult population: social trust, volunteerism, political efficacy and general health.  These results suggest that digital literacy development should have goals of building broad usage of ICT technologies in both workplace and non-workplace settings, and will likely be associated with a wide range of important social and economic outcomes.  It's important to note that these results occur after any effects of PSTRE scores are already taken into account, so that these outcomes are embedded in the everyday *uses* of ICT technologies.

Thanks for the article synopsis.  The finding about digital embedding being evident in the social outcomes is heartening for teachers who have a sense that digital literacy skill development should be supported by classroom work across a range of contexts. The absence of digital embedding for current employment status was surprising to me. I wonder if you could share your thoughts about this finding? 

Jen