Day three of our discussion on “How can technology transform adult education and current practice?” focusing on Assessment

Good morning and welcome to day three of our discussion on How can technology transform adult education and current practice?”  The section of the report we are concentrating on today is Assessment (discussion will be cross-posted to Formative Assessment group).

Art Graesser and David Rosen will continue to share their reflections on the draft report Connected Teaching and Personalized Learning: Implications of the National Education Technology Plan (NETP) for Adult Education, produced through a contract with the American Institutes for Research (AIR).

We also look forward to hearing from you either in response to the Assessment section of the report, to Art's and David's posts, or other group members' posts.

 

Related Documents and Resources:

Connected Teaching and Personalized Learning: Implications of the National Education Technology Plan (NETP) for Adult Education

National Education Technology Plan 2010: Transforming American Education: Learning Powered by Technology.

Comments

Thanks, Nell.

 

As we begin to discuss the Assessment section of the NETP Report -- and continue our discussion of the Learning section -- I have several questions I would like to ask of teachers and other participants here. I hope we have lots of responses, especially from those who have joined us today in the LINCS Assessment CoP, but also from others who may be interested.

 

1.     The Assessment section of the report begins: “Many people think of assessment as a test. Although testing can play an important role, assessment is a purposeful process that involves collecting information, analyzing that information, and making decisions on the basis of that analysis. In education, a test is one source of information that is analyzed to inform decisions about student learning and achievement. Other sources include prior learning opportunities; work samples such as papers, presentations, or products; and observations of students as they perform a task, such as rebuilding an engine or conducting a water-quality test.”

 

This introduction focuses on what is frequently called formative (rather than summative) assessment.  How do you use technology for formative assessment in your teaching? That is, how does technology help you and your students know in what areas they are progressing and in what areas they may need further help or a different learning strategy or method?

 

2.     The NETP establishes five goals for using digital technologies to enhance educational assessment. At a high level, these goals aim to improve the quality of information provided by assessment instruments and to enhance educators’ ability to use assessment information to improve teaching and learning.

  • The first goal focuses on providing timely and actionable information about student learning that is designed to improve achievement and instructional practices.
  • The second goal aims to use technology to improve the quality of assessment materials for both formative and summative purposes.
  • The third and fourth goals address opportunities to improve assessment by using technology to improve student engagement and motivation during assessment, particularly by embedding assessment into learning activities and making assessments more accessible by applying the principles of Universal Design for Learning (UDL).
  • The final goal recognizes the importance of protecting privacy while also capitalizing on the volume and variety of data that can be collected, analyzed, and shared through digital learning and assessment environments.”

 

Do you think all five goals are important for adult education assessment? Are some more important than others? Which ones?

 

3.     The report includes short-term strategies in these areas:

a.     The report says that technology offers many new kinds of test items, ones that may be better measures of the range of learning we expect of adults as well as of children and that can be used for formative as well as summative assessment.

b.     Technology can be used to create authentic performance-based assessments

c.      Cognitively diagnostic (formative) assessments that could help to analyze cognitive misunderstandings

d.     Accessibility and UDL

 

Which ones are important or intriguing to you and why?

 

4.     The report includes long-term strategies in these areas:

a.      Peer matching, in online or distance learning where of hundreds or thousands of students technology can make matches based on ability or interests to improve retention and increase support

b.     Learning opportunity tracking analysis where “creating digital records of learning opportunities for individual learners has great potential to inform future learning opportunities. For example, the growing interest in portable learner profiles means that adult learners can manage information about their learning and achievements while personalizing their records beyond what each school might track.”

 

Do you think these strategies would be useful to you as a teacher, to the adult education field?

 

5.     In the implications section, concern is raised about the need for policy and practice to assure privacy and security of students.

 

Is this an issue you are concerned about? What policies would you like to see?

 

David J. Rosen

djrosen@newsomeassociates.com

Colleagues,

 

Using technology in adult education opens up exciting new possibilities for assessment, both formative -- informal learning progress assessments that show students and teachers how the learner is doing -- and summative assessments, that show whether or not the learner has mastered the content or skills. Thanks to Larry Ferlazzo, a never-listless California adult education colleague (Larry produces a lot of great adult education resource lists!) I have learned about Beyond the Bubble, a new kind of contextual history assessment from Stanford University. http://beyondthebubble.stanford.edu/. You will find a quick and amusing animated introduction to this at https://plus.google.com/110822531939544743586/posts/FAsUdq5V9fv?cfem=1

 

Yesterday I also learned about a free iPad (and I believe soon to be Android) application called TouchCast that enables you to create videos with embedded “video apps” that allow you to draw on a video, add background pictures, and embed maps, images, twitter feeds and other touch screen tools. Imagine formative assessments built by teachers using a tool like this!

 

These are just a few of the kinds of tools that are now available for assessment.

 

What tools are you eager to use, or have you already used to make your own learning assessments. What tools would you recommend?

 

David J. Rosen

djrosen@newsomeassociates.com

I could imagine at a basic level ... I remember in my American History course grabbing pictures from scenes and having students identify the era from which they came (Great Depression, Roaring Twenties, etc) and how they knew.   

I'm thinking that being able to draw on a video or image would let a student do more formative things, like notice things that they might not yet understand the relevance of... ask questions... and share and collaborate.  

Susan,

I like that you see TouchCast as a tool students could use to make things: presentations, project reports, assessments, reflections. I wonder if you, or others, see TouchCast as a useful tool for students' e-portfolios. I also wonder if manipulating the V-Apps in TouchCast would help students get comfortable with the skills they need to manipulate various windows for the GED2014 Math test. Do you, others, think so? I hope that you and others who make something with TouchCast will post the URL here so we can all take a look, learn to use this tool together be inspired by each others' work.


David J. Rosen

djrosen@newsomeassociates.com

I dont' have an ipad; the overwhelming majority of my students don't have ipads.   (I'm disinclined to click on things that say "free ipad!" too :P )  

I do, however, have GIMP and the Adobe Creative Suite (which I"m trying to wean myself from towards open sourcier things, having had several bad experiences dealing with Adobe), and I"m taking Java in the fall and our app development class in the spring (from a teacher who's given me lots of encouragement and support).

ON my ride in this morning, I was considering a: how in the world I'm going to find time and focus once students come back Monday, and b: some of the concepts that really, really could benefit from some visual/conceptual frameworks.   One that jumped out was "rounding."  Too many of my students simply don't, honestly, understand the basic concept of rounding ... or they understandstand it at some level, but can't apply it to "math classes" because it doesn't translate into "round to the nearest 100" or "round to the nearest tenth."  So, we teach them this complex maneuver of, oh, underline the place named in the problem and circle the next one and ask whether its five or more.... and copy... and... the problem has lost **all** meaning by then.   These folks walk away from class still thinking .022 is biger than .1 , but... they got those rounding questions right... at least for this test...

    Couldn't we have a way for students to see the numbers, at least as visual as a number line (or somethign even more concrete like a big ol' graduated bucket) that could zoom in and out and they could *see* the idea of "which one is it closest to?"  And... with somethign touchable... trace and feel that yea, one is longer?   Not so much because they *can't* understand the abstraction of the numbers, but because oh, sometime about five or ten years ago they gave up trying to make those connections so they need it proven? 

 . 

 

Here in Maine we have been working on some professional development for adult educators using interative video conferencing (ITV)  for teaching and learning. It has been a real learning experience for all involved. One of the most valuable things has been simply getting partcipants comfortable with using the technology. In between our face to face (using ITV)  sessions partcipants were asked to schedule conferences with a partner and practice, practice, practice- using the remote, the pre-sets, the scheduler, and conferencing with another in this new way. We have learned the importance of getting the teachers comfortable before we can begin talking about how to use the equipment for teaching. 

I also wanted to share a neat app called Poll Everywhere http://www.polleverywhere.com/  which allows you to create a poll (multiple choice or open ended questions) and students/participants can reply in realtime with answers, opinions questions. Some teachers have used this with ITV to assess students understanding as they go along as well as measure how they themselves are doing with disseminating information.

Hello Megan,

Sounds like you are doing some great professional development in Maine.

In many states, a blended model -- face-to-face at the beginning and end, and sometimes also once or twice in the middle, has worked well, especially as it enables more substantative PD over time with, as you point out, lots of opportunity to practice in between PD sessions, and to reflect on how using the new methods has worked for them. Sometimes this is a group reflection with other teachers who have also tried out the same methods.

Poll Everywhere (still free if you post one question at a time?) is a great tool. Some teachers use it in their classrooms as a low-cost alternative to "classroom clickers." Instread of asking, "Does everyone understand that? If not, raise your hand" they might say "Answer this poll question on your cell phone: "My understanding of polynomials is: 1) fine thanks, let's move one; 2) pretty good; 3) not too good; Ieed help, please; 4) Terrible. I haven't a clue. Help please." The data are then displayed on a computer so the teacher can pretty quickly see how the class is doing. Megan, I think there are different ways to display the data, right? If I recall correctly,  the aggregate data to a question can be displayed to the whole class by response, but -- if the teacher knows the student's ID -- s/he can group students who need more help and work with them, lor have a student who gets it work with them as a peer tutor.  Some teachers use Poll Everywhere outside class, or in distance learning classes.

Poll Everywhere is a good example of a free/inexpensive tool that -- if students bring their feature phone cellphones to class (or if the teacher pairs up those who don't have phones with those who do, or provides a lending classroom cell phone library for students who don't have phones)  gives the teacher a way of formatively assessing what the students think they understand that may be better than their raising their hand which, for some students is embarassing.

Anyone else use Poll Everywhere? If so, how do you use it?

 

David J. Rosen

djrosen@newsomeassociates.com

 

 I saw "Poll Everywhere" at the South Carolina educational technology conference; we were all asked the question "Where does the solid stuff of a stick come from:   water, air, the sun, or soil?  "   just so we could watch the pretty results come up on the screen.  

    Then we were given a few minutes to talk to each otehr about our answers and the poll was taken again.  

     Now, besides showing us the nifty technology, the point of this was also to show what Minds of Our Own showed, because, like the Harvard grads they interviewed, the overwhelming majority of us got it wrong.   (It's "air," because it's the C from the C02 that turns into the C in the wood molecules.)  This emphasized that humans *don't* get rid of basic misconceptions about stuff (like that 'air' really is matter and thus can change from gas to solid) with practice answering questions about it, even when we get all the questions right. Questions on a test are a different thing than The REal WOrld. 

    I think Poll Everywhere would be fun to use in a math class to help students see things like that there are many ways to look at a problem ( http://youtu.be/cy3UCaAiXQE  has a "dot card" exercise that does this at a basic level).   It would be neat to have five *different* right answers choose from :) 

Clicker technologies can be motivating when polls are taken and votes can be displayed. Early research indicated they are motivating but don't help learning because shallow questions are typically asked.  The current emphasis in some circles is to expand the clickers to include short justifications of answers or more open-ended questions in which learners type in content (a word, sentence, or 2-3 sentences).  Then representative answers can be presented on the screen by the teacher.  I know of one system that conducts computer analyses on the distribution of answers and then displays a couple of good answers of the students; students get motivated when their answers are displayed in front of the class (protecting anonymity of course).  Typical answers can be presented when there is no right answer.  And faulty answers can be presented, with the teacher explaining why it is not a good answer.  So there are facilities for providing deeper learning by expanding the clickers to answers with verbal content. 

The NETP Report was very comprehensive in covering the many different types of formative and summative assessment.  The assessment world has moved beyond the traditional multiple choice questions and short answer questions.  In addition to the many assessment methods and measures in the report, there are a few others to point out.  Pearson Education and Educational Testing Service have developed automated essay graders that can grade essays very accurately (as well or better than human experts of composition).  The Intelligent Tutoring Systems (ITS) track the mastery of skills, concepts, and misconceptions automatically while the student solves problems.  The emotions of the learner (such as boredom, confusion, frustration, persistence) can be tracked during learning with modern educational data mining algorithms.  Some assessment methods are embedded in games that attempt to be motivating to the learner.  The hope is that assessment will be fun, just as games are being designed to enhance motivation in the learning of difficult subject matters.

The breadth and accuracy of assessment methods will continue to grow and be integrated increasingly in learning technologies. In my view, the major challenge is to find ways for the adult learners, tutors, and teachers to use the results of these assessments that accrue in their E-Portfolio. So my central question: How will these formative and summative assessment measures be used?

In an ideal world, the adult learners would look at the dozens of measures (or qualitiative content) in their E-Portfolios and then use these scores to guide them in selecting the next learning task -- in a self-regulated fashion that considers their personal interests and career goals. We know that this ideal learner is extremely rare. What is needed is a tutor or teacher to help the adult learner try to use the measures in the E-Portfolios.  Unfortunately, a very small percentage of teachers/tutors have been trained to assist adult learners in self-regulated learning.  Professional development training programs need to fill this gap.  Perhaps technology can be designed to automatically make recommendations to the adult learner on (a) how to interpret the measures in the E-Portfolio and (b) sensible next learning tasks with a justification as to why.  The teacher/tutor could work with adult learners on the scaffolding technologies and thereby enhance their own professional development in addition to helping the adult learners.

I wonder whether some measures in the E-Portfolio should be hidden from the learner.  Does it make sense for the learners to know how they score compared with other learners?  Should they know how creative they are?  Should an adult learner know everything that a computer has stored about them?  These are complex questions that need some attention.

The peer matching approach mentioned in the NETP Report has considerable potential.  Peers learn a lot teaching other peers, as the research shows.  The peer matching approach also solves another challenge in helping adult learners: the attrition problem.  The drop out problem may slow down if students have a community of peers on various topics that they communicate with.  These peers in social networks have considerable potential.  The field of computer supported collaborative learning (CSCL) has developed a research base for exploring this avenue.         

Hi everyone,

These are very interesting points and questions regarding assessment.  All of them deserve a thorough vetting.  But I am intrigued by these questions that Art poses:

 

"I wonder whether some measures in the E-Portfolio should be hidden from the learner.  Does it make sense for the learners to know how they score compared with other learners?  Should they know how creative they are?  Should an adult learner know everything that a computer has stored about them?  These are complex questions that need some attention."

 

I have found in my experience as an adult educator that keeping information of any kind from the person who generated that information feels uncomfortable at best and not productive at worst.   Years ago I thought that I should not give "quizzes" because adult students 1) failed at that before so let's not present that to them again; and 2) they should not be interested in their "score", they need to be focused on their personal progress. But humans do not work like this.  Humans need to know how they are doing vis-a-vis their peers (and non-peers).  

What high school student does not (finally) receive their SAT scores and see how they've summed up against their fellow test takers?  What person does not want to know what information a computer has stored on them?  Only people who actually do not know that this is how high stakes tests function, and that computers (and humans too) store vast amounts of information on them.  This is part of what I believe is our mission: to educate our clientele on these very questions that Art poses.  And subsequently help them navigate this inevitable and contemporary system.  Why do we treat adult students differently than the rest of the population?  They should be integral to answering the questions posed within this report, so part of what their education demands is an understanding of the results of their efforts and not just what they need to prepare for some test.  We don't do that.

So I agree whole-heartedly that we need professional development for our teachers and tutors so that they can guide adult students through the process of understanding data.  It is not enough to prepare our workforce on the "front side" of the equation - they also need to be very prepared to work with the results side.  We don't do this either.

Finally, I am surprised at this question:  "should they know how creative they are?"  Art, why this question?  I'm thrown off guard: isn't helping people discover their talents, creativity, and skills a giant part of what we are supposed to be doing with our clientele?  With all people that we all interact with? What would possibly be a down-side of helping people shine?  Thanks for your response. 

 

 

 

I just want to note that I do know of and have visited programs in which the staff IS very engaged in data education and use with students (and of course these programs' staff are highly skilled at interpreting and using their data results).  But this is episodic, it is not cultural to our profession.  I would like to see the episodic move to an indispensible part of what happens in classrooms and programs. 

Marie.

Thanks so much for your views on whether to hide normative data from the adult learner.  I struggle with the question of whether to help the learners do their personal best versus showing how they perform compared with others (normative data comparison).  Some learners (including many of those who play games) are energized by seeing how well they do compare to others -- its a motivator.  For others, they give up because their performance is so low.  Dweck, Elliott, and others have good theories about this.  I also resonate with the importance of them knowing how the real world is.     

What about the situation where a learner has a trait attribution model which says "if I am not good it this, I'll give up because it's not worth the effort trying higher" Early low performance will lead them to give up, but a personal best model will keep them going.  Perhaps the personal best model is worthwhile ealy in the training, but with a shift to a normative comparson model later on after some practice.  We need more data on this for adult learners, who I believe tend to adopt a trait model  (but maybe I'm wrong). 

One note on the creativity issue.  What about the adult learner who is creative on 2 out of 20 skills/talents?  The learner is not creative overall as a generic trait.  The creativity is knowledge/skill specific.  If we have a generic creativity metric it may be misleading.  If we have 20 creativity metrics, it will become cumbersome and create confusing feedback. Perhaps there is an intermediate grain size (like 4-5 categories).  

These are tough questions.  Your views are helpful.

Hi Art, super interesting comments, thank you for this.  it is true that if at first you don't succeed, you don't always try, try again. I suppose I am a case in point since I am not at all skilled at onling gaming (for example) and so I don't do it.  However, neither am I interested in doing this.  so I wonder where the intersection of skill and motivation meet - it would be different for each person I guess.  

I really like your thought about starting with the personal best model and then moving to (or adding in) the trait attribution model.  This makes a lot of sense to me.  I really feel that both of these models need to be a part of the adult education curricula - they both serve important and different purposes.   It strikes me that our K-12 system mostly works on the trait attribution model - is that true?  Or am I making that up?  It feels that way.  what would it mean to build curricula based solely on a personal best model? Do Dweck or Elliott address this at all?  

Creativity:  ok, you are not talking in general about stuff people can do really well, you are talking about how to build a model to measure creativity.  This seems like the most challening piece of all.  Hmmm....can you talk about why it's important to measure creativity in the first place?  Thanks Art. 

 

 

Marie,

As you suggest, not everyone likes games and the intersection between skill and motivation has not been solved.

Most games are do not tap important skills and most learning technologies do not have good game elements.  Everyone wants to connect games and learning, but the challenge is difficult and many are trying to figure that out.  What game features will appeal to adult learners?  The game features people speak of are choice, challenges that get conquered, well paced interaction with feedback, narrative, fantasy, competition.  Would adult learners like a game with a lot of challenge? Or competition?  

There are ways to measure greativity from on-line interaction during learning.  The measures include diversity of solutions, the novelty of content, and the usefulness/apness of what gets created. I still wonder whether an adult learner should be receiving information on their creativity scores.  

I agree with your intuition that most adult learners would probably operate on the trait attribution model.  I wonder if there is any evidence on this.