Week 8: Exploring evaluation options

It appears that many were not in the mood to go shopping, it must be this unusual weather we are all experiencing this year :) One of the goals of going shopping was to unearth what people liked and what people did not like about the many ways current shopping platforms share evaluations with potential buyers. When you think about it, our finished reports and evaluations really are highlighting what tools are good for and what challenges exist in a way that inform the public to make choices that help open doors to learning options. I hope you all don't mind that I made up the following two draft options for our discussion this next week. 

This week we will start looking at options for evaluating the items we have found (and hopefully continue to find) in our Diigo tagging. I have created two sample forms we might use for evaluation. Both are designed in a way to keep our focus on the teacher and learner and how each tool may function well with their needs. Here are more details for each sample form.

Form 1 : Comprehensive form that could be fleshed out later to include many other resources and online "things" not currently included in our categories of focus. The "Online Tools" is the only choice that is currently fleshed out and I probably would "hide" the other choices if we picked this model to stick to so that the other categories are not a distraction. 

Form 2 : This form many have a bit less to it but that might be an advantage. It would be a form we would use for this work only and future evaluation groups would want to develop 

Please look over both forms and offer your feedback on each. It would be ideal to hear what you think each does well and what challenges, changes or additions you might like to see. In each one of these forms, there were some very deliberate decisions made and I expect there may be some questions asking for clarification. 

This week you are shopping for an evaluation tool. No money required, but this may take some time to look things over in a thoughtful manner. Your time and contributions are greatly appreciated in this effort. If you have not "caught up" with Diigo tagging and all that other stuff, please don't let that hold you back from diving in and helping with this phase of things ! It is never too late to share your thoughts and opinions.

 

Comments

I forgot to mention that you should be going through the forms, but please don't submit data at this time. There will likely be many changes before we start evalating and I would hate to have people waste time putting things in now. 

Form 1 has two parts, so it is ok to hit Continue on any pages... just don't hit the final Submit button on either form.

Hi Ed and all, I appreciate your putting together these forms so we have something to bounce ideas off of. I like your categories. Would you mind giving us permission to make comments and suggestions in the form docs? That would make it easier to offer input. I don't know how that comes out on a form, so if it doesn't work well, pasting the questions in a regular Google Doc would do it. Based on teachers' situations within their programs, I suggest adding a few more that I think they are likely to need to use: - No Internet required - Suitable for the one-computer classroom (projector required) - Suitable for computer beginners (Alternatively: Computer skill level required: Beginning, intermediate, advanced) [have definitions] - Suitable for multilevel classes (in terms of content) - App available for mobile devices (and/or functions on mobile devices) - Skill(s) addressed (Listening, reading, etc.) - CCRS anchor(s) addressed I'll propose that we figure out a way to streamline the evaluation of resources that don't meet a minimum on certain 'dealbreaker' items, for example, if the evaluator rates a resource below a 4 on effectiveness, we don't bother fully evaluating it, and put it in a special pile to possibly be reviewed by someone else, if time allows. Question: Apologies if I've overlooked something: Where will the final product live? I'm wondering how the info we're generating on Diigo and Forms will get there, if 'there' is someplace else? (Thinking workload.)

Diana and all, I have played around with the settings a bunch this am and found that I can't really set it up to have comments and stuff in the forms themselves. I could share the sheets that are attached to the forms but that does not allow us to leave comments and such. I like the idea of copy/pasting it into a doc so we can edit and comment on there. I will hope to do that in the next day or so if I can get the time. 

I find the idea of streamlining the evaluation in some ways and in other ways I see some concerns. With almost any tool, there are many different implementation ideas. Just because someone rates something low, does not mean their evaluation does not trigger an approach or implementation that others may not have thought of. I guess I am feeling "one person's trash is another person's treasure" applies to the items that are out there. In reflecting on the shopping exploration I did last week, I can just see how those few really bad reviews might have prevented thousands of other reviews from getting completed. I often look to the bad reviews to see what the biggest challenge was for that person so I can gauge how important that concern is for what I am thinking of doing with a tool. Does that make any sense?

I love the suggestions you offered and I have thoughts on many of them but I need to run to meetings this am. I hope to get back to those thoughts and suggestions this afternoon or early in the morning. 

If others have thoughts on what Diana shared, please share so we can work together to prepare a good evaluation tool. 

 

Hi Ed and all, Thanks for your responses. I understand and agree about using negative reviews while shopping. (I do that too!) My thinking is that the service of this repository is to provide recommended resources. I think as a user, I'd wonder why a resource with a poor rating was included. My idea was that something would be excluded if a number of us said it didn't meet some minimum requirements. As an example, way back, the LINCS ESL Special Collection group developed evaluation criteria: http://www.literacynet.org/esl/aboutus.html

Diana, I think I might have had a bit of a different focus in terms of what our "end product(s)" are. I was hopeful that this group could help establish the following three things:

A set of categories that are aligned to functions adult educators might easily use to find content that is relevant to their needs.This would include us collecting a bunch of items for people to review at some point.

An evaluation tool/system that educators can use consistently to evaluate materials out there

A report we publish to the field with evaluated resources. I was not anticipating that we would be culling the good from the bad in this report. 

I worry about one person's trash being one person's treasure. If the first two reviews of an item on Amazon were poor they don't shut things down and in fact after 10s of thousands of later reviews there may be quite a good number that find the item useful. This group will not likely have the time or the person power to evaluate all the resources that fit in the categories we have. The hope is that our evaluation system and the report sharing results of our evaluations help set the stage for the rest of the field to use our work to continue evaluating and sharing evaluations using the methods we use in this group. It seems quite a different focus, in terms of end product, from the collection you shared in the link. 

I hope that clarifies a bit why the poor ratings have been included in the evaluation tool. We would still want to know what is poor even if we end up filtering our results to only report on the positive evaluated items. It may be that the poor materials are just not included in the final report or collection, but the data collected would still let any know specifics on why a resource was rated poorly. 

Ed and others,

I prefer form 2 for this project. It's tailored to what we joined this group to do.

I hadn't thought about the additional considerations Diana raised, all but one of which I think may be useful to rate and would be fairly easy for a reviewer to answer. Although I think it could be useful for a user to know the CCRS alignment, I think that would be difficult for most of us  to assess. Perhaps, Ed, you know another group that might be able to handle that question more easily and efficiently?

My understanding is that we will publish our reviews to at least the Technology and Learning CoP, if not beyond, and that these are reviews of tools and resources we recommend. If two reviewers sink a tool or resource, I don't think it is useful to include it in our published resource. If two reviewers widely differ in their response, one way to resolve this would be to assign the resource to a third reviewer and in the review include responses of all three. The Science Videos group did that and it worked well. Perhaps that would work for evaluations of web-based tools and resources, too.

David

djrosen123@gmail.com

 

CCRS is of course a criteria many would like to see identified in the field in many digital collections. I think many in the country are still trying to unpack what all the standards are saying. With our focus on the materials we are looking at, I think there is a majority of tools that would not even be able to be related to the CCRS well. Things like lessons can more easily have CCRS tags on them, but how do you tag a communication tool for instance? If we were going with the form 1 option which is more comprehensive, we could fit in CCRS in some of those major categories that easily apply to CCRS.

The feedback from last week indicate we should stick with form 2 and I will be editing that promptly to include the additions that Diana offered. 

Hi Ed,

What a super job on the forms. I like them both but opt for Form 2, since it seems simple to use and i think with this kind of work that will be crucial. And as much as I liked some aspects of Form 1 such as the categories of the rubric, From 2 with it on one page will take less time to fill out I think and easier to scroll on mobile.  I think I woudl just have one free text area at the end and add challenges under usage.

Thanks for all your work on this!

Steve

 

 

Steve, thank you for sharing your views of which form you prefer and why. We are able to modify either form and if we end up going with form 2, please feel free to remind me of the changes you wish in case I forget to look back here :)

How about everyone else? Any preference between form 1 or 2? Changes you would like to see to either? Omissions, deletions, modification suggestions always welcome as are your rational for each. Hope to hear from others before the end of the weekend so I can polish up whichever we choose and we can start looking at evaluation a few resources to test the form and system out a bit. Then after we do that short test run, we can really start cranking some evaluations as we all get time.