Week 10 Time to evaluate

Unfortunately, some people have been a bit overzealous in diving into evaluation a bit prematurely. Last week, we were aiming to review the tool to look for areas of improvement and there were suggestions that we add in some skills assessment into the evaluation tool. I have not made those modifications and the form now has three skills areas that evaluators might use. I tried to put them all in one question but there was no practical way to fit that much into one question without it being cumbersome. 

You can find the updated form here

For those that have already done an evaluation, you may wish to look over these new skills items to see if you may wish to redo the evaluation(s) you tried this last week. 

If anyone catches an error in spelling, please let me know and I can get that fixed asap. 

We have a list of items to evaluate, and we can add to it later as we wish, but right now, it is time to stat up your evaluation engines and let'er rip! Please feel free to evaluate as many items as you feel inspired to do. If you have difficulty with the tool or accessing any of the sites to evaluate, please let us know. 

There are some resources in our list that are paid services. Please DO NOT pay for these services simply to evaluate. If you have personal interest in buying any of these services or you have already purchased them in the past, we would all appreciate you concentration on those paid services when you find time to evaluate. 

Our aim is to have at least two people evaluate every resource we can. At the end of the week, I will aim to have a compiled list of which resources have received how many evaluations. I hope to even have that automated soon if I can get a few more hours squeezed into this whole "daylight savings" junk cheeky

 

As always, please share your questions, observations, suggestions and thoughts. Happy Evaluations!

Comments

March can be such a long month with many things going on. I was hoping I could share some of the data from some early attempts at using our evaluation form on some of the resources in our Diigo library. Unfortunately, we don't have any data yet to share. 

If anyone gets a chance to evaluate one or two resources this weekend, we can take a look at the data that is coming in. If we need to make any adjustments, it is always better to do with only a few data sets to try things out. 

To review, here are the links again:

Our resources are stored on Diigo in this group https://groups.diigo.com/group/lincs-educational-resources

Our evaluation form is here at this link: https://docs.google.com/forms/d/1HhEakN1y4R9z41c-MH4oLRGeBEzP9wSJK2Xv0yzs2RQ/viewform

As always, please feel free to post any questions or email if you need assistance. Hope you all have a wonderful weekend!

 

Good day all! While waiting for some evaluations to start up, I have been processing a system behind the scenes. Our data comes into a spreadsheet right now. I have done up some systems that aggregate data into a format that displays what the reviewers felt about the tool/resource.

Thus far, we had three evaluations done, but they were done prematurely before we had finished our revisions of the evaluation tool. We are all good to go for evaluating resources at this point. Please take a moment this week to try at least one resource from our list. 

Our resources are stored on Diigo in this group https://groups.diigo.com/group/lincs-educational-resources

Our evaluation form is here at this link: https://docs.google.com/forms/d/1HhEakN1y4R9z41c-MH4oLRGeBEzP9wSJK2Xv0yzs2RQ/viewform

I will report out on what resources have been evaluated and how many reviewers have done each resource by Sunday this weekend. Everyone is encouraged to really explore the resource/tools before evaluation. Fortunately, after you have taken time to explore the resource, filling out our evaluation form should be a quick and painless activity. 

Please let me know if you encounter difficulties in this evaluation process. 

Hi Ed and all, 

You are a very kind and patient man :-)

I've evaluated a couple of resources. I like the form a lot! It's easy to fill, covers useful info. A couple of thoughts:

I evaluated tools (Google Forms and Kahoot) that in and of themselves don't touch on many/any skills but have the potential to be used to touch on a range of skills. I'm not sure how that could be captured, or if it needs to be. For example, Forms could be super for teamwork and presentation skills if the teacher has small groups surveying other classes and then sharing the results with the class. I'm just thinking of how folks might search for a resource, and whether they'd come across wonderful tools like these if they put in 'teamwor' or 'presentation skills'. Should we mark all possible uses?  

About 'ease of use': I found myself wondering how to mark these because I know teachers that don't have rudimentary know-how and would find it difficult, and I know those that are more fluent in tech-ing around and would find it easy. A thought: Have the categories be beginner, intermediate, and advanced instead? 

Diana and all, thank you for making the time to try out a few evaluations and to get some feedback in! In the thoughts and discussions around the potential uses of any of these tools/resources, there was much to mull over. The skills categories were not even included in the original draft specifically because there are many tools that can be applied to every skill development. We decided to throw in the tags as an extra way to highlight potential uses of a tool. If you feel something applies to all the skills listed, by all means check off all that apply. I personally feel that many of the tools on our list in diigo could probably have a majority of the skills checked off, but I think that users will probably find more value in the Usage and Challenges notes fields. Those two areas really allow the evaluator to make notes like, "This tool can really be used to enhance any lesson or goals a teacher may have by providing ...." which highlight an interpretation of how broad or narrow the scope of a tool is. Some may note under Challenges that the strength of the tool might not be realized unless the teacher is very well versed in all the options or something to the effect that a tool may have too many uses which could be overwhelming for those new to technology. Effectively, the narrative parts will offer the end users more details that might be of value. So if something applies to all skills, check them all off and be sure to highlight in the Usage and Challenges your thoughts of how that tool can address so many diverse needs. 

Ease of use is of course going to be quite a subjective value. There were thoughts of doing beginner, intermediate and advanced, but those are somewhat judgmental in that there is no clear delineation between those terms. Instead, the concentration on a perceived timeframe someone might need to get using a tool with some competency is a less judgmental way of trying to let teachers know what might be necessary if they wish to try this tool.  Try thinking of a body of teachers and what timeframe the majority of the staff would need to get going with a tool. It is just like the Olympic skating scoring, cut out the bottom 10% and the top 10% and look for your average gut feeling on this one.

One of my personal goals of this project was for our tools, categories and resource lists might someday become available for the general population to adopt and use much like Amazon.com's system of evaluation.  If we had hundreds of people that evaluate any given tool, there would be some trends that could be displayed that would probably offer a more reliable indication of how "easy" something might be. In the short term goals of this project, the evaluators are simply our group members in this micro group. Of course, a month or so after we start this project, Amazon goes out and announces they are building an Education sharing service that will include their star system and comments! Sharing questions or thoughts as you did in your post is an excellent practice because it opens the discussion for us all on how we might address some of those ideas you share. 

Google is a huge umbrella of tools and something like that probably is very difficult for us to capture. The potential power is vast and there could be an incredibly steep learning curve to really master all these tools might do. By the same token, if one simply thinks of the Google suite as a cloud based MS Office alternative (horribly understating it's capabilities of course), then the tools are quite easy to get into and get going with. Because the intent, focus and expectations of users will vary a great deal, evaluators will want to spend a good amount of their evaluation time thinking about how their narrative input fields can best offer useful information without being too verbose. We will have multiple evaluating each resource we can and if there are differences of opinions, discussions will be encouraged to find some middle ground or to clearly articulate the points of view to the consumers. 

Do those thoughts help at all? Do others have some thoughts to share? Has anyone else found some time to try things out and have some experiences to share with those that have not yet taken things for a spin?

Hi Ed and all, 

Thanks - what you say makes sense. 

Here's some input editing-wise:

  1. In Resource Location: so that others can accurately get to the resource. - Instead: so that others can get to the resource. 
  2. Typo in Basic Skills: Skils
  3. Typo in General Skills: Self-Confience
  4. Typo in Recomendation: Recomendation 
  5. In Recommendation: In general, how likely would you recommend - Instead: In general, how likely you would be to recommend 
  6. Repetition of 'Teamwork' in 'Job Readiness' and 'General'. Other skills in these categories may possibly be so similar that some could be 
  7. Is 'Study Skills (General) and 'Self Directed Learning Skills' (Job Readiness) different or the same?
  8. In 'General': There may be something I'm not discerning: What's the difference between How to stay focused and How to stay on task?
  9. Since Job Readiness is a category, is it worth adding College Readiness?

I'm putting my picky editor hat on for the rest: 

  1. For category choices, choose whether to have only initial words capitalized or all words capitalized. Ex: Self Discipline or Self discipline. It varies among categories.
  2. For both of the 'Ease of Use's: Delete  'is needed to learn and use the tool.'

Thank you very much for all your work on this! If Amazon doesn't address adult ed, this could be a much-appreciated resource for the field. 

Thank you Diana for the suggestions. I think I have enacted all of those suggestions except for the few I have thoughts or comments on below. Everyone please share your thoughts if you have any on any edits on the form.

7. Self Directed Learning Skills vs Study Skills: I see these as two different elements. Study skills implies the ability to review and prepare for a line of study. This usually includes reviewing materials that have already been provided by another.  Self directed learning skills are centered more on habits of mind in which one realizes one does not know all that is needed and can ascertain what resources might be best appropriate without directed instruction from another. Does that clarify a bit?

Picky Edit #2: I am reluctant to remove the suggested test because the results would read; "Only a little time", "A fair amount of time", and "Much time". When we look at how the directions are written: EX How easy is this digital tool or resource to use from a student's perspective?  it may be confusing to some as to what those time based responses are in reference to. Perhaps we should ditch the whole word "easy" from the prompts and instead use something like, "How much time to you perceive it would take for a student to learn to use this tool or resource?" We could of course change the other prompt to have "teacher" in there instead of student. Thoughts?

Anyone else have any thoughts?