Accessories
Viewing 12 reply threads
  • Author
    Posts
    • #1269
      ByronByron
      Participant

      I’ve been trying to determine whether or not using ActivExpression2 SDRs with ActiveInspire can be an effective tool for administering quizzes and tests. Everything I’ve played around with so far leads me to believe that it can, except for one small aspect.

      Once I’ve administered a “test” through ActivInspire, using the SDRs to test students’ retention, and (most importantly) having generated my questions through ActivInspire with the option for students to go back and check their answers, I then export the test results into an Excel spreadsheet. The displayed results are fantastic (I love the green/red distinction between right and wrong answers), except for the fact that for every question the student went back and checked, Excel generates an additional row of information that displays the answer the student gave after they went back and checked their answer.

      As you can see, even when a student went back and changed what was once a correct answer into an incorrect answer (Q4), the Excel spreadsheet still displays that additional line of information that does nothing but confuse someone trying to grade this assessment. There’s simply no quick way to determine just how many questions the student truly got wrong.

      Furthermore, if you look at the second image, you’ll see that even though there are technically only 4 questions, ActivInspire spits out to Excel that there were actually 6 questions (only because this particular student went back and checked 2 of the questions, hence the “additional” 2). However, even though the score is technically correct, a 50%, this excel spreadsheet is taking that total as 3 correct answers out of 6 (again, because it auto-populates duplicate questions with the new answer instead of just overriding the first answer). Excel SHOULD be calculating that 50% based on the fact the student answered 2 out of the 4 questions correct. Now, this is a bad example because both equal 50%, but in a situation (which I have tested) where a student answered 3/4 questions correctly, but the computer thinks he really got a 4/5 because he went back and checked his answer for just one question, that percentage is going to be off. (75% vs 80%)

      There’s just no way to make this an effective grading tool unless we remove the option for students to go back and check their answers… but that isn’t fair.

      Now, I know these aren’t meant to be true grading tools, something that can replace Scantron or anything, but if anyone knows a way to work around this so Excel displays a true percentage of questions answered correctly… maybe we CAN turn these devices into true grading tools. I’m not sure if this is an ActivInpsire thing or an Excel issue, or if there’s even a way to program Excel to make a work around for this, but all I know is that teachers would love it if they could find a way to speed up and enhance any kind of grading… and I thought this might be able to do it.

      So so close…

    • #6495
      Anonymous
      Inactive

      One of my few complaints about the Activexpressions is the Excel file that is generated.  It doesn’t seem like it would be difficult to produce an editable file that could produce the grading system you need.  The only solution I’ve found is to turn off the option for students to navigate the questions on their own.  Usually if I’m solely grading from the Activexpression results I give them a “mulligan” in case they enter the wrong answer.  I strongly agree that the Excel file could be slightly improved to give teachers more control of the grading.  Still, I love these tools and would encourage everyone to use them in their classroom!

    • #6496
      Anonymous
      Inactive

      I have access to another reporting tool for ActivInspire called Inside. If you can share with me your Flipchart containing the results – I can see how that would cope.

      My email is dave,grosvenor@cannco.co.uk.

    • #6497
      Anonymous
      Inactive

      Oops – that should have been a dot between the “dave” and “grosvenor” – but that should fool the hackers eh!

    • #6498
      Anonymous
      Inactive

      Mike,

      Totally agree. Seems extremely simple to integrate an Excel filing system that will spit out a “graded” report that actually lets a teacher record grades quickly and efficiently. At the moment, a test with 50 questions taken by 3 sections of 30 students each, allowing them to go back and check their answers which seems like a must, would be a nightmare to grade. It’s depressing because I feel like the ActivExpression devices are SO CLOSE to doing what we really want them to do. They’re right there, but still too far at the moment to replace good old fashioned scantron.

      Do you have any tips on how you use the ActivExpressions, by chance?

    • #6499
      Anonymous
      Inactive

      Dave,

      Just out of curiosity before I send anything, what is this “Inside” you mentioned? I did a quick google search and nothing came out. I don’t think you’re a spammer or anything, but I always like to do my due diligence before I just send something to someone. That said, I’d be very curious to see if something can be integrated between ActivInspire and the ActivExpression devices to resolve this grading issue.

      Thanks

    • #6500
      Anonymous
      Inactive

      Byron,

      If you go to http://inside.learningclip.co.uk/ you’ll see a description, the manual and a link for a 30-day trial.

      The way it works is you drop the flipchart with results into the “inside” program – and that produces an insightful report of who is struggling.

      I’ve run a few trials and it really helps me to see where the problems are in my classes.

      Please let me know how you get on.

      Cheers,

      Dave (dave.grosvenor@cannco.co.uk)

    • #6501
      Anonymous
      Inactive

      Dave,

      Looks pretty helpful. Unfortunately, when attempting to download the manual for Inside itself, I get an error that makes me think it’s on their end. I’ll keep checking back to see if that gets resolved.

    • #6502
      Anonymous
      Inactive

      Byron,

      Would you try here instead whilst the developers fix that problem.

      Cheers,

      Dave

    • #6503
      Anonymous
      Inactive

      I just stumbled upon this thread and figured I’d give my two cents on how to use self-paced questions.

      I teach elementary/middle, and have to submit test scores weekly for parents to sign off on. I created a standard answer document and create a class set of questions I align to a self-paced flipchart. I disable maneuvering through the question set and multiple attempts at the correct answer.

      When I give my test, students mark their answers on the answer sheet, and after 5-10 minutes I turn on the self-paced flipchart and students have time to enter their answers after that. I give the reporting that tells students whether the answer is correct or incorrect, but not the correct answer. In the space next to their answer they place a check if it was correct, x if it was incorrect, or a star if they put in the wrong answer or want me to grade the written answer instead.

      After teaching the kids the routine and a couple of tests, grading multiple choice (I teach science and history) is dead simple because the kids already entered their answers, I manually grade 3-5 responses out of 70 students a week, and the kids mark their answers so their parents know how they did.

      What Promethean should have done is allocate memory on the host computer for students to enter their answers for a set, be able to review, then submit them all for grading when done. The way a Promethean software engineer explained how the LRD’s work is that ActivInspire pushes everything to the ActivExpressions, so a firmware update would just need to add the review ability (e.g. list answers for however many is in the set) by having a button that asks the host to send back the answers for display, then a command to have the host grade them all at once.

      Considering I severely doubt they’ll make an ActivExpression Series 3, it’ll never see the light of day sadly.

    • #6504
      Anonymous
      Inactive

      Hi Loren,

      I’m fascinated by the ingenuity of your work-around. Could you possibly send me a copy of one of your flipcharts – and the grading scheme you’re applying. I *may* have a better solution for you. My email address is dave.grosvenor@cannco.co.uk.

      Cheers,

      Dave

    • #6505
      Anonymous
      Inactive

      I agree if only there was even an option when creating the self-paced test that would only count the last answer given by a student for a particular question.  It would make grading so much easier.  I use my activexpressions v2 daily for bellwork, classwork, quizzes, and tests.  When grading tests or quizzes I spend a good deal of time manipulating the spreadsheet to provide the information I want.  This mostly involves deleting all the multiple answers students have given before their final answer.  I also like to use multiple versions of the test which requires sorting students into which version of the test they had.   I usually enter the key for each version of the test by manipulating the spreadsheet.

    • #6506
      Anonymous
      Inactive

      Hi,

      I think that the program Inside might help you. Download it from here: http://inside.learningclip.co.uk/Downloads/2/Inside_v1.1.2.zip – and see the manual here: http://inside.learningclip.co.uk/Downloads/1/Inside_manual_V3.pdf.

      It provides two measures for students:
      1. Total = correct responses – incorrect responses
      2. Score = 100 x correctly answered questions x % correct responses / quetsions answered

      It also divides the students into 5 groups – that *might* marry up to you grading scheme.

      Cheers,

      Dave dave.grosvenor@cannco.co.uk

Viewing 12 reply threads
  • You must be logged in to reply to this topic.