- This topic has 12 replies, 7 voices, and was last updated 4 years, 7 months ago by Anonymous.
February 4, 2016 at 6:12 pm #1269ByronParticipant
I’ve been trying to determine whether or not using ActivExpression2 SDRs with ActiveInspire can be an effective tool for administering quizzes and tests. Everything I’ve played around with so far leads me to believe that it can, except for one small aspect.
Once I’ve administered a “test” through ActivInspire, using the SDRs to test students’ retention, and (most importantly) having generated my questions through ActivInspire with the option for students to go back and check their answers, I then export the test results into an Excel spreadsheet. The displayed results are fantastic (I love the green/red distinction between right and wrong answers), except for the fact that for every question the student went back and checked, Excel generates an additional row of information that displays the answer the student gave after they went back and checked their answer.
As you can see, even when a student went back and changed what was once a correct answer into an incorrect answer (Q4), the Excel spreadsheet still displays that additional line of information that does nothing but confuse someone trying to grade this assessment. There’s simply no quick way to determine just how many questions the student truly got wrong.
Furthermore, if you look at the second image, you’ll see that even though there are technically only 4 questions, ActivInspire spits out to Excel that there were actually 6 questions (only because this particular student went back and checked 2 of the questions, hence the “additional” 2). However, even though the score is technically correct, a 50%, this excel spreadsheet is taking that total as 3 correct answers out of 6 (again, because it auto-populates duplicate questions with the new answer instead of just overriding the first answer). Excel SHOULD be calculating that 50% based on the fact the student answered 2 out of the 4 questions correct. Now, this is a bad example because both equal 50%, but in a situation (which I have tested) where a student answered 3/4 questions correctly, but the computer thinks he really got a 4/5 because he went back and checked his answer for just one question, that percentage is going to be off. (75% vs 80%)
There’s just no way to make this an effective grading tool unless we remove the option for students to go back and check their answers… but that isn’t fair.
Now, I know these aren’t meant to be true grading tools, something that can replace Scantron or anything, but if anyone knows a way to work around this so Excel displays a true percentage of questions answered correctly… maybe we CAN turn these devices into true grading tools. I’m not sure if this is an ActivInpsire thing or an Excel issue, or if there’s even a way to program Excel to make a work around for this, but all I know is that teachers would love it if they could find a way to speed up and enhance any kind of grading… and I thought this might be able to do it.
So so close…
February 5, 2016 at 3:45 pm #6495AnonymousInactive
One of my few complaints about the Activexpressions is the Excel file that is generated. It doesn’t seem like it would be difficult to produce an editable file that could produce the grading system you need. The only solution I’ve found is to turn off the option for students to navigate the questions on their own. Usually if I’m solely grading from the Activexpression results I give them a “mulligan” in case they enter the wrong answer. I strongly agree that the Excel file could be slightly improved to give teachers more control of the grading. Still, I love these tools and would encourage everyone to use them in their classroom!
February 5, 2016 at 10:10 pm #6496AnonymousInactive
I have access to another reporting tool for ActivInspire called Inside. If you can share with me your Flipchart containing the results – I can see how that would cope.
My email is dave,firstname.lastname@example.org.
February 5, 2016 at 10:11 pm #6497AnonymousInactive
Oops – that should have been a dot between the “dave” and “grosvenor” – but that should fool the hackers eh!
February 8, 2016 at 1:33 pm #6498AnonymousInactive
Totally agree. Seems extremely simple to integrate an Excel filing system that will spit out a “graded” report that actually lets a teacher record grades quickly and efficiently. At the moment, a test with 50 questions taken by 3 sections of 30 students each, allowing them to go back and check their answers which seems like a must, would be a nightmare to grade. It’s depressing because I feel like the ActivExpression devices are SO CLOSE to doing what we really want them to do. They’re right there, but still too far at the moment to replace good old fashioned scantron.
Do you have any tips on how you use the ActivExpressions, by chance?
February 8, 2016 at 1:37 pm #6499AnonymousInactive
Just out of curiosity before I send anything, what is this “Inside” you mentioned? I did a quick google search and nothing came out. I don’t think you’re a spammer or anything, but I always like to do my due diligence before I just send something to someone. That said, I’d be very curious to see if something can be integrated between ActivInspire and the ActivExpression devices to resolve this grading issue.
February 8, 2016 at 1:55 pm #6500AnonymousInactive
If you go to http://inside.learningclip.co.uk/ you’ll see a description, the manual and a link for a 30-day trial.
The way it works is you drop the flipchart with results into the “inside” program – and that produces an insightful report of who is struggling.
I’ve run a few trials and it really helps me to see where the problems are in my classes.
Please let me know how you get on.
February 8, 2016 at 2:45 pm #6501AnonymousInactive
Looks pretty helpful. Unfortunately, when attempting to download the manual for Inside itself, I get an error that makes me think it’s on their end. I’ll keep checking back to see if that gets resolved.
February 8, 2016 at 2:56 pm #6502AnonymousInactive
Would you try here instead whilst the developers fix that problem.
April 19, 2016 at 12:52 am #6503AnonymousInactive
I just stumbled upon this thread and figured I’d give my two cents on how to use self-paced questions.
I teach elementary/middle, and have to submit test scores weekly for parents to sign off on. I created a standard answer document and create a class set of questions I align to a self-paced flipchart. I disable maneuvering through the question set and multiple attempts at the correct answer.
When I give my test, students mark their answers on the answer sheet, and after 5-10 minutes I turn on the self-paced flipchart and students have time to enter their answers after that. I give the reporting that tells students whether the answer is correct or incorrect, but not the correct answer. In the space next to their answer they place a check if it was correct, x if it was incorrect, or a star if they put in the wrong answer or want me to grade the written answer instead.
After teaching the kids the routine and a couple of tests, grading multiple choice (I teach science and history) is dead simple because the kids already entered their answers, I manually grade 3-5 responses out of 70 students a week, and the kids mark their answers so their parents know how they did.
What Promethean should have done is allocate memory on the host computer for students to enter their answers for a set, be able to review, then submit them all for grading when done. The way a Promethean software engineer explained how the LRD’s work is that ActivInspire pushes everything to the ActivExpressions, so a firmware update would just need to add the review ability (e.g. list answers for however many is in the set) by having a button that asks the host to send back the answers for display, then a command to have the host grade them all at once.
Considering I severely doubt they’ll make an ActivExpression Series 3, it’ll never see the light of day sadly.
April 25, 2016 at 1:47 pm #6504AnonymousInactive
I’m fascinated by the ingenuity of your work-around. Could you possibly send me a copy of one of your flipcharts – and the grading scheme you’re applying. I *may* have a better solution for you. My email address is email@example.com.
April 30, 2016 at 11:26 pm #6505AnonymousInactive
I agree if only there was even an option when creating the self-paced test that would only count the last answer given by a student for a particular question. It would make grading so much easier. I use my activexpressions v2 daily for bellwork, classwork, quizzes, and tests. When grading tests or quizzes I spend a good deal of time manipulating the spreadsheet to provide the information I want. This mostly involves deleting all the multiple answers students have given before their final answer. I also like to use multiple versions of the test which requires sorting students into which version of the test they had. I usually enter the key for each version of the test by manipulating the spreadsheet.
May 4, 2016 at 12:12 pm #6506AnonymousInactive
I think that the program Inside might help you. Download it from here: http://inside.learningclip.co.uk/Downloads/2/Inside_v1.1.2.zip – and see the manual here: http://inside.learningclip.co.uk/Downloads/1/Inside_manual_V3.pdf.
It provides two measures for students:
1. Total = correct responses – incorrect responses
2. Score = 100 x correctly answered questions x % correct responses / quetsions answered
It also divides the students into 5 groups – that *might* marry up to you grading scheme.
- You must be logged in to reply to this topic.
- Release notes for ActivConnect OPS-G v2.4.4 (released 16 October 2020)
- Release Notes for ActivPanel Elements Series v4.2.0 (released 30 September 2020)
- Release notes for myPromethean portal 1.3.1 (released 18 August 2020)
- Release notes for ClassFlow v6.12 (released 15 August 2020 for US / 21 August 2020 globally)
- Release Notes for myPromethean Portal 1.3 (released 23 July 2020)