Multiple Response Q - Limitations?

jhauglie5350
jhauglie5350 Community Member Posts: 25
Hi y'all - we're pushing out our training package on Monday and are doing final user testing. No one has been able to get successfully past one question near the middle of the training. I finally tried something and it seems to work, so I am posing a question here: do multiple response questions have a limitation on how many correct answers are possible?

The item in question asks the users to select all topics discussed in the (long and meandering) section. I compiled a list of 8 choices, 4 correct and 4 incorrect. Yet the LMS consistently failed to process the program past the question, regardless of which - or how many - options you would choose. So I cut the list to 6 choices (3 of each) - and now the LMS processes the page with no problems. (None of the answer options had anywhere near 255 characters.)

Has anyone else seen this behavior with multiple response questions before, or is this further evidence that there is extant alien life somewhere influencing software behavior (in other words, gremlins with coronavirus or something)?

Thanks in advance!

Comments

  • cainam
    cainam Community Member Posts: 290 ☆ Superstar ☆
    I think the longest one I've used had 10 choices - no issues.  So the only time you have problems is when the course is loaded into the LMS?  If you publish as html and try it out locally does it have any problems?
    - Adam Cain
  • brobertson4402
    brobertson4402 Community Member Posts: 24
    As far as I know there is not a limit on the number of answers to a multi-select question.  I suspect there is some other issue with the question that was resolved when you reduced the number of possible answers.  How is it processing feedback, etc?  I don't think your LMS processes the page.  It should be dependent on the Lectora output for any page processing.   What happens when you preview in Lectora? Does it work there?
  • jhauglie5350
    jhauglie5350 Community Member Posts: 25
    Trying the question locally, and previewing in run mode, shows no effects of anything unusual. All behaviors (incl correct answers) are as expected. I am thinking that there are other culprits at work somewhere, because there are previous multiple response (and other format) questions that show no such problems.

    Thanks for the tips!
  • timk
    timk Community Member Posts: 1,176 ☆ Superstar ☆
    Do you retain the value of the question variable(s) between sessions? The space to save variables in the LMS is limited. This could explain why it works with a "shorter" question.

    Are there any error in the console?
  • jhauglie5350
    jhauglie5350 Community Member Posts: 25
    Tim - thanks - Yes, question variable values are retained. And it's intriguing that a selection pool of 8 items would cause some type of "overflow" when a drag/drop selection pool (different question) of 12 items would not do that.

    Are we seeing errors? MULTIPLE. And frankly it's driving me nuts (ok, moreso than usual). The errors we're seeing when testing in the LMS (a platform called Journey, from a vendor named Red Vector) are usually the one attached. Sometimes they are preceded by a window that drops open and warns that stack value limits are exceeded. The problem is that the errors are appearing only during quizzes (in this module, called knowledge checks).

    Each of five sections has a 10-12 question quiz at the end, involving an array of t/f, multiple choice, multiple response, and drag/drop items. The errors do not appear in the first quiz, but as the user proceeds into the second quiz (after the second section) they start appearing more or less randomly. I've rebuilt the question items several times, validated that the question numbers and variables correspond correctly, and have just about stood on my head while publishing (no errors). But the errors continue to pop up. We've submitted a ticket to the vendor and asked for investigation on their side but so far there's no response.

     

     
  • cainam
    cainam Community Member Posts: 290 ☆ Superstar ☆
    Years ago I remember modifying a test because of running into some variable storing limits... can't remember the exact scenario, or if this might help - but what I did was not include the actual text in the answers themselves, but just 1a, 1b, 1c for the text (or a shortened version of the answer), and added text blocks outside the test question with the real answers/distractors.  (just hide the text for the actual question, and use your own).  Thereby cutting down on variable length for each question...
    - Adam Cain
  • timk
    timk Community Member Posts: 1,176 ☆ Superstar ☆
    cmi.suspend_data is indeed the field in the LMS where all values are saved that are "retained between sessions". In Scorm 1.2 the field is limited to 4096 characters (for the whole course, not per question). As you can see in your screenshot a lot of space is already used for the page tracking.

    By the way: The variable of a dnd question with 12 items is 53 characters long, while the variable of a multiple choice question is as long as all selected option texts so it can very easily be much longer.

    But all values are stored in the same field, so it is not a problem with this individual question, it is just the one that usually exceeds the limit. But it may be the next, if your answers to the previous questions happen to be very short, e.g. if you select only 1 option for each mc question.
  • jhauglie5350
    jhauglie5350 Community Member Posts: 25
    Tim and Adam - THANKS!! This is great information, and Adam, a great workaround! I will see how I can modify these questions as soon as I can get v18 up and running. (Yeah, I thought I could get it done fairly easily...boy was I wrong...)
Sign In or Register to comment.

So You Wanna Be An eLearning ROCKSTAR?

We're all fans of eLearning here! Want to become an eLearning ROCKSTAR? Just click on one of the buttons below to start your rocking journey!