### Question

Can I use a comma as a decimal separator in my SCORM Course?

### Answer

1. After diving into the SCORM specifications for SCORM 1.2 and SCORM 2004, we know that:

SCORM is fairly silent on the subject. In SCORM 1.2, the definition of the data type is:

CMIDecimal - A number that may have a decimal point. If not preceded by a minus sign, the number is presumed to be positive.

Examples are "2", "2.2", and "-2.2".

In SCORM 2004, the definition of the data type is:

Real (10, 7): The real(10,7) data type denotes a real number with a precision of seven significant digits.

Neither of these definitions is conclusive...One could argue that they slightly favor the legitimacy of just a period because that is the only example given, however one could also argue that the absence of the comma separator was just an oversight by somebody who was ignorant of internationalization issues.

2. The next step is to look down a layer a the ECMAScript (JavaScript) standard to see how numbers are supposed to be formatted and interpreted there.

Stack Overflow has a good discussion of this issue at

It's pretty clear that in ECMAScript, only a period is a valid separator.

3. Back to the SCORM spec, looking at the definition of the second argument to the SetValue call, it says:

parameter_2 – The value to which the contents of parameter_1 is to be set. The value of parameter_2 shall be a characterstring that shall be convertible to the data type defined for the data model element identified in parameter_1.

That phrase indicates that a comma MIGHT be permissible. A number with a comma separator is certainly convertible to a real decimal. But what is the definition of convertible and how far is the LMS expected to go? The values "3+7/2.8^3" and "forty-two point six" are technically convertible to a decimal, but nobody would expect an LMS to accept those. A logical interpretation of convertible would then go back to the formats EMCAScript would automatically convert. In the StackOverflow article we see that a comma would not automatically be converted. But on the other hand, it would seem reasonable to assume that an LMS could correctly interpret the comma in a locale where it was known to be the correct format.

4. Next steps is to look at the SCORM Conformance Test Suite and see what the LMS test looks for. Unfortunately it doesn't look like there are any tests where a comma is sent as part of a decimal format. That probably means that is it not tested. But, it could also mean that if run in a locale where commas are used the tests would be different.

5. After that, the next step would be to run the SCORM Conformance Test Suite on the content and see what happens. Does it fail when a US locale is used? If so, try it when setting the OS locale to a location that uses commas. Does it still fail?

After discussing this internally, we gathered our notes to share them with the ADL...their response was:

"Very interesting stuff. The real (10,7) notation seems to originate in an IEEE specification, which, as you pointed out and SCORM has a requirement, runs on ECMAScript. Thank you very much for verifying that only a period is valid in ECMAScript, because that is the same place I would have gone with this if you had just asked the question.

In terms of the intent of the notation, I would say it is almost certainly an oversight of international number notation. I think the only thing looked at in terms of international standards was that of language within the specification. Had they been thinking about numerical notation, I think you would have a Data Model field to specify a numerical notation as well. I would say that the deepness of the mathematics used in this specification is about as deep as your typical Java math library, which as far as I know doesn't get into international locales by default. I would reason that the "real" type defined in the specification is more often implemented closer to float or double in Java since a real number isn't really confined by decimal places."

So the official answer from ADL would be that the only notation you should see within a number be the negative sign and one decimal point.

In terms of the intent of the notation, I would say it is almost certainly an oversight of international number notation. I think the only thing looked at in terms of international standards was that of language within the specification. Had they been thinking about numerical notation, I think you would have a Data Model field to specify a numerical notation as well. I would say that the deepness of the mathematics used in this specification is about as deep as your typical Java math library, which as far as I know doesn't get into international locales by default. I would reason that the "real" type defined in the specification is more often implemented closer to float or double in Java since a real number isn't really confined by decimal places."

So the official answer from ADL would be that the only notation you should see within a number be the negative sign and one decimal point.

Let us know if you have any questions at support@rusticisoftware.com