The Java SDK allows unicode query collations to have a null locale. This test expects them to be null in the JSON output. But the ObjC SDK transforms a null locale for unicode collations to the device’s current locale. Which is the correct behavior?
According to the documentation, Java is behaving correctly here while Obj-C has a note that says it does the same thing by using en_US as a default when the passed locale is null. It looks like Obj-C is not doing that correctly though. Pinging @jayahari.vavachan
In all fairness, this was just brought to my attention very recently. I just fixed it within the last month.
Will track it here:
Hi, @jayahari.vavachan. Thanks for creating a ticket to track. To be clear, the issue is not that “ObjC Unicode collation locale returns null”. This is actually what the Java SDK does that differs from the ObjC behavior.
ObjC returns the device’s current locale instead, if the Unicode collation is created with a null locale. This is neither like the Java SDK’s behavior of allowing null, nor the ObjC documented behavior of using en_US as a default.
It’d be good to have these APIs behave the same. Is there a reason ObjC shouldn’t just return null like Java does?