One problem I had this week is that a piece of code which had, for the previous few years, been perfectly behaved suddenly started crashing for a new customer. This was the first time I’d seen the code run on a Finnish system, so a problem with internationalization seemed like the obvious place to start.
Running the software on a fi-fi system, I found that a call to Double.Parse was causing a FormatException: Input string was not in the correct format.
The issue here was that the code was attempting to parse a string which contained a software version number, e.g., “2005.100”. In a US English environment, this works fine – that gets parsed as 2005.1. In a German environment, this works fine – that gets parsed as 2005,100. Finnish, however, doesn’t use the period as either a decimal separator or a thousands separator, so when the system tried to parse the string, it didn’t recognize it as a number.
The solution is to pass in the second parameter for Parse, which (at least in C#) is an IFormatProvider. When no IFormatProvider is given, the system defaults to InstalledUICulture, which means it was trying to interpret the string according to the system’s locale settings. By passing in InvariantCulture, I force it to ignore the locale settings and interpret the input as an English (no country) string.
Double.Parse(version, CultureInfo.InvariantCulture)
As a result, this value – which is always stored in the same format – is also always read in the same format, and the end user sees the same behavior regardless of the computer on which the code is being run.
I have learned that you should always set the culture to InvariantCulture. OK. I understand why. But why does Double.Parse default to InstalledUICulture? Isn’t that a flaw in the language?
No, it’s just a different use case.
In this instance, the string to be parsed was always in en-us form, so it’s correct to use InvariantCulture. But in other situations you might expect that the string to be parsed would be formatted appropriately for the current locale.