We do both server and client side development at iZettle. This rant is about a server side phenomenon.
Why, oh why, do all programming languages have 'system default', or 'platform default' methods? It's destined to forever being abused and misunderstood! Below are three example areas where that's a problem in Java.
Let's consider Java's
Locale class as an example. There's a method
Locale::getDefault that you can use if you want a quick value to use while you're stressing to get your feature out the door. Seems legit, right? The thing is that the value of this call will vary between runtimes. Why is that a problem then? You might just wanna log something, and basically all your servers are in en_US anyway! What if it isn't? What if someone decides to host your application on a completely new kind of machine? Or what about all poor developers running tests against this code? Do we require them to have the exact same locale on their development machines as you have in production? I totally understand the utility value of this when doing client-side programming: of course you normally want to have application texts presented using the same locale as you once set on your laptop/phone/toaster. The thing is, servers serve clients that don't give jack-ass about the locale you decided for your server. So, let's stop using
Locale::getDefault , shall we?
This one might be less important than the others (because it seems like UTF-8 is finally becoming the default on most platforms), but hey, we have windows devs as well! So, why even have a method
String.getBytes that doesn't accept any arguments? It's just luring devs into writing code with poor portability!
Now we're coming to a pet peeve of mine: time zones. In Java, both the old (
java.util.Date based) and the new
java.time.* libraries are full of these 'system default' annoyances. What about
LocalDate.now() as an example? This call basically asks: "Given the time zone settings of your laptop, what day is it now?". That question might seem OK as long as the user experience takes place on your laptop as well, but let's consider the logical extension of that: "Hey, I'm a customer from Brazil. Oh, and BTW, what day is it now?". It might not be apparent for everyone, but to know what day it is right now in Brazil, you need not only to know that absolute point in time, but also the time zone of the spectator. Basically, the time zone of the server is never interesting. My belief is that most people use these method calls believing one of two things:
1. That the actual time zone used will be resolved to UTC anyway, which is exactly what they wanted but didn't express.
2. That the time zone doesn't really matter in this context.
1 -> No it won't. The result of the execution will vary on different machines, notably production machines and development machines. This is an extremely common cause for intermittently failing tests (probably at specific hours of the day when there's an actual date difference between two time zones).
2 -> Well, it does matter. Here are two statements where at least one of them is almost guaranteed to be a bug because it does not do what the developer intended:
LocalDate .now() .atStartOfDay(ZoneId.of("Europe/Stockholm")); LocalDate .now(ZoneId.of("Europe/Stockholm")) .atStartOfDay();
The difference between these two is subtle but important (can you spot the nonsensical one?).
A (non-exhaustive) list of some other methods in the same problem sphere:
LocalTime.now() LocalDateTime.now() Timestamp::toLocalDateTime ZonedDateTime.now() ZoneId.systemDefault() TimeZone.getDefault() Clock.systemDefaultZone() Calendar.getInstance() new SimpleDateFormat() Date::getYear
Another problem these methods have is that they hide the inherent complexity of time zones and offsets, while they shouldn't. Why not? Because programmers need to understand what they're doing when writing time related code. Offering these seemingly easy methods will just postpone the developers necessary understanding of these matters.
So, while I understand the need for these methods for client-side programming, they should never be used for server-side programming!
So, what's the solution?
I would suggest a couple of different approaches to the problem:
1. Keep the methods, but remove the no-args version, so that the caller is always forced to explicitly pass in a parameter stating 'system default please'. Let's say there is a phony time zone,
TimeZone.PLATFORM_DEFAULT that could be passed in instead of a proper time zone. This would then force the developer to explicitly say that he's using whatever the platform chooses. It would discourage the shortcuts taken today, but still not be overly burdensome for actual client devs.
2. Have some default methods, but make them use standardized values instead of a 'platform default'. This would mean that UTF-8, UTC and 'en_US' would be official standard values. 99.9% of the time, you only want UTF-8: let the exceptional cases be the explicit ones. Same goes for time zones: While it's probably incorrect, developers believe that UTC is what they want 99% of the time: at least make the code do what the developer believes. Locale: 'en_US' is not better or worse than any other locale, it just happens to be the most common one today on default server installs. These standard values might be a bit cxonfusing for developers as well, but at least you would get consistent results from your code, no matter where you're running it.
3. Add a compiler flag
noSystemDefaults that effectively forbids any system default method calls whatsoever. This flag should be used by everyone doing server-side development. There's nothing better than build-time prevention of bad code. If you have a compiled language, use it's USP for what it's good for! This one is my clear favorite. So, Oracle (or JCP): please do it!
To remedy the most basic violations, we've added a couple of regexp rules to our central checkstyle configuration.