this post was submitted on 29 Jun 2024
901 points (94.9% liked)
Programmer Humor
32490 readers
574 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I’m probably going to get a lot of hate for this, and I do recognize there have been problems with it all over the place (my code too), but I like null. I don’t like how it fucks everything up. But from a data standpoint, how else are you going to treat uninitialized data, or data with no value? Some people might initialize an empty string, but to me that’s a valid value in some cases. Same for using -1 or zero for numbers. There are cases where those values are valid. It’s like using 1 for true, and zero for false.
Whomever came up with the null coalescing operator (??) and optional chaining (?->) are making strides with handling null more elegantly.
I’m more curious why JavaScript has both null and undefined, and of course NaN. Now THAT is fucked up. Make it make sense.
To offer a differing opinion, why is null helpful at all?
If you have data that may be empty, it's better to explicitly represent that possibility with an
Optional<T>
generic type. This makes the API more clear, and if implicit null isn't allowed by the language, prevents someone from passing null where a value is expected.Or if it's uninitialized, the data can be stored as
Partial<T>
, where all the fields areOptional<U>
. If the type system was nominal, it would ensure that the uninitialized or partially-initialized type can't be accidentally used whereT
is expected sincePartial<T>
!=T
. When the object is finally ready, have a function to convert it fromPartial<T>
intoT
.Ignoring the fact that a lot of languages, and database systems, do not support generics (but do already support null), you’ve just introduced a more complex type of null value; you’re simply slapping some lipstick on it. 😊
Type-safe lipstick :)
In a discussion about whether null should exist at all, and what might be better, saying that Optional values aren't available in languages with type systems that haven't moved on since the 1960s isn't a strong point in my view.
The key point is that if your type system genuinely knows reliably whether something has a value or not, then your compiler can prevent every single runtime null exception from occurring by making sure it's handled at some stage and tracking it for you until it is.
The problem with null is that it is pervasive - any value can be null, and you can check for it and handle it, but other parts of your code can't tell whether that value can or can't be null. Tracking potential nulls is in the memory of the programmer instead of deduced by the compiler, and checking for nulls everywhere is tedious and slow, so no one does that. Hence null bugs are everywhere.
Tony Hoare, an otherwise brilliant computer scientist, called it his billion dollar mistake a decade or two ago.