I’ve never used Haskell. I won’t claim I’m good at Rust. I mostly work with Ruby and Clojure, both dynamic languages where you don’t really need to worry about types. But then of course that’s not true. Even if you put Rails’s magic aside, it’s way too easy to write code that accidentally works (in an absolutely unintended fashion).

low-angle photography gray building

It’s really nice that the “Clojure way” is to use maps instead of monster structs or abstract builder factory facade classes, but it gets less nice when the only thing you know about the structure of the map given to your function is convention. I’m not arguing for a closed schema (cue Rich Hickey saying “I’m not doing that”), at least not as long as passing around data structures larger-than necessary, and extracting the necessary data from them is (close to) free (which, sadly, it is not, but let’s pretend it is).

A type system I’d love to use is one where I can specify data structure and properties (schema?) and the compiler will make sure for me that I don’t accidentally give things to the wrong function without realizing. This is very similar to what Clojure’s spec does – at least the idea, implementation aside. It’d be great if the language (runtime?) would automatically give the needed “view” (in the sense of database (materialized) views) of a data structure passed in to my process for free.

Is it really such an impossible thing to have spec-like checks statically at compile time? I don’t think so. Even in strict typed languages, (user) input has to be cast to some known type at some point. This cast might or might not be possible, potentially resulting in an error (at runtime). But in the end it’s my (or the library writer’s) responsibility to tell the compiler that something is fine for sure, “take my word for it.” While painful at times, Rust forcing me to consider more cases with its Result type quickly becomes more helpful than annoying.

On the other hand I grew to dislike exceptions and especially exceptions as flow control a lot. It just feels lazy. Disasters happen and a program might crash, but passing exceptions around all the time shouldn’t be business as usual. I haven’t measured in Ruby, but in Java (and therefore Clojure) it’s ridiculously expensive too. Rust’s Results might feel awkward at points, but it quietly handling that all for me just by using ?. is really convenient.

I don’t know how efficient (or not) it’d be to have data descriptions stricter than “just” i64. I don’t know how much of it is possible theoretically to confirm at compile time, but it feels like it should be possible all right. For example if I have a piece of data that is specified as “an integer between 1 and 36” and I try to assign it a value returned by a function that only says f64, that should just work (and potentially Result in an error that I should handle somewhere). Same with strings that match a certain regular expression.

Some of the checking obviously has to be done at runtime, but if it’s only at the edges, then I think it’d be great to have safety everywhere else knowing for sure (or as close to sure as possible) that I don’t accidentally treat the user’s email address as a UUID. On the other hand, I think this should be optional: nothing is more frustrating than Rust telling me I can’t do something because it’s the wrong type when I know it has the right stuff inside (wrapped by six iterators and three boxes maybe).

Let me play around and when things have to be tightened up, let me tighten it up efficiently. Data specification checks should be enforced at the edges of spec-safety (I’m trying really hard not to use “type”). If “all” of a system is tightened like that, then that means the edges of the system (user input, IO or other “outside world” stuff). If none if it is, then the compiler should just shrug with a disapproving look and let me try that fun-looking new cloud API without having to define all the data structures explicitly beforehand.