this post was submitted on 31 Oct 2023
13 points (100.0% liked)

Programming

13376 readers
1 users here now

All things programming and coding related. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 1 year ago
MODERATORS
 

Cross-posting this here as I saw some misconceptions about Rust language

I think that blog describes well the pros of using a strongly-typed language like Rust is. You may fight the compiler and get slower build times but you get less bugs because of the restrictions the language imposes you.

The biggest con of Rust is that it requires learning to be used, even for someone who has already programmed before. It's not like Python or Ruby where you can just dive in a code base and learn on the go. You really need to read the Rust book (or skim through it) to get through the notions. So it has a higher entry level, with all the misunderstandings that come with it.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] TehPers@beehaw.org 3 points 1 year ago (1 children)

My favorite tests are the ones I don't need to remember to write. I haven't needed to write a test for what happens when a function receives null in a while thanks to TS/mypy/C#'s nullable reference types/Rust's Option/etc. Similarly, I generally don't need to write tests for functions receiving the wrong type of values (strings vs numbers, for example), and with Rust, I generally don't even need to write tests for things like thread safety and sometimes even invalid states (since usually valid states can be represented by enum variants, and it's often impossible to have an invalid state because of that).

There is a point where it becomes too much, though. While I'd like it if the compiler ensured arbitrary preconditions like "x will always be between 2 and 4", I can't imagine what kinds of constraints that'd impose on actually writing the code in order to enforce that. Rust does have NonZero* types, but those are checked at runtime, not compile time.

There are techniques like abstract interpretation that can deduce lower and upper bounds that a value can take. I know there is an analysis in LLVM called ValueAnalysis that does that too - the compiler can use it to help dead code elimination (deducing that a given branch will never be taken because the value will never satisfy the condition so you can get rid of the branch).

But I think these techniques do not work in all use cases. Although you could theoretically invent some syntax to say "I would like this value to be in that range", the compiler would not be able to tell in all cases whether it's satisfied.

If you are interested in a language that has subrange checks at runtime, Ada language can do that. But it does come at a performance cost - if your program is compute bound it can be a problem