> Types, after all, are a very weak ontology, i.e., you still cannot know for sure that just because your code compiles it interprets the values it's getting from other party's code correctly.
This again seems to be throwing the baby out with the bath water. Just because in some cases types can't be verified does not mean we should fully be without types at all. There are a myriad of ways to mitigate even this, Parse, Don't Validate comes to mind [0]. Type Driven Development is another way [1], as well as using runtime type checkers like Zod. This article by Kent C Dodds is a really good example of the latter, it covers everything you're talking about regarding unknown types [2].
> I would even argue that it may create a false sense of safety -- just think of the Mars satellite that crashed because some developer thought a number was in imperial units when, of course, it was in SI.
Actually, that sounds to me like just the opposite case. The dev thought it was in imperial units because they might not have known the type of said number. If the number was instead typed with `ImperialUnit unit = ...`, that issue might not have occurred. Now, if the unit value was inputted incorrectly by the programmer, that's a different issue, no amount of typing will fix a business logic typo. As well, if the satellite was fully untyped, issues like these would have occurred far more often, so taking a singular example as a damnation of an entire paradigm doesn't really work.
This again seems to be throwing the baby out with the bath water. Just because in some cases types can't be verified does not mean we should fully be without types at all. There are a myriad of ways to mitigate even this, Parse, Don't Validate comes to mind [0]. Type Driven Development is another way [1], as well as using runtime type checkers like Zod. This article by Kent C Dodds is a really good example of the latter, it covers everything you're talking about regarding unknown types [2].
> I would even argue that it may create a false sense of safety -- just think of the Mars satellite that crashed because some developer thought a number was in imperial units when, of course, it was in SI.
Actually, that sounds to me like just the opposite case. The dev thought it was in imperial units because they might not have known the type of said number. If the number was instead typed with `ImperialUnit unit = ...`, that issue might not have occurred. Now, if the unit value was inputted incorrectly by the programmer, that's a different issue, no amount of typing will fix a business logic typo. As well, if the satellite was fully untyped, issues like these would have occurred far more often, so taking a singular example as a damnation of an entire paradigm doesn't really work.
[0] https://news.ycombinator.com/item?id=27639890
[1] https://blog.ploeh.dk/2015/08/10/type-driven-development/
[2] https://www.epicweb.dev/fully-typed-web-apps