Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
JavaScript is weird (jsisweird.com)
65 points by night-rider on Aug 13, 2023 | hide | past | favorite | 105 comments


I agree JS has a high number of quirks, but like 22 out of these 25 are "implicit type conversion is surprising".

It's surprising in almost every language that does it to any degree. JS's is only slightly stranger than, say, PHP. About the only exception is "empty values are falsy", and even that surprises people pretty frequently, and can be used to construct highly confusing snippets of behavior.


Lua has a feel very much like javascript, but is much less conversion happy. It helps that there's separate operators for addition and concatenation, so you won't get 1 + 2 = 12 unless you ask for it.

Sometimes it seems javascript has a mind of its own, and turns all my numbers into strings when I'm not looking, so I have to sprinkle x = x | 0 throughout the code, like hanging garlic.


PHP also uses different operators for addition and concatenation, but I feel that it has just as much, if not more, implicit conversion weirdness as JS.

It seems to me like those two things are orthogonal. (Not that JS wouldn't necessarily be nicer with a separate concatenation operator.)


You can convert to a number with a + before your variable.

const y = +x + 12;

Will be number addition even if x is a string.


Or via the Number constructor:

    const y = Number(x) + 12
Depending on your audience, that might be clearer in intent than `+x`.


Most are not quircks of implicit type conversion, they happen because of the "no errors, at any cost" phylosophy of the language.

1 + "five" is an error in PHP, not some random correct-looking value.


In recent PHPs, yes, as they've gotten a lot more strict.

On older ones though, it's not an error: https://onlinephp.io?s=s7EvyCjg5UpNzshXMFTQVlBKyyxLVbIGAA%2C...

Even in the previous major version (7) it's only a warning.

I agree that philosophy is likely the main cause of decisions like this, but JS is hardly unique in having it.


> "no errors, at any cost"

Meaning of course "no errors reported, at any cost".

Programming language language has redefined "error" - to substantial cost to programmers.


When your system gets to a certain size, this is absolutely fatal. Please halt and throw a massive exception, shrugging and carrying on makes bugs impossible to track down and the whole system unpredictable.

It’s a philosophy that might be acceptable for a simple-ish frontend, but when if you try and build something significant it’s just awful.

Source: I inherited responsibility for a > 5m LOC system written in JavaScript. The problems aren’t all JavaScript’s fault, but it does play a major part in it.


> Please halt and throw a massive exception

+1

A case I'm struggling with recently is Fastmail mail search. Given a valid query which it cannot handle, rather than show an error report, it does its "best" and delivers a false result. With potentially great cost to the unwitting user. Is there even a warning on the doc page?? No.


> 1 + "five"

Forget JavaScript, that's not even an error in Java.


Yes, Java has this one strong inconsistency on its type system. The interesting thing is that the addition operator has a not completely predictable behavior exactly because the language designers insisted that custom operators reduce the language predictability.

But anyway, it's much less bad in Java because you can force it to err by declaring what type of thing you want it to return.


No, JS is not weird, your code is weird.

true + ("true" - 0) ?

"" && -0 ?

Like, can you tell me a language where it does make sense to "apply AND to an empty string and a (negative?) zero"?

If you write code like this, you have much bigger issues than whatever you choose to write your code in.


> No, JS is not weird, your code is weird.

My code has bugs occasionally, because I'm a human. Which is why I would like my programming language to return errors when I make obvious mistakes.

> Like, can you tell me a language where it does make sense to "apply AND to an empty string and a (negative?) zero"?

No I cannot, which is why I can't imagine why JavaScript wouldn't return an error for this situation.


Because js is old and not well designed in the beginning. Its goal includes compatibilities so it's like old windows api that won't change any time soon, or ever. Either use a strict subset or use typescript.


Sure, except...

How does one enforce a strict subset with an easy setup? Why are people writing non-browser code in this horrible language, often without strict subsets or typescript?

Obviously there are solutions, but the solutions clearly aren't pervasive enough because the problem is still here (and growing, as more JS code is written), so we as an industry still need to keep talking about how problematic and poorly-designed JS is.


> How does one enforce a strict subset with an easy setup?

The old solution are the good books like the good part. The new solution would be typescript.

It's unlikely the API will get deprecated no matter how you talk about it. It's the windows of internet. Any sane shops would use typescript. U don't even need to write ts, and could just let the compiler do the hinting for you.


> It's unlikely the API will get deprecated no matter how you talk about it.

Deprecation isn't what I want. Equivalent alternatives are what I want.

I still can't do <script type='text/typescript'>. Why?

And look, Typescript is also hampered by some of JavaScripts bad choices. Actor-model concurrency would be a really good way to handle events and parallelism to boot, but we won't get that in any language that compiles to JS.

WASM isn't an alternative, because you can't use WASM without JS.


> I still can't do <script type='text/typescript'>. Why?

Because typescript is not javascript. Google already tried with dart, they failed. Typescript is still nice-to-have, it's not replacing javascript. MS is not trying to replace javascript anyway. If your goal is to having javascript replaced, that looks fine but I don't see how it succeed. If you u want to work on TS codebase in backend, there is deno


>> I still can't do <script type='text/typescript'>. Why?

> Because typescript is not javascript. [...] Typescript is still nice-to-have, it's not replacing javascript.

+1. TypeScript compiles to JavaScript, so it can't replace something it compiles to. Besides, text/typescript isn't even a valid MIME type. The text type is meant for plain-text documents (.txt, .md, etc.), so TypeScript would have been in the application type, following JavaScript (application/javascript). Getting TypeScript in a web browser would be too much work. For starters, they would need to take the TypeScript compiler, slap it into a web browser, where it would compile the TypeScript code, pass the result onto the JavaScript compiler, which would finally pass the result to the browser. Whats wrong with this? The code has to undergo two compilation layers, and, most importantly, the TypeScript compiler in the web browser can be out-of-date with mainline TypeScript releases. Your best bet right now is using a web framework with TypeScript support.


I think that it can be both. Yes, weird code will produce weird results but in my opinion, the language should at least help one find the weird code.


TypeScript is great for this, among other things.

it actually aids in writing faster (more JIT-friendly) code by alerting you to these implicit type mutations/coercions.


That wasn't the point of this exercise - nobody is saying that these are actual code examples you'd see in real life.

The point is that in lots of languages these statements would simply result in compile time errors. In JS, though, the type coercion rules are so loose (and in many cases extremely non-obvious) that you can see this head-scratching behavior.

It's the same underlying behavior that leads to very cool/very insane stuff like JSFuck: https://jsfuck.com/


No, JS is weird. You wouldn't write code like that exactly, you'd write "foo + (bar - baz)". Then a few refactors later, you end up with slightly different types for Reasons, then prod starts blowing up and you have the fun of tracking down why.

This weak typing is unique to a few languages like JS and PHP. Python is just as dynamic, but will at least throw an error if you try to accidentally add a string and a number. Static languages catch this even earlier.


Sure, I'll try. Imagine we had two functions already defined:

  get_current_username: Returns a nonempty string iff user is logged in.

  get_balance: Returns a user's "balance" of some sort of credit or currency. Negative balances are possible.
In Python, where "falsiness" is very clearly defined by the language spec, it would be idomatic to write:

  should_render_balance = bool(get_current_username() and get_balance())


so you get the exact same answer?

in js, Boolean('' && 0) will give you the same answer as bool('' and 0) in python.

if i code golf in python (like the article does in js), i'll also get weird-looking code and "unexpected" outputs.

valid python: 0*"string"or"other_string"


Sure, I was just responding to the question:

> Like, can you tell me a language where it does make sense to "apply AND to an empty string and a (negative?) zero"?

People do stuff like that in Python all the time. Personally, I don't do it in JS, though, because I find the conversion-to-bool rules much less predictable. In this case, though, I guess JS would behave the same.


js being a dynamic language on the web with very "helpful" coercion logic, it is much easier to unintentionally perform such an operation than a non-js developer might expect

typescript helps, but i have fired some of these footguns even with typescript.


Yes, I was about to suggest that instead of "JS Is Weird", the site would be more appropriately called "My inputs are weird".


Indeed, nobody would write this code to get the result shown— weird code is usually an accident, and errors informing you about accidents are way more useful than delivering some weird counterintuitive result just to avoid runtime errors.


the top one is... kinda crazy, the second one is very-plausible: `if (name && age) {saveUser()}`.


> 0.2 + 0.1 === 0.3

This isn't just javascript though. It's harder to find a language for which this would be true than the other way around.


Yeah, this is a IEEE754 floating point issue.

I'm not sure why it seems there hasn't been any progress made on making floating point types less error-prone in the last few decades.


Rounding errors are unavoidable when representing an infinite amout of numbers in a fixed number of bits. But in many cases a decimal type (like in C#) would have more intuitive behvior than binary floating point. A decimal type have been proposed for JavaScript but I dont know how far along it is.


and typeof NaN == "number" is also part of the standard, it is valid to pass NaN to functions that accept numbers, and it's expected to propagate or change back to normal numbers (e.g. NaN^0=1)


If anything, JS has plenty of problems due to its too literal interpretation of NaN.


I mean, it is a weird statement in that the language apparently needs a === operator.

Of course, after looking up what === is, the behavior is obvious.


That wouldn't change anything here. This behavior is due to the IEEE 754 number spec, which is implemented in many languages.


The floating point behavior is the obvious part, the === operator is the head-scratcher.


> This website is literally about JavaScript. I mean what did you expect, a .NET application? This website is 99.9% poorly optimized and highly questionable JS. And yet, you have JS turned off.

I kinda expected text and code examples, on a blog. I forgot to check that this is a dedicated website about the weirdness of javascript.


My favorite thing about js that I recently learned is this:

var arr = [];

arr[2] = 2;

// now arr is [empty, empty, 2]

arr[-4] = -4;

// now arr is [empty, empty, 2, '-4': -4]

arr.length; // is 3

arr["foo"] = "bar";

// [empty, empty, 2, '-4': -4, foo: 'bar']


> // now arr is [empty, empty, 2, '-4': -4]

I'm not sure because I'm not at home, but did you verify this pseudocode? Pseudocode doesn't have to be valid syntax, that's the point pf course. But it should bring the point across regarding the behavior of actual programs, no?

AFAIK, unquoted colons inside of array literals in JS are illegal.

I'd understand it better if you had written sth like

  {"0": undefined, "1": undefined, "2": 2, "-4": -4}
It is fairly clear that you are opting out of all optimizations applicable to what

  Array.prototype
is meant for in JS. The distinction between "empty" (in this special case, sparse arrays) and the primitive value "undefined" is similar to the difference between non-declared and undefined variables when not using const/let — which every one should do IMO, along with ES Modules instead of legacy node- or bundler-specific module syntax. Also, using var in 2023 makes only sense to me for two cases:

- REPL context wishing to redeclare identifiers other than functions defined using the function keyword - very legacy browser Targets, < latest IE11

Sparse arrays are, just like var, a historic relic.

I have never encountered any code where sparse arrays were intentionally used or appeared at all.

Arrays are objects in JS, yes. You can construct a sparse array using

  Array.from({length: 5})
"sparse arrays" are generally confusing and seldom desirable.

And JS is honest that it's an interpreted language where "arrays" are not even required to have a uniform member data type - which of course is very advisable for reasons that go beyond code readability.

But let's be honest, is (i.e.) PHP more clear in this regard?

Personally I hate PHP's mishmash of arrays and maps ways more than the JS version. Both are syntactical sugar for memory-managed and heavily optimized "map" like objects where the keys are restricted to primitive types.

Personally, I like that JS at least has a usable notion of strict equality and makes it easy to understand when values are references (always, unless they are primitive).

PHP's copy-on-write behavior, reference operators and the mishmash of "associative" and "indexed" arrays are much harder to actually understand and bigger footguns IMO.

JS has massive quirks, but what you show is not surprising for anyone doing JS with a modicum of proficiency.

JS is a scripting language.

People tend to forget this because of fast JIT compilers and because of TypeScript, I guess.

Sparse arrays are a relic and the differences between your "empty" and the actual primitive value "undefined" in JS are not of any practical significance in my opinion.

It's really a wart, but not one of practical relevance.

At least this differentiates this from "problems" like

  ["1","2","3"].map(parseInt)
that are really just a lack of basic syntax understanding.


>// now arr is [empty, empty, 2, '-4': -4]

This isn't pseudo-code. It's what your browser will output as the value for arr

What will seem weird and unexpected in this is that you're essentially dealing with dictionaries rather than arrays. They just have a length attribute bolted on that counts the highest key number as a length.


Yes that's what it is and I still would agree with you that this one of the really interesting warts/quirks of JS. Sorry if my comment sounded antagonistic, wasn't intended that's why I was editing some of it before reading your reply.


Highest key number + 1


Who would ever code like this? Those kind of quizzes can be made for any language

So tired of those smug "hurr durr language X hehe" posters


All practical languages has quirks, but JavaScript have significantly more quirks than the average language.


    parseInt(0.00005)

    // 0
    parseInt(0.000005)

    // 0
    parseInt(0.0000005)

    // 5
    parseInt(0.00000005)

    // 5
Huh??!


0.00005 -> "0.00005" -> 0

0.00000005 -> "5e-8" -> 5


^ because: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

> The parseInt() function parses a string argument and returns an integer of the specified radix (the base in mathematical numeral systems).

Surprisingly, if you put garbage in, you get garbage out.


The garbage obviously being the language's std library.


nah. just type coercion.

Why would you parse a float? Parsing is for strings. Converting a float to an int is done with stuff like `Math.floor`.


> Why would you parse a float?

You wouldn't, which is why it should produce an error.


Which would be changing the paradigm away from implicit coercion, yes.

That would make it a different, entirely hypothetical, ill-defined language. A very clear strawman that isn't particularly worth fighting.

It would quite possibly be a better language, but it's not a thing you "just do" to solve footguns - it has pretty big effects on the language and ecosystem.


Yeah, that’s terrible behavior for such a critical function like parseInt. JS apologists can legitimately protest some of these examples as being highly unrealistic or just normal type coercion/floating point behavior, but this one here is an unforgivable bug factory.


Slightly challenging one. Find the potential bug.

   const arr = [1, 2, 3, 4, 5]

   for (let i = arr.length - 1; 0 < i; i--) {
      const j = Math.floor(Math.random() * (i + 1))
      [arr[i], arr[j]] = [arr[j], arr[i]]
   }


Not potential:

  Uncaught ReferenceError: can't access lexical declaration 'j' before initialization
Semicolons aren't that bad.


> In general, if a statement begins with (, [, /, +, or -, there is a chance that it could be interpreted as a continuation of the statement before, if semicolon isn`t provided.

While /, +, and - are rare in the beginning of the statement, ( and [ aren't.


That list is missing ` from ES2015+ template literals.

I enjoy teaching this to junior developers as the "winky frown rule" because that is a silly name that sticks in your memory and does help you remember it. Lines that start with an emoticon frown must wink:

    ;(    ;[    ;`


My favorite weird JS thing:

['10', '10', '10', '10', '10'].map(parseInt)


But even that isn't weird, since parseInt() takes two arguments (string and radix) and map expects a function value that receives three arguments (value, index, and the whole array). It's not so much "weird" as it is "I didn't read the docs“.


It's weird because someone coming from a functional language like Haskell would think that the anonymous function in the following code is redundant, while in fact it's necessary:

    ["10","10","10","10","10"].map(x => parseInt(x))


It’s nothing to do with the language being functional or not. It’s because JS supports optional arguments. Haskell doesn’t support optional arguments which is why this is impossible in Haskell, but that is a huge trade off since optional arguments make it easy to evolve an API over time without causing breakage.

It’s a known good practice in JS to always wrap callbacks with a lambda in this situation. Even if you don’t, usually the type checking will catch it, this bug only works because the second argument is a number to both functions.


> optional arguments make it easy to evolve an API over time without causing breakage

Defining functions with new names also make it easy to evolve an API over time without causing breakage.


It does tend to make code less readable however if you end up with several functions that do the same thing except that they were defined at different times with different numbers of arguments. It’s not always clear from the perspective of the API user whether the difference is just the number of arguments or if there are other differences too. Optional arguments communicate that intention more explicitly.

Looking at, e.g. the Python standard library a large number of functions have arguments that were added after the function was first defined. If Python didn’t have optional/keyword arguments then the number of API calls would be substantially larger.


> It’s not always clear from the perspective of the API user whether the difference is just the number of arguments or if there are other differences too. Optional arguments communicate that intention more explicitly.

They do, but potentially falsely.

> Looking at, e.g. the Python standard library a large number of functions have arguments that were added after the function was first defined. If Python didn’t have optional/keyword arguments then the number of API calls would be substantially larger.

The inability of Haskell not to have optional arguments isn't too bad, but all in all, I have to concede that you have a fair point. I personally don't miss optional arguments but I can see how they're beneficial (and I would really like named arguments).


It's weird for multiple reasons unique to JavaScript:

1. Functions can be called with the wrong number of arguments.

2. map() passes multiple arguments to the callback.


oh boy, I haven't seen that one before. that's awesome - so insane looking at a glance


My favorite which wasn’t included is

NaN === NaN -> false

In fact that’s exactly how isNaN(x) is implemented under the hood. It compares the argument to itself.


This is from the IEEE-745 spec. The rationale is explained at the link below by a member of the baord at the time.

https://stackoverflow.com/questions/1565164/what-is-the-rati...


And as can be seen from the weak provided rationale, the IEEE-754 spec rather sucks. Javascript would probably have been a better language had it diverged from it.


As another comment points out, this isn’t JavaScript related, it’s just how the semantics of NaN in floating point works.

Because, conceptually, 0/0 is not equal to -inf + inf. NaN is not a semantic value (it is a lack of one), so you can’t do semantic comparisons with it.


That's true in most languages.

The opposite is almost more surprising.


The only language I can think off that does the right thing is Eiffel (https://bertrandmeyer.com/2010/02/06/reflexivity-and-other-p...)


Someone should write a book about this [0] and a tool to automate checking your JavaScript code [1].

[0]: https://www.oreilly.com/library/view/javascript-the-good/978...

[1]: https://www.jslint.com/

I'm working on a book called "How to not get your knickers in a twist because you neglected to learn from people who came before you."


this is the reason I pick a language OTHER than js for the backend. Why oh why did the whole world decide nodejs sever side js was a good direction? For the frontend I get it. Unless you are going to mess with WASM files you need js.


Cause people wanted to write their front-ends and back-ends in the same language, for various reasons (code sharing, economics of hiring developers, etc).

My issue with JS (particularly but not exclusively server-side) is less that it has some weird aspects and more that its built-in functionality/standard library is incredibly barebones in many respects. IMO that’s the main reason why it’s considered normal in the JS world for a project to pull in hundreds/thousands of largely un-vettable dependencies.


> Why oh why did the whole world decide nodejs sever side js was a good direction?

Knowledge transfer, context switching, the stuff you already know. But there's also an interesting non-JS reason that's gotten forgotten because the serverside landscape has changed so much since the early 2010s. Node very heavily popularized single-threaded asynchronous request handling (ie, not starting up a thread for every request that came into the server).

Of course there is nothing about Javascript that means it specifically as a language is essential for that pattern, but I remember hearing talks from Netflix employees and industry advocates about how much Node had improved their performance; I remember industry people coming to my college and talking about single-threaded serverside code and saying "you all need to check Node out."

And there are downsides to asynchronous code (callback hell was the obvious downside that got brought up a lot before `async` or even promises had arrived in the language) -- but I remember among especially younger serverside developers this trend of people saying, "Node is just faster. That's why you use it, because it's just straight-up more performant than what you're currently doing." And I understand serverside devs from other languages can at this point jump in and say that it's not, that the single-threaded model is possible in other languages, that there are downsides to the single-threaded model, that Node doesn't do it well, whatever. I'm not arguing the point, I'm just describing the cultural shift I saw.

All of the conversations about sharing code happened too, but I remember having conversations where people talked to me about jumping to Node specifically because Express was single-threaded and because it could handle more concurrent requests than other popular server architectures that they were familiar with. And I suspect that it gave the ecosystem a lot of momentum.

Node's original purpose was in part an experiment to see if single-threaded async request handling would be able to handle large numbers of simultaneous requests. My somewhat-limited memory of the conversations that were taking place back then is that running the same language clientside and serverside was just the icing on the cake. The big reason people were excited was because Node was single-threaded.


> single-threaded serverside code and saying "you all need to check Node out."

When Node started to become a thing, Java and C# has already been doing multi-threaded web servers for about a decade.

Oh, and C# already had support for async programming too!

Node become popular in part because Silicon Valley just doesn't seem to be aware of anything that exists outside it.


> Java and C# has already been doing multi-threaded web servers for about a decade.

Node was single-threaded async, not multi-threaded. The entire point was to not spin up multiple threads and to rely on the event loop instead.

> Oh, and C# already had support for async programming too! Node become popular in part because Silicon Valley just doesn't seem to be aware of anything that exists outside it.

> And I understand serverside devs from other languages can at this point jump in and say that it's not, that the single-threaded model is possible in other languages, that there are downsides to the single-threaded model, that Node doesn't do it well, whatever. I'm not arguing the point, I'm just describing the cultural shift I saw.

Right or wrong, you can go back and look at forum posts even only 6, 7 years ago and you'll see people asking about single-threaded performance and asking how Node manages to scale if it's single-threaded. A lot of developers at that time got introduced to single-threaded event-based asynchronous server programming through Node.

I'm not telling you that Node should have won, I'm telling you one of the reasons why it happened to win. Here's Dahl's original slides presenting Node in 2009: https://s3.amazonaws.com/four.livejournal/20091117/jsconf.pd...

Immediately the first thing he's talking about is not unity between the front-end and backend language or sharing code. It's "I/O needs to be done differently." That's what people were talking about back then.


Edit: here's the actual presentation, not just the slides: https://www.youtube.com/watch?v=ztspvPYybIY

What's interesting about this talk is that Ryan doesn't even mention Javascript at all (other than briefly to say that NodeJS is built on it) until nearly 13 minutes into the presentation. And it's clear that he's using Javascript as a means to an end -- he's going with Javascript largely just because Javascript by default has a single threaded event loop that every JS programmer had to learn to work with it.

So he brings up going with JS because JS programmers are used to callbacks already -- and then roughly 2 minutes later he immediately moves back to talking about I/O and request handling again. Very little of his presentation is about Javascript beyond an aside of "JS already does this and JS programmers are used to it, so we might as well go with that and try to copy JS conventions." He almost spends more time talking about common SQL libraries than he spends talking about JS.


the website is 90% problems of very stupid implicit type conversion, and 10% valid floating number behavior that exists in other languages too (0.1+0.2, NaN a number)

I agree using plain JS anywhere is not a good idea, but Typescript (or js with ts annotations) is often ignored when it does a very good job of catching most of these issues, while offering a very flexible type system in a more mainstream paradigm


I personally do not know much about JS other than some light projects, but I have worked on an application that was C# both front and backend and the productivity was higher than anything else I've worked on. Using the same language, same tools, same developers, same job interviews was great. I'm thinking of doing back end JS for this reason.


I think Microsoft’s web stack is underrated, and was perhaps a factor there as well. I’ve seen unusually fast products written in the “dreaded” ASP.NET. Visual Studio is a much more powerful IDE than vim/VS Code, and imo C# is a much more scalable language that vanilla JS. Typescript bridges the gap pretty well.


This is the reason you don’t pick JS for the backend? I’ve been using JavaScript professionally for over a decade now and I’ve literally never needed to write +!![] or “” - - “” or even a single one of these esoteric edge cases. (Except the floating point weirdness, which is an issue in basically every language.)


no i mean the fact that JS was invented in 1995 and isn't well designed because we didn't know what we know now. I'm a professional golang developer and it just feels obvious to me when writing go; wow, this language was well thought out and fixed a lot of mistakes from 1995. (In a way that TypeScript doesn't even come close)


I really like Go, and I tend to prefer it to JS for server-side stuff, but when I use it there are a lot of things I miss about JS:

- Error handling is onerous; I write a lot of functions where most of the lines are just checking for and returning `err`

- No anonymous structs, and it’s really awkward to define structs inline

- Dealing with collections is really verbose; I guess since generics dropped I could handle this in userland, but it’s not idiomatic

etc etc. No language is perfect!


I think u might like my "go on rails" framework:

https://many.pw/sd/

I make heavy use of map[string]any vs structs. I also find the right balance of checking err vs just making it _


JS was better designed than people give it credit for or it wouldn't have lasted anywhere near as many decades without being replaced by something else.

JS has also had a lot of its "mistakes" fixed since 1995. It does so in a backwards compatible way and it isn't always obvious how much progress has been made, but it has never been a static unchanging, "unfixed" language.

(Personally, I'd much rather work with JS on the backend than golang. Golang looks to me like a throwback to the 1970s that misses good language design ideas from the 1990s. I understand some of why it has become popular, but there's so much bad design in that language that upsets my nausea.)


> this is the reason I pick a language OTHER than js for the backend.

You choose against a language because of code that any sane person, which I hope includes yourself, would never write intentionally?

Weird flex, but ok.



Node.js is popular is precisely because browsers are popular and JavaScript is the language of web browsers.

Reusing code, tooling, training is meaningful.

---

Every language (Java, Ruby, Go, Bash, whatever) has quirks. JS may have more than others, but it's not alone.


The fact that JS can be this stupidly wacky yet still be so enjoyable to work with that nodejs is as popular as it is says more about other languages than about JS, I think.


Are you sure people are using it because they enjoy using it?


Yes, I'm pretty sure. In my and my friends' experience at least. As long as the codebase is of manageable size it's one of the most laid back languages out there, letting you get shit done without having to write pages of boilerplate or being lectured by a compiler. The level of abstraction is just right for the current hardware too.

I'll probably get flamed for saying this, but having all objects be hash maps that can be extended or edited at any point and be simultaneously dumpable and loadable into JSON with one line was an unfathomably big brained move. Being able to just use the data you get from various sources instead of having to create an ad-hoc struct/object that perfectly matches all types and values (and ofc crashes immediately if a change is made on the other end) is so underrated. That's not even mentioning the native async support.


Like typescript, which is an amazingly good language with an absurdly powerful type system that is easier to learn that competing absurdly powerful type systems!


Do they still call it "isomorphic"?



no no no, python is weird


But no mention of "dog".sup()?


Very much why I write my front ends in Elm. It might be a PIA, but it shields me from quirks like this.


Nice! So what is the result of 0.2 + 0.1 in Elm?


Same as it is in javascript, python and julia: 0.30000000000000004, but I fear I'm stepping into quicksand by answering.


I was just curious how Elm protects you from quirks like the 0.1 + 0.2 != 0.3 example.


Get a linter




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: