The TS type system is incredibly powerful, and I wish I could apply to other languages out there. The biggest annoyances with it come from the fact that it isn't its own language; you get runtime errors due to incorrect or out-of-date types (often maintained by people other than the library authors). Also, "any" and "unknown" types tend to sneak in via default generic arguments, vague external module typedefs, etc. These types can spread through your codebase, preventing the type checker from doing its job, and the "strict" mode doesn't do much about it!
Generally speaking, I have to be extremely careful with working on a JS/TS project. I use the strictest modes possible and enable as much linting as possible. I also use the type-coverage module and fail the build if the `any` type gets applied anywhere.
As another general complaint, NPM doesn't do enough to incentivize its users to release quality software. JS/TS has always been very trend-based and social, and even just running a few simple lints and prominently displaying a score on NPM would probably really help the community improve its standards.
Take pub.dev (Dart), for example. Here's a module page: https://pub.dev/packages/firebase_core. The score is displayed prominently, and if you click on it you get a full breakdown of how it was calculated: static analysis, up-to-date dependencies, and documentation (although the coverage is not 100% for this module, so I think it's wrong to assign full points). When you upload to pub.dev, you want a score of 100! It doesn't guarantee that the module is fantastic software, but at least the basics have been taken care of.
In my opinion, the most important piece missing from the score on pub.dev is test coverage. Take for example a page from Metacpan (Perl packages). Here's a module I've released: https://metacpan.org/pod/Algorithm::AM. On the left you see displayed the automated test results, including the test coverage, which has been submitted by volunteers who downloaded and tested the code on a variety of platforms and Perl versions. There's also link to a "kwalitee" page, which is similar to the pub.dev score (though this could be much improved by having an actual score and displaying it prominently on the main module page).
Now an NPM module: https://www.npmjs.com/package/react. What info do we have about the quality of the package? Essentially just the activity from the community (open PR's and issues, downloads). We have no idea how well it's tested, what kind of static checks have been performed, whether your editor will be able to display documentation or typing when you mouse over a method from the library, etc. (TypeScript is mainstream enough now that it should be fine for NPM to provide a bit of special handling for it.) Most 3rd party libraries are from individual authors and probably have very little activity. When deciding whether or not to use them, right now you need to dive through the code to check if the basics are taken care of. Having some of this automated would save a lot of time.
Generally speaking, I have to be extremely careful with working on a JS/TS project. I use the strictest modes possible and enable as much linting as possible. I also use the type-coverage module and fail the build if the `any` type gets applied anywhere.
As another general complaint, NPM doesn't do enough to incentivize its users to release quality software. JS/TS has always been very trend-based and social, and even just running a few simple lints and prominently displaying a score on NPM would probably really help the community improve its standards.
Take pub.dev (Dart), for example. Here's a module page: https://pub.dev/packages/firebase_core. The score is displayed prominently, and if you click on it you get a full breakdown of how it was calculated: static analysis, up-to-date dependencies, and documentation (although the coverage is not 100% for this module, so I think it's wrong to assign full points). When you upload to pub.dev, you want a score of 100! It doesn't guarantee that the module is fantastic software, but at least the basics have been taken care of.
In my opinion, the most important piece missing from the score on pub.dev is test coverage. Take for example a page from Metacpan (Perl packages). Here's a module I've released: https://metacpan.org/pod/Algorithm::AM. On the left you see displayed the automated test results, including the test coverage, which has been submitted by volunteers who downloaded and tested the code on a variety of platforms and Perl versions. There's also link to a "kwalitee" page, which is similar to the pub.dev score (though this could be much improved by having an actual score and displaying it prominently on the main module page).
Now an NPM module: https://www.npmjs.com/package/react. What info do we have about the quality of the package? Essentially just the activity from the community (open PR's and issues, downloads). We have no idea how well it's tested, what kind of static checks have been performed, whether your editor will be able to display documentation or typing when you mouse over a method from the library, etc. (TypeScript is mainstream enough now that it should be fine for NPM to provide a bit of special handling for it.) Most 3rd party libraries are from individual authors and probably have very little activity. When deciding whether or not to use them, right now you need to dive through the code to check if the basics are taken care of. Having some of this automated would save a lot of time.