You lose incremental builds of modern bundlers/transpilers, which makes your build time too slow for bigger projects. People nowadays expect to see changes almost in realtime using hot module replacement etc.
I would say it's partly because of the design of make.
A Makefile consists of separate commands¹ and is heavily file-based¹, which makes it rather slow and doesn't allow to keep state (in memory) between re-runs. Traditionally this hasn't been bottleneck, because compiling C code was relatively slow operation, but modern web development tools prefer to work with in-memory streams instead of invoking executables for tiny files on disk.
I'm not saying you cannot do things with make, but in NodeJS/web ecosystem it just doesn't feel as natural/flexible as the "native" NodeJS-based toolchain.
[1] Great for interoperability, but sometimes more tailored solution is worth it.
Launching a process on Linux only takes about 2ms, and make only needs to do it for parts that change. There is no reason you couldn't use a make-based system to get incremental rebuilds with under 100ms latency, which is about as good as most people get with native JS systems.