That single line of code may be using a slow abstraction, doesn't cover edge cases, has no caching of reused values, has no optimization for the common path, or any other number of issues. Thus being slower, fragile, or sometimes not even solving the problem it's meant to solve.
More often than not performance and robustness comes at a significant increase to the amount of code you have to write in high level languages... Performance optimizations especially.
A high performance parser I was involved in writing was nearly 60x the amount of code (~12k LOC) of the lowest LOC solution you could make (~200LOC), but also several orders of magnitude faster. It also covered more edge cases, and could short circuit to more optimal paths during parsing, increasing the performance for common use cases which had optimized code written just for them.
More lines of code = slower
It doesn't. This is a fundamental misunderstanding of software engineering and is flawed in almost every way. To the point of it being an armchair statement. Often this is even objectively provable...
Hard agree.
Less code is not a positive metric to measure your implementation by, and is not a valid premise to justify itself. Often increasing the complexity (again, LOC is not an indicator of complexity), tanking performance, and harming the debugging experience is a common result of the mentality. Things that make software worse.
Not all one-liners are bad ofc, that's not the argument I'm making. It's about the mentality that less code is more good, where poor decisions are made on a flawed premise.