I see what you mean here. I think part of that problem is online and the breath of tech makers these days:
The one point that really struck home to me, is the games makers can't quality test for all computer builds. I have an FX AMD cpu, Sapphire Graphics Card, Gigabyte Motherboard, Corsair RAM and Crucial SSD's - So lets say the games developer tested this particular rig and got the game working really sweet on it. But then I swap out that Corsair RAM for some Kingston. Well I've invalidated that set of tests. And what if I swap the SSD's and have a HDD in there instead for more storage? The mother board, the graphics card... Just with my one computer alone, there are probably a thousand close matches and a million combinations I could have put together... Even consoles can have multiple different component sources in the same box with some games actually playing better on consoles built in the same factory but with different brands internal bits. Can't recall the game but on Playstation 2 it worked fine on all but a specific model that just bugged out on it.
That's one argument for the suck-it-and-see approach that seems to be happening...
But if a game is released part finished or not fully realized, no I don't see any reason for that, It's almost like a "Minimal Viable Product" is released to see if it'll fly and if it falls then it minimises the losses. If it flies then they can start recycling some revenue into it to keep the cogs turning and make it nearer to what it should have been. Thats my cynical outlook on it, but then I played Star Command Origins and some whacky lawsuit means I can play any more of what was planned in the future.
Reacting to the community is okay, but then it begs the question just who is making the game, the developers or the online popularity contest that is public opinion?