alright, well after doing some more testing, and hearing the opinions of MANY others:
this game is definitely optimized VERY poorly.
You can test without online play by using the benchmark.
If your hardware is a ton more powerful as a PS4, but still has to run the game in a much worse looking state - this is bad optimization. If you need a 1080 and an i7 just to hit the graphics properly - this supports the bad optimization claims, not disproves them. A machine working with the best tech on the market can generally run anything, and how well optimized for PC a game is not measured by that alone, and it might be hard for you to benchmark such a thing if that's what you are working with. For what it's worth, my card is only just behind that, and it's a completely different story. Optimization in this game is decidely poor. Whether or not that's Denuvo at work is yet to be seen.
I don't know about that. The 1080 i7 talk was more about running at max settings @ 4k I thought.
My PC is a new build and is solid, but nothing amazing. People have better and I've found settings that are running the game at a higher resolution than the PS4 and I think it probably looks better than ps4 with better textures.
There is a lot of candy in the game and the PC version is not technically comparable to the PS4 having a lot more hardcore options and better textures, so it's hard to say what is what with out something more scientific than citing a bunch of random people, with god knows what setting combinations, saying they aren't happy. You need side by side comparisons running the same resolution, equivalent texture, shadow, motion blur, bloom, etc settings on a rig with similar ram and CPU power and even then you have to figure the OS overhead is going to be different.
Sometime after that point and testing rigs with slightly better power (doing poorly) you can start questioning the optimization of any platform specific code. A word I would say is a real word in software, but is generally so generic that it's often not useful, especially when tossed around by the public with no actual knowledge of the internal specifics of the code base. It's turned into a word the public uses a lot to complain about performance concerns in a way that makes it sound like they know the lingo when it's probably not as true or meaningful as it sounds. On top of that, a lot of the performance heavy stuff (the graphics) are part of the underlying game engine and not necessarily NRS/WB's code to optimize. (I have heard them say they use a modified engine, so that could factor in)
Do we know what Shadow algorithm the PS4 uses? What about the Subsurface scattering level, motion blur, ambient occlusion (all expensive computations) ? Do we know if the person complaining is running 4K, has shadows on max, but maybe then has to run on low to mid textures because that sweet boss system he is running isn't automatically a PS4 killer just because it cost more than a PS4? What does person 1 think "looks worse" than a PS4? If I use HQ shadows maybe I am forced to use LQ motion blur which looks terrible - worse that PS4. But if I use MQ for both then maybe that looks fine. There are a lot of variables flying around that can't just be shoved in good/bad bags and mean a whole lot. I'm not sure I have enough real, objective information to say if someone failed optimization or not.
The quality bar on the output for the game is pretty high. I feel like what I'm seeing suggests it's demanding game, not necessarily a poorly performing one - which is different. A difference not everyone is going to be equipped to appreciate.
With that said, I do question my load times. I was able to get the game running well by turning particles to low and then I could crank up other settings, but my SSD takes a long time to load the benchmark. I don't need the mentioned 3 second loads, but 6 would be nice.