Hi all,

I've done some extensive testing on all the rigs I have access to.
First one, my desktop rig, is an i7-6700K with GTX1070.
Second one, my GF's rig, is an i3-6100 with GTX1060.
Third one is my office PC which is i7-4770 with GTX960.
All computers have more than enough RAM so that's not an issue. All computers have the latest nvidia drivers.

Since the game was stuttering in highly populated areas on all computers. Stuttering got worse and worse the weaker CPU I tested.
If you think Driftwood is bad in this regard, wait until you get to Arx. Arx square is probably the most demanding location in the game and it was noticeable even on the 6700K, where it went from being well over 120 fps on the Outskirts to just 55-60 on the city square - while GPU usage remained the same, CPU usage sky-rocketed.
The 4770 system performed pretty much the same, shaving about 3-4 fps compared to the 6700K+GTX1070, despite having a GPU orders of magnitude slower. And the i3 system just barely held 37 fps in Arx and felt very choppy, despite having a 1060 installed.
On all computers CPU usage was well above 60% in Arx (97% on the i3) - as a result, gaming on the i7-4770 + GTX960 actually felt more fluid than the i3 + GTX1060.
This is all with the same settings (1920 x 1080 on ultra). Lowering settings to minimum on both Skylake PCs had absolutely zero effect on performance, as 10xx GPUs were barely taxed.

It appears Divinity OS 2 is an absolute CPU destroyer for some reason.