Howdy engineers! I am struggling with an odd issue with my game performance when it comes to GPU load. Specs: I7 5930K @3.8GHz 32GB DDR4 GTX 980Ti 6GB - Driver: 382.05 144Hz Display I was setting the global display refresh rate to 60Hz in the nVidia driver, because at 144Hz the game uses way too much GPU power. The fans spin at 100% and the card produces a lot of heat. Furthermore I was not able to set an idependent value for the refresh rate and Vsync for the game either in the driver, in the config file nor in the game itself. It was overwritten by the global settings every time. I am currently bulding a quite large town on a dedicated server which demands quite a bit performance now. At 60Hz I will get about 30fps in game with a CPU load of 25% and GPU load of 35%, using 7GB RAM and 4GB VRAM. At 144Hz I will get about 55-60fps with a GPU load of 70%. So my question is, why is the game not using the potential of the GPU to get more fps at 60Hz or even at 144Hz? Is this a problem in the driver (I am aware it is not the latest driver) or the game itself? And is there a way to set independent values to the global driver settings? Cheers!