The game still lag in 4K, even with RTX-3080
Posts: 2779
New config just arrived in this Christmas
AMD Ryzen 5800X with water cooling
Gigabyte AORUS xTreme Waterblock RTX-3080 with water cooling
64GB DDR4-3200 RAM
PCIe 4.0 NVMe SSD
4K 60Hz screen
Two 2K 170Hz screen
In 4K, it is still quite unplayable. 20-30FPS once the battle start getting explosions around.
Are you kidding me, even the most demanding Cyberpunk 2077 got 55 FPS with everything max out with DLSS.
In 2K, acceptable but after some games there will be lagging and frame drop while scrolling across the screen. Restarting the game won't solve it, got to restart the whole computer.
And Bug splat still happening once per few games while loading.
Anyone else experiencing these kind of problems?
Posts: 469
You havent maxed antialiasing right, cause that is fps bruner
Posts: 1515
I don't know if the game still patched by Relic or just the community, I still find some optimization/graphics issues even with next-gen hardware.
New config just arrived in this Christmas
AMD Ryzen 5800X with water cooling
Gigabyte AORUS xTreme Waterblock RTX-3080 with water cooling
64GB DDR4-3200 RAM
PCIe 4.0 NVMe SSD
4K 60Hz screen
Two 2K 170Hz screen
In 4K, it is still quite unplayable. 20-30FPS once the battle start getting explosions around.
Are you kidding me, even the most demanding Cyberpunk 2077 got 55 FPS with everything max out with DLSS.
In 2K, acceptable but after some games there will be lagging and frame drop while scrolling across the screen. Restarting the game won't solve it, got to restart the whole computer.
And Bug splat still happening once per few games while loading.
Anyone else experiencing these kind of problems?
Optimization. The game was written mostly for the intel processor (old game, intel was king back then). Aside from that, no matter how strong your rig is, if the game engine can't utilize all the cores and resources, you're in for a bad day.
The higher the player count, the worse the performance. The longer the game, the worse the performance. Search around the google for some COH2 performance tips. There is a great article on Steam for COH2, and some good ones on these sites as well.
Posts: 2779
Uhm i get over 60 fps at most time with a pc from 2012.
You havent maxed antialiasing right, cause that is fps bruner
I don't, AA is useless in 2K / 4K.
Launch up COH1 with this new config it can keep seemlessly smooth staying at 170Hz in 2K with everything max out. COH2 was running like garbage. Seriously it is a 3080.
Posts: 1794
i use 1080ti at uwqhd and with max settings at 2xAA, i never hit 20-30fps even in late 4v4 matchup
Posts: 600
Posts: 2148 | Subs: 2
PC 1
i7 3770
Turbo: 3.9 GHz
Cores (Threads): 4 (8)
GeForce 1070ti 8GB
Perf Test LOW: AVG 60 fps @ 1080p no sync
PC 2
i5 8400
Turbo: 4.0 GHz
Cores (Threads): 6 (6)
GeForce 1060 3GB
Perf Test LOW: AVG 90 fps @ 1080p no sync
So the newer i5 8400 is 50% faster even though it has a MUCH slower videocard. The main difference is the memory speed and memory interface. The i7 DMI is 5GT/s and the i5 DMI is 8GT/s.
This points me to the fact that a lot of the work of the game is done on the CPU and has to be transferred either to/from memory or to/from the videocard. Obvious things are the FOW calculations and pathing. Assuming they are done on the CPU.
The old i7 has a memory speed of DDR3-1333.
The newish i5 is DDR4-2666.
So I am curious if it is best to get the newest mobo and fastest ram you can as opposed to throwing money into a better GPU. Of course the GPU may help other games.
Another interesting value was updating the GPU settings to a medium level made almost no change to the FPS in the tests. Which indicates the GPU is not a factor on speed at lower resolutions. Maybe at 2k and 4k you will see some differences.
In the old FPS days, the amount of L1 and L2 memory cache was key. If you could get your larger loops to run inside this amount of memory you could bypass the slower main memory. Not sure if that matters here, but I note it since the data sets being operated on may be much larger than a typical FPS game.
The search for fps continues...
Posts: 2779
Just for giggles I have been testing a couple PCs at home:
PC 1
i7 3770
Turbo: 3.9 GHz
Cores (Threads): 4 (8)
GeForce 1070ti 8GB
Perf Test LOW: AVG 60 fps @ 1080p no sync
PC 2
i5 8400
Turbo: 4.0 GHz
Cores (Threads): 6 (6)
GeForce 1060 3GB
Perf Test LOW: AVG 90 fps @ 1080p no sync
So the newer i5 8400 is 50% faster even though it has a MUCH slower videocard. The main difference is the memory speed and memory interface. The i7 DMI is 5GT/s and the i5 DMI is 8GT/s.
This points me to the fact that a lot of the work of the game is done on the CPU and has to be transferred either to/from memory or to/from the videocard. Obvious things are the FOW calculations and pathing. Assuming they are done on the CPU.
The old i7 has a memory speed of DDR3-1333.
The newish i5 is DDR4-2666.
So I am curious if it is best to get the newest mobo and fastest ram you can as opposed to throwing money into a better GPU. Of course the GPU may help other games.
Another interesting value was updating the GPU settings to a medium level made almost no change to the FPS in the tests. Which indicates the GPU is not a factor on speed at lower resolutions. Maybe at 2k and 4k you will see some differences.
In the old FPS days, the amount of L1 and L2 memory cache was key. If you could get your larger loops to run inside this amount of memory you could bypass the slower main memory. Not sure if that matters here, but I note it since the data sets being operated on may be much larger than a typical FPS game.
The search for fps continues...
Should we keep vsync disable?
Posts: 2148 | Subs: 2
Should we keep vsync disable?
That is a good question.
When testing your FPS it should be turned off so you can see the actual FPS you are getting. I use the STEAM overlay for FPS.
Vertical sync was needed in the old days so you didnt see tearing (weird artifacts) on the screen as you drew new data to the frame buffer and it was showing old data. So you would get half new data half old data.
V sync makes the GPU wait until an old CRT monitor is in the process of moving the electron beam from the bottom of the screen to the top. So no data is being drawn while you put the new image into the buffer. Early videocards only had memory you could write to while it was not being read from. So Vsync was the only option. LCDs do not have a refresh exactly so it may not be needed in most situations. And all it is doing is adding a WAIT period for each frame you draw and slowing down your game.
I always have it on since I only play Coh2 and 60 fps is fine for me. When I turn V sync off I get much better fps but then my GPU fan starts spinning up and making a lot of heat.
You should definetely turn it off and check how it goes. Also verify that the graphics settings are at a high refresh rate. When I start changing resolutions from 1080 to 4k, windows changes my refresh rate to 30 Hertz and I have to manually fix it. I am only using HDMI cables so I dont think it can do 60Hz at 4K. I think you need a DisplayPort cable for 60Hz. But I am guessing.
SUMMARY
For BEST performance, turn V Sync off and if the screen starts having distortions that bother you, turn it back on.
Posts: 2779
That is a good question.
When testing your FPS it should be turned off so you can see the actual FPS you are getting. I use the STEAM overlay for FPS.
Vertical sync was needed in the old days so you didnt see tearing (weird artifacts) on the screen as you drew new data to the frame buffer and it was showing old data. So you would get half new data half old data.
V sync makes the GPU wait until an old CRT monitor is in the process of moving the electron beam from the bottom of the screen to the top. So no data is being drawn while you put the new image into the buffer. Early videocards only had memory you could write to while it was not being read from. So Vsync was the only option. LCDs do not have a refresh exactly so it may not be needed in most situations. And all it is doing is adding a WAIT period for each frame you draw and slowing down your game.
I always have it on since I only play Coh2 and 60 fps is fine for me. When I turn V sync off I get much better fps but then my GPU fan starts spinning up and making a lot of heat.
You should definetely turn it off and check how it goes. Also verify that the graphics settings are at a high refresh rate. When I start changing resolutions from 1080 to 4k, windows changes my refresh rate to 30 Hertz and I have to manually fix it. I am only using HDMI cables so I dont think it can do 60Hz at 4K. I think you need a DisplayPort cable for 60Hz. But I am guessing.
SUMMARY
For BEST performance, turn V Sync off and if the screen starts having distortions that bother you, turn it back on.
Thanks, I would have it off by now.
Posts: 658
I don't know if the game still patched by Relic or just the community, I still find some optimization/graphics issues even with next-gen hardware.
New config just arrived in this Christmas
AMD Ryzen 5800X with water cooling
Gigabyte AORUS xTreme Waterblock RTX-3080 with water cooling
64GB DDR4-3200 RAM
PCIe 4.0 NVMe SSD
4K 60Hz screen
Two 2K 170Hz screen
In 4K, it is still quite unplayable. 20-30FPS once the battle start getting explosions around.
Are you kidding me, even the most demanding Cyberpunk 2077 got 55 FPS with everything max out with DLSS.
In 2K, acceptable but after some games there will be lagging and frame drop while scrolling across the screen. Restarting the game won't solve it, got to restart the whole computer.
And Bug splat still happening once per few games while loading.
Anyone else experiencing these kind of problems?
I get Bugsplats after 2-4 games played so I have to restart the game often. Hopefully the 64 bit version of the game fixes this issue as I believe the game is hitting the 32 bit memory limit and crashing.
I get around 80 FPS on 4k/60 with all settings (minus Anti-Aliasing which is off since its not needed) on 2080TI and Intel 9th Gen CPU.
Could be possible the Ampere (Nvidia 3000 series)needs a new game profile for COH 2 in the Nvidia drivers.
Nvidia's latest drivers for example doesn't include the game profile for World of Warcraft Shadowlands which is causing the game to run at half the FPS and flicker nonstop. Wouldn't be surprised if COH2 or Nvidia needs an update for the game to run properly considering that configuration.
Posts: 2148 | Subs: 2
When I turn it off and am running 100+ fps, my keystrokes are updating way too fast and the screen flies around.
Anyone else have this happen?
Posts: 2779
Another thought on running with VSync turned off:
When I turn it off and am running 100+ fps, my keystrokes are updating way too fast and the screen flies around.
Anyone else have this happen?
Not really understand, can you explain more on this?
Posts: 469
I get about 100fps in the beginning and 50-60 fps in 4v4 late game
And in 2v2 it never drops below 100 fps.
So something is wrong with your configuration
Posts: 2148 | Subs: 2
Not really understand, can you explain more on this?
My game is usually locked to 60 fps. When I turn off Vsync and get 180fps, I press the arrow keys to move the screen and it goes 3 times as fast. So the arrow keys become almost useless because it moves so fast.
Posts: 7
In 4K, it is still quite unplayable. 20-30FPS once the battle start getting explosions around.
Are you kidding me, even the most demanding Cyberpunk 2077 got 55 FPS with everything max out with DLSS.
So with DLSS enabled? Because DLSS improves performance as it is "made-up" 4K (although it can look better than native 4K) and you'd expect ~15-20 fps in cyberpunk w/ DLSS disabled and a 3080.
The game is relatively old and relatively poorly optimized AFAIK; it obviously did not have the dev budget available to Cyberpunk.
Posts: 1794
Posts: 2779
in 32bit mode, i have ryzen 3700x and the benchmark gave me minimum 55fps. what does your 5800x gives you ?
Gonna check after work.
Posts: 1794
Gonna check after work.
be aware next time you launch, it will update to 64bit. try launch it off line.
for mine,
32bit 55fps min
64bit 63fps min
clear improvement in performance.
Livestreams
17 | |||||
188 | |||||
33 | |||||
17 | |||||
11 | |||||
4 | |||||
1 | |||||
1 |
Ladders Top 10
-
#Steam AliasWL%Streak
- 1.655231.739+15
- 2.842223.791+5
- 3.940410.696+6
- 4.35459.857-1
- 5.599234.719+7
- 6.278108.720+29
- 7.307114.729+3
- 8.645.928+5
- 9.10629.785+7
- 10.527.881+18
Replay highlight
- cblanco ★
- 보드카 중대
- VonManteuffel
- Heartless Jäger