![]() ![]() Our final GPU offset is +100 with no added voltage.CPU:i7-2600K 4751MHz 1.44V (software) -> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1 This brought bandwidth to 832GB/s versus 760GB/s. Therefore our final memory overclock was 20.8GHz versus 19GHz default. By backing down slightly it ensures the memory won’t throttle, helps it last longer than if we pushed it to 21GHz, and also brought the TDP down a little so we could maximize our GPU overclock. What we actually ended up doing was backing down to +900 on the memory which sets it to 1300MHzx16 which gives us a clock speed of 20.8GHz on the memory. This, therefore, indicates that the memory is beyond its capability. When we increased the memory further, for example up to +1500 where it operates at 22GHz or 1375×16 the performance in our game went from 58FPS down to 52FPS. This actually seemed to be ok, and we did notice a big jump in the power demand. At +1000 the memory is running at 21GHz (versus 19GHz default) or 1313×16. What we found with our video card is that we were able to set it all the way up to about +1000 before this started happening. Therefore to hone in the right memory overclock you have to keep increasing the frequency until performance reaches a peak, and then starts to actually degrade or lower. It basically keeps lowering performance to keep it from artifacting. What this means is that when you increase the memory setting, instead of getting artifacts and geometric patterns like you would on other memory types the memory instead error corrects to keep things clean, at the detriment of performance. ![]() The GDDR6X works differently on the GeForce RTX 3080. The real tricky setting to set was the memory clock. Even at +100, we are over the TDP as you will see. However, after this point, the GPU clock would throttle lower because it far exceeded the TDP. We played with several different offset settings and found that we had positive performance results all the way up to +100 on the offset setting. It isn’t going to ramp the clock up unless there is headroom to do it, power, temperature, it all goes into the equation. You have to keep this in mind, GPU Boost is in control at all times, no matter what. Core ClockĪdjusting the Core Clock is rather simple, by setting an offset we can add onto the GPU Boost clock. We actually lowered the fan speed to 80% and this overclock worked just fine and much quieter with temps in the upper 70’s. 100% fan speed wasn’t required to achieve the overclock we are going to show, but for our testing, we wanted to just make sure. We left it on sync and manually increased the fan speed to 100% just to make sure we were getting the best overclock we can on air. We also have control over the fans, you can actually adjust each fan separately, or sync them. The Temp Limit can also be increased to 90. In MSI Afterburner we can turn the Power Limit up to 115%, so that lets us exceed the TDP a bit. You’ll see that the video card hits the maximum TDP, and goes over it anyway without even touching the voltage, so trying to manipulate it just makes the situation worse. The video card already manages voltage thanks to GPU Boost, and raising the Core Voltage manually higher put us up against the TDP limit very quickly which actually hurts our overclocking potential. It does allow Core Voltage control, up to 100%, but we found that raising the voltage wasn’t needed. Overclocking with the latest beta of MSI Afterburner worked well on our video card. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |