Welcome Stranger to OCC!Login | Register
Guides » Video Cards

GTX 1070 Overclocking Guide


Overclocking GTX 1070 Guide: Introduction

So, you acquired an NVIDIA GTX 1070 and you want to get the most out of it. Well you are in luck, as this guide I will cover the How-To part of overclocking a GTX 1070. This guide applies to all GTX 1070 cards, no matter the model or brand.

That being said, some brands cherry pick and bin the GPUs (Graphical Processing Unit) to overclock higher based on the model right out of the box. This, however, does not mean a higher priced card or one with an X, Z, FTW, or Classified in the name is better. It just means that the card will perform based on its target market. This is a fancy way of saying "luck of the draw" when pushing the card to its limits, but the baseline to start with will be higher and maybe an overall higher stable overclock. So, let me explain this in detail, but first here is your generic disclaimer.



WARNING! Overclock at your own risk. Overclocking a video card can void your warranty and cause other problems down the road! OverclockersClub cannot be held responsible for the information provided below. Please, if you have any questions, jump onto the forums and ask away! Overclock at your own risk!


Suggested Software:


MSI Afterburner


Unigine Heaven Benchmark 


Per Brand:

Evga: EVGA Precision XOC

Zotac: FireStorm Utilty

MSI: MSI Afterburner

Gigabyte: Xtreme Engine Gaming

Video Overclocking Guide:

For those who are like me and enjoy watching videos, I included a video version of this guide. It does not, however, cover anything in depth, but is an easy way to visually see what I am talking about throughout the guide.


Understanding NVIDIA GPU Boost 3.0:

As NVIDIA approaches the company's third integer of GPU Boost, it comes at a cost. Although not all bad, NVIDIA Boost 3.0 has been integrated into all Pascal GPUs (i.e  GTX 10 Series). Therefore, gone are the days of linear voltage offset and NVIDIA's hands-off approach, allowing users to push the boundaries and easily void warranties. It's not that bad, as the more NVIDIA uses its technology, the more the last few generations of cards have come to a high point for the average user. The ability to click and go is a great addition to getting the maximum performance out of your product.

Enough rambling. What is NVIDIA Boost 3.0? A long winded version will be explained in a bit, but the short answer is simple. Boost 3.0 allows users to get, out of the box, an adaptable overclocking for all situations without compromising the video card or warranties it may carry. It's so easy to use, and the user doesn't have to do anything since it's already built into the drivers.


Pictures courtesy of Nvidia


The long answer, for those who like to get deep into technology, is a little too long for an overclocking article. But the main points to take away from this is the jump from Boost 2.0 (9 Series) to 3.0 means NVIDIA has taken away the ability to mod the card's BIOS (until the encryption is cracked) and added flexibility voltage offsets. The voltage offset is a big thing. This allows the 10 Series to maximize efficiency when it comes to megahertz (MHz) to voltage. In the past the voltage was linear. An example would be 100MHz  correlation to .100 millivolt (mv) - 200MHz = .200mv, etc. This means that 150MHz would have to be .150mv and nothing else based on the linear projections. Boost 3.0 allows anything in-between, which allows the voltage to be higher or lower at any point on the scale. The downside of this, and a major player in the overclocking world, is that NVIDIA has locked the maximum voltage to 1.093v. Each manufacturer can choose to allows its customers to overvolt it to this, or simply lock it off completely.

Secondly, a Boost 3.0 card will automatically downclock its total potential well before heat becomes an issue. It will also conform it to the target power parameters to avoid such issues of overheating and power draw. Reports of cards downclocking at only 70 °C can be found on some forums, but NVIDIA's R&D team are the only ones who actually know the exact parameters, so I will not continue to spread too many rumors. Truly the days of heavy overclocking and letting the card cook for the sake of a few mhz is over. That is until someone cracks the encrypted bios and another Skyn3t BIOS is created (don't hold your breath)...


Stable System:

One of the most asked questions about video cards is, "how much power do I need?" Once again, this can be a long topic just about power supplies. But remember the power supply is like a beating heart; you can have the best body in the world, but if the heart is weak, you will feel weak. Make sure before overclocking anything, and just for the sake of the computer components, that you have a good power supply. A quality power supply with lower wattage can go much further than a cheap, 1000 watt off brand. I've seen it plenty of times. Just because it's labeled 1000 watts does not mean it can output 1000 watts sustained. A good brand and quality build will output its labeled amount 24/7 without flinching.

That being said, you still should calculate how much power draw the video card may need. An easy way is to add up each PCIe cable (6-pin or 8-pin) and wattage supplied by the motherboard. These cables are also known as PEG cables (PCI Express Graphics). By specifications, the motherboard should not supply more than 75 watts per PCIe slot, while each 6-pin PEG supplies 75 watts and 8-pin PEG supplies 150 watts. Specifications have changed over time, as a 6-pin can deliver more than 75 watts these days, although it is best to think of it the old way for video cards. So, NVIDIA Founders Edition GTX 1070s with a single 8-pin PEG connector should not use more than 225 watts. Realistically, the card does not pull in more than 180 watts, because of the voltage limits explained in the Boost 3.0 section. NVIDIA lists the GTX 1070 and 1080 as 180W Thermal Design Power (TDP) cards.


The Idea of Overclocking:

I will admit, when I first got my card, this NVIDIA Boost 3.0 really had me confused. Right out of the box the card was running at 2.0GHz (2,000MHz), but Gigabyte said 1822MHz on its website. After a bit of digging, everything I wrote above is an issue for veteran overclockers. Boost 3.0 will push the card to the "safe" limits based on voltage and heat. Because my house stays fairly cool all year long, hovering around 65 °F, NVIDIA's software decided 2,000MHz was perfect for the listed voltage. So for every brand and model, each card will have a baseline; above that, if the card is not stable, it is no big deal, it didn't say it would. However, in a way NVIDIA didn't get the memo, as, due to Boost 3.0's scalablity, you may see a fluctuation from the base speed all the way up to 2.0GHz. This is normal and you haven't done anything wrong. This is just NVIDIA trying to get you every extra megahertz in a user friendly way. After all, not many people overclock in comparison to the amount of cards sold.

This is important, because even when you are trying to apply a higher overclock, it may downclock for any number of reasons, from heat to lack of voltage. As stated above, NVIDIA has locked the GTX 1070 voltage to 1.093v. It does not matter what brand you have, you cannot exceed this. That means your maximum overclock cannot surpass 2.2GHz because of how Boost 3.0 works and voltage limitations. Most cards will not exceed 2050MHz, as that's the average number I've come across browsing the Web.

Have no fear, we are almost at the actual overclocking part of the guide. Just remember, even if you cannot exceed what was already provided, anything over what the supplier listed is technically an overclock. Getting a 2000MHz Core clock out of the box meant NVIDIA gave me an 178MHz Core overclock without having to lift a finger. For my card and model, Gigabyte does not support anything higher than listed specs of 1822MHz.

  1. GTX 1070 Overclocking Guide: Getting Down To Basics
  2. GTX 1070 Overclocking Guide: Getting Down To Numbers
Related Products
Random Pic
© 2001-2018 Overclockers Club ® Privacy Policy
Elapsed: 0.1281421185   (xlweb1)