Specifications
Bus Type PCI-E 3.0
GPU Clock 1266MHz
Stream Processors 2304
Memory Bus 256 bit
Memory Clock 7 GHz
Memory Size 4 GB
Memory Type DDR5
Card Profile Dual
Thermal Solution Blower fan Outputs Dual link Support Y
Max Supported Resolution (DIGITAL) 4096 x 2160
Output - Display Port 3
Output - HDMI 1
Features
Display Port ready 1.4
HDMI Ready 2.0b Requirements External Power - 6-pins 1
Minimum Power Supply Requirement 500 watt
XFX Recommended Power Supply XFX 550W PSU
Technologies
HDR Ready Y
FinFET 14 Y
AMD FreeSync technology Y
DirectXâ„¢ 12 Optimized Y
VR Ready Premium Y
4th Generation GCN Y
AMD LiquidVR technology Y
AMD Virtual Super Resolution (VSR) Y
Radeon Software Y
AMD CrossFire Technology Y
HDMI 2.0b Y
DisplayPort 1.4 ( 1.2 Certified, 1.3/1.4 Ready) Y
Certifications
RoHS Y
Top comments
markpomeroy
23 Jul 163#349
Mine arrived today and it has 8GB on it!
bbfb123
1 Jul 163#163
Please...this graphics card isn't gonna blow your motherboard up. The only people that might run into trouble are those who are running big over voltage on the graphics card, and to be doing that you should be using a high quality motherboard and power supply anyway. If your running a cheap £50 motherboard and power supply and try running big overvolt on the graphics card then that's that persons own stupid fault if it goes wrong.
fishmaster
30 Jun 163#129
A single card is the best solution, that's the answer. The whole 2X RX480 versus GTX1070/80 is silly.
Tim1292
30 Jun 163#105
Most reviews I've seen put the 480 slightly ahead of the 970 in DX11 games and ahead of the 980 in DX12 games. Why would you buy a used 970 over this if they are the same price? Or even if it was £10-£20 cheaper?
You would be buying a discontinued product with little or no warranty, poor DX12 support, poor future driver optimisation and very little future resale value. Also 4GB > 3.5GB anyday, especially when newer games take advantage of more VRAM.
Latest comments (352)
The_Hoff
25 Jul 16#352
It should also increase the speed on the memory.
markpomeroy
25 Jul 16#351
Nope not yet. I'm going to wait until I really need the extra RAM.
The_Hoff
24 Jul 16#350
Tried flashing it yet?
markpomeroy
23 Jul 163#349
Mine arrived today and it has 8GB on it!
markpomeroy
21 Jul 16#348
Price has gone up to £182.99 now! I'm glad I managed to get one at £173.99.
MaterSpryce
19 Jul 16#347
Dang it!!!!!!!! They won't down soo quickly!!! I couldn't snag one. :disappointed:
jalexanderevans
19 Jul 16#346
Pre-orders are up again! Just managed to get one through- £10 off via quidco too!
jalexanderevans
17 Jul 16#345
Thats kinda ironic... why dont they just remove the button then. Guess this is expired then!
brookheather
17 Jul 16#344
Why isn't this deal expired as there is no stock?
miaomiaobaubau
17 Jul 16#343
are they still selling the faulty batches???
MaterSpryce
17 Jul 16#342
The Preorder stock, is out of stock.
jalexanderevans
16 Jul 16#341
Trying to add this to basket as showing as pre-order but it won't add... Anyone else able to add/order it?
The_Hoff
16 Jul 161#340
Where as the rest of us just think you're both kids :o)
Nate1492
15 Jul 16#339
Many examples?
No, no there isn't. I've owned 3 AMD cards. The 9800 was amazing, it was my favorite card of all time.
The latest offerings from AMD are sub par.
And honestly, just throwing this out there, maybe you are the fanboy here.
MaterSpryce
15 Jul 16#338
I'm sorry you just have to face the facts, AMD will do much better in Vulkan and DX 12 beating Nvidia and 1060. You are a Nvidia fanboy who has probably never owned a AMD videocard. There are many examples.
Nate1492
14 Jul 16#337
Are you joking or what? The article you link clearly shows that is not a 'clear lead'. The title of the article is...
"early directx 12 games show a distinct split between amd and nvidia performance"
How do you interpret that as a "clear lead in dx 12"?
At what point do you just ignore reality and make up a fabricated story?
And none of these benchmarks show the 480!
Do you not realize how ridiculous you sound when you link an article that contradicts your summary?
Look, just because AMD has poor performance in DX11 and poor Open GL performance shouldn't mean when they get praise for simply fixing their own problems moving to DX12/Vulkan.
MaterSpryce
14 Jul 16#336
"You may think that AMD doesn't have DX12 in it's hands, but there is no argument about Vulkan."
What about it? Vulkan isn't DX12, it was the remnants of mantle.
It's the worst possible example you could use to say "AMD is good with DX 12, look at DOOM!
Doom uses neither DX 11 or DX 12. It uses OPEN GL and Vulkan, literally nothing to do with DX 12.
MaterSpryce
13 Jul 16#334
Hey what do you have to say now, look at the AMD results in the new Doom patch. You may think that AMD doesn't have DX12 in it's hands, but there is no argument about Vulkan.
The_Hoff
13 Jul 162#333
I have an idea...
Let's all start discussing the deal. Has anybody received this and can anybody confirm if they got the 8GB model?
Nate1492
11 Jul 16#332
Did you just ignore my entire post?
There are tons of features that neither AMD nor NVIDIA have implemented. Pretty much equal amounts.
Async support is *minor* in terms of all the stuff that AMD hasn't done for DX 12. They just got people to advertise Async support as some awesome feature. It's worked, people have been blinded by Async computation, as if it's what DX12 is. Hint, it's not. GCN has nothing to do with DX12 anyway, it's just what they call their instruction set. It's not a unique concept or anything innovative (yet again, good PR makes people believe 'GCN' is some magical thing.)
MaterSpryce
9 Jul 16#331
What do you mean? Their entire GCN architecture and on board A sync support was laying the ground work and preparing for DX 12 and Vulkan.
Nate1492
9 Jul 16#330
But that's the thing.
They actually haven't been laying the groundwork for DX12. Take a look at the wiki link showing what they actually support.
Yea, I agree. Its just that seeing how AMD has been laying the groundwork on this for years (for now) its a safe bet to think the AMD has an edge in DX 12.
rev6
9 Jul 16#327
To be honest. It's not a good time to compare AMD to NVIDIA in DX12. Too few AAA titles and simply not enough games out there.
MaterSpryce
9 Jul 16#326
obviously the 1070 and 1080 are better (due to their brute strenght) AMD have not even release there competitor.
Also Rise of the tomb raider (was) an exception because it lacked Async support until very recently.
Nate1492
9 Jul 161#325
You are linking Ashes of the Singularity as an example of AMD 'beating' NVIDIA in DX12?
Ashes is literally an AMD sponsored title. Very few people play the game, check out how many reviews it has on steam.
The 980 beats the Fury X at 1080p and all of the other cards in the Geforce 10xx range destroys AMD.
So what's the catch? AMD has a game in DX12 that it does well in and Nvidia has a game it does well in. Neither are "winning" DX 12 titles.
What you can say is that AMD's CPU bottleneck they have in DX11 isn't present in DX12. But just because NVIDIA had good DX 11 performance while DX 11 choked AMD big time... That doesn't mean AMD 'did better than NVIDIA' in DX12, they simply fixed their own problems. There was less to fix with NVIDIA moving from DX11 to DX12.
MaterSpryce
9 Jul 16#324
Did i say once that Nvidia doesn't "support" DX12?
Did you read about anything i said?, i said nothing about "supporting"
If you take the time to look at benchmarks you would see AMD beats Nvidia almost all the time.
AMD's been pushing performance through the merit of the hardware, NVIDIA's been pushing performance through artificial improvements made at the driver level. An example is Async compute capabilites.
So i guess you don't know about DX12 afterall
Nate1492
9 Jul 16#323
I know quite a bit about DX 12.
I think you are making the incorrect assumption that AMD 'supports DX12' and NVIDIA 'doesn't'.
AMD and NVIDIA both have support for some of the features, but they do NOT have support for all of the features.
Only Microsoft's WARP12 has full DX12 support.
And if you actually go by the official DX12 levels... Geforce 900 series suports Level 12_1 while the 480 only supports 12_0.
So, perhaps next time you rabble on about "AMD has more DX 12 support" you'll take a second on that thought.
dh33r4j
9 Jul 16#322
There is no SLI support. You wont be able to run two of them in SLI. I think 480 in crossfire would be a better option for those who wants to buy one now and then add a second one later on (possibly a used one from ebay for dirt cheap)
MaterSpryce
9 Jul 16#321
That's MSRP (manufacturer suggested retail price) They don't have to follow it , just look at the GTX 1070 and 1080 you can't find any of them for retail price.
brookheather
9 Jul 16#320
UK prices here - FE is £279 and custom boards £239:
They can't even draw a graph correctly sure :wink:
Rhythmeister
8 Jul 16#318
OOS now that it looks like the 1st batch of the 4GB cards are easily flashable to 8GB :disappointed: Has anyone here flashed theirs?
JS94
8 Jul 16#317
Maybe I am a sceptic, but no chance.
I cannot see the 1060 costing much less than £300 here. But we will see.
My prediction is £270-£280 if it is actually 980 performance in games.
MaterSpryce
8 Jul 16#316
It is and AMD advantage due to their on board A sync capabilities.
Aradria
8 Jul 16#315
Of course it has an advantage over older cards in DX12, but that's a generational advantage, not an AMD one. 1060 shouldn't be at a disadvantage as it's the same generation, so it'll probably beat it.
Anyway price/performance is always similar in that bracket so it looks silly to argue so passionately one way or another.. Even if you want a 480 it's a good idea to wait for the 1060 to come out and see how it affects the pricing unless you're desperate for one now.
MaterSpryce
8 Jul 16#314
It's not a double standard, do you anything about DX 12?
1. AMD is know for smoking Nvidia by a lot in DX12
2. The 480 beats and matches the 980 (mostly beats) in DX 12
3. So it is very likely the 480 will beat the 1060 by a lot in DX 12
First of all, I said "billed at" which, in simple English, means reported at.
Secondly, you can't both say "You don't know it will be better than the 980" and then turn around and then say the "480 is bound to smoke the 1060 at Dx12".
You do realize how much a double standard that is, right?
cigbunt
8 Jul 16#311
will be 300 in the uk
dh33r4j
8 Jul 16#310
Nvidia unveils the GTX 1060: Faster than a GTX 980 for $250.
Nvidia claims the GTX 1060 is 15 percent faster and over 75 percent more power efficient than the RX 480, which, if true, would make the eight percent jump in price over the 8GB RX 480 more than worth it
The benches are not even out Nvidia is known for not always matching their claim
s
markpomeroy
8 Jul 16#307
Just managed to order one!
MaterSpryce
8 Jul 161#306
The benchmarks are not even out and you assume that's correct, Nvidia is known for stating and not delivering and "better than a 980" means nothing in some games it may beat or match the 980 and in some it may not (like the 480) also the 480 is bound to smoke the 1060 in direct X 12.
As it happens, some (or perhaps all) launch cards that ship with 4GB of GDDR5 can be unlocked to 8GB. You read that right: vendors apparently shipped these initial cards with 8GB, but simply used a different BIOS to limit them to 4GB. It's a quaint solution, and one that's just begging to be messed around with.
Just as update to members on HUK, The Stilt who is a pro overclocker and has good AMD product knowledge has released a method to reprogram IR3567B to take more of the loading to PCI-E plugs but due to how the VRM is spilit 50/50 between PCI-E slot/plugs the redistribution yields only a 10W decrease in PCI-E slot usage.
2. The thread with i2c command fix and soon bios fix release.
3. Anyone interested in photo showing phase distribution and has a link to Buildzoid's video testing RX 480 PCI-E slot/plug setup view this post by McSteel on TPU.
Due to only 10W lowering on PCI-E slot and how PCB design is I would assume this would easily be used up when OC'ing. Hopefully when AMD release a fix it will not hamper performance to gain lower PCI-E slot power usage. If there was another controller that could change PCI-E slot/plug power distribution or PowerPlay in ROM could I know The Stilt would have done this.
stone3t
5 Jul 16#285
OOS
tahir_owen
5 Jul 161#284
Apprently undervolting the gpu increases its performance because it's able to sustain the full 1266mhz boost clock permanently.
Apparently the 4GB cards are actually 8GB....:confused:. Needs a bios flash
Nate1492
4 Jul 16#283
Absolutely.
And we need to see what the 'fix' exactly is.
Will they simply cut some of the power out? Will this impact games when VRAM usage goes up? Will they drop the stock clocks on GPU/Memory?
Who knows.
iDealYou
4 Jul 16#282
So 1) issue confirmed by AMD 2) awaiting software fix
This is right?
I like the price of this ... but will wait to see the 1060 from NVidia now.
Agharta
4 Jul 161#281
Since Haswell was released in 2013 and Broadwell was a very limited desktop release in 2014 you are implying that it is designed for Skylake only for Intel platforms!!!
miaomiaobaubau
4 Jul 16#280
thank you, already got few gtx970 hanging around
Nate1492
4 Jul 16#279
You're pretty much spot on.
AMD's 290 was a good GPU, no doubt, but they have relied on it for 4+ years and they still aren't leaving it behind. If you have a 290, there is very little reason to switch to the 480.
Also, this is supposed to be a budget GPU. Why on earth do you need a 2014 or newer CPU, high end motherboard, a quality PSU, AND excellent case ventilation?
It sounds like the *reference* 480 is a poorly designed product if it has all of these requirements.
I could easily recommend a reference 970 without stipulating any of those things. Both are blower style, so they SHOULDN'T need good case ventilation. 480 was supposed to be power efficient, but I guess that's out the door.
miaomiaobaubau
4 Jul 16#278
did not seem you understood or read properly what I wrote. BTW, apart from not having the very latest GPU's, my systems are powerful hence the right to express what I think about this particular GPU which seems is less or on par at best than what amd produced about 4year ago. Also, some games and usually benchmarks, would not care that much about how powerful is a particular cpu, to an extent of course because you still need a decent power for a particular GPU to push the GPU to the limit on a particular game or benchmark ( obviously apart from the final values of the cpu itself which I would not count at all), must be all proportionate for a particular need.
BTW, you say it is a must to have a cpu made after 2014, how strange, I found out (I test a lot) that cpu's made nearly 6/7 year ago are still a lot better that what amd is giving nowdays. Looks like amd cannot keep up at all trying instead to discount instead what they were producing already long time ago even so under a different architecture etc... etc... etc...
tempt
4 Jul 16#277
How old is your CPU? This card is designed for newer generation processors (2014 onwards). Also make sure you have a high end motherboard, a quality PSU and a case with excellent ventilation.
miaomiaobaubau
4 Jul 16#276
seen some private people unigine heaven and valley tests on you tube. Compared with what I got, results are disappointing
I must say, nothing to do with the cpu which is virtually irrelevant on these tests. Even my r9 290 seems to be better
And AMD absolutely are trying to lure 970 owners away. They won't be very successful, but they certainly are trying.
2) I won't buy the reference card because it's noisy, hot, terrible OC, and has a huge PCIe voltage issue that will require a 'software' change that will simply throttle the card.
3) I just showed you that the ratio wasn't 1:1. 90 W and 76 W.
That's no where near 1 to 1 ratio!
What it tells me i that the 6 pin physically can't provide much more than 75W. While the PCIe spec can demand more (and potentially then crash the board).
Imagine this simple scenario.
The graphics card wants 170W of power.
It asks all available interfaces for the maximum power available.
It's returned with 150W of power.
The graphics card then asks for more power, overruling the amount of power returned by the PCIe+6 pin.
Asking both for more, if either can provide more via some hardware override, it will pull in more.
To assume this card would ignore double 8 pin connectors and simply keep taking more from the PCIe slot is ludicrous. They would have to be complete boneheads to fail that badly. You'd never be able to draw more than 165-170W before your mobo shuts down or begins to degrade.
4) If you honestly don't think AMD can 'change this ratio by software' then they are even more screwed.
They either have limited all of their 480 series cards to 150W, or they will have to recall their cards.
Agharta
4 Jul 16#274
I don't recall ever reading so much negative comment about a graphics card, it's usually AMD's processors that get all the stick.
I can't help thinking that their lack of resources has pushed them into a corner and when this GPU turned out to be underwhelming they got desperate and cranked it and released a seriously flawed product.
I hope they can address this issue without a fix that tanks the performance as was the case with the Phenom TLB bug.
Hopefully they aren't using GloFo for Zen also!
rev6
3 Jul 16#273
You're right. Most people are in the $200 GPU market. Not that most will buy the 480. 970 users would be silly to buy this.
GAVINLEWISHUKD
3 Jul 161#272
I'm pretty sure AMD haven't said that. Why would say the huge amount of 970 owners consider it? They won't. They may have said that 85% of people looking for a sub $200 card might be?
Got a link?
Read the reviews has anybody said "Do not but this card"? No. If it worries anybody that much don't buy it and grab a 970. If I was in the market for a 480 I would buy one.
So what is your reason not to buy one? What do you know that the review sites don't?
Once again you seem not to grasp the concept even though you wrote it earlier yourself! It's using almost a 1/1 ratio. The slot supplies 1w as does the 6pin. So logic says the at 75w on the slot uses 75w of the 6pin. So if the 6pin is supplying 90w so will the slot. Moving that to an 8 pin it will still be 90w from the slot and 90w from the 8 pin. It will make no difference moving to 8 pin as the problem still exists.
What the world wants to know is can AMD change this ratio by software. Which I think not.
Nate1492
3 Jul 16#271
According to AMD, everyone is considering this budget mainstream upgrade. They mentioned 85% of all users would target this for their rig.
You can 'think' it's not a serious issue. But on what grounds? Your 'gut'?
You are implying the issue with this is the PCIe slot is attempting to draw power before the 6 pin. You didn't back it up, you just sorta said it. Without any proof, and expected us to accept that conclusion.
GAVINLEWISHUKD
3 Jul 161#270
Yes,yes,yes I don't think it is a serious issue! But acknowledge it is an issue.
1) Pretty much all the info from the review sites so far? Also by your own admission "I haven't found any place suggesting this is simply a 'prioritization' issue, all the tests suggest the power draw is simply exceeding 150W and it's pulling from whatever it can."
Undervolting uses less from both and overvolting uses more from both. So can't see any reason why it would magically use more from the 8pin and less over the PCI-e lane.
2) OK I'll accept the 10%-15% figure. Of this minority how many were considering buying a 480?
So I'm not sure where you get "Nobody" from? It seems you ask a question and I reply but you seem selective on question answering.
I think it will be closer to nobody then everybody of this small percentage of users.
So I'm happy to leave it here. I'm sure on any points I'm wrong you will gladly correct me in the future.
Nate1492
3 Jul 16#269
I'll address these two points and move on, I think you've biased yourself into a position where you don't believe the PCIe issue is that serious.
1) Absolutely not, give us anything that suggests that the PCIe slot would 'max out first'. That makes no logical or reasonable sense.
You can see plenty of evidence that underclocking reduces this issue and overclocking increases it.
2) I think you're off by a large factor on your estimates of PC age.
Look at how many Geforce 400 and 500 users there still are. Check out how many Radeon 6xxx/7xxx users there are.
Just a quick scan shows nearly 10-15% of users have 5+ year old graphics cards. It's pretty safe to assume these machines also have 5+ year old CPU/Motherboards as well.
So no, I'm not buying the idea that "nobody" uses a CPU from 2010.
The GTX580 isn't a valid comparison as you actually have to OC it for it to become an issue, the 480 can cause damage out of the box.
GAVINLEWISHUKD
3 Jul 161#268
I'm not sure what scale you use but AdoredTV was "terrible" but me pointing out the video you linked to was "poor". Both videos had both good and bad point. AdoredTV video was in your opinion much worse. For me it was much closer.
90w in the paragraph 85w on the graph, there is a 5w discrepancy already! The point is one figure is pretty useless. You need 2 of the 3 to be truly useful. 85w at 11.4v is less than 80w at 12.6v. The only testing we have seen is with both sets of numbers is Pcper's where it shows it is slightly less.
On your second point you have answered it yourself! It seems to be pulling it from wherever it can so will still be maxing out the PCI-e slot before it can use any extra from be it a 6 or 8pin. This is the point I'm not sure AMD can change this without hardware changes so adding an 8 pin won't change this.
Mainstream gamers? Forum tags from forums seem to suggest nobody is using a mobo and CPU from 2010. Most seem to be i3, i5 and FX that are far newer. I'm sure when steam get some survey numbers this will be confirmed.
I'm not making excuses I have never tried to hide the fact it uses too much power. But until we find out more about the motherboard issues we can't say anything for sure. I can kill a mobo with a GTX580. Does that make the GTX580 automatically a mobo killer? No
I said earlier in the thread that if you intend to overclock probably best not to get one now.
Nate1492
3 Jul 16#267
Why would you throw a 'lol' at the end of that perfectly fine and reasonable statement? It makes the rest of the sentence look... Thoughtless?
Yes, people are 'considering' AMD, but the card is a major let down in performance and this PCIe issue.
It's clear that AMD have essentially pushed as much power as possible into the 480, and the card is splitting at the seems.
They should have left the card at a lower volt, power draw, and just accepted lower than 970 at stock performance.
But that would look terrible to marketing, so they pushed the card to it's near stock limits, and this has come back to bite them as it draws over 150 Watts during intense gaming/benching.
rev6
3 Jul 16#266
:smile:
ukez
3 Jul 16#265
You say that, but i personally know a few guys that wouldn't usually look at AMD but they're still considering this card due to the price lol
rev6
3 Jul 16#264
It's not going to help AMD sell the GPU's regardless if it's a serious problem and they sort it out. Damage is done. Pun intended.
Nate1492
3 Jul 16#263
I've grabbed your reply and I will respond to each part. However, #248 was not a post in which you convinced me (or likely anyone) that this video was bad, nor that the PcPer is 'good'.
Nonsense. It's a name, not a claim. He's doing reviews.
Read the "Stress test power consumption" part. They state the PCIe slot is drawing 90 out of the 168W of power!
Simple math shows that the opposite is true. 90W from the PCIe, 78 from the 6 PIN. Both components are being overdrawn, but the 6 PIN is within tolerances.
Do you have any links saying this is the case? I haven't found any place suggesting this is simply a 'prioritization' issue, all the tests suggest the power draw is simply exceeding 150W and it's pulling from whatever it can.
The mainstream gamer. Honestly, you can make all the excuses you want, but there has already been reports of damaged motherboards.
GAVINLEWISHUKD
3 Jul 161#262
Because of the points I posted in post #248.
There was lots of not good/misleading info.
The only video that I would class as good at this moment in time is the Pcper one.
GAVINLEWISHUKD
3 Jul 161#261
Several?
The ones that looked confirmed are both 3 card systems. This is not a surprise at all. ATX standards for the the P4 four pin is 150w. ATX only specify for 2 high power pci-e cards. There is still 75 watts available via the 20pin.
I suspect AMD will disable 3 way crossfire in next week's update, or down clock it so much in the 3 way profile that it won't be worth running 3 cards.
So the problem on the above is the motherboard manufacturers issue. They could have included additional pci-e power directly for the slots. They didn't as it never been needed before.
I do remember Asrock doing boards with a 4 pin molex to boost pci-e power.
I'm surprised AMD did not find this issue before launch because even if they were is specs adding the 130% powertune would have pushed them well over the top.
Thinking about it I wonder if this is one of the reasons why Nvidia have limited their cards to 2 way?
Nate1492
3 Jul 16#260
How was it a terrible video? You can't simply say "It's a terrible video" and just leave it at that.
The guy described the situation, found the reason, investigated, and provided a lot of good info.
Is it a bad video because it exposes a flaw?
tempt
3 Jul 16#259
AMDs disaster with the 480 hasn't escaped the attention of investors. Expect its share price to tank.
Several people have posted on reddit that the PCIe slot on their motherboards were damaged after a single day of using this card.
GAVINLEWISHUKD
3 Jul 161#257
All I was pointing out was it was a terrible video.
Yes I fully acknowledge that it uses more than the specification.
What I want to know is, is this causing issues with real people buying the card and using it in their system? My speculation is it is not.
Either way AMD will sort this issue by tweaking the performance. The problem is 99% of will have know idea if AMD have sorted it apart from a few sites tests. Some chips out there today will happily undervolt to get them into spec without affecting performance. Others probably won't. AMD just need to find a sweet spot between the two. Presuming they can't change the power ratio but that would be the best option if they could. Take 15% off the available power over the slot and add it back over the 6pin.
Did AMD all along intend 75w to be taken to the GPU from the 6pin and 35w from the slot (normal sort of level) making this 110w that keeps getting bounced about. This would leave 40w for the fan and importantly the memory but the memory at that clock and when fully utilised is more than the original samples supplied (Samsung supplied top binned chips?). Over 8 chips it would only need 15% over use and suddenly your over 80w total from the slot.
rev6
3 Jul 16#256
It's also faster memory in the 8GB variant.
nedford
3 Jul 16#255
thank God its not made by mac... itd be more like 245 pounds extra for 4gb more ram.
nedford
3 Jul 16#254
they do, but this gfx card is newer and way more powerful than the processor in ps4 / xbox1
Nate1492
3 Jul 16#253
At no point does it need to be suggested that the PCIe or the 6 pin pulls more.
The issue is entirely within the fact that the PCIe lan is pulling 80-82W on average.
I pointed out a video of a reviewer finding problems.
As he said, his highly overclockable motherboard (which costs in excess of 400 quid) didn't black screen.
You are making an assumption that his test rig wouldn't support a GPU if it was under the 750w Limit.
Did you miss out on the fact he plugged the 980 TI into the same motherboard and had zero problems?
tempt
2 Jul 16#252
rev6
2 Jul 161#251
The conclusion was, NVIDIA exceeds the power specification all the time. And that the 480 just isn't powerful enough to make use of all the VRAM.
gupsterg
2 Jul 162#250
PowerLimit is part of PowerTune and contained in PowerPlay of ROM and deals with GPU only, it does not take into context RAM, other circuitry and electrical elements like losses,etc. It basically gives GPU parameters:-
i) cooler can dissipate (x) Watts (TDP).
ii) VRM in context of thermal limit can provide (x) Amps (TDC).
iii) connectors on PCB can provide x Watts (MPDL). MPDL has no value in ROM to separate PCI-E slot & plugs.
Be aware there is some hidden "magic" in OS driver. For example on Fiji if I do an OC of GPU: 1145 HBM: 545 and use (x) PL in ROM and use driver defaults 3DM FS will stick to 1145MHz. Then I run Heaven or Valley card will drop some clocks, this is not due to it throttling due to PL/Temps but how PowerTune algorithm senses a moment of low GPU usage/no display and drops clocks to save power/lower temps, etc. But if I switch a feature in drivers off "Power Efficiency" then card will stick to 1145MHz, as PowerTune algorithm has been relaxed. Perhaps this PowerTune algorithm is what they will modify, which still IMO would mean higher PCI-E slot power draw than past AMD cards.
So PowerLimit via ROM sets up GPU and does not deal with other circuitry. IR3567B is controlling phases but can't differentiate between PCI-E slot/plugs and the mosfets take their power from source on PCB. Now if there is a controller on the PCB which deals with PCI-E slot/plugs and has data interface (I2C,SMBus,PMbus) then they can modify this via driver and/or ROM.
On Hawaii there was great PCB variation for example there was Vapor-X 290X with like 10 phases on rear and 2 upfront and when you compared ROMs between ref PCB (6 rear, 1 front) and that, the ROMs only differed on PowerPlay and VoltageObjectInfo which programs IR3567B. I never found anything dealing with PCI-E slot/plugs in Hawaii/Grenada/Fiji ROM (been involved over a year now in bios mod), but I don't have all info.
Now RX 480 is PowerPlay 7 card (downed copy of ROM from TPU), like Tonga and Fiji, so it shares tonga_pptable.h part of Linux Driver. These are the values in PowerTune of RX 480:-
Just if anyone is wondering I'm not an nVidia card owner, I have not had nVidia for 6yrs. I have owned Cypress, Hawaii and Fiji.
GAVINLEWISHUKD
2 Jul 162#249
It just seems strange why AMD would include this in this little none detailed statement as it has no relevance. My reading was that AMD had set its power target for the GPU on the understanding of what the ram should be consuming but actually it consumes more than expected.
Maybe I look at the hidden meaning too far.:laughing:
As for the power split I guess we will know soon enough if they are controllable values as the divide will be evident and easy for AMD to sort. Most sites/people don't seem to have a clue. Some think is is possible and I think Hardware.FR and myself are the only ones that think not!
GAVINLEWISHUKD
2 Jul 161#248
But the link you provided is poor too. From a YouTube channel that calls itself "scientific studio" there is no science!
At no point in any of the info we have seen (tomshardware/PcPer) has there been a suggestion that the PCI-e lanes have supplied more power than the 6pin.
Also suggesting an after market card with one or even two 8 pin will solve the problem is a wild suggestion as it seems at this point it is maximising the PCI-e lanes first. So it will give you more overall headroom the issue will still be present.
Besides who is going to be using a system from 6 years ago with a RX480!? It was a low rent mobo with only one P4 and a 95w CPU overclocking to probably about 125w and a power hungry chipset. It would probably struggle even if the GPU was under the 75w limit.
gupsterg
2 Jul 162#247
Not at all blaming Samsung IMO.
All their pointing out is besides tuning GPU to maximise performance they used very high speed RAM to gain performance. RAM tends to use a lot less power in comparison to GPU even if high speed/8GB. IIRC from an article max 40W.
From what I understand the high side mosfet (12V) and low side mosfet (GPU,etc) are controlled by IR3567B (PWM), Sin's VRM guide.
For example for GPU VDDC the GPU commands IR3567B to provide x VDDC and IR3567B does it's "magic" with the mosfets. The phases are independent (relatively speaking), so let's say phase 1 of loop 1 gets knocked out due to a fault on mosfet, the other's carry on working but it will put extra strain on them and perhaps failure later. A member in the Fiji bios mod thread made an error on a capacitor mod on a phase, that phase got wrecked but the rest worked but later card died. Another member did lots of experiments with a Hawaii card and deemed the IR3567B a very intelligent voltage control chip.
The IR3567B full datasheet is covered by NDA, so we only have access to a 2 page PDF:disappointed: . But IR3565B is very very similar chip and IR3567B has same features :smiley: , 59 page data sheet. IR3565B (48 pin) is dual output 4+2 phase and IR3567B (56 pin) is dual output 6+2. The increased 8 pin count on IR3567B is due to the 2 extra phases on loop 1, 4 pins make up current sense input/return, 2 pins PWM signal, 2 NC .
My view is in current state RX 480 is a "balls up":disappointed: .
The updated PC perspective article is the best testing IMO and explains Tom's Hardware Guide power review sections as well. I collated THG data on 390X / Nano / Fury X / RX 480 and viewing the average power usage RX 480 on PCI-E is high, due to size of image here is direct link. Even the ref PCB 295X2 which basically used 2x 8 pin PCI-E connectors to their hardware limit didn't draw a lot from PCI-E slot, THG review.
Hardware spec for PCI-E plugs is pretty high, most PSU have 6+2 pin PCI-E plug so technically the 6 pin has 3x 12V lines, as they are 18AWG wires let's say ~8A capacity results in 288W. Here is direct link to data I collated. The PCI-E slot is the weaker link and has 5x 12V pins 1.1A each only, this is in comments section of PC perspective article by their team member.
We will have to see what occurs with new driver, currently I think IR3567B can not differentiate power from PCI-E slot/plugs. I think other PCB design aspects limit PCI-E slot usage, if AMD change aspects of voltage/PL to GPU via driver/bios to reduce power usage it may still mean a lot of power is still used from PCI-E slot when compared with past cards.
striker33
2 Jul 161#246
I'd still rather spend the extra for the 1060 when it arrives. Nvidia have nailed the low temp/high efficiency game, and while AMD have gotten a lot better in regards to heat and noise, they still can't come close. Also take into account the fact that AMD are slower with drivers and generally have more issues in that regard.
As always though, AMD will be king when it comes to purely budget minded consumers. No question about that.
Then again if Nvidia drop SLI for the 1060 as rumoured, then the RX 480 will get a good following regardless.
Nate1492
2 Jul 161#245
I actually think this is a terrible video.
He's trying to wash the problem away. As he said, this isn't displaying the same issues other cards exhibited.
AdoredTV is pretty in favor of AMD, take a look at his 480 review.
His conclusions was pretty ugly, if you broke it down.
You can't use that 8GB without exceeding the motherboard spec at stock rates.
So, NVIDIA don't get a pass for 3.5GB of actual VRAM, what about this 8GB of VRAM that you can't actually use?
So it is peaking at 12.3% over PCIe specs and 10.3% over ATX specs.
So at stock I can't see this being an issue. So for the avarage consumer buying one and loading their game and playing all should be fine.
Note for the enthusiast it's probably a good idea not to increase the power control up (for now until we get more info from AMD).
So if you are to buy the card or not do so on the basis of stock benchmarks.
If these numbers are good (they are) for the price you pay then grab one and enjoy your games
If you intend buying for overclocking then don't bother for now.
As for 'Can AMD fix this' I'm still looking for info on this. Yes I'm pretty sure they can solve the issue by limiting the VRM control to use less power which in turn will cause less power draw. But in turn that will affect the raw performance. I'm not convinced from a hardware level (for this batch of cards) there will be a fix.
Now my view on what is a possibility of what could happen (I'm not saying it will it's just my opinion, before any fanboys start Nvidia would probably do the same).
Say today your game gives you 60fps and uses 80w on the 12v PCIe slot.
AMD optimise the drivers (as they do) and normally you would see an increase to say 64fps (still using 80w).
Now they will be able to dial back the power to within spec and give you back your 60fps.
So on the face of it the game used too much power to give you 60fps and now it doesn't use too much and gives you 60fps.
AMD fixed it right!?
No they solved it but didn't fix it.
So if driver optimisation doesn't seem to be happening in the coming weeks but the power issue goes away (at stock anyway which is all AMD have to do) then you will know why. :smiley:
If anybody finds any useful info like a PCB layout diagram please send me a PM (more eyes and all)
But please don't post me to links for forums or reddit posts like somebody did last night. Thanks.
Everyone hold your horses. Surely this will only take a few days to figure out, but I wouldn't risk my entire system. Besides, the prices should be lower next week!
Nate1492
2 Jul 16#237
I don't think I can take that response with anything more than simply labeling it corporate BS.
They responded... Sure, but it was basically acknowledging the fact there are issues. They don't know why. And they are "Looking into it."
To me, this is exactly what I'd want to avoid in a 'cheap' graphics card. It might nuke my other hardware... Yeah, I'll wait til it is fixed.
a3lawy
2 Jul 16#236
Not much of a response, typical corporate damage control. Every product that has been on stores has passed compliance testing, doesn't mean the retail units have. Even exploding Takata air bags.
I haven't said I showed serious voltage concerns. I've linked to articles saying there are serious voltage concerns.
You can keep on with the "legitimate advice" crap you are going on with, but the truth is, this is a big problem that will absolutely result in reduced hardware life.
There are only a handful of reviews so far, generally, this type of damage manifests itself in reduced life expectancy of capacitors. You'll just notice one day half of your capacitors have popped and you won't quite know why.
GAVINLEWISHUKD
2 Jul 16#233
The worrying thing is how much power would the card (Nvidia) that hit 225w have gone to if it hadn't hit the 225w limit of the 24pin! :confused:
Next is who is feeling brave enough to try and power the the entire card off the 24pin! :laughing:
Someone posted their XFX card on PC Partpicker it seems to have a backplate but that picture is from a 8gb variant.
rev6
1 Jul 16#229
The 192-bit memory bus and low shader count worries me. The 1060 will most likely be the best it can be on release and then eventually fade away were as the 480 could get better over time like with most GCN cards. Hopefully the 480 overclocks well with custom PCB's.
soza
1 Jul 161#228
I think I'll wait to see the benchmarks and price of the GTX 1060. That plus I want to see what the 3rd party RX 480's are like. I know that Sapphire's got the RX 480 Nitro with an 8-pin PCIe power connector that'll sell for around £249.
I wouldn't buy this card if you've got an old PC or want to run two in CrossFire.
Since it will probably take 2 weeks I think I am going to get the normal version and then see the reviews on the aftermarket. If the aftermarket is worth it then I will return the rx 480 and then get the aftermarket. Do you think this is a good decision?
The_Hoff
1 Jul 16#226
It's pre-order, when do they take payment?
May order one and cancel if something better arrives.
Tim1292
1 Jul 16#225
Wait for the aftermarket version, always will be better than a reference card.
rev6
1 Jul 16#224
rev6
1 Jul 16#223
Have a look at the full size image. The power connector is on the PCB, not what I highlighted.
Fabmaur
1 Jul 16#222
It says it'll be here on the 31st of July! I don't want to wait that long since I only have an intel graphics chip. The price of the 8gb version is a lot more expensive and the aftermarket version will be out soon and more expensive. What should I do?
mikeespain
1 Jul 16#221
I believe that's the PCB, backplate or not doesn't really matter though tbf
JS94
1 Jul 16#220
Hopefully custom PCBs from other manufacturers will fix this power draw problem.
Really don't want to risk frying my MoBo!
PhilK
1 Jul 16#219
I'm RIGHT :laughing:
rev6
1 Jul 16#218
You're boring.
PhilK
1 Jul 16#217
No, it's when someone doesn't see the glaringly obvious and then tries to defend WHY he didn't. :stuck_out_tongue: (That means a mocking tongue out, by the way.) :wink: (That one means don't take it so seriously though at same time)
rev6
1 Jul 16#216
But it's like only getting a joke told to you 20 years ago and laughing hysterically about it. Like you only just realised a GPU is mounted upside down so the label is if it's not.
Let's just both laugh at it and call it a day :laughing:
PhilK
1 Jul 16#215
With the jokey, not-to-be-taken-seriously comment, it has everything to do with !
rev6
1 Jul 16#214
All GPU's are the same. Where have you been hiding?
The smileys have nothing to do with it :stuck_out_tongue:
PhilK
1 Jul 16#213
No, different smileys mean different things. For example. Laughing smileys suggest not to be taken seriously or taking the P..... most often. A clue there I think
GAVINLEWISHUKD
1 Jul 16#212
Sorry (being old and all) yes I'm fully aware of the difference between BIOS and what is called software drivers today. In past times the BIOS updates were commonly referred to as driver updates (there was much less control back then, most things were fixed function) and software updates which you now call drivers. Strange world.
As for B+C one of us will be correct and only time will tell. :smiley:
GAVINLEWISHUKD
1 Jul 16#211
So are you saying the input voltage regulator chip no longer exists and both the input and output and handled by IR3567B? So it must be fused between each VRM so any power failure pretty much makes your card a door stop. :disappointed:
If so I would probably be more worried about killing the card than the motherboard. Lol
Pretty interested to know how the power input side is now handled. If you have any links it will make some good bedtime reading. :smiley:
Apologies to anybody as it seems things may have moved on and if the above is correct it may be possible to sort the issue with a BIOS update.
I'm happy to stand by that I don't consider there being any risk to the motherboard.
rev6
1 Jul 161#210
That's a lot of nothing :smile:
rev6
1 Jul 16#209
I thought they're all like that :smiley:
Rhythmeister
1 Jul 16#208
There's nothing else coming from either camp for a good few months yet bar the GTX 1060 and the RX 490 probably!
PhilK
1 Jul 16#207
The....er.....laughing smiley's didn't register with you then ?
Deetea
1 Jul 161#206
31st July? Wtf. I expect between now and then i'll find something else i fancy.
ReadySetGoGo
1 Jul 161#205
I looked at the graph on that link.
It says 'much faster than a 480' but interpret the chart properly and its only 15% faster and they want to charge 50% more for it.
oops
You say you've shown serious voltage concerns, yet there isn't a single verified story showing any actual damage from these pcie slots using more power. It's amazing the stark contrast of the technical "knowledge" of people posting on here compared to a decent computer forum such as overclock.net. If you want real, legitimate advice go look on ocn at this "issue" I'm sure you'll get some sound advice.
gupsterg
1 Jul 161#199
As I've never owned Tahiti I've never searched for pre-made ROMs. You can use VBE7 to modify your stock ROM. I'm not sure if VBE7 is giving access to all PowerLimit values in ROM either, you see on Hawaii/Fiji there is TDP/TDC/MPDL (view OP of Hawaii/Fiji bios mod threads on OCN PowerLimit section).
PowerLimit is part of PowerTune, PowerTune table is contained within PowerPlay data table of ROM. We can create tables list of a ROM 2 ways :-
i) In 2009 a developer working on Linux driver for AMD cards reverse engineered a program (AtomDis) to parse AtomBios (AMD ROM). The program can only be used in Linux (can be used in VM as well).
ii) A OCN member created a windows program (AtomBiosReader), which can also create tables list for ROM but not parse data within them.
Once you are in right section of ROM using a Hex editor you will need to assess which hex value does what. This is made easy by information in Linux driver. For example to translate PowerPlay on Tahiti you'd use pptable.h.
Be aware when you modify PowerLimit in ROM you will still see 0% in OS as the new values in ROM have become the starting point for any change for PL via OS. Be also aware that PL in ROM is limiting GPU "power" not total board electronics, so RAM, fan, etc do not form part of it. Your actual power draw stats will be higher if you can see monitoring data for VRM in HWiNFO or using measuring equipment.
rev6
1 Jul 16#198
The GPU sits in the PC upside down. What's your point?
Aradria
1 Jul 16#197
A) You don't seem to know the difference between drivers and BIOS.
B) The non-reference cards should have extra power sockets.
C) There is danger of damaging your motherboard, especially if it's a lower-end model.
Coulomb_Barrier
1 Jul 16#196
Depends on price and the stock that Nvidia can bring of their as-yet-to-be-announced 1060 (may release as early as July 14). It will almost certainly be faster than a 480 but land around the £300 mark, so significantly more expensive. Whilst it seems AMD has ample stock of their 480s flying off shelves, Nvidia seem to be only able to manufacture limited quantities of their high-end 1070/80. If the 1060 'launch' is another soft launch and retailer price-gouging hell due to limited stock again, then that throws AMD a massive bone with their 480.
The_Hoff
1 Jul 16#195
I also need a new 144hz monitor and can't afford a decent gsync. I can grab a 24" freesync one for under £200, so expect I'll be AMD for a while.
Any decent 290 vs 480 benchmarks or direct comparisons anybody has links for? Couldn't find anything conclusive.
The_Hoff
1 Jul 16#194
Interesting though after the 1080/980ti marketing they pulled I'll believe the benchmarks when I see them independently.
Tempted to wait too for the 1060 and 3rd party cards, but I'm not really an OCer and will prob see little benefit.
Price wise, I can't see AMD reductions if Nvidia aren't directly competing with their DX12 cards.
belsibub
1 Jul 16#193
1060 claiming better performance at a bigger price,meh.
I sold my 290 about a month or so ago, with the same thing in mind, thankfully I could almost do a side grade into the rx 480 at a cost of around 20 pounds, but reference cooler is putting me off, if I had not yet sold my 290, I would more than likely not bother to be honest and in fact I am considering purchasing a cheap 290, but quite happily surprised at how well my r9 285 is coping with my gaming needs, I have seen 290 sell as cheap as 115 and would seriously consider another in the short term.... But 1060 coming next week too , that will muddy the waters further, but only 3gb and 6gb memory options
The_Hoff
1 Jul 16#191
I've got a 4GB R9 290, can't decide whether to buy it or not.
Mainly playing Arma and CSGO at the moment but expect BF1 will be a big game for me. Also doing lots of video editing.
I could probably sell the 290 for £130ish(?) and so £40 upgrade seems reasonable, if only for the power consumption reduction.
Thoughts?
zacy5000
1 Jul 161#190
I'm going to wait till its £160 like I originally thought.
Rhythmeister
1 Jul 16#189
I don't suppose there are any Tahiti based bioses to increase GPU power a bit are there?
Rhythmeister
1 Jul 16#188
I've emailed Amazon to ask if they'll price match ebuyer to the rest of the UK can have this for this price :man: Fingers crossed they want to play nice eh?
gupsterg
1 Jul 162#187
IR3567B (the VRM/voltage control chip) can be controlled by ROM/SW/Driver.
For example when we use MSI AB on a Hawaii/Grenada card to change VDDC offset (GPU core voltage) it modifies IR3567B loop 1 via i2c communication. VDDCI (Aux voltage) is loop 2 on Hawaii/Grenada. Also via MSI AB command line you can modify registers within IR3567B via i2c.
On Fiji "SMC" (on die regulator) is "messaged" via MSI AB "talking" to driver which then modifies IR3567B loop 1 for VDDC via i2c communication. You are not supposed to communicate directly to IR3567B via i2c on Fiji but through driver "messaging" as the "SMC" is communicating with IR3567B virtually all the time, this information was shared by a) Unwinder author of MSI AB on Guru3D b) can be seen when we take i2c dumps on Fiji which take ages compared with Hawaii/Grenada, as "SMC" has priority c) part of Linux driver which has "messaging" parameters for "SMC" and one of those is "releasing i2c bus". On Fiji loop 2 is MVDDC to HBM.
I have been heavily into Hawaii/Fiji bios mod, threads are on OCN. Via firmware you can set voltage offset for loop 1 & 2 / fSW loop 1 & 2 independently, also OCP, OVP, VMAX. The IR3567B is a very advanced voltage control chip IMO, most of what I/we learned was from a) studying Hawaii/Grenada ROMs b) testing modifications c) shares by a Pro overclocker which has been involved with AMD for a lengthy period.
I'm currently doing some bios mods for an OCN member for RX 480 :wink: .
PhilK
1 Jul 16#186
But it's printed upside down !!!! :confused::disappointed: :laughing::laughing::laughing::laughing:
GAVINLEWISHUKD
1 Jul 161#185
A) Pretty sure a driver update won't solve this (apart from removing/limiting power bar in Wattman). There is no software control of the voltage regulator.
B)I don't expect to see cards with 8 pin or it will eat into 490 sales later in the year. They are limiting performance by power draw.
If true it turns out you can reduce power consumption by 20-30% while increases performance. :-)
A little tinkering in the Watt Manager and it becomes even more efficient. Bonus.
My purchase is even more satisfying now. Thank you AMD for this GPU, I don't have much cash but now I get to play PC games with great performance.
I am excited to receive it. I want to try and save for a Freesync monitor now in time for christmas.
It's outside the upper tolerance limits for motherboard power draw, which makes it dangerous. Of course a BIOS update should fix it, or waiting a couple of weeks for a card with adequate power sockets. It's not the end of the world but it's not "fine"
GAVINLEWISHUKD
1 Jul 161#182
There is no scandal. The specification for the 4gb version was always lower.
If the rumours are true that the stock 4gb cards for now have 8gb on them lowering the clock was the the best for both AMD (for performance delta) but also for Samsung to clear lower binned chips.
Partner cards will probably use higher clocked dram but will be 8*512 so true 4gb cards.
But there certainly is no scandal. :smiley:
CampGareth
1 Jul 16#181
Remember that pretty much every connector in a computer has limits that are on the generous side though I'm having a hard time finding hard information. I do remember things like the pins of a 4 pin peripheral power cable being rated for 8A each since that's the most they can draw without going above 50 degrees C. It'd take over 100C to melt the plastic, and that rating is met by even the shonkiest of cheap power supplies. I'm not sure what a proper gold plated pin would be capable of but I'm willing to bet you could run far more current through it without it hitting 50C and ridiculous amounts before it melts the plastic.
chapchap
1 Jul 16#180
So to sum up- AMD fail ?! If so,when Nvidia release the 1060 will if be another nail in the coffin for AMD in the GPU market?
Sir_Didymus
1 Jul 16#179
It's a shame that this GPU is causing issues for older AM2 Mobos drawing too much power from the Mobo as opposed to taking it from the PSU otherwise I would have bought this right now to put together a cheap gaming rig. Easy fix for another manufacturer though. All they really have to do is add something greater than a 6 pin pcie socket for maybe an 8 pin or greater and the power should by default be naturally drawn more from the PSU. No doubt someone like Sapphire will release a far better version of the GPU with a better cooler and all the Mobo power leeching issues solved. Can't be beaten currently when it comes to bang for buck.
Coulomb_Barrier
1 Jul 161#178
Stop spreading misinformation. The issue is the card drawing 10W+ extra over the rated limit for the PCIE slot. This was found in two review samples. At stock clocks this isn't an issue anyway, and then it's isolated to a few samples out of countless. If you are not overclocking, I would not worry and I say that as someone who has been building PCs for 10 years. AMD will release a driver fix soon as well you'd imagine.
Aradria
1 Jul 16#177
Seems like the second scandal with these cards is the 4GB ones have lower memory clock and bandwidth than the 8GB ones, unless you use a BIOS hack. So even at 1080p the 4GB is a fair bit slower by default.
Remember none of the complaining reviewers have checked if the cards are pulling more power than the mboard negotiated at power up. The electromechanical claim also appears to be bs, there is a 75W hard limit on the 6pin connector due to pin size. But if the slot is allowed to negotiate up to 300W, spread across 16 power connectors pin size can't be an issue or the standard would be nonsensical.
When someone tests on a minimum spec mboard and it pulls to much power that will be a story. Most likely it will just fail to boot and no damage will be done.
Just total bs created by a site that let nvidia pass on the same observation, eagerly regurgitated by biased fanbois.
Nate1492
1 Jul 16#175
How do you know this? You are just saying unsupported words.
We've shown you serious voltage concerns.
Do not underestimate the power of too much electricity and a computer.
These are stock card issues, not people overvolting.
Rhythmeister
1 Jul 16#174
Now if only it were this price for the UK and not merely GB :disappointed:
The RX 480 performs slightly better than said Nvidia card and the performance of the new ATi card will only increase as drivers mature, it'll be the usual scenario I suspect :sunglasses:
solves that issue 75W is the start up default still a faux pas by AMD but not a catastrophe as suggested
dreamager
1 Jul 16#170
Ok, so my ageing mobo is only pci-e 2.0. Are these things back compatible until I upgrade the rest of my rig or have I got to do a complete overhaul?
gupsterg
1 Jul 161#169
Yes, from severals reviews I've read. This a "sideway upgrade" if you have 290/X, 390/X. If you have lower grade card then those two series then it's a performance upgrade. IMO even for 285/380/380X it could be questionable to buy it.
MadonnaProject
1 Jul 16#168
Is it true though that if one owns a 390 this is slightly less powerful however the advantage is that it's cheaper?
adv
1 Jul 16#167
stonkingly good value for money... If only I didn't need CUDA.
StrifeyWolf
1 Jul 16#166
No you are right. I think he is talking about the GPU power in this specific model and how it will be in future models not current gen.
You could do a little googling, then you can be clued in, like your friend :wink: .
bobo53
1 Jul 16#165
could you please show us some qualifications supporting what you are saying??
vmistery
1 Jul 16#164
This deal has way too much heat, unless people are referring to the average temps this is giving off :smile:
bbfb123
1 Jul 163#163
Please...this graphics card isn't gonna blow your motherboard up. The only people that might run into trouble are those who are running big over voltage on the graphics card, and to be doing that you should be using a high quality motherboard and power supply anyway. If your running a cheap £50 motherboard and power supply and try running big overvolt on the graphics card then that's that persons own stupid fault if it goes wrong.
bbfb123
1 Jul 16#162
So use HDMI then? :smirk:
praevalens
1 Jul 16#161
To have recommended game settings for a new card? How long does it typically take for raptr
Elevation
1 Jul 16#160
Don't worry, you'll still pay more today than you will the day after tomorrow.
Agharta
1 Jul 16#159
Memory pricing doesn't work like that and the chips aren't 8GB but 8x 8Gb.
Rid1
1 Jul 16#158
Wait a while, most GPUs have hiccups upon launch, wait for better drivers and updates that won't kill your system!
Harryisme
1 Jul 16#157
Except each card likely has 8x1GB chips, not 1x8GB chip. I don't think they sell GDDR5 8GB chips.
AMD leaving on 4GB of RAM on the card is confusing, would like to see the card opened up to see what's going there. it would make more sense to just sell all cards at $200 with 8GB, they would literally clear the entire stock in a day if they did that.
That said after seeing the posts about the card drawing more power from the PCi express lane than permitted, I think I am going to wait for the custom PCB's before buying, why AMD just didn't go for a 8pin connector I don't know, it would have made more sense, even if six pin was enough, the 8 pin could have allowed the third parties to fit a beefier cooler on and push the card even further.
tiptop33
1 Jul 16#156
stream0
1 Jul 16#155
Do not buy this card at the moment. There has been reported issues of high power draws than the stated amount, high temperature & noise issues.
Potential downsides of high power draw is that it can damage your motherboard. To control this maybe a bios/card update is needed but it's best to wait it out for now.
If you're lucky to receive certain early reference cards, you *might* be able to find a BIOS unlock for that extra RAM (don't expect any performance improvements tho, especially if it's the slower 7GHz type)...
Bigfoot600
1 Jul 16#153
Oh well brand new station then if that happens
Guys actually think very carefully about this card. It is reportedly pulling nearly double the amount of power through the PCI-e slot than it should be. This can damage your motherboard.
My advice would be to wait and see what AMD say or do about this, and any stories of fried boards.[/quote]
Bigfoot600
1 Jul 16#152
Ive had some issues with 4k frame rate on my r9 280 i was in need of a new gpu at the right price would of been silly to buy a r9 390 at extra cost so thats my reason for buying a new one
thegamingkinginfo
1 Jul 16#151
Depending on what games you play and what resolution you play it at really.
adammillett123
1 Jul 16#150
Does anyone recommend upgrading my r9 280 to this? Is the difference significant for my gaming build or not worth it? Thanks
JS94
1 Jul 16#149
Wow.
Guys actually think very carefully about this card. It is reportedly pulling nearly double the amount of power through the PCI-e slot than it should be. This can damage your motherboard.
My advice would be to wait and see what AMD say or do about this, and any stories of fried boards.
ando
1 Jul 16#148
Suppose so, I never thought of that, thanks
a3lawy
1 Jul 162#147
Buying 10,000,000 8GB chips could be cheaper than 5,000,000 4GB + 5,000,000 8GBs.
ando
1 Jul 16#146
I'm confused, did AMD leave the other 4GB there and just disable it?
I can't see that happening, I could be wrong as I don't know to much, I've read about cross flashes and CPU's coming with disabled cores but didn't know manufacturers would just included extra memory on a graphics card and just disable it. Surely they're shooting themselves in the foot, no?
rev6
1 Jul 16#145
1070. Avoid multi-GPU's.
a3lawy
30 Jun 16#144
There IS a difference in product binning, usually Asus/Gigabyte will be better ASIC chips than lower quality like Powercolor although debatable what real world difference can be noticed.
The major difference will be warranty period and process.
I would avoid the RX480 till they sort out the power issues, read up a lot of complaints.
treacle13
30 Jun 161#143
I want to switch to AMD for better video playback. Not impressed with this card.
Anonknowmouse
30 Jun 16#142
As you said they're all basically the same in different boxes I'd just buy the cheapest one you can find. From what I've read the bios is locked down tight on these. I'd only pay more for the 8gb version if you plan on running it at 4K, as vram only affects the maximum texture quality you can run without stu...tter..ing.
Bigfoot600
30 Jun 16#141
Brought mine from Novatech last night RX 480 8gb boxed by powercolor does it matter whether its powercolor, gigabyte, asus, xfx? Prices going up in £5's
The 970 is a decent card and decent for anything but 4k/60fps. Nvidia unfortunately have supply issues so prices are about 20% high for the 1070/1080. Wait a couple of months until they drop and buy then. Plus we should have details on the 1060 soon.
kye1987
30 Jun 16#138
Where are these new 970s for this price?? A quick Google comes up with £220+ cards..
Anonknowmouse
30 Jun 16#137
The RX480 is disappointing after the hype and rumours. Identical performance to a last generation GTX 970 for a tiny bit less cash. I'm sure the RX470 will be slightly better value per frame for slightly less performance overall, so I'd wait for that.
treacle13
30 Jun 16#136
all the 390x's have cleared off ebay. stuck with a 970 for another year. Ridiculous having a budget of £300 and still unable to purchase anything decent.
TesseractOrion
30 Jun 16#135
PcPer have spotted '490' listed along with all their other model numbers on AMD's site so presume there will be one at some point, don't know yet whether it'll be Vega or Polaris (dual?) based yet...
belsibub
30 Jun 16#134
Low power card?Putting overclocking back in the hands of the user?
So take the heatsink off, clean the GPU with isopropyl alcohol and apply a decent TIM for lower temps and slower fan speed making the card run quieter. 166W is only a huge power draw if your PSU is a piece of crap :confused: Why would anyone buy a previous generation GTX 970 when they can have something current from the company manufacturing all the CPU and GPU hardware in this generation's consoles? I must admit, I considered one myself until I discovered the GTX 1060 shall be released in July and I truly hope it can compete with the RX 480 which is the best value for money card around in my opinion despite its limited overclock.
gowingnator
30 Jun 16#132
If I was to get a display port to DVI-D adapter, would I still be getting 144Hz?
GAVINLEWISHUKD
30 Jun 161#131
While I'm not going to wade in on the 480 because a) I don't have a review sample card or likely be getting one (I'll try and borrow one in a few weeks, pending on AMD not asking for them back*) and b) there is no voltage testing results (as far as I can find)
But what I will say is you have zero chance of 'frying' your motherboard. You may get stability issues (unlikely) but it only fair to point out.
The reason for this is that once the current gets to its highest available load the voltage will drop off. The voltage regulator on the GPU will detect the lower voltage figure and pull more from the 6 pin. This will continue until it either hits the maximum specification set by AMD or the voltage drops on the 6 pin supply.
So the most probable reason everybody is hitting the same 82w'ish is because the voltage has dropped and and the power request stops.
So I'm hoping to see some more testing from the review sites in the coming days.
*If AMD (you have my details) or other card partner what to send me a review sample let me know.
Rhythmeister
30 Jun 16#130
Crossfire AND SLI suck the donkey danglies, it's the nature of the game engines. I can't believe people are STILL considering the GTX 970 over the RX 480, that's just silly; W1zzard is the man :man:
fishmaster
30 Jun 163#129
A single card is the best solution, that's the answer. The whole 2X RX480 versus GTX1070/80 is silly.
thegamingkinginfo
30 Jun 162#128
If you're not giving me an answer to my question then shut up. I want to know peoples opinions on CFX for RX 480 or a a single 1070. If you are not here to provide me with an answer, then don't answer. They are similar but CFX has some compatibility issues and some problems overall whereas the single GPU avoids all of that.
Nate1492
30 Jun 16#127
The problem is that the 970 has already been price adjusted to be much closer to the 480.
There is a NVIDIA 970 selling for 180 (rebate, base 200).
The conclusion is really strong. Don't buy double 480, it doesn't compare.
gowingnator
30 Jun 16#125
Shame no DVI, got monitors with HDMI and DVI
fishmaster
30 Jun 16#124
I'm disapointed by the RX480 despite all of the review conclusions. It brings around GTX970 power for around £75 less than the GTX970 has been for a long time. It's not much progress really. As for DX12 well most games are DX11 based and it doesn't provide any significant DX11 performance boost. I bought in to the hype a bit but as ever AMD disapoint. If Nvidia come in at the right price point with the GTX1060 then the AMD competition was moot. So then we'll have the well let's wait for Vega and then let's just wait for what AMD have after that etc etc. I even think AMD Zen will turn out to be a disapointment, Intel have a 10 year lead over AMD so it will be less of a disapointment.
fishmaster
30 Jun 16#123
Oh behave.
Nate1492
30 Jun 16#122
Or how about Nate1492, someone who looks at reviews and critically anaylzes them before suggesting advice, or name calling, others?
Nate1492
30 Jun 161#121
They absolutely tested DX12 games. I don't know what you are trying to pull here, but you didn't check the review very well.
It's a problem. It's a serious problem. I'm glad Toms does multimeter readings, it really helps consumers avoid bad hardware.
wildswan
30 Jun 16#116
thx was wondering if I could replace the stock hp psu. it's a midi tower case but until I slide the side away don't know if it's std psu form factor. I'll see what the ti is going for though good suggestion
The first graph shows power spikes over 200W, yet no reports of any motherboards being fried.
thegamingkinginfo
30 Jun 16#114
2x RX 480 or a GTX 1070?
Nate1492
30 Jun 16#113
Again, I say this.
Any evidence that the 960 is "sometimes drawing *200Watts off the PCI Epxress port*"?
You pointed to evidence that shows the spec of PCI allowing for a bit of fudge around the 75W limit.
Also, if you read that post, they suggest "a bios update is all that is needed to 'fix' this problem".
I'm not convinced that's the case. Yes, the specification for PCIe offers some edge case scenario for high power consumption devices, but the AMD 480 does not register itself as a high power consumption device, nor does the standard motherboard support more than 75W+-7.5.
In fact, in the very thread, you have people telling you just this.
The point? This is a problem. No matter what the specification technically allows, standard, consumer, motherboards do not support high power PCIe.
And more precisely.
PCIe non high power devices must be within +-8% of 12v and +9% 3.3v... Which again, is around 82W.
So if the AMD is drawing anywhere over 75W, it is problem. If it's drawing over 82W it's out of spec and can be held liable for frying your MoBo.
It is slightly ahead of the 780ti, the 290, and the 970. Behind the 390, 290x, 980, 390x and everything else you'd expect.
It beats cards from 3 generations ago and barely skims past last gens middle of the road card!
The reference 480 has been a flop. See if the 480 AIB custom cards can actually do something, or give this card a pass.
Rhythmeister
30 Jun 161#107
I'm hovering over the buy button on a RX 480, I'm just hoping it makes nGreedia wise up and sell the GTX 1060 for sensible money and that it can compete with this saucy ATi card :smile:
ReadySetGoGo
30 Jun 162#106
8.9/10 review. There are plenty more like that. I did lots of reading before deciding to get this graphics card.
Not a bad review in sight as far as I could see.
Tim1292
30 Jun 163#105
Most reviews I've seen put the 480 slightly ahead of the 970 in DX11 games and ahead of the 980 in DX12 games. Why would you buy a used 970 over this if they are the same price? Or even if it was £10-£20 cheaper?
You would be buying a discontinued product with little or no warranty, poor DX12 support, poor future driver optimisation and very little future resale value. Also 4GB > 3.5GB anyday, especially when newer games take advantage of more VRAM.
Has anyone realised that stock isn't due for over a month on this?
ST3123
30 Jun 16#99
EDIT: sorry thought you said card for the PSU, the advice still stands if you take that route though.
Unfortunately, you will find a 180 watt PSU extremely limiting but it could probably just about manage a Nvidia GTX 750Ti as it's about the most power light card you can get that is still worthwhile, if you go any lower on spec you might as well stick to the integrated graphics. The 750 is a decent compromise though and should give you at least medium details at 1080P, high in some of the older and less demanding games.
The 950 is a bit newer and better but I think you would be really pushing it with that PSU....
dezontk
30 Jun 16#98
wildswan
30 Jun 16#97
anyone recommend a psu for hp pavilion 530na? it has a 180w and guess to run this card needs a 400w psu or bigger
stanlenin
30 Jun 16#96
I don't say they are racists. I say they are wannabe racists as they hate on red for the same reasons racists hate on black or other. Just because they are useless (the racists) and angry people.
Nate1492
30 Jun 16#95
Stop trying to derail this discussion. Add to it with some facts, price suggestions, buyers advice, or something constructive, please!
Nate1492
30 Jun 162#94
Racists? Who here has said anything offensive, apart from yourself? You are trolling on a forum where there are very few trolls.
stanlenin
30 Jun 16#93
I really don't give two burned doughnuts about who buys what card, but damn his thread attracts so many wannabe racists, who are just so unhappy with their miserable lives and cant live a day without joining other sheep for their daily baaaah baaaah baaaah
miaomiaobaubau
30 Jun 16#92
they cannot get one right, they are behind by far. 4 years now and they given us the same cards, nothing changes with AMD. Sorry, forgot, finally they added the hdmi 2.0
stanlenin
30 Jun 16#91
As I said above, pcie specs and motherboards allow to draw far more than 75w. It's just that people like you are not keeping up with modern tech. the limit for pcie slot is 250-300w, dependant on some factors.
480 card has nothing holding it back, unlike 3.5gb on 970. And one doesn't have to be AMD fan to point out that 970 has 3.5 ram. What's wrong with you?
Are you now taking all your bla bla back?
LazybeatX
30 Jun 16#90
I would wait for the aftermarket versions in a couple of weeks but yes it's a good card.
Nate1492
30 Jun 16#89
The real question is why, though?
Why would a 150W TDP card need to draw more than 75W from the PCIe port when it can get 75W from the 6 pin and 75W from the PCIe?
This was supposed to be a power efficient card.
3.5 GB ram for NVIDIA was screamed by AMD supporters for 2+ years. Here's something that should be screamed about just as loudly.
>75W from PCIe? We need a name that will stick, right? PCIe Overvolt.
stanlenin
30 Jun 16#88
You're a sad illiterate little man. Make some research before embarrassing yourself and writing "No".
Motherboard allows 300watt through the pcie connector. But ofc you cannot read, why am i even trying. Just go and walk your dog, don;t forget the plastic bags. You don't wanna be scooping with your bare hands boy.
hey guys need advice, does any one recommend this graffics card? i have Sapphire 11196-09-40G HD7950 3GB Vapox-X Graphics Card at the moment. what improvement would i see just need some advice here thanks
The hype made people go crazy. It's underwhelming as usual.
jaydeeuk1
30 Jun 16#83
That is shocking. Like VW cheating emissions tests!
Wonder if the high end cards will be affected too. AMD could end up with a very expensive recall.
Salfordgirl1
30 Jun 16#82
Why are people go crazy over this?
The 970 outperforms the 480 8GB version and is around the same price, if not cheaper.
You may as well buy a used or new 970, it'll give you more performance.
I'd also wait for a better cooling option, as this GPU has been hitting 80 degrees and over heating quite a lot, which is typical AMD.
Need a card for cheap? Used 970. Despite being years old, it still outperforms AMDs best efforts.
Rumitus
30 Jun 16#81
AMD did very well with this card. I won't be getting it, but if history repeats, this will be stretching further and further past my GTX 970 in the long run.
It is a shame there won't be an RX 490 as I was looking for that sweet spot 1440p AMD upgrade over my 970. I have a hard time trusting Nvidia and/or coughing up for a 1070.
ReadySetGoGo
30 Jun 16#80
Wow, I just heard that Overclockers sold over One thousand of these yesterday.
I wouldn't hang around if you are considering purchasing, just in case their is a shortage of stock.
rev6
30 Jun 16#79
I would just buy a 1070 in that case for less headaches. Plus the 1070 can overclock very good.
TheVoice
30 Jun 16#78
No, it's 75W through the connector with the rest via 6-pin or 8-pin power connectors directly from the PSU to a max of 300W. Pulling 300W solely through the motherboard's connector would be insane.
The issue is that many reports now show that the 480 pulls far more than 75W through the PCIe connector, which some motherboards might not be able to withstand.
Just Wondering
30 Jun 16#77
I tried a flubit for a 8gb version, came in at 203 delivered
londonstinks
30 Jun 16#76
Gonna hang on to my 290, but this is a great deal for 1080p gamers.
enigmatik33
30 Jun 16#75
The 8GB card uses one type of ram and 4GB another, previous hacks allowed access to cores and bios changes. Never magically added memory.
Rid1
30 Jun 16#74
I'll hold out for a better looking one :stuck_out_tongue:
dijital
30 Jun 16#73
This isnt an SD card or regular DDR. This is stupid fast ram.
miaomiaobaubau
30 Jun 16#72
another fail, if eventually made illegal not even poundland can take them over
LazybeatX
30 Jun 161#71
I don't have the the official specs as they haven't released them yet. I do know that the OC cards will feature at least an 8 pin power connector with dvi ports too. The Asus strix 480 is the exact same design as the 1080 strix, but with regards to the overclocking I have no idea, info should be out soon. There is a high res image of the nitro now http://puu.sh/pKDbr/76ced3a60c.jpg and also the strix http://rog.asus.com/media/1467228455226.jpg
fishmaster
30 Jun 16#70
Yes if you can wait then the GTX1060 announce date (not release date) is about a week away. So we'll have better information then.
shkapars
30 Jun 16#69
And its due to memory clocks not the vram size
xenononon
30 Jun 16#68
This is a CRACKING price btw. Only 1-3 frames < the 8gb version which can cost 220+.
xenononon
30 Jun 16#67
I was going to buy this yesterday. But then OCUK put up their price so I got a 1070! I do not regret.
cheesemp
30 Jun 16#66
I meant as in designed for 4K TVs as that and VR are the main reason for releasing them. What actual framerate and resolution they will do who knows... Hopefully a bit better than 1080p60 but whether they can hit 2K in some lower poly games who knows?
Also, when you say 4K models. What you mean is "can play 4K video" not "can play 4K games" right? :wink:
ttttd
30 Jun 16#63
Do you have links to proper info on the cards? I'm waiting for custom boards as well. I can't find anything on google except photos.
LazybeatX
30 Jun 16#62
Actually their will be 4GB and 8GB aftermarket cards as always. It has been said by the vendors that prices will be the same as reference for the aftermarket cooling solutions. Obviously the nitro will cost more due to the factory OC but you should still have no problem picking up a card with a better cooling solution for the same price as reference.
ollie87
30 Jun 16#61
There's no such thing as future proofing in the tech sector. Buy what you need now and spend the money you save in a couple of years time.
LazybeatX
30 Jun 16#60
Yea this has just been blown out of all proportion. Probably the green team paying off the media to sabotage the 480 until the 1060 arrives, I wouldn't put it past them.
stanlenin
30 Jun 16#59
Clearly, this card is 174GBP while the aftermarket will probably have 8GB and will cost at least 240GBP that's 50% difference, so this card is a great buy.
stanlenin
30 Jun 16#58
Wops I liked your post by accident. Just keep your lies/scaremongering to yourself. Reports suggest that people who don't buy this card will never meet god.
Do you have any proof of any damage caused? If no then don't lie.
PCIE 2.0 specs allow up to 300watts of power through the pcie slot.
LazybeatX
30 Jun 16#57
Now that Asus, MSI and Sapphire have leaked their aftermarket cards and should hit the market them in 2 weeks I would not recommend buying the reference cards. I am definitely getting the Sapphire RX 480 Nitro. Also has 8 pin for better overclocking. Here is the official pic http://cdn.videocardz.com/1/2016/06/SAPPHIRE-Radeon-RX-480-NITRO.jpg
Just Wondering
30 Jun 16#56
Not really no,depends , if you hunt down good bargains at prices which enable you to resell at a similar price later....i've had a 270 , 280x and a 290 , each better than the offer , sold my 290 recently , using my 2nd card a 285 at present, holding tight for the best offer .
stanlenin
30 Jun 16#55
90% of people buy graphics cards in $100-$300 range. This card offers most bang for your buck of all cards in the world and also falls in 100-300 range.
This statistics also implies that most people have not bought 3 cards with similar performance, because those were quite expensive, so only few enthusiasts chose them.
Why are you even here?
retrend
30 Jun 16#54
Sounds like a waste of cash, considering we were on 28nm for about half a decade you probably bought about 3 cards with near identical performance.
Just Wondering
30 Jun 16#53
Depends I generally do , depends when you jumped on board with your last card and the range in which it is in.
retrend
30 Jun 16#52
There's very little point in upgrading that regularly.
Chips just dont get better than fast now.
c-traxx
30 Jun 162#51
Watch out, some reports suggest that this card usses alot more power than it should. Some motherboards might not handle that. Just a heads up
Just Wondering
30 Jun 16#50
Depends how far you want to go into the future proofing side of things.....considering you might be flogging the card in 12 - 18 months and jumping onboard with the next range of cards , I cannot see 8gb been needed as a requirement for games in that period of time , I'd save the 45 quid , or wait until the price drops in the next few weeks ...
zeromx
30 Jun 16#49
Not for future proofing.
Just Wondering
30 Jun 16#48
Yikes is this really a offer !!! it is showing July the 31st !!! , is it worth someone's time to ring up and point this out to eBuyers, they might of got the date wrong etc......
Plus by then non reference cards will be available !!!
seanmorris100
30 Jun 16#47
You cant hack physical memory chips u less youre a wizard....
GAVINLEWISHUKD
30 Jun 161#46
The PS4 is based around GCN 1.1 and it closest relative on the desktop was Bonaire (HD7790).
AMDs pipeline for GCN has changed little over the years. They have spent much of the time adding features more than anything else. But so have Nvidia for the last few generations.
vulcanproject
30 Jun 161#45
I wouldn't be going mad for any of these yet before I had seen what the GTX1060 can do anyway.
Yes the use custom AMD technology, but let's give you an example that might clarify things:
You can buy a car from Ford with the following engines.
1.0
1.2
1.4
1.6
1.8
2.0
The engines and technology are all by Ford but they perform quite differently from each other, so what I'm saying is the AMD graphics card in this deal is more powerful that the AMD graphics in the current generation of consoles.
The PS4 GPU Teraflop power is around 2 Teraflops, this card is capable of over 5.5 Teraflops.
rev6
30 Jun 16#42
The PS4 and X1 don't have GCN 1.0 hardware, which is from 2011/12.
blaser
30 Jun 16#41
Bah! I don't know what to do I was holding out for an rx 480 with an after market cooler, but with the state of the pound I don't know if it will be this cheap again...
commenter14
30 Jun 161#40
Much, much older. The Xbox One and PS4 have hardware from about 2011 in them. This is a budget graphics card and it will absolutely annihilate the consoles. Also don't believe the hype about the next generation of consoles. The GTX 1080 is £600 and can only just reach 4k 60fps on ultra for new games. The PS4.5 won't even be close. At best it'll be able to do 1080p 60fps properly...
GAVINLEWISHUKD
30 Jun 161#39
Well I don't expect AMD to drop the price but it may get a bit cheaper as the £ to $ improves.
As for overclocking I'm not sure there will be huge amounts in it which won't be due to heat but but power. This does leave the door open for AMD to launch for a 480x with 2x6pin and slightly over clocked DDR5 using binned chips that were better than 480 but not full 490.
So getting three SKUs out of Polaris 10 makes sense to go along with three for Polaris 11.
So we can pretty much guess that that once the reviews of the 1060 arrive the 490 will arrive and be tweaked both in performance and price to match/beat it.
Then it just about waiting for the high-end 5 series which I expect to beat 1070 and 1080 so then Nvidia to out the Ti version to go ahead again. Then I'm expecting a Polaris refresh (4 series rebadged as 5 series but moved one level)
I'll put my crystal ball down now! Lol
belsibub
30 Jun 16#38
£45+ more seems a little steep for an extra 4gb?
1lluminati
30 Jun 161#37
They do
cheesemp
30 Jun 161#36
Yes both consoles have Radeon derived graphics in them however its the 4K models due out next year that will (likely be based on this chip). The current Xbox One and Ps4 are based on older tech.
vulcanproject
30 Jun 16#35
For all those talking about 4GB with locked RAM- apparently it is just in order for reviewers to test the card's performance with 4GB AMD delivered the cards with a bios that disabled RAM rather than supply both types of cards.
The retail versions have the memory you pay for and no more.
zeromx
30 Jun 16#34
There is an 8gb model and is worth the extra cost.
1lluminati
30 Jun 161#33
Why would they put 8GB RAM on the card and then lock it to 4GB? RAM costs money. Especially DDR5
ReadySetGoGo
30 Jun 16#32
But I was told that all the consoles use AMD Radeon? Core technology. Have I been lied to?
I hope not as the guy who told me is usually quite clued up on these things.
TesseractOrion
30 Jun 16#31
Hope they include an adaptor (but doubt it), as otherwise purchasing an active adaptor adds quite a bit (percentage wise) to the cost (vs 970). My Qnix only has one connector, DL-DVI, which is one of the reasons it OCs so well.
TesseractOrion
30 Jun 16#30
Says "available for pre-order, expected 31st July". Hopefully AIB models will have surfaced by then so we can see better cooling & overclocking potential. I'd advise to wait a few weeks. This model may end up around £150 soon than later perhaps...
fishmaster
30 Jun 16#29
Future consoles not the current generation. So this GPU power is definitely not in PS4 and Xbox One.
ST3123
30 Jun 161#28
Unfortunately, aside from the licensing fees, DVI is seen as rather old hat (been around since about 2003/2004 I think) and becoming increasingly rare on newer monitors. Plus it can't support 4k that newer screens use. Aside from that it isn't really any use for connecting to TVs either as they would use HDMI instead, so you will more commonly see them include lots of display port and one HDMI, mainly for those wanting to connect to a TV as I don't believe HDMI has any advantage over DP for monitor use.
I'd say really they should include at least an adaptor for DVI as it was sold on monitors as the primary connection for MANY years and people typically replace monitors FAR less frequently than graphics cards. I personally still use two DVI monitors compared to just one display port compatible monitor I got just a couple of months ago...
7800gtman
30 Jun 161#27
tempted but I'm waiting for zen then I'll make a whole new rig :-D
ollie87
30 Jun 162#26
Absolutely not! You may be able to use them together in a DX12 game but that is down to an individual developer supporting that function, even then the R9 270X is so weak it'd be pointless.
stelo
30 Jun 16#24
A noob here, can I crossfire it with my R9270x 4gb? Thanks in advance to all the experts here.
ollie87
30 Jun 162#25
You pay a licensing fee for DVI and HDMI. DisplayPort is open and free.
ollie87
30 Jun 161#23
XFX are basically the equivalent to EVGA but for AMD now.
stanlenin
30 Jun 16#22
Amazing card. Supports all new tech unlike previous gen AMD and Nvidia cards.
ShootistUK
30 Jun 161#21
I see these come with hdmi and displayport connectors. Whatever happened to having a DVI port? Wonder if they ship with a displayport to DVI adapter or whether this is something else to have to purchase for us old time monitor users
ukez
30 Jun 161#20
I think this is an awesome card...I like these crossfire results seen here
So desperately trying to hold out till the 1060\460\470s hit, so it brings parity to all the new gen cards\give all the manufacturers time to come up with their own designs\ 2nd run improvements\ a million other reasons...
If they're in stock tomorrow i'll more than likely snap. Must... stay... strong!
TapMyButtons to Sf2rox
30 Jun 16#19
I know the struggle trying to wait and see what the gtx 1060 has to offer or better yet the aib version's of the 480.
I can't remember where I read it but I thought it was the opposite? That they sent out both RAM sizes but it was found accidentally that the boards actually all had 8GiBs but some artificially locked to 4?
Also read that no 4GiB cards are being produced at all... will have to do some googling lol :confused:
TesseractOrion
30 Jun 16#16
Bear in mind memory is 7 vs 8GHz, may make a small difference. Almost probably best to wait for custom cooler solutions... and XFX aren't the greatest gfx card vendor IMHO (had a couple previously, but maybe they've upped QC since then)...
ST3123
30 Jun 16#11
Ever so slightly better than Overclockers original price yesterday (£175.99 I think) before they ramped it up. Not bad for a launch price so hot. Personally, I would like to see these drop to ~£150 then I would be seriously tempted as I think they are a very good balance of price performance for those who aren't obsessed with the absolute bleeding edge of technology.
If some clever hackers figure out how to unlock the 4GB one to it's full 8GB then I'd be completely sold on this card....
enigmatik33 to ST3123
30 Jun 161#12
It's either a 4GB model or an 8GB, can't be hacked or unlocked to change that!
Gkains to ST3123
30 Jun 161#15
Wasn't there a quote from someone at AMD basically saying that what actually happened is that they only sent out the 8GB card for reviews but provided the reviewers with a BIOS which turned it to a 4GB card so they could review both?
Who says it can't? Many previous cards, AMD especially, have been successfully unlocked in the past, like some of the 290 to 290X. Plus a lot have said that the card has 8GB on the board and simply 4GB locked by the BIOS. I'm sure AMD wouldn't want that extra RAM to become unlocked as it would render their dear 8GB version a lot less relevant but I'd like nothing more than someone to do it, teach AMD to artificially limit their cards in the name of profit. And no I'm not an Nvidia fan either, I'm sure they do similar things, just saying it's a really sh**ty practice is all...
YouDealTroll
30 Jun 163#13
you might want to hold off as there are some questions raised about the power draw not being what it should!
bradford_dr
30 Jun 16#10
What's the deal with the backplate? I've never really been aware of this being an important feature before now....
smsmasters
30 Jun 161#9
Just wait for the custom coolers to arrive.
moneybag
30 Jun 16#7
Good deal. Don't forget possible 2% Quidco.
rev6
30 Jun 16#6
I could be wrong though :smile:
smsmasters
30 Jun 16#5
Where's the backplate in the pics?
smsmasters
30 Jun 16#1
Backplate or not?
rev6 to smsmasters
30 Jun 16#4
Looks like it.
praevalens
30 Jun 16#3
Damn. Paid more yesterday.
ReadySetGoGo
30 Jun 16#2
I wanted to say thanks for posting this. I just checked out lots of reviews and and all authors say that for the price the 480 is really good and cant be beaten at it's price point.
I have also been reassured that because similar technology is in the PS4, XBOX One (and future consoles).
Opening post
Specifications
Bus Type PCI-E 3.0
GPU Clock 1266MHz
Stream Processors 2304
Memory Bus 256 bit
Memory Clock 7 GHz
Memory Size 4 GB
Memory Type DDR5
Card Profile Dual
Thermal Solution Blower fan Outputs Dual link Support Y
Max Supported Resolution (DIGITAL) 4096 x 2160
Output - Display Port 3
Output - HDMI 1
Features
Display Port ready 1.4
HDMI Ready 2.0b Requirements External Power - 6-pins 1
Minimum Power Supply Requirement 500 watt
XFX Recommended Power Supply XFX 550W PSU
Technologies
HDR Ready Y
FinFET 14 Y
AMD FreeSync technology Y
DirectXâ„¢ 12 Optimized Y
VR Ready Premium Y
4th Generation GCN Y
AMD LiquidVR technology Y
AMD Virtual Super Resolution (VSR) Y
Radeon Software Y
AMD CrossFire Technology Y
HDMI 2.0b Y
DisplayPort 1.4 ( 1.2 Certified, 1.3/1.4 Ready) Y
Certifications
RoHS Y
Top comments
You would be buying a discontinued product with little or no warranty, poor DX12 support, poor future driver optimisation and very little future resale value. Also 4GB > 3.5GB anyday, especially when newer games take advantage of more VRAM.
Latest comments (352)
No, no there isn't. I've owned 3 AMD cards. The 9800 was amazing, it was my favorite card of all time.
The latest offerings from AMD are sub par.
And honestly, just throwing this out there, maybe you are the fanboy here.
"early directx 12 games show a distinct split between amd and nvidia performance"
How do you interpret that as a "clear lead in dx 12"?
At what point do you just ignore reality and make up a fabricated story?
And none of these benchmarks show the 480!
Do you not realize how ridiculous you sound when you link an article that contradicts your summary?
Look, just because AMD has poor performance in DX11 and poor Open GL performance shouldn't mean when they get praise for simply fixing their own problems moving to DX12/Vulkan.
Did you not read it?
Also Vulkan is the future jut like DX 12.
Also here is another thing showing AMD has clear lead in dx 12 http://www.extremetech.com/gaming/226082-early-directx-12-games-show-a-distinct-split-between-amd-nvidia-performance
Other than the 480 matching the 980.
It's the worst possible example you could use to say "AMD is good with DX 12, look at DOOM!
Doom uses neither DX 11 or DX 12. It uses OPEN GL and Vulkan, literally nothing to do with DX 12.
Let's all start discussing the deal. Has anybody received this and can anybody confirm if they got the 8GB model?
There are tons of features that neither AMD nor NVIDIA have implemented. Pretty much equal amounts.
Async support is *minor* in terms of all the stuff that AMD hasn't done for DX 12. They just got people to advertise Async support as some awesome feature. It's worked, people have been blinded by Async computation, as if it's what DX12 is. Hint, it's not. GCN has nothing to do with DX12 anyway, it's just what they call their instruction set. It's not a unique concept or anything innovative (yet again, good PR makes people believe 'GCN' is some magical thing.)
They actually haven't been laying the groundwork for DX12. Take a look at the wiki link showing what they actually support.
https://en.wikipedia.org/wiki/Feature_levels_in_Direct3D#Direct3D_12
AMD has done zero work into Rasterization.
They haven't finished tiled resources or implemented 10 bit minimum floating point.
Microsoft has done the 'groundwork', AMD and NVIDIA have not. Let's not give either of them undue credit.
£239 is a guess. The same website reports that the GTX 1070 costs "£329" in the UK. Where?!
http://www.trustedreviews.com/news/nvidia-gtx-1070-price-deals-where-to-buy-uk
Also Rise of the tomb raider (was) an exception because it lacked Async support until very recently.
Ashes is literally an AMD sponsored title. Very few people play the game, check out how many reviews it has on steam.
http://store.steampowered.com/app/228880/
There is 1k reviews, it's just a tech demo for AMD.
What about Rise of the Tomb Raider?
https://www.techpowerup.com/reviews/Palit/GeForce_GTX_1080_GameRock/20.html
The 980 beats the Fury X at 1080p and all of the other cards in the Geforce 10xx range destroys AMD.
So what's the catch? AMD has a game in DX12 that it does well in and Nvidia has a game it does well in. Neither are "winning" DX 12 titles.
What you can say is that AMD's CPU bottleneck they have in DX11 isn't present in DX12. But just because NVIDIA had good DX 11 performance while DX 11 choked AMD big time... That doesn't mean AMD 'did better than NVIDIA' in DX12, they simply fixed their own problems. There was less to fix with NVIDIA moving from DX11 to DX12.
Did you read about anything i said?, i said nothing about "supporting"
If you take the time to look at benchmarks you would see AMD beats Nvidia almost all the time.
https://www.google.co.uk/url?sa=t&rct=j&q=&esrc=s&source=web&cd=4&cad=rja&uact=8&ved=0ahUKEwik-NTfvebNAhWLAsAKHTWUCRAQFgguMAM&url=https%3A%2F%2Fwww.reddit.com%2Fr%2FAmd%2Fcomments%2F4kykoh%2Fr9_fury_x_as_good_as_gtx1080_in_dx12_4k_in_aots%2F&usg=AFQjCNH6e_ZTnswsrYEHGdB7cromqvBN6Q&sig2=xdShcaz3MLfL8OT0feyVHw
The 390X can match the fricken 980ti in DX12
Please research before you say anything else: https://www.youtube.com/watch?time_continue=100&v=lRphUEh-dUA
AMD's been pushing performance through the merit of the hardware, NVIDIA's been pushing performance through artificial improvements made at the driver level. An example is Async compute capabilites.
So i guess you don't know about DX12 afterall
I think you are making the incorrect assumption that AMD 'supports DX12' and NVIDIA 'doesn't'.
https://en.wikipedia.org/wiki/Feature_levels_in_Direct3D#Direct3D_12
AMD and NVIDIA both have support for some of the features, but they do NOT have support for all of the features.
Only Microsoft's WARP12 has full DX12 support.
And if you actually go by the official DX12 levels... Geforce 900 series suports Level 12_1 while the 480 only supports 12_0.
So, perhaps next time you rabble on about "AMD has more DX 12 support" you'll take a second on that thought.
Nvidia GeForce GTX 1060 – a VR-ready card to rival the RX 480
I cannot see the 1060 costing much less than £300 here. But we will see.
My prediction is £270-£280 if it is actually 980 performance in games.
Anyway price/performance is always similar in that bracket so it looks silly to argue so passionately one way or another.. Even if you want a 480 it's a good idea to wait for the 1060 to come out and see how it affects the pricing unless you're desperate for one now.
1. AMD is know for smoking Nvidia by a lot in DX12
2. The 480 beats and matches the 980 (mostly beats) in DX 12
3. So it is very likely the 480 will beat the 1060 by a lot in DX 12
I hope it is the case: http://arstechnica.co.uk/gadgets/2016/07/nvidia-gtx-1060-specs-price-release-date/
Secondly, you can't both say "You don't know it will be better than the 980" and then turn around and then say the "480 is bound to smoke the 1060 at Dx12".
You do realize how much a double standard that is, right?
Nvidia claims the GTX 1060 is 15 percent faster and over 75 percent more power efficient than the RX 480, which, if true, would make the eight percent jump in price over the 8GB RX 480 more than worth it
Source: http://arstechnica.com/gadgets/2016/07/nvidia-gtx-1060-specs-price-release-date/
s
https://www.youtube.com/watch?v=h8NUtM4sexc
Considering there is no such thing as a 4GB model, and now everyone knows, there won't be anymore 4GB models for a long time (if ever).
It could just be a marketing ploy to call the 480 a '$199' card at launch. Don't expect the next batch to also be 8GB.
So, we have to compare the 1060 6GB to the 480 8GB.
It's $20 difference (less than 10% more expensive) for a card billed at 10-20% better? Seems like it's a really good price to performance winner then.
"extra tenner" don't make laugh the US price is already confirmed to be over $20 more and that's for the 8GB 480 More likely and extra £40
and it's like 10-15% better aswell
As it happens, some (or perhaps all) launch cards that ship with 4GB of GDDR5 can be unlocked to 8GB. You read that right: vendors apparently shipped these initial cards with 8GB, but simply used a different BIOS to limit them to 4GB. It's a quaint solution, and one that's just begging to be messed around with.
1. The post explaining the ref PCB.
2. The thread with i2c command fix and soon bios fix release.
3. Anyone interested in photo showing phase distribution and has a link to Buildzoid's video testing RX 480 PCI-E slot/plug setup view this post by McSteel on TPU.
Due to only 10W lowering on PCI-E slot and how PCB design is I would assume this would easily be used up when OC'ing. Hopefully when AMD release a fix it will not hamper performance to gain lower PCI-E slot power usage. If there was another controller that could change PCI-E slot/plug power distribution or PowerPlay in ROM could I know The Stilt would have done this.
Apparently the 4GB cards are actually 8GB....:confused:. Needs a bios flash
And we need to see what the 'fix' exactly is.
Will they simply cut some of the power out? Will this impact games when VRAM usage goes up? Will they drop the stock clocks on GPU/Memory?
Who knows.
This is right?
I like the price of this ... but will wait to see the 1060 from NVidia now.
AMD's 290 was a good GPU, no doubt, but they have relied on it for 4+ years and they still aren't leaving it behind. If you have a 290, there is very little reason to switch to the 480.
Also, this is supposed to be a budget GPU. Why on earth do you need a 2014 or newer CPU, high end motherboard, a quality PSU, AND excellent case ventilation?
It sounds like the *reference* 480 is a poorly designed product if it has all of these requirements.
I could easily recommend a reference 970 without stipulating any of those things. Both are blower style, so they SHOULDN'T need good case ventilation. 480 was supposed to be power efficient, but I guess that's out the door.
BTW, you say it is a must to have a cpu made after 2014, how strange, I found out (I test a lot) that cpu's made nearly 6/7 year ago are still a lot better that what amd is giving nowdays. Looks like amd cannot keep up at all trying instead to discount instead what they were producing already long time ago even so under a different architecture etc... etc... etc...
I must say, nothing to do with the cpu which is virtually irrelevant on these tests. Even my r9 290 seems to be better
And AMD absolutely are trying to lure 970 owners away. They won't be very successful, but they certainly are trying.
2) I won't buy the reference card because it's noisy, hot, terrible OC, and has a huge PCIe voltage issue that will require a 'software' change that will simply throttle the card.
3) I just showed you that the ratio wasn't 1:1. 90 W and 76 W.
That's no where near 1 to 1 ratio!
What it tells me i that the 6 pin physically can't provide much more than 75W. While the PCIe spec can demand more (and potentially then crash the board).
Imagine this simple scenario.
The graphics card wants 170W of power.
It asks all available interfaces for the maximum power available.
It's returned with 150W of power.
The graphics card then asks for more power, overruling the amount of power returned by the PCIe+6 pin.
Asking both for more, if either can provide more via some hardware override, it will pull in more.
To assume this card would ignore double 8 pin connectors and simply keep taking more from the PCIe slot is ludicrous. They would have to be complete boneheads to fail that badly. You'd never be able to draw more than 165-170W before your mobo shuts down or begins to degrade.
4) If you honestly don't think AMD can 'change this ratio by software' then they are even more screwed.
They either have limited all of their 480 series cards to 150W, or they will have to recall their cards.
I can't help thinking that their lack of resources has pushed them into a corner and when this GPU turned out to be underwhelming they got desperate and cranked it and released a seriously flawed product.
I hope they can address this issue without a fix that tanks the performance as was the case with the Phenom TLB bug.
Hopefully they aren't using GloFo for Zen also!
Got a link?
Read the reviews has anybody said "Do not but this card"? No. If it worries anybody that much don't buy it and grab a 970. If I was in the market for a 480 I would buy one.
So what is your reason not to buy one? What do you know that the review sites don't?
Once again you seem not to grasp the concept even though you wrote it earlier yourself! It's using almost a 1/1 ratio. The slot supplies 1w as does the 6pin. So logic says the at 75w on the slot uses 75w of the 6pin. So if the 6pin is supplying 90w so will the slot. Moving that to an 8 pin it will still be 90w from the slot and 90w from the 8 pin. It will make no difference moving to 8 pin as the problem still exists.
What the world wants to know is can AMD change this ratio by software. Which I think not.
You can 'think' it's not a serious issue. But on what grounds? Your 'gut'?
You are implying the issue with this is the PCIe slot is attempting to draw power before the 6 pin. You didn't back it up, you just sorta said it. Without any proof, and expected us to accept that conclusion.
1) Pretty much all the info from the review sites so far? Also by your own admission "I haven't found any place suggesting this is simply a 'prioritization' issue, all the tests suggest the power draw is simply exceeding 150W and it's pulling from whatever it can."
Undervolting uses less from both and overvolting uses more from both. So can't see any reason why it would magically use more from the 8pin and less over the PCI-e lane.
2) OK I'll accept the 10%-15% figure. Of this minority how many were considering buying a 480?
So I'm not sure where you get "Nobody" from? It seems you ask a question and I reply but you seem selective on question answering.
I think it will be closer to nobody then everybody of this small percentage of users.
So I'm happy to leave it here. I'm sure on any points I'm wrong you will gladly correct me in the future.
1) Absolutely not, give us anything that suggests that the PCIe slot would 'max out first'. That makes no logical or reasonable sense.
You can see plenty of evidence that underclocking reduces this issue and overclocking increases it.
2) I think you're off by a large factor on your estimates of PC age.
http://store.steampowered.com/hwsurvey/
Specifically, take a look at how many Intel G33/G31 express users there are. That's a 2007 CPU.
http://ark.intel.com/products/31914/Intel-82G33-Graphics-and-Memory-Controller
Look at how many Geforce 400 and 500 users there still are. Check out how many Radeon 6xxx/7xxx users there are.
Just a quick scan shows nearly 10-15% of users have 5+ year old graphics cards. It's pretty safe to assume these machines also have 5+ year old CPU/Motherboards as well.
So no, I'm not buying the idea that "nobody" uses a CPU from 2010.
The GTX580 isn't a valid comparison as you actually have to OC it for it to become an issue, the 480 can cause damage out of the box.
90w in the paragraph 85w on the graph, there is a 5w discrepancy already! The point is one figure is pretty useless. You need 2 of the 3 to be truly useful. 85w at 11.4v is less than 80w at 12.6v. The only testing we have seen is with both sets of numbers is Pcper's where it shows it is slightly less.
On your second point you have answered it yourself! It seems to be pulling it from wherever it can so will still be maxing out the PCI-e slot before it can use any extra from be it a 6 or 8pin. This is the point I'm not sure AMD can change this without hardware changes so adding an 8 pin won't change this.
Mainstream gamers? Forum tags from forums seem to suggest nobody is using a mobo and CPU from 2010. Most seem to be i3, i5 and FX that are far newer. I'm sure when steam get some survey numbers this will be confirmed.
I'm not making excuses I have never tried to hide the fact it uses too much power. But until we find out more about the motherboard issues we can't say anything for sure. I can kill a mobo with a GTX580. Does that make the GTX580 automatically a mobo killer? No
I said earlier in the thread that if you intend to overclock probably best not to get one now.
Yes, people are 'considering' AMD, but the card is a major let down in performance and this PCIe issue.
It's clear that AMD have essentially pushed as much power as possible into the 480, and the card is splitting at the seems.
They should have left the card at a lower volt, power draw, and just accepted lower than 970 at stock performance.
But that would look terrible to marketing, so they pushed the card to it's near stock limits, and this has come back to bite them as it draws over 150 Watts during intense gaming/benching.
Nonsense. It's a name, not a claim. He's doing reviews.
http://www.tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html
Read the "Stress test power consumption" part. They state the PCIe slot is drawing 90 out of the 168W of power!
Simple math shows that the opposite is true. 90W from the PCIe, 78 from the 6 PIN. Both components are being overdrawn, but the 6 PIN is within tolerances.
Do you have any links saying this is the case? I haven't found any place suggesting this is simply a 'prioritization' issue, all the tests suggest the power draw is simply exceeding 150W and it's pulling from whatever it can.
The mainstream gamer. Honestly, you can make all the excuses you want, but there has already been reports of damaged motherboards.
There was lots of not good/misleading info.
The only video that I would class as good at this moment in time is the Pcper one.
The ones that looked confirmed are both 3 card systems. This is not a surprise at all. ATX standards for the the P4 four pin is 150w. ATX only specify for 2 high power pci-e cards. There is still 75 watts available via the 20pin.
I suspect AMD will disable 3 way crossfire in next week's update, or down clock it so much in the 3 way profile that it won't be worth running 3 cards.
So the problem on the above is the motherboard manufacturers issue. They could have included additional pci-e power directly for the slots. They didn't as it never been needed before.
I do remember Asrock doing boards with a 4 pin molex to boost pci-e power.
I'm surprised AMD did not find this issue before launch because even if they were is specs adding the 130% powertune would have pushed them well over the top.
Thinking about it I wonder if this is one of the reasons why Nvidia have limited their cards to 2 way?
The guy described the situation, found the reason, investigated, and provided a lot of good info.
Is it a bad video because it exposes a flaw?
SeekingAlpha
Yes I fully acknowledge that it uses more than the specification.
What I want to know is, is this causing issues with real people buying the card and using it in their system? My speculation is it is not.
Either way AMD will sort this issue by tweaking the performance. The problem is 99% of will have know idea if AMD have sorted it apart from a few sites tests. Some chips out there today will happily undervolt to get them into spec without affecting performance. Others probably won't. AMD just need to find a sweet spot between the two. Presuming they can't change the power ratio but that would be the best option if they could. Take 15% off the available power over the slot and add it back over the 6pin.
Did AMD all along intend 75w to be taken to the GPU from the 6pin and 35w from the slot (normal sort of level) making this 110w that keeps getting bounced about. This would leave 40w for the fan and importantly the memory but the memory at that clock and when fully utilised is more than the original samples supplied (Samsung supplied top binned chips?). Over 8 chips it would only need 15% over use and suddenly your over 80w total from the slot.
The issue is entirely within the fact that the PCIe lan is pulling 80-82W on average.
I pointed out a video of a reviewer finding problems.
As he said, his highly overclockable motherboard (which costs in excess of 400 quid) didn't black screen.
You are making an assumption that his test rig wouldn't support a GPU if it was under the 750w Limit.
Did you miss out on the fact he plugged the 980 TI into the same motherboard and had zero problems?
i) cooler can dissipate (x) Watts (TDP).
ii) VRM in context of thermal limit can provide (x) Amps (TDC).
iii) connectors on PCB can provide x Watts (MPDL). MPDL has no value in ROM to separate PCI-E slot & plugs.
Be aware there is some hidden "magic" in OS driver. For example on Fiji if I do an OC of GPU: 1145 HBM: 545 and use (x) PL in ROM and use driver defaults 3DM FS will stick to 1145MHz. Then I run Heaven or Valley card will drop some clocks, this is not due to it throttling due to PL/Temps but how PowerTune algorithm senses a moment of low GPU usage/no display and drops clocks to save power/lower temps, etc. But if I switch a feature in drivers off "Power Efficiency" then card will stick to 1145MHz, as PowerTune algorithm has been relaxed. Perhaps this PowerTune algorithm is what they will modify, which still IMO would mean higher PCI-E slot power draw than past AMD cards.
So PowerLimit via ROM sets up GPU and does not deal with other circuitry. IR3567B is controlling phases but can't differentiate between PCI-E slot/plugs and the mosfets take their power from source on PCB. Now if there is a controller on the PCB which deals with PCI-E slot/plugs and has data interface (I2C,SMBus,PMbus) then they can modify this via driver and/or ROM.
On Hawaii there was great PCB variation for example there was Vapor-X 290X with like 10 phases on rear and 2 upfront and when you compared ROMs between ref PCB (6 rear, 1 front) and that, the ROMs only differed on PowerPlay and VoltageObjectInfo which programs IR3567B. I never found anything dealing with PCI-E slot/plugs in Hawaii/Grenada/Fiji ROM (been involved over a year now in bios mod), but I don't have all info.
Now RX 480 is PowerPlay 7 card (downed copy of ROM from TPU), like Tonga and Fiji, so it shares tonga_pptable.h part of Linux Driver. These are the values in PowerTune of RX 480:-
typedef struct _ATOM_Tonga_PowerTune_Table {
UCHAR ucRevId;
USHORT usTDP;
USHORT usConfigurableTDP;
USHORT usTDC;
USHORT usBatteryPowerLimit;
USHORT usSmallPowerLimit;
USHORT usLowCACLeakage;
USHORT usHighCACLeakage;
USHORT usMaximumPowerDeliveryLimit;
USHORT usTjMax;
USHORT usPowerTuneDataSetID;
USHORT usEDCLimit;
USHORT usSoftwareShutdownTemp;
USHORT usClockStretchAmount;
USHORT usReserve[2];
} ATOM_Tonga_PowerTune_Table;
Just if anyone is wondering I'm not an nVidia card owner, I have not had nVidia for 6yrs. I have owned Cypress, Hawaii and Fiji.
Maybe I look at the hidden meaning too far.:laughing:
As for the power split I guess we will know soon enough if they are controllable values as the divide will be evident and easy for AMD to sort. Most sites/people don't seem to have a clue. Some think is is possible and I think Hardware.FR and myself are the only ones that think not!
At no point in any of the info we have seen (tomshardware/PcPer) has there been a suggestion that the PCI-e lanes have supplied more power than the 6pin.
Also suggesting an after market card with one or even two 8 pin will solve the problem is a wild suggestion as it seems at this point it is maximising the PCI-e lanes first. So it will give you more overall headroom the issue will still be present.
Besides who is going to be using a system from 6 years ago with a RX480!? It was a low rent mobo with only one P4 and a 95w CPU overclocking to probably about 125w and a power hungry chipset. It would probably struggle even if the GPU was under the 75w limit.
All their pointing out is besides tuning GPU to maximise performance they used very high speed RAM to gain performance. RAM tends to use a lot less power in comparison to GPU even if high speed/8GB. IIRC from an article max 40W.
From what I understand the high side mosfet (12V) and low side mosfet (GPU,etc) are controlled by IR3567B (PWM), Sin's VRM guide.
For example for GPU VDDC the GPU commands IR3567B to provide x VDDC and IR3567B does it's "magic" with the mosfets. The phases are independent (relatively speaking), so let's say phase 1 of loop 1 gets knocked out due to a fault on mosfet, the other's carry on working but it will put extra strain on them and perhaps failure later. A member in the Fiji bios mod thread made an error on a capacitor mod on a phase, that phase got wrecked but the rest worked but later card died. Another member did lots of experiments with a Hawaii card and deemed the IR3567B a very intelligent voltage control chip.
The IR3567B full datasheet is covered by NDA, so we only have access to a 2 page PDF:disappointed: . But IR3565B is very very similar chip and IR3567B has same features :smiley: , 59 page data sheet. IR3565B (48 pin) is dual output 4+2 phase and IR3567B (56 pin) is dual output 6+2. The increased 8 pin count on IR3567B is due to the 2 extra phases on loop 1, 4 pins make up current sense input/return, 2 pins PWM signal, 2 NC .
My view is in current state RX 480 is a "balls up":disappointed: .
The updated PC perspective article is the best testing IMO and explains Tom's Hardware Guide power review sections as well. I collated THG data on 390X / Nano / Fury X / RX 480 and viewing the average power usage RX 480 on PCI-E is high, due to size of image here is direct link. Even the ref PCB 295X2 which basically used 2x 8 pin PCI-E connectors to their hardware limit didn't draw a lot from PCI-E slot, THG review.
Hardware spec for PCI-E plugs is pretty high, most PSU have 6+2 pin PCI-E plug so technically the 6 pin has 3x 12V lines, as they are 18AWG wires let's say ~8A capacity results in 288W. Here is direct link to data I collated. The PCI-E slot is the weaker link and has 5x 12V pins 1.1A each only, this is in comments section of PC perspective article by their team member.
We will have to see what occurs with new driver, currently I think IR3567B can not differentiate power from PCI-E slot/plugs. I think other PCB design aspects limit PCI-E slot usage, if AMD change aspects of voltage/PL to GPU via driver/bios to reduce power usage it may still mean a lot of power is still used from PCI-E slot when compared with past cards.
As always though, AMD will be king when it comes to purely budget minded consumers. No question about that.
Then again if Nvidia drop SLI for the 1060 as rumoured, then the RX 480 will get a good following regardless.
He's trying to wash the problem away. As he said, this isn't displaying the same issues other cards exhibited.
AdoredTV is pretty in favor of AMD, take a look at his 480 review.
His conclusions was pretty ugly, if you broke it down.
You can't use that 8GB without exceeding the motherboard spec at stock rates.
So, NVIDIA don't get a pass for 3.5GB of actual VRAM, what about this 8GB of VRAM that you can't actually use?
https://youtu.be/rhjC_8ai7QA?t=95
At stock the card peaked(not avarage) at 80.39w ([email protected]). Now the max spec for PCIe is 71.28w ([email protected]) and ATX standard which is 72.6w ([email protected]).
So it is peaking at 12.3% over PCIe specs and 10.3% over ATX specs.
So at stock I can't see this being an issue. So for the avarage consumer buying one and loading their game and playing all should be fine.
Note for the enthusiast it's probably a good idea not to increase the power control up (for now until we get more info from AMD).
So if you are to buy the card or not do so on the basis of stock benchmarks.
If these numbers are good (they are) for the price you pay then grab one and enjoy your games
If you intend buying for overclocking then don't bother for now.
As for 'Can AMD fix this' I'm still looking for info on this. Yes I'm pretty sure they can solve the issue by limiting the VRM control to use less power which in turn will cause less power draw. But in turn that will affect the raw performance. I'm not convinced from a hardware level (for this batch of cards) there will be a fix.
Now my view on what is a possibility of what could happen (I'm not saying it will it's just my opinion, before any fanboys start Nvidia would probably do the same).
Say today your game gives you 60fps and uses 80w on the 12v PCIe slot.
AMD optimise the drivers (as they do) and normally you would see an increase to say 64fps (still using 80w).
Now they will be able to dial back the power to within spec and give you back your 60fps.
So on the face of it the game used too much power to give you 60fps and now it doesn't use too much and gives you 60fps.
AMD fixed it right!?
No they solved it but didn't fix it.
So if driver optimisation doesn't seem to be happening in the coming weeks but the power issue goes away (at stock anyway which is all AMD have to do) then you will know why. :smiley:
If anybody finds any useful info like a PCB layout diagram please send me a PM (more eyes and all)
But please don't post me to links for forums or reddit posts like somebody did last night. Thanks.
https://www.youtube.com/watch?v=kFuYc2FHgjw
I was just posting it to add to the discussion. I am neutral on it at the moment.
This is a good video https://www.youtube.com/watch?v=kFuYc2FHgjw
Everyone hold your horses. Surely this will only take a few days to figure out, but I wouldn't risk my entire system. Besides, the prices should be lower next week!
They responded... Sure, but it was basically acknowledging the fact there are issues. They don't know why. And they are "Looking into it."
To me, this is exactly what I'd want to avoid in a 'cheap' graphics card. It might nuke my other hardware... Yeah, I'll wait til it is fixed.
https://www.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/d4sy0c3
You can keep on with the "legitimate advice" crap you are going on with, but the truth is, this is a big problem that will absolutely result in reduced hardware life.
There are only a handful of reviews so far, generally, this type of damage manifests itself in reduced life expectancy of capacitors. You'll just notice one day half of your capacitors have popped and you won't quite know why.
Next is who is feeling brave enough to try and power the the entire card off the 24pin! :laughing:
Someone posted their XFX card on PC Partpicker it seems to have a backplate but that picture is from a 8gb variant.
I wouldn't buy this card if you've got an old PC or want to run two in CrossFire.
https://hardforum.com/threads/rx-480-is-apparently-killing-boards.1903867/
https://www.youtube.com/watch?v=rhjC_8ai7QA&feature=youtu.be&t=97
May order one and cancel if something better arrives.
Really don't want to risk frying my MoBo!
Let's just both laugh at it and call it a day :laughing:
The smileys have nothing to do with it :stuck_out_tongue:
As for B+C one of us will be correct and only time will tell. :smiley:
If so I would probably be more worried about killing the card than the motherboard. Lol
Pretty interested to know how the power input side is now handled. If you have any links it will make some good bedtime reading. :smiley:
Apologies to anybody as it seems things may have moved on and if the above is correct it may be possible to sort the issue with a BIOS update.
I'm happy to stand by that I don't consider there being any risk to the motherboard.
It says 'much faster than a 480' but interpret the chart properly and its only 15% faster and they want to charge 50% more for it.
oops
I found this DX12 Total War: Warhammer link that shows an improvement on AMD cards by over 25% performance!
http://www.dsogaming.com/news/report-total-war-warhammer-runs-27-slower-dx12-nvidias-hardware/
PowerLimit is part of PowerTune, PowerTune table is contained within PowerPlay data table of ROM. We can create tables list of a ROM 2 ways :-
i) In 2009 a developer working on Linux driver for AMD cards reverse engineered a program (AtomDis) to parse AtomBios (AMD ROM). The program can only be used in Linux (can be used in VM as well).
ii) A OCN member created a windows program (AtomBiosReader), which can also create tables list for ROM but not parse data within them.
Once you are in right section of ROM using a Hex editor you will need to assess which hex value does what. This is made easy by information in Linux driver. For example to translate PowerPlay on Tahiti you'd use pptable.h.
Be aware when you modify PowerLimit in ROM you will still see 0% in OS as the new values in ROM have become the starting point for any change for PL via OS. Be also aware that PL in ROM is limiting GPU "power" not total board electronics, so RAM, fan, etc do not form part of it. Your actual power draw stats will be higher if you can see monitoring data for VRM in HWiNFO or using measuring equipment.
B) The non-reference cards should have extra power sockets.
C) There is danger of damaging your motherboard, especially if it's a lower-end model.
Any decent 290 vs 480 benchmarks or direct comparisons anybody has links for? Couldn't find anything conclusive.
Tempted to wait too for the 1060 and 3rd party cards, but I'm not really an OCer and will prob see little benefit.
Price wise, I can't see AMD reductions if Nvidia aren't directly competing with their DX12 cards.
http://hexus.net/tech/news/graphics/94127-leaked-nvidia-geforce-gtx-1060-presentation-slides-published/
Mainly playing Arma and CSGO at the moment but expect BF1 will be a big game for me. Also doing lots of video editing.
I could probably sell the 290 for £130ish(?) and so £40 upgrade seems reasonable, if only for the power consumption reduction.
Thoughts?
For example when we use MSI AB on a Hawaii/Grenada card to change VDDC offset (GPU core voltage) it modifies IR3567B loop 1 via i2c communication. VDDCI (Aux voltage) is loop 2 on Hawaii/Grenada. Also via MSI AB command line you can modify registers within IR3567B via i2c.
On Fiji "SMC" (on die regulator) is "messaged" via MSI AB "talking" to driver which then modifies IR3567B loop 1 for VDDC via i2c communication. You are not supposed to communicate directly to IR3567B via i2c on Fiji but through driver "messaging" as the "SMC" is communicating with IR3567B virtually all the time, this information was shared by a) Unwinder author of MSI AB on Guru3D b) can be seen when we take i2c dumps on Fiji which take ages compared with Hawaii/Grenada, as "SMC" has priority c) part of Linux driver which has "messaging" parameters for "SMC" and one of those is "releasing i2c bus". On Fiji loop 2 is MVDDC to HBM.
I have been heavily into Hawaii/Fiji bios mod, threads are on OCN. Via firmware you can set voltage offset for loop 1 & 2 / fSW loop 1 & 2 independently, also OCP, OVP, VMAX. The IR3567B is a very advanced voltage control chip IMO, most of what I/we learned was from a) studying Hawaii/Grenada ROMs b) testing modifications c) shares by a Pro overclocker which has been involved with AMD for a lengthy period.
I'm currently doing some bios mods for an OCN member for RX 480 :wink: .
:laughing::laughing::laughing::laughing:
B)I don't expect to see cards with 8 pin or it will eat into 490 sales later in the year. They are limiting performance by power draw.
C)There is no danger as previously explained.
https://www.reddit.com/r/Amd/comments/4qoclm/german_site_explores_the_potential_for/
If true it turns out you can reduce power consumption by 20-30% while increases performance. :-)
A little tinkering in the Watt Manager and it becomes even more efficient. Bonus.
My purchase is even more satisfying now. Thank you AMD for this GPU, I don't have much cash but now I get to play PC games with great performance.
I am excited to receive it. I want to try and save for a Freesync monitor now in time for christmas.
https://www.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/
It's outside the upper tolerance limits for motherboard power draw, which makes it dangerous. Of course a BIOS update should fix it, or waiting a couple of weeks for a card with adequate power sockets. It's not the end of the world but it's not "fine"
If the rumours are true that the stock 4gb cards for now have 8gb on them lowering the clock was the the best for both AMD (for performance delta) but also for Samsung to clear lower binned chips.
Partner cards will probably use higher clocked dram but will be 8*512 so true 4gb cards.
But there certainly is no scandal. :smiley:
http://www.legitreviews.com/amd-radeon-rx-480-4gb-versus-radeon-rx-480-8gb_183576
When someone tests on a minimum spec mboard and it pulls to much power that will be a story. Most likely it will just fail to boot and no damage will be done.
Just total bs created by a site that let nvidia pass on the same observation, eagerly regurgitated by biased fanbois.
We've shown you serious voltage concerns.
Do not underestimate the power of too much electricity and a computer.
These are stock card issues, not people overvolting.
solves that issue 75W is the start up default still a faux pas by AMD but not a catastrophe as suggested
You could do a little googling, then you can be clued in, like your friend :wink: .
AMD leaving on 4GB of RAM on the card is confusing, would like to see the card opened up to see what's going there. it would make more sense to just sell all cards at $200 with 8GB, they would literally clear the entire stock in a day if they did that.
That said after seeing the posts about the card drawing more power from the PCi express lane than permitted, I think I am going to wait for the custom PCB's before buying, why AMD just didn't go for a 8pin connector I don't know, it would have made more sense, even if six pin was enough, the 8 pin could have allowed the third parties to fit a beefier cooler on and push the card even further.
Potential downsides of high power draw is that it can damage your motherboard. To control this maybe a bios/card update is needed but it's best to wait it out for now.
If you're lucky to receive certain early reference cards, you *might* be able to find a BIOS unlock for that extra RAM (don't expect any performance improvements tho, especially if it's the slower 7GHz type)...
Guys actually think very carefully about this card. It is reportedly pulling nearly double the amount of power through the PCI-e slot than it should be. This can damage your motherboard.
My advice would be to wait and see what AMD say or do about this, and any stories of fried boards.[/quote]
Guys actually think very carefully about this card. It is reportedly pulling nearly double the amount of power through the PCI-e slot than it should be. This can damage your motherboard.
My advice would be to wait and see what AMD say or do about this, and any stories of fried boards.
I can't see that happening, I could be wrong as I don't know to much, I've read about cross flashes and CPU's coming with disabled cores but didn't know manufacturers would just included extra memory on a graphics card and just disable it. Surely they're shooting themselves in the foot, no?
The major difference will be warranty period and process.
I would avoid the RX480 till they sort out the power issues, read up a lot of complaints.
https://www.youtube.com/watch?v=8Oc7zXhzlzU
But what I will say is you have zero chance of 'frying' your motherboard. You may get stability issues (unlikely) but it only fair to point out.
The reason for this is that once the current gets to its highest available load the voltage will drop off. The voltage regulator on the GPU will detect the lower voltage figure and pull more from the 6 pin. This will continue until it either hits the maximum specification set by AMD or the voltage drops on the 6 pin supply.
So the most probable reason everybody is hitting the same 82w'ish is because the voltage has dropped and and the power request stops.
So I'm hoping to see some more testing from the review sites in the coming days.
*If AMD (you have my details) or other card partner what to send me a review sample let me know.
There is a NVIDIA 970 selling for 180 (rebate, base 200).
The conclusion is really strong. Don't buy double 480, it doesn't compare.
https://www.techpowerup.com/reviews/AMD/RX_480/19.html
They did Hitman in DX11 because the DX12 implementation was terrible.
I would also call AOTS a joke benchmark, but the 480 manages to exactly tie the 980 anyway.
I mean, AOTS is the poster child card for all AMD fanboy benchmarks, or it was, until NVIDIA decimated AMD in the benchmarks.
http://www.guru3d.com/articles_pages/amd_radeon_r9_rx_480_8gb_review,11.html
http://www.guru3d.com/articles_pages/amd_radeon_r9_rx_480_8gb_review,12.html
http://www.guru3d.com/articles_pages/amd_radeon_r9_rx_480_8gb_review,13.html
The Motherboard can deal with spikes, but it's not good.
But what can't happen is power draw that is average above spec.
http://www.tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html
http://www.tomshardware.co.uk/nvidia-geforce-gtx-960,review-33113-8.html
It's a problem. It's a serious problem. I'm glad Toms does multimeter readings, it really helps consumers avoid bad hardware.
The first graph shows power spikes over 200W, yet no reports of any motherboards being fried.
Any evidence that the 960 is "sometimes drawing *200Watts off the PCI Epxress port*"?
You pointed to evidence that shows the spec of PCI allowing for a bit of fudge around the 75W limit.
Also, if you read that post, they suggest "a bios update is all that is needed to 'fix' this problem".
I'm not convinced that's the case. Yes, the specification for PCIe offers some edge case scenario for high power consumption devices, but the AMD 480 does not register itself as a high power consumption device, nor does the standard motherboard support more than 75W+-7.5.
In fact, in the very thread, you have people telling you just this.
The point? This is a problem. No matter what the specification technically allows, standard, consumer, motherboards do not support high power PCIe.
And more precisely.
PCIe non high power devices must be within +-8% of 12v and +9% 3.3v... Which again, is around 82W.
So if the AMD is drawing anywhere over 75W, it is problem. If it's drawing over 82W it's out of spec and can be held liable for frying your MoBo.
And heck, take a look.
https://www.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/
Check update 22 and 23. This is exactly what I've been saying.
So... Yeah, the AMD 480 is absolutely failing spec. No question.
Especially considering the card was only a 120W card... So I mean, are you just... Straight up lying?
It's hot. It's noisy. It doesn't overclock. It has a HUGE powerdraw.
I can think of one reason to buy it. It's cheap and performs on par with last generations 970.
The 980? Not a chance, it doesn't beat it. Check the review summary.
https://www.techpowerup.com/reviews/AMD/RX_480/24.html
It is slightly ahead of the 780ti, the 290, and the 970. Behind the 390, 290x, 980, 390x and everything else you'd expect.
It beats cards from 3 generations ago and barely skims past last gens middle of the road card!
The reference 480 has been a flop. See if the 480 AIB custom cards can actually do something, or give this card a pass.
Not a bad review in sight as far as I could see.
You would be buying a discontinued product with little or no warranty, poor DX12 support, poor future driver optimisation and very little future resale value. Also 4GB > 3.5GB anyday, especially when newer games take advantage of more VRAM.
Also, as drivers mature the performance will increase as usual :sunglasses:
Unfortunately, you will find a 180 watt PSU extremely limiting but it could probably just about manage a Nvidia GTX 750Ti as it's about the most power light card you can get that is still worthwhile, if you go any lower on spec you might as well stick to the integrated graphics. The 750 is a decent compromise though and should give you at least medium details at 1080P, high in some of the older and less demanding games.
The 950 is a bit newer and better but I think you would be really pushing it with that PSU....
480 card has nothing holding it back, unlike 3.5gb on 970. And one doesn't have to be AMD fan to point out that 970 has 3.5 ram. What's wrong with you?
Are you now taking all your bla bla back?
Why would a 150W TDP card need to draw more than 75W from the PCIe port when it can get 75W from the 6 pin and 75W from the PCIe?
This was supposed to be a power efficient card.
3.5 GB ram for NVIDIA was screamed by AMD supporters for 2+ years. Here's something that should be screamed about just as loudly.
>75W from PCIe? We need a name that will stick, right? PCIe Overvolt.
Motherboard allows 300watt through the pcie connector. But ofc you cannot read, why am i even trying. Just go and walk your dog, don;t forget the plastic bags. You don't wanna be scooping with your bare hands boy.
https://www.reddit.com/r/Amd/comments/4qmlep/rx_480_powergate_problem_has_a_solution/
Wonder if the high end cards will be affected too. AMD could end up with a very expensive recall.
The 970 outperforms the 480 8GB version and is around the same price, if not cheaper.
You may as well buy a used or new 970, it'll give you more performance.
I'd also wait for a better cooling option, as this GPU has been hitting 80 degrees and over heating quite a lot, which is typical AMD.
Need a card for cheap? Used 970. Despite being years old, it still outperforms AMDs best efforts.
It is a shame there won't be an RX 490 as I was looking for that sweet spot 1440p AMD upgrade over my 970. I have a hard time trusting Nvidia and/or coughing up for a 1070.
I wouldn't hang around if you are considering purchasing, just in case their is a shortage of stock.
The issue is that many reports now show that the 480 pulls far more than 75W through the PCIe connector, which some motherboards might not be able to withstand.
Do you have any proof of any damage caused? If no then don't lie.
PCIE 2.0 specs allow up to 300watts of power through the pcie slot.
This statistics also implies that most people have not bought 3 cards with similar performance, because those were quite expensive, so only few enthusiasts chose them.
Why are you even here?
Chips just dont get better than fast now.
Plus by then non reference cards will be available !!!
AMDs pipeline for GCN has changed little over the years. They have spent much of the time adding features more than anything else. But so have Nvidia for the last few generations.
You can buy a car from Ford with the following engines.
1.0
1.2
1.4
1.6
1.8
2.0
The engines and technology are all by Ford but they perform quite differently from each other, so what I'm saying is the AMD graphics card in this deal is more powerful that the AMD graphics in the current generation of consoles.
The PS4 GPU Teraflop power is around 2 Teraflops, this card is capable of over 5.5 Teraflops.
As for overclocking I'm not sure there will be huge amounts in it which won't be due to heat but but power. This does leave the door open for AMD to launch for a 480x with 2x6pin and slightly over clocked DDR5 using binned chips that were better than 480 but not full 490.
So getting three SKUs out of Polaris 10 makes sense to go along with three for Polaris 11.
So we can pretty much guess that that once the reviews of the 1060 arrive the 490 will arrive and be tweaked both in performance and price to match/beat it.
Then it just about waiting for the high-end 5 series which I expect to beat 1070 and 1080 so then Nvidia to out the Ti version to go ahead again. Then I'm expecting a Polaris refresh (4 series rebadged as 5 series but moved one level)
I'll put my crystal ball down now! Lol
The retail versions have the memory you pay for and no more.
I hope not as the guy who told me is usually quite clued up on these things.
I'd say really they should include at least an adaptor for DVI as it was sold on monitors as the primary connection for MANY years and people typically replace monitors FAR less frequently than graphics cards. I personally still use two DVI monitors compared to just one display port compatible monitor I got just a couple of months ago...
But its worth having a read of this: RX480 fails PCI-E specification.
If they're in stock tomorrow i'll more than likely snap. Must... stay... strong!
https://www.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/
Also read that no 4GiB cards are being produced at all... will have to do some googling lol :confused:
If some clever hackers figure out how to unlock the 4GB one to it's full 8GB then I'd be completely sold on this card....
Sound more likely. Unless someone who's bought a 4GB card can confirm otherwise: the TPU GPU BIOS database has the 8GB ROM for the brave...
https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=RX+480&interface=&memType=&memSize=
I could be wrong though :smile:
I have also been reassured that because similar technology is in the PS4, XBOX One (and future consoles).