Carigamers
Tech Talk => Hardware, Tweaking & Networking => Graphics Cards => Topic started by: TrinireturnofGamez on November 30, 2007, 06:07:53 PM
-
http://techreport.com/discussions.x/13703
high end Geforce 9 card to release by February 7th, with a medium end model by june... The 8800gt is a great card , but lack of availability suggests its just a hold over until Nvidia's new cards are ready. Only 50 000 8800gts are bieng shipped versus 500 000 3800s .
http://techreport.com/discussions.x/13707
AMD will drop prices on their 3800 series cards by $20 USD to compete with the 8800gt 256mb , the 3850 will be available for 159US MSRP , and the 3870 199 .
The 3800 series will compete with a dual GPU card , hopefully we'll see it with 1GB of RAM at 300US .
I'm waiting till august to upgrade so i can see how this plays out , looks like 10.1 is going to hold for a good year or two like SM3.0 did , plus i should be able to get 8800gtx performance for around 200 US like i wanted .
-
Some interesting little snippets:
GeForce 9800 GX2 nabbed
Keeping roadmaps under wraps is [H]ard
By Wily Ferret: Friday, 04 January 2008, 2:35 PM
THE GREEN TEAM is usually pretty good at keeping its future graphics plans under wraps. Usually it shares them only with selected websites and "partners" who are NDA'd up to the eyeballs to avoid the information leaking onto the web.
But the straight-up folks at HardOCP have posted fresh pictures and information about Nvidia's roadmap today, including a look at the dual-GPU beast which will be known as the 9800 GX2.
The card is similar to the 7950 GX2, with two 8800 GPUs on separate PCBs, slammed together and running in SLI. It is due to launch at the end of February or early March, the site reckons, and will take over the crown of fastest card from the 8800 Ultra - at least in games that support SLI.
Heat should be less of a problem than in existing SLI setups, as the new 8800 GPUs are die-shrunk to 65nm.
The rest of the roadmap takes a decidedly unwelcome skew, as Nvidia adopts DAAMIT's tactics of rebranding current-gen products with a new name. The 9800 series will be a small revamp of the 8800, with support for Tri-SLI and some tweaked clocks and specs. The 9600 will do the typical trick of being a new mid-range card that doesn't perform as fast as well, bang for buck, as previous generation cards.
It seems that we won't have anything genuinely new from Nvidia for quite a while - it's possible that we won't see a truly next-gen graphics architecture for another nine months. If R700 continues to see delays, there will be little incentive for Nvidia to push out anything sooner.
You can read some more details over a HardOCP here (http://enthusiast.hardocp.com/article.html?art=MTQzOSwxLCxoZW50aHVzaWFzdA==) and here (http://enthusiast.hardocp.com/article.html?art=MTQ0MCwxLCxoZW50aHVzaWFzdA==).
-
Hmmmmm im disappointed to say the least in that roadmap, however time will tell if one of these refreshes will deliver enough of a performance increase over my current 8800 @ a reasonable price point to make me want to switch. Also, there's a glimmer of the new next gen parts coming in by April/May.
-
Preview of the 9600GT: http://en.expreview.com/?p=184 (http://en.expreview.com/?p=184)
-
Almost all NVIDIA’s upcoming products will be delay to March, according to our sources.
Delayed new products including 780a, 750a, 790i, 790i Ultra, and 9800 GX2. All these products should be announced in Feb, 2008. Sources did not mention the reason why NV hold the upcoming releases. And we guess that’s because of factories are running into Chinese new year holiday.
The delay will push back the war between GeForce 9800GX2 and Radeon HD 3870 X2. And that means HD 3870 X2 will be the absolutely highest product in the market, continue eating NVIDIA’s share for one extra month.
And 790i and 790i Ultra’s delay will make them directly face Intel’s X48. Also, in NVIDIA’s February product line, only GeForce 8200 will still stick on NV’s roadmap, to be released in Feb. But GeForce 8200’s rival, AMD’s 780G have been released in China few days ago.
We’ve also reported 9600GT suffering a week’s delay from Feb 14 to Feb 21, and its enemy, Radeon HD 3600 series already began selling from Jan 23.
-
Some bad news for the green camp... though the last part is a good note for us the consumer
9600GT recalled, 9800X2 delayed
More Nvidian release woes
By Charlie Demerjian: Monday, 18 February 2008, 4:13 PM
IT LOOKS LIKE Nvidia is firing on no cylinders lately. The firm can't seem to get a product out on time. There is one bright spot: the count on the 9600GT delays has gone up by 50 per cent.
That may not be a bright spot, but the so far twice delayed 9600GT, (specs here) appears to have been recalled. Some OEMs are getting the call to yank them, others not yet, or not at the time they talked to us. What this means is more and more delay, this time we are not sure how much though. An educated guess would be CeBIT, because what else are they going to launch there?
Oh yes, we probably should mention that the 9800X2, or whatever they call it, has also moved from CeBIT to the end of March, for now. Before you get all hot and bothered, this is basically a couple of downclocked G92s on 2 PCBs.
It will be decently fast, but it will also be expensive. The cheapest we can find a G92 based GT with 512MB memory is about $210, and a GTS for about $260. That compares to $185 and $210 for the ATI 3850 and 3870 respectively. This means that he NV cards have a premium over the ATI cards at all points.
Toss in that NV can't make them on one PCB, duals are expensive, as are board cut-outs, and you have a huge price premium over the 3870X2. Add in that OEMs hate dual PCB boards, and you are looking at another losing streak in the making. Are we staring down the barrel of another 7950? NV could not show working boards at CES, so, like Hybrid, things are looking grim.
There is a bit of good news though, the 8800GT is going to have a price drop in short order, as sill most of the lower line (8600 and down). We are not sure how much, but look for $10-20.
-
Not that i'm interested in any one of those cards but the bad part about these delays is that more than likely there will be a trickle down effect that will in turn delay the next generation cards as well. I hope they get their act in order soon.
In the mean time, I don't think that they have any other choice than to reduce the price of the 8800gt's in order to make it more attractive in comparison to the 3870's.
-
GeForce 9800 GTX (D9E-20) to launch in Late March
http://www.xtremesystems.org/forums/showthread.php?t=177484
Jedi just now show me this scene. I would have been more excited if this was a next generation product. The general consensus is that the 9800gtx is gonna be based on g92 which will be probably a higher opimised version of the 8800gt gpu with higher clocks and more memory, maybe faster.
They real sticking though, whey the new stuff Nvidia?
-
Ppl, slowing the refresh cycle isn't as bad as you make it out to be, it means your EXISTING CARDS will RETAIN MORE VALUE for a LONGER PERIOD :p geez these cry babies. The longer an existing architecture stays around, the more games and software can be optimized for it. I am not saying they SHOULD slow down (well maybe a little) but I AM saying that just because they haven't brought out the next best thing since yogurt in a tube doesn't mean it's the end of the world. Besides apart from Crysis could someone name another game that brings the best current gen to it's knees? and at what REASONABLE RESOLUTION doh come with not well @ 3900x2200 its making barely 30fps rubish and with 32xAA and 400FSAA.... yuh get the idea.
-
Its not so much wanting next gen to play Crysis better. Yeah thats a part of it but just a small part. The real scene is to have and to hold a new part to test and look at.
-
Preach on brotha Awesome... :lol:
-
Its not so much wanting next gen to play Crysis better. Yeah thats a part of it but just a small part. The real scene is to have and to hold a new part to test and look at.
This is a reason for a company to spend millions of dollars in R&D to keep up a ridiculous product refresh cycle? that makes NO sense
-
lol @ that rez issue fuh real
this new found absurd obsession with massive resolutions is way outta hand
when i get my new rig, it wont be run in anything higher than 1280
cause its totally unnecessary to do otherwise.
i am quite sure its the monitor manus pushing this rubbish around ooo look 3.144444444 to the 10th degree res on this monitor!!!! buy buy buy buy!! oooooooo
-
the 9600gt's are outtttt. newegg is loaded with them, price is about 170-180 usd..
tweaktown has reviewed one of the cards also, check out the review here >>> http://www.tweaktown.com/reviews/1293/1 (http://www.tweaktown.com/reviews/1293/1)
-
(http://i115.photobucket.com/albums/n310/daaaknite/NVIDIA.jpg)
I've gotta say that a 9800gtx @ ~$400 is pretty sweet given the fact that (if this roadmap is even sllightly accurate) this would be the cheapest gtx at release everrrr. I still have my doubts about how much they can do with an optimised current gen architecture, but if they can get a significant performance boostover current generation, well HELLS YEA sign me up for one of those babies.
-
Amazon has them too:
US$180 - US$220 (http://www.amazon.com/gp/feature.html/ref=amb_link_6360722_2?ie=UTF8&docId=1000199541&pf_rd_m=ATVPDKIKX0DER&pf_rd_s=top-1&pf_rd_r=0BE7DNZSC1AAWVEXNPXX&pf_rd_t=301&pf_rd_p=364918101&pf_rd_i=GeForce%209600)
-
TigerDirect talks about the EVGA GeForce 9600 GT Superclocked GPU
http://www.youtube.com/watch?v=-gCQXjWwMPk
-
O_O ^^^^ WHAEYYYYYYY Just Imagine what the others will do to halflife2 Wheeeyy cause a real Cryisis dey I will get 2 in the 9 series if its worthit!!!
-
Pics of the Albatron 9800GX2.
(http://img401.imageshack.us/img401/8840/albatron9800gx21gxboxth4.th.jpg) (http://img401.imageshack.us/my.php?image=albatron9800gx21gxboxth4.jpg)
http://www.anandtech.com/weblog/showpost.aspx?i=404
-
http://www.nordichardware.com/news,7475.html According to this , the 9800GTX is about the same speed as the 8800gts is right now.
This means that , if ATI's cards provide the 50% peformance jump (50% more stream processors than 3870) that they are supposed to , ATI might finally be able to compete toe to toe , and perhaps even dominate Nvidia.
I agree with Crixx on the resolution issue , i'm still using an old Dell CRT monitor cause i got it for free, and i'm happy if a game can run at 1280x1024 .
When i get a job maybe i'll get a 19-21 inch LCD.. but until then... cheapness ftw
-
Nvidia just sucking more and more with every piece of news that come out concerning their products. In the first place, I think they should have saved the 9 series for next generation parts, but that's beside the point. But thinking that ati is gonna bring a next generation card that will even remotely compete with anything that nvidia has to offer is wishful thinking at best, especially since amd has already made it clear that they are not looking to compete in the high performance segment.
-
especially since amd has already made it clear that they are not looking to compete in the high performance segment.
Nothing screams that more than building a single PCB DUAL GPU card that bests any single PCB card Nvidia has... right..
-
LOL im not talking present tense, in that in the FUTURE they wont be fighting the high end card market so much. Besides, any way you spin it, its two cards they essentially made into one. While that in itself is a feat that even nvidia cant duplicate to date, it is still TWO cards coming together to match ONE card from Nvidia. Now if nv get vex and decide to make a single pcb DUAL 8800ultra, what go be amd situation?
-
COnsidering their driver situation, I would suspect not very much :p btw the intention of AMD for future video esp wrt the higher end segment is exactly what they are doing now, so is Nvidia for that matter. There is not going to be any new 'monster' 1 billion transistor 300W card that can spew crysis at framerates of 1,000,000 fps... rather both camps are working on a simpler archetecture to make it easy to scale their capabilities.... or at least that's AMD and... INTEL's approach. You see with the unified shader tech that ATI used to develop the 2xxx and 3xxx series it's somewhat similar to Intels mini core project. You have a very simple but powerful 'smart' core that can do anything. Then you basically copy and print as many as you need to get the job done. Yes there will be optimizations along each new gen, but the idea is, that you simplify the design process by using basic building blocks for your variations. So if your high end need 500 shaders, then you use 500 shaders, if your mid range only needs 250, etc, etc. Also it was mentioned in an article I read that AMD's approach from now will be mutlicore as opposed to MEGASINGLECORE. Which if you check it, is the way the industry as a whole (not just in GPU and CPU) but on a whole. Parallelism. You can only go SO FAST, after FAST is WIDE.
-
I think we can all agree that ATI has superior multi card technology.Now really and truly, HOW multi gpu is achieved should not matter since both nv and amd use their respective linking tech to achieve the bonding in any case. (although single pcb IS better for lower power consumption and heat generation)
The main ingredient in the scaling of the tech is the sli/crossfire. What nvidia really needs to work on is improving their SLI (that is IF they really intend to go the route of uniting two smaller forces to make a single large one or if they're gonna continue in the vein of making one very powerful part).
I think the next few months will be interesting to see how ati can improve on its recent successes and how nvidia responds to the red team's renewed vigour.
-
Hmm...
9800GX2 is woefully expensive
CeBIT 2008 Sometimes you don't get what you pay for
By Charlie Demerjian: Tuesday, 11 March 2008, 3:34 AM
THE NVIDIA 9800GX2 is turning out to be a pretty huge joke, and it is not even out yet. I can't wait to see the fanbois on the forums bragging about how much they got taken for, prepare for laughs.
Why am I down on this card? Several reasons, the main one is price. It is absurdly expensive for what you get, and what you get isn't all that much.
We printed the numbers a few days ago, here if you missed it, and they are pretty woeful. Lets assume the worst for the 3870X2, call it 13,500 3DMark06, and lets assume the best for the the 9800GX2, with a driver drop and other hacks, say it hits 15K. A little math (15,000/13,500 = 1.111) tells us that it will be about 11% faster than the 3870X2. No problem so far.
The funny part? It is going to retail for $599 US, you read that right, $599. The X2 is at $449, 75% of the price. A bit more math tells us that the price performance (3DMark/$) of the X2 is 30/$ and 25/$ for the GX2. That is one hell of a difference, and it is based on best case assumptions. To put it in different terms, you could get a 3870X2 AND a 3870 for a three way for around the same money.
Don't look for this thing to come down in price much, vendors are telling me they are getting charged $425 +/- $25 for kits depending on who they are. When you add a box, trinkets and shipping, you end up with something that can't be sold for much less than that MSRP.
Before you think that the price will go down with volume, don't hold your breath. The numbers I was quoted for allocation are laughably low. Two PCBs, cutouts, bridges and assembly are not cheap, and expensive is not something OEMs like. Neither are dual PCB solutions.
In the end, you have a hot and expensive card that can't justify it's exorbitant cost. With muc h faster and cheaper 3870X2s arriving in short order, there seems little point to the GX2. Who was it that said something about not launching a product unless it was 20% faster than the competition? I guess that quote will have to be airbrushed from the corporate database before the 18th.
As with anything yuh take it with a pinch of salt.. so i'll leave some hope for the green camp (though if the previous gen 7950X2 is anything to go by... then yeah I see a repeat occurence waiting to happen), if what is stated above is right... then SHAME on Nvidia for not doing better!
-
I was just waiting for you to post this lol. I let go oneee steups when I read this. As though NV have their head in the sand or something. I would believe that this thing would perform mediocre performance like that and indeed be that expensive. That product would be a total bust once this is true.
-
*W1nTry directs Capt to the 7950GX2 snafu of yesteryear* err you mean if they DOH stick their heads in the sand AGAIN right?
-
lol yeah if they DOH stick their head in the sand. But they have to be total morons to sell a card for $600 us that barely outperforms the $250 8800gt. That would be absolute and utter madness. And not forgetting we have the 8800gts 512 for $300 that approaches ultra speeds and performance. I cant understand for the life of me WHY they would even release this thing if those numbers are right. Maybe they have so many excess PCB's that they have to get rid of them this way, moving two of them out at once. But if that is the case, this is NOT the way to do it. Sell 8800gts's or gt's I mean wtf??
-
Well it will cost alot since its a dual PCB with basically 2 GTs... so in the least its the COST of 2 GTs which is close to 450USD, then you have to remember they need to recoup R&D costs and of course its not a standard layout. THe PCB actually is a cut out layered design... so its more expensive to make since its not in MASS numbers... so I could believe this card could cost close to 500USD... not 600USD but in teh least 450... and even then it's probably a loss. They making this product as an ego trip.. they don't need to make this card imho. Their cards are already more or less in teh least in teh single card dept, and the Tri SLI does win alot of benches... but as I said.. it's prolly an ego thing, since AMD doing it make them look like more of a tech and innovation leader....
-
check it out here>>> http://www.tweaktown.com/articles/1331/nvidia_s_geforce_9800_gx2_early_test/index.html
damn that's large arse card..
-
disappointing. soo damn disappointing. they should have named this card the 8800GX2. it doesn't perform anything like i'd expect a Geforce 9 series card to. but i guess it's largely due to the drivers not being fully optimized. i mean, the card card doesn't even support DirectX 10.1 with SM4.1!!! although no games at the moment supports it, it would be nice for a new series of card to carry new technology (the 3870X2 has Dx10.1 support BTW). Yuh disappointin meh NVidia.
-
Ah well capt... seems I may have been right....we'll call the 9800GX2 the '7950GX2 2nd Edition' kinda like Vista is MS Windows ME II XD
-
This is old news. Also, there's another topic for video card news, but not to worry son, its all good. *Merged* Nice comparison between cards though.
Its also important to remember that the new name doenst mean that this is a new generation. ALL nvidia cards post November 2006 are based on g80 tech so you would not get any new features like dx10.1 until next gen comes around.
LOL @ 7950gx2 2nd edition. Nvidia, its all well and good to make mistakes, in fact its a necessary thing. But when you make the same mistake twice, you have to ask youself...wtf??
-
I think most know its not next gen per say, which is a little troubling since AMD will release the 4xxx series this year.. that would mean that Nvidia's next gen this year would be...??Xxxxx? it's funny to me that the nvidia naming has caught up with the AMD naming.... at this rate you'll go out to but a 5800 AMD card and a X1800 Nvidia card.. but yeah... shooting urself in the foot twice can't be a healthy practice...
-
LOL nvidia will catch up to amd and amd will catch up to nvidia too lol. 3xxx ati...to 4xxxx then what do we have? ati 5200?
-
That's the thing eh.. we'll hopefully have 5200 series cards that don't suck, a 5800 series card that's NOT a vacuum cleaner or we could have a X1600 that kicks @$$... it's gonna be weird... I studying what about those poor ppl that mix it up...
*guy walks into bestbuy*
baka: "I'd like to get an X800 please"
Clerk: "Would that be Nvidia or AMD?"
baka: "Wait.. their two?"
Clerk: "Yes sir"
baka: "Who's the original manu?"
Clerk: "AMD sir"
baka: "I'll have the AMD X800 then!"
Clerk: "Certainly sir, excellent choice"
... O_o
Later that night..
baka: WTF crysis won't even run on this crap...
-
^HAHAHAHAHAHAHA!!!!!!
-
That's the thing eh.. we'll hopefully have 5200 series cards that don't suck, a 5800 series card that's NOT a vacuum cleaner or we could have a X1600 that kicks @$$... it's gonna be weird... I studying what about those poor ppl that mix it up...
*guy walks into bestbuy*
baka: "I'd like to get an X800 please"
Clerk: "Would that be Nvidia or AMD?"
baka: "Wait.. their two?"
Clerk: "Yes sir"
baka: "Who's the original manu?"
Clerk: "AMD sir"
baka: "I'll have the AMD X800 then!"
Clerk: "Certainly sir, excellent choice"
... O_o
Later that night..
baka: WTF crysis won't even run on this crap...
:sign10: :laughing4:
-
http://translate.google.com/translate?u=http%3A%2F%2Fwww.pconline.com.cn%2Fdiy%2Fgraphics%2Freviews%2F0803%2F1243353.html&langpair=zh%7Cen&hl=en&ie=UTF-8
according to this chinese site , the 9800 is no faster than the 8800, only real benefits would be increased overclockability due to 65nm.
There is a definite possibility that ATI's HD 4000 series of cards will 0wn Nvidia's @ss, since they are slated to be at least 40% faster , with twice as much shading power, more texture units etc..
if only they would launch soon.
Nvidia got lazy, it might be funny to see them having to play the 'value' game when they could easily have taken the performance crown by boosting their cards a bit.
Though there might be the release of a super clocked '9800 ultra' that could mitigate ATi's lead a bit , the top tier ATI card should still be able to maintain a 10% performance margin over Nvidia.
-
So it seems as though the 9800gx2 wont be as pointless as we first thought. (http://www.tweaktown.com/articles/1332/geforce_9800_gx2_overclocked_and_tested/index.html) It does make sense, though, and a bit of redemption for Nvidia...for now anyway. Possessing decent overclocking headroom, it seems to be a powerful semi Crysis busting part, though still with a price out of reach to the everyday joe.
-
I read that review last week/over weekend, honestly I wasn't impressed seeing as it's highlighted as still not being as good as the 8800GT in SLI. They did mention that the CPU overclock strangely enough was seen to give it increased performance... but not everyone can afford a quad core processor, far less to OC it to 3.75GHz... they did mention expecting some improvement in performance with mature drivers but my opinion on this card still stands, its just for show as it WILL be expensive and not really worth the money.
-
Very true. And if the 7950gx2 was anything to go by, the next generation (whatever the name may be) will offer significantly more power in a single card solution. It really pays to wait and see that happens in the next few months.
-
honestly, looking at the speed, stock and overclocked, i can't say i'm impressed personally at the performance of the card. Especially the fact that it's a dual PCB.. man.. nvidia needs to buy ah new drawing board and start to work again. Knowing it's not really a "next gen" offering from nvidia, but still, the current G92's i.e the 8800gt and the 8800gts are honestly the best buys from the green camp... i will still give ATi the crown basically because of the price of the 3870x2. The price of the 9800gx2 aint gonna be pretty.. so i hope nvidia knows what they are doing, but not everything has to be perfect at first try.. only time can tell...
-
:dropjaw:!!!!!!!!!!!!!!
http://www.amazon.com/e-GeForce-PCI-Express-Graphics-Card-Lifetime-Warranty/dp/B0015U7YCG/ref=sr_1_5?ie=UTF8&s=electronics&qid=1205893704&sr=8-5
cheaper than an 8800 Ultra!?!
-
here's a review of the GX2 in SLi.
http://www.tweaktown.com/articles/1347/geforce_9800_gx2_in_quad_sli_tested/index.html
watch the crysis bench scaling!
-
Those numbers seem to be similar to the ones for the single card setup with a q6600 @ 3.7ghz.That SLI performance is unimpressive at best.
-
rumors of the GTX being pushed back.
http://en.expreview.com/2008/03/19/9800gtx-pushed-back-to-april-fools-day/#more-308
if this is true, that April 1st launch date real fishy.....
-
Well the 9800GX2 are out in full force. They are powerful albeit in certain cases and they ARE EXPENSIVE...like 600USD expensive... which was the price guessed... it does outperform 2 8800GTs in SLI now by a small margin in some cases and it has stable driver releases yet to come. Just check out newegg.com and you'll see em... Gosh that's alot of dosh to play crysis on medium settings at 100+ FPS... you could buy 3.5 HD3750s for that much money...
-
The e-Geforce 9800 GTX
(http://img223.imageshack.us/img223/5029/geforce20980020gtx20in2dx6.th.jpg) (http://img223.imageshack.us/my.php?image=geforce20980020gtx20in2dx6.jpg)
Here is the GeForce 9800 GTX Characteristics.
(http://img216.imageshack.us/img216/9467/evga20geforce20980020gtdn0.th.jpg) (http://img216.imageshack.us/my.php?image=evga20geforce20980020gtdn0.jpg) (http://img216.imageshack.us/img216/1527/geforce20980020gtx20in2iz7.th.jpg) (http://img216.imageshack.us/my.php?image=geforce20980020gtx20in2iz7.jpg)
GPU-Z Screen
(http://img216.imageshack.us/img216/4684/geforce20980020gtx20in2mf3.th.jpg) (http://img216.imageshack.us/my.php?image=geforce20980020gtx20in2mf3.jpg)
-
A very comprehensive preview of the 9800GTX ppl:
http://www.tweaktown.com/articles/1356/1/page_1_introduction/index.html (http://www.tweaktown.com/articles/1356/1/page_1_introduction/index.html)
-
http://en.expreview.com/2008/03/31/gt200-to-become-9900gtx-and-9900gx2-launch-date-set-to-july/
I'm hoping that this isnt true. I was waiting for the 9800GTX until i realised it wasnt worth it. I jus got my 8800GT to hold off till GT200 and this is what is in store?!?! jeez man! enough with the refreshes already!
-
http://www.pcper.com/article.php?aid=540 <<< here is a review of the 9800gtx.. man wtf nvidia doin.. let me buy my 8800gt eh..
-
well ..they see it that ther is no competition, and with no competition there will be no urge to progress, ent? :shakehead:
-
well if it is so.. why waste time making a card that's no significant performer to the others in their line.. i mean the g80 core still keeping up.. and the 9800gtx not that much faster than the 8800ultra.. and well ati has their 3870x2 which seems to be their "spartan" card at the moment, which seems to be on par or even a little bit better that the 9800gtx.. i hope the prices are great, because it won't make sense spending more than 350 dollars on this when 8800gts g92's are at 260us..
-
There's ot even a need to go as far as the gts, the gt deliver senough power to run everything at decent levels (except Crysis which you have to play on medium) Until a new generation comes out that significantly increases performance without costing too much, this topic is moot.
-
http://xbitlabs.com/news/video/display/20080414131104_Nvidia_Describes_Next_Generation_Graphics_Processor.html
Close to 200 stream processors , probably twice as fast as a single 8800GT .
To be released in July/august , Likely will give AMD's 4800 series a run for its money... lets hope not so much that it runs them out of business.
-
Well that article is as useful as a kick in the nuts...
-
lol @ kick in nuts.. not even a picture, specsheet, nvidia logo, nuttin... bleh!
-
(like a reall noob) soo what allyuh really saying is ....9800 GTX is a waste of money?? or is not?
-
A very nice review of hte 9600GT with it's SLI configuration tested against the 9800GTX
http://www.madshrimps.be/?action=getarticle&number=12&artpage=3412&articID=817 (http://www.madshrimps.be/?action=getarticle&number=12&artpage=3412&articID=817)
Depending on the games you play, the 9600GT in SLI may just be the choice for you and with a price as low as 120USD (http://www.newegg.com/Product/Product.aspx?Item=N82E16814261002) after mail-in-rebate someone could easily and much more affordably top GATT's benchmarks XD... any takers?
-
interesting results, but my 8800gt already in transit, and well my board aint SLI.. so that's two reasons.. but it is seriously a bess bang for buck card though..
-
More rumours (http://techreport.com/discussions.x/14620) of 9900 series cards in July.
*salivates*
-
Price cuts anyone?? Oh this can only getting better. I can just see the price slashing on amazon for the 8800s and then...AND THEN!! muhahahahaaa!!
Man I need money.... *leaves thread pissed*
-
holy shizzle, either those 8800gt's and gts' price must drop really low, or we seeing ah bloody 7-800 dollar card again. I mean, the ultra is still about what, 519 dollars, the 8800gtx's are at 400 dollars, and the new 9800gx2's are at 579 approx, either those cards have to drop in price, or we seeing one expensive mothaf**king graphics card from nvidia come july.
-
You don't get it do you, there's gonna be a serious price slashing on the "older" cards man. That's why it's always good to wait a lil to get the best deal cause it seems the time you take to wait on what you order to reach down to T&T, the technology advances..
-
^^ i second that.
the 8800GT i have now. I got it in early January for US$270. It is now under US$200!!
-
of course i get it, i'm not concerned about the price slashing on d 8800gt's, it's already happening.. i'm concerned about d price of the new cards meng, something that's supposed to be next gen following the specs on arc's link, i wondering what the price of those cards would be, because if ah 9800gx2 is bout 579, what the price of the new cards would be...
secondly, pc technology continues to advance no matter what, by next year an 8800gt may come like ah 7900gs now, and couple of us would be looking for something new. it's the general cycle of technology, something new always has to come out, following that prices of old parts have to drop, it has happened, it is happening, and will always happen. If you buy something for 300dollars today and 2 months from now u see it for 200 dollars, yeah it would bun, but u can't be vex for not waiting two months, and you should never regret your puchase.
-
And it's only going to continue Sax, especially with sites like amazon that seem to toy with you juggling prices. I tell you man it's a conspiracy man, the government man lol.
-
Sigh. We're really having this discussion AGAIN?
If I want something NOW, hell i'm going to buy it now. So what if something comes out in a few months that is better than what you got? That's how it goes in the tech world. If you for the wait and see approach, then you'll be in a perpetual state of waiting. You'll wait until u grow old and die. So what if the price drops in some months? Its in my hands to have and to hold NOW.
-
^^^ Ent, and PLAY yuh aint know if the price/performance of that 9900 GTS is right, I just might
buyeth it. :happy0203:
-
Is about time for me to upgrade too...so i'll be getting one as well is all is good.
-
Hmm...July....family coming in then.....
*Seriously ponders depending on the price tag...*
-
Sigh. We're really having this discussion AGAIN?
If I want something NOW, hell i'm going to buy it now. So what if something comes out in a few months that is better than what you got? That's how it goes in the tech world. If you for the wait and see approach, then you'll be in a perpetual state of waiting. You'll wait until u grow old and die. So what if the price drops in some months? Its in my hands to have and to hold NOW.
ENT!!! see it, buy it..
-
Aye Capn and anyone else with 8800s when allyuh ready to sell card let me know yes. I aint mind to be a lil behind, once it wukkin nice :P
-
http://www.trustedreviews.com/graphics/news/2008/04/28/nVidia-Launches-GeForce-9600-GSO/p1
new nvidia budget 9 series card.. 9600gso... 192bit memory bus, 384mb of video ram....
-
I all for all dem cards, but will ATI ever be able to master Crysis?
-
crysis isn't God.. why base our video card purchasing on one game? but ati's new cards seem promising and look like they can b17ch slap nvidia, so we can only wait..
-
crysis isn't God.. why base our video card purchasing on one game?
Because its one of if not THE most demanding games. Once there's a card that can handle that nicely, there's nothing out there (until another monstrous title is released) that can't be handled.
So yes, Crysis IS God...the god of all games.
-
someone on Youtube posted a vid of 3DMark 06 scores using a "9900 Ultra". i mean i know it's not real, but do u guys have any idea how he did it? it's not photoshop, cause he manipulates the windows in the video.
http://www.youtube.com/watch?v=QSpgkSUdYs4&feature=related
-
nah.. i believe he did a bios mod or something similar.. making the card register as if it was a 9900ultra....
-
He most likely did it to a 9800gx2.
-
http://www.techreport.com/discussions.x/14734
9900GTX to have around 240 stream processors , 32 raster units , 1GB of GDDR3 and 512bit memory.. for 499 US. Monster single gpu card , will eat lots of power.
Basically it will replace the 9800GX2 and be faster.. probably in the area of 20-30% .
ATI's 4870X2 will probably be its direct competitor in the price bracket , Assuming a single 4870 = 2 3870s , the X2 should provide similar performance.
the single 4870 is going for 350 US... Likely to be a good bit slower , but cheaper , cooler , less power drain.
-
Wow looks like those cards coming a bit sooner than I thought. I cant wait till the nda's are lifted and the benchmarks come along.
-
AMD has settled on launch dates and a naming scheme for its next-generation RV770-based graphics cards, according to VR-Zone. The Singapore-based website claims the first of AMD's new line, the Radeon HD 4850, will hit store shelves on June 18. A faster version, the Radeon HD 4870, will reportedly launch on June 25, but retail availability shouldn't follow until July. Later in the third quarter, VR-Zone says AMD will introduce a dual-GPU Radeon HD 4870 X2 graphics card.
According to recent rumors, the 4850 will cost $269 at launch, and the 4870 will be priced a little higher at $349. Both cards will feature 480 stream processors and 256-bit memory interfaces, with the Radeon HD 4870 packing 1GB of 1935MHz GDDR5 RAM and the 4850 only featuring 512MB of slower memory (perhaps of the GDDR3 type).
In related news, several sources including HKEPC and Fudzilla report that Nvidia will dub its next-gen top-of-the-line graphics card "GeForce GTX 280." The lower-end variant will be known as "GeForce GTX 260." This new naming scheme may be tied to Nvidia's recent statements about its lineup being too large and confusing.
Both the GTX 280 and the GTX 260 will be based on the next-gen GT200 graphics processor, the sources claim. Older rumors suggest the GTX 260 could have 240 stream processors, 1GB of GDDR3 memory, a 512-bit memory interface, and a $499 launch price.
http://techreport.com/discussions.x/14745
Yeah...thats GTX 280. A bit different naming scheme in an attempt to deconfusify the current naming system.
They're supposed to be coming out between the 16th and 20th of June so we can expect benchmarks to flood the world wide wibble by the 2nd week of next month.
-
I think its because they ran out of numbers.. I mean you can but a Radeon 9800 Series OR a Nvidia 9800 series... what were they going to use after? X1xx ??? that would sound too familiar.
-
True, but there's something disturbing about saying GTX280 and GTX 260. It sounds...wimpy. I guess its because we use Dell optiplex GX260's and GX280's almost exclusively in the entire ministry and those machines are, well....shhhh....iiiii
-
LOL @ capt hating on dell. As opposed to?
Introducing the all new Nvidia Geforce Hurricane! it slices, it dices and even makes you breakfast!!! yes folks look no further, the future is TWIMTBP... O_O
-
LOL @ hurricane. I think they should put the GTX suffix on the end like before and use....some OTHER numbers. Anything but 260 and 280. I mean, most people may not be familiar with those dell machines but for me those numbers are synonymous with migranes.
To make matters worse, the models differ only with the inclusion of "T" on the video cards.
-
Later this week NVIDIA will enact an embargo on its upcoming next-generation graphics core, codenamed D10U. The launch schedule of this processor, verified by DailyTech, claims the GPU will make its debut as two separate graphics cards, currently named GeForce GTX 280 (D10U-30) and GeForce GTX 260 (D10U-20).
The GTX 280 enables all features of the D10U processor; the GTX 260 version will consist of a significantly cut-down version of the same GPU. The D10U-30 will enable all 240 unified stream processors designed into the processor. NVIDIA documentation claims these second-generation unified shaders perform 50 percent better than the shaders found on the D9 cards released earlier this year.
The main difference between the two new GeForce GTX variants revolves around the number of shaders and memory bus width. Most importantly, NVIDIA disables 48 stream processors on the GTX 260. GTX 280 ships with a 512-bit memory bus capable of supporting 1GB GDDR3 memory; the GTX 260 alternative has a 448-bit bus with support for 896MB.
GTX 280 and 260 add virtually all of the same features as GeForce 9800GTX: PCIe 2.0, OpenGL 2.1, SLI and PureVideoHD. The company also claims both cards will support two SLI-risers for 3-way SLI support.
Unlike the upcoming AMD Radeon 4000 series, currently scheduled to launch in early June, the D10U chipset does not support DirectX extentions above 10.0. Next-generation Radeon will also ship with GDDR5 while the June GeForce refresh is confined to just GDDR3.
The GTX series is NVIDIA's first attempt at incorporating the PhysX stream engine into the D10U shader engine. The press decks currently do not shed a lot of information on this support, and the company will likely not elaborate on this before the June 18 launch date.
After NVIDIA purchased PhysX developer AGEIA in February 2008, the company announced all CUDA-enabled processors would support PhysX. NVIDIA has not delivered on this promise yet, though D10U will support CUDA, and therefore PhysX, right out of the gate.
NVIDIA's documentation does not list an estimated street price for the new cards.
http://www.dailytech.com/article.aspx?newsid=11842
Pros: PhysX stream engine onboard!!!! 3-Way SLi (3Gigs of GPU goodness)!!!!
Cons: No DX10.1, SM4.0 support!!!! WTMC!!!!
-
^^^ interesting, i can't wait till these cards come out to see how much "better" they perform...
-
Nah they WILL be faster. Im just wondering by how much.
Im also inclined to see what ati's offerings are since I think that they have a nice surprise in store for us.
-
yea and i do hope ATi brings something worthy to the table, i'm already drawn to the low power consumption they promise, but i do hope the performance is worthy of making a move to their new card... but in due time
-
I because of my pocket, will be going for the XFX GeForce 9800 GTX Video Card - FREE Black Edition, 512MB DDR3, reasonable overclock there hope its stable, the only thing is.... XFX doh have that neat step up program like EVGA, but I want that gae soo bad, original that is, what would be a better choice for me to go with? Cause crysis getting aqcuired or JK'ed
One mOORE thing.....I was watching a review vid of this card and the normal version on Tigerdirect, and the older vid had the card at a higher framerate 54.2 fps when the Black Edition was at 38 fps, both at same resolutions, the older one with 4X AA,, can someone explain this to me???? I doh know if thatis because the XFX team did the benchmarking for him over the freaking phone!!! I was however considering the memory type of the cards are different, the normal card has GDDR3 whil the Black has DDR3, this is freaking crysis on high WTMC is going on here!!!
-
Wow. Use some damn sentences will you i'm getting a cerebral infarction just reading that post.
Maybe the "older" card was a sli setup whereas the 9800gtx was working alone. (Whatever that "older card may be).
What game coming free?
What is this "normal" version?
-
1. Sorry bout the sentences, got a 2 in english
2. COD 4 is the free game
3. When i say "normal" i mean a non overclocked version of the series, that was the 'older' card, it not really older per say, but hows is it that an overclocked version running the same games at slower frames per second>
-
IT'S NAKEDDDDDDD... LOVE LIVE THE CHINESE!!!
Geforce GTX 280 Baring it ALLLL...LOL (http://www.tweaktown.com/news/9487/nvidia_geforce_gtx_280_stripped_naked/index.html)
-
Changes changes changes I wonder what will be out next year????? Fuhget that!!!! I ll just TRi SLI my card and get similar performance, @ cheap price tooo, theyll still be able to run games to come( wont they???)
-
Hmm
Nvidia GTX260 and 280 revealed
Missed targets and low yield
By Charlie Demerjian: Saturday, 24 May 2008, 5:57 PM
NVIDIA STAGED ITS regular editors' day, and once again, we were not invited. Luckily, that means we can tell you about it early.
With the usual flair of spin and statistics which almost no one questions for fear of being cut off, NV talked about two new GPUs, the GTX280 and GTX260. The 280 is the big one, the 260 is the mid range, what used to be the GTS.
The 280 has 240 stream processors and runs at a clock of 602MHz, a massive miss on what the firm intended. The processor clock runs at 1296MHz and the memory is at 1107MHz. The high-end part has 1G of GDDR3 at 512b width. This means that they are pretty much stuck offering 1G cards, not a great design choice here.
The 280 has 32ROPs and feeds them with a six and eight-pin PCIe connector. Remember NV mocking ATI over the eight-pin when the 2900 launched, and how they said they would never use it? The phrase 'hypocritical worms' come to mind, especially since it was on their roadmap at the time. This beast takes 236W max, so all those of you who bought mongo PSUs may have to reinvest if they ever get three or four-way SLI functional.
The cards are 10.5-inch parts, and each one will put out 933GFLOPS. Looks like they missed the magic teraflop number by a good margin. Remember we said they missed the clock frequencies by a lot? Here is where it must sting a bit more than usual, sorry NV, no cigar.
The smaller brother, aka low-yield, salvage part, the GTX260 is basically the same chips with 192 SPs and 896M GDDR3. If you are maths-impaired, let me point out that this equates to 24 ROPs.
The clocks are 576MHz GPU, 999MHz memory and 896MHz GDDR3 on a 448b memory interface. The power is fed by two six-pin connectors. Power consumption for this 10.5-inch board is 182W.
This may look good on paper, but the die is over 550mm, 576 according to Theo, on the usual TSMC 65nm process. If you recall, last quarter NV blamed its tanking margins on the G92 yields.
How do you fix a low yield problem? Well, in Nvidia-land, you simply add massive die area to a part so the yields go farther down. 576 / 325 = 1.77x. Hands up anyone who thinks this will help them meet the margin goals they promised? Remember, markets are closed Monday, so if you sleep in, no loss.
The 260 will be priced at $449 and go up against the ATI 770/4870 costing MUCH less. The 280 will be about 25 per cent faster and quite likely lose badly to the R700, very badly, but cost more, $600+
I would like to reference this to the Folding thread. Take points guys.
-
woah. $450 for a gts part? $600 for gtx?
That's 25% more performance for ~ 100% greater price?
Good god nvidia, I hope this is not true. I'm sorry but theres no way im paying more than $400 for a gts part. Is best I get a 4870 and done.
and just to give you an idea on just how powerful the 4870 could be here's some benchmark shots (http://www.bilgiustam.com/512mb-ati-hd-4870rv770-1gb-geforce-9800-gx2-crysis-ve-3dmark-2006-testleri/). May not be 100% but...if even remotely true could spell big trouble for nvidia this summer.
-
my thoughts exactly, when i saw those prices, it aint making sense nah...
i'm drawn to the 4870 because of:
1) the projected price range
2) the low power consumption
3) the performance for the price.
i think ATi will be sipping some champagne and smoking some cigars after they launch these babies..
-
http://www.hardware-infos.com/news.php?news=2092
If all of this is true , looks like ATI will have a few months to catch up with Nvidia.
If they shrink the GT280 to a 55nm or 45nm process , half these problems will dissapear, but until then, ATI FTW.
Competition , good news for everyone.
-
Wow...for a second there I thought i'd forgotten how to read...
Fellas...how's your German? Could sure use your help here.
-
Uhhh...... ???
-
i cant read or understand german, except for "Becks" Lol but based on those charts, those new gpus are on a give and take with the 9800 GTX, kinda making you choose you dependency on their hardware for "specific games". Basically the technologies they promoting will mainly coincide with particular games to get the best out of them, like Crysis is to nivida and Half life 2 is to Ati. sort of a mafia illuminati thing going on they, if those stats are true
-
i cant read or understand german, except for "Becks" Lol but based on those charts, those new gpus are on a give and take with the 9800 GTX, kinda making you choose you dependency on their hardware for "specific games". Basically the technologies they promoting will mainly coincide with particular games to get the best out of them, like Crysis is to nivida and Half life 2 is to Ati. sort of a mafia illuminati thing going on they, if those stats are true
How on earth you arrive at that conclusion? You sure you watch the right link?
It is clear that the GX280 will be more powerful than the 9800gtx by leaps and bounds. Also remember that it is a new generation gpu which means that even watching raw specs cant tell you exactly how its gonna perform, but one thing is for sure, its gonna be faster all around. Don't be confused by the higher clock speeds of the 9800gtx. Just look at the transistor count on those b1tches.
-
Nah boy...I was never a GTX-hoe (http://www.carigamers.com/cms/forums/index.php?action=profile;u=4209) anyway...the most I would spend on a video-card is $300 US (not including shipping).
Them new Nvidia cards might be a little outta my reach. New CPU immc oui. I shall wait for the 'GT' or 'GS' versions to see how they stack up
against the 4870 variants before I make a choice...but GOOD GOD *(c) James Brown* at the 4870 seemingly wining up on Crysis. :awesome:
I aint jumping on no bandwagon. Whoever faster at $300 US gets da money. Its as simple as that. :)
-
never understand how nvidia does what it does with low clock speeds
they never seem to interested in playing the clock speed game
unlike certain red companies...
-
i cant read or understand german, except for "Becks" Lol but based on those charts, those new gpus are on a give and take with the 9800 GTX, kinda making you choose you dependency on their hardware for "specific games". Basically the technologies they promoting will mainly coincide with particular games to get the best out of them, like Crysis is to nivida and Half life 2 is to Ati. sort of a mafia illuminati thing going on they, if those stats are true
How on earth you arrive at that conclusion? You sure you watch the right link?
It is clear that the GX280 will be more powerful than the 9800gtx by leaps and bounds. Also remember that it is a new generation gpu which means that even watching raw specs cant tell you exactly how its gonna perform, but one thing is for sure, its gonna be faster all around. Don't be confused by the higher clock speeds of the 9800gtx. Just look at the transistor count on those b1tches.
Well recieved, would be something to see the bacchanal that would come if Ati give nvidia a run fuh dey money!!! the kind of deals that they would bring happier ppl on planet earth LOL
-
never understand how nvidia does what it does with low clock speeds
they never seem to interested in playing the clock speed game
unlike certain red companies...
Actually that´s not altogether true. You see crixx, let´s examine the 8800GT vs. HD3870.
8800GT HD3870
Avg Core Clock 600MHz 775MHz
Memory Clock 900Mhz 1125MHz
Shader Clock 1500MHz 775MHz
Stream Proc 112 320
Now looking at the above highlighted sections, you quickly see that to say Nvidia doesn´t play the frequency card would be a lie. The primary reason the 8xxx series is faster than the HD3xxx series I would have to say is Shader Frequency. AHA you may exclaim, but the HD3xxx has more than 2x the Stream Proc. VERY TRUE. But you see those ´Stream Processors´ in the ATI card are ACTUALLY the shaders. ATI´s unified shader architecture. Whereas in Nvidia, the shaders are actually not unified. That´s why an ATI card can do GPGPU tasks like folding, but the current Nvidia cards DO NOT. The upcoming GT and GTX will have a more unified approach and hence will be taking on more GPGPU tasks. However as it stands Nvidia'cards are better at gaming and ATI cards are better at GPGPU. You could say Nvidia focused on gaming and ATI focused on processing.
-
With that many shaders at those kinna speeds, the GTX280 promises to be one hell of a card. The cost may very well be justified for those willing to spend that much.
-
don't forget about the new onboard Physics Processor. that could very well be one of the main reasons why those new card are so pricey. it is the first of it's kind after all. check out some tech demos showing off the PPU.
http://www.youtube.com/watch?v=MoQh4R1-Vjs
http://www.youtube.com/watch?v=X9BGiMvrUrk
http://www.youtube.com/watch?v=x3eenx3E0X4
-
Yawn @ tech demos. Yes they show you what is possible but the actual application never comes or is underwhelming.
Notice how they're NOT using Vista...
And wtf is nvidia bringing out their own social networking services? 3 of them if im not mistaken? WTMC?
-
http://xbitlabs.com/news/video/display/20080528160257_Nvidia_Projects_Long_Lifespan_for_Current_Generation_Flagship_Graphics_Chip.html
The new ATI's keep looking better and better , the G92 chip will be Nvidia's mainstream performance GPU for another 6-12 months.
While there will be price drops , shrink to 55nm and higher clocks , i'd rather buy into ATI's new hotness than a strong, but 2 year old architecture.
The good news may be : All these new Nvidia promises (physics) will probably run on the G92 and G80 through CUDA just fine, i don't think their GPUs have changed much at all in the past 2 years.
ATI's new cards don't look to have any major features over the nvidia's, but i'd still say they are more future proof with DX 10.1 support etc.
-
I am waiting to see how they integrated the PhysX hardware into the card (if indeed they did at ALL). If they integrated the logic into the core? or did they do something like Core 2? (2 cores on an MCM package?) or did they just add the PPU to the CARD? like an HD3870X2? or 9800GX2? or what. That's my interest in the physics. Otherwise my take on why Nvidia didn't include DX10.1 on their new hardware is simply THEIR IMPLEMENTATION SUCKED. Yall saw the Assasin' creed fiasco right? the game publisher pulled DX10.1 support as the AMD hardware got a performance BOOST and the Nvidia doesn't support it! not only that but the game is a TWIMTBP so yuh know what play off there despite what both claim. I am currently on the Nvidia camp (8800GT nforce 4 chipset) and the way things looking i'll move forward on the Nvidia as well (750a and another 8800GT perhaps). I like DAAMIT (my X1900XT still folds circles around even a Core 2 Duo dare I say quad). But ftw Nvidia i'll be. (Less HD4xxx kicks the pants off GT/GTX).
-
That "starting at $500" thing sounding kinna ominous. If ati can deliver a competitively performing part @ or ~ half the price, I just may go crossfire in my next rig, which unfortunately will mean that I have to change mainboard.
The speculation is killing me.
-
http://www.nordichardware.com/news,7809.html
Supposed benchmarks for the latest cards, Assuming these are correct ,ATI looks like they might even take back the crown, while the 4870 X2 isn't included , it may very well do better than the GT280 .
Of course this is pre launch speculation, using pre launch drivers, we'll have to wait and see.
-
Yeah...so benchies arent quite out yet but here's a sneak peek...
http://www.theinquirer.net/gb/inquirer/news/2008/06/12/gt200-scores-revealed
O_o
-
This is... confusing....
Nvidia 9800GTX+ tips up
Speedy, at 738MHz
By Paul Taylor: Thursday, 19 June 2008, 10:17 AM
BREAKING FORMATION from its usual naming conventions, the green goblin is set to launch a new and improved version of its 9800GTX, aka the 9800GTX+, according to PC Perspiration.
The new and improved graphics chip will be built on a 55nm process, a shrink from the GTX’s current G92@65nm.
According to the article, the specs on the GTX+ will be something like 738MHz core, 1836MHz shader, 1000MHz GDDR3 (we’re assuming 2000MHz effective), Tri-SLI support and a unique “kick-in-the-ass” feature which, we presume, is activated when the system detects it is being pitted against DAAMIT cards. No power numbers, yet, but if this really is a 55nm shrink, then there should be some OC’ing headroom…
Pricing and availability seem to be another shocka: $229 MSRP and a mid-July launch, throwing itself right in the path of the upcoming 4850, 4870 and 4870X2. We can’t say DAAMIT wasn’t expecting this, so these 4xxx cards better be good.
PC Per says it’s got one in the lab right now, which comes as a great surprise (and a nice coup) for the masses of graphics enthusiasts, we’re sure. They’re promising numbers by the morrow.
Nvidian PR is still fast asleep, but we’ll try and get the big picture on this later.
http://www.pcper.com/comments.php?nid=5814 (http://www.pcper.com/comments.php?nid=5814)
Unless Nvidia PURPOSELY SPOILED the drivers BIG TIME on their GT/X2xx series.... so that they can increase dramatically once HD4xxx releases... this seems like it'd be a kick in the nuts to... THEMSELVES... the GT so far BARELY beats the 9800GTX... so here we have a FASTER 9800GTX??? err.... can you spell canabalism....
-
only nvidia know what nvidia is doing, releasing cards that compete against their own tryin to make ATi look obsolete and insignificant. It's like they're operating as if there is no competition and they must compete against themselves. I hope the 4XXX series kicks nvidia in the nuts just for all of this gpu spam..lol
-
http://www.pcper.com/article.php?aid=580&type=expert&pid=1 (http://www.pcper.com/article.php?aid=580&type=expert&pid=1) << a preview of the 9800gtx+
i'm wondering who will pay 200us+ for just few more fps' when the 4850, 8800gt and 9800gtx are priced lower.. i know there are some people who would, but i don't see the worth.
-
Oh Nvidia, those conniving bastards. They release info on this card and simultaneously drop the price of the
'old' 9800 GTX to $199 US just to trump the 4850. A master stroke? Debatable. Good for consumers? Absolutely.
The 'battle' is now squarely joined at the 199-250 US price point. What makes things a whole lot more interesting is
that all Nvidia GPUs from the 9800 series and on can now take advantage of PhysX, on-GPU physics processing.
Based on preliminary results, the 9800 GTX+ is significantly faster than the 4850 as is, and with PhysX enabled,
overall system performance is even faster as well.
Will people pay the extra 30 US for that? Oh hell yes...but then, we haven't seen the 4870 performance numbers as yet,
so only time will tell.
GPU gods...show me the way. :notworthy: