• 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Kyle banned me from the Hardforum!
#61
Guys let's not bait apoppin into saying something untoward about nVidia. It sounds like it's just the way things are done in this business. IMO nVidia has been absolutely fantastic to BTR and I personally appreciate it. I don't think this site would even exist if it weren't for nVidia's generosity, expertise, and support. It's enough that apoppin has to buy all the AMD cards out of pocket. I don't think it's fair to expect him to suddenly turn his back on nVidia who has been good to him, just to make a point about the NDA or whatever else Kyle and the AMD trolls are raving about.

I don't think it's fair to say that nVidia has apoppin "over the barrel", either. nVidia gives apoppin the cards for review, and he reviews them. It's that simple. The numbers speak for themselves. apoppin is usually testing at least 25 games. The cards are either going to perform well, or they are not. Editorial portions of articles are mostly fluff IMO and I usually skim them. It's those portions where bias can come into play. But there is no arguing with the most games tested in the English language, IMO. Really I have never seen anything posted either here or at ABT that was not 100% the truth. If nVidia really has apoppin over the barrel it's not having much effect at all IMO. Paladin

(08-29-2018, 03:41 PM)gstanford Wrote: Oh, I've read the available info on Turing pretty darn carefully, don't you worry about that!

It's slow at raytracing (as reported by several at the event - it can't maintain 60 fps a lot of the time in the showcase games).

I'm not barking up any wrong trees, despite your cute graphics suggesting otherwise in the other thread.

I know bullshit when I see it and Turing is bullshit.  Overpriced bullshit with heavy handed PR tactics to boot.

Sorry nvidia, I ain't drinking this dodgy koolaid and neither is most of the rest of the internet either!
Greg, seriously, you need to wait for reviews and see. Yes, RTX is in its infancy and yes, it looks like it performs slow, but really I applaud nVidia for creating this technology that looks to be the future of computer graphics. Are you just ignoring the fact that they are promising a 50% performance uplift in current games? That's pretty good IMO. Yes, the pricing sucks on the Ti and the 2080, but the 2070 looks to me like it could be the dark horse and it could turn out to be a tremendous value. Titan XP performance for $499 before you even overclock it is fantastic IMO.

I do take issue with the pricing on the higher end models and it sucks that guys like us have no real upgrade path ATM, but I think you are being overly critical of this launch without having seen how the cards perform across a number of current and upcoming games.

We haven't even seen the reviews. Who knows, even the higher end cards could be worth it at the prices they are charging. We just don't know yet. Early signs are pointing to "no", but we just don't know yet.
  Reply
#62
(08-29-2018, 04:22 PM)gstanford Wrote: I can already see all I need to see.  This is going to be nv30 mk II.
Wow, that's harsh. You're comparing Turing to a failed architecture that did not support all of the DX features of the time, that also ran hot and loud? Turing looks like it will run cool, overclock well, provide a 50% performance uplift over the previous generation, and add the coolest new graphics feature since DX9. AMD has nothing that even comes close and who knows if and when they will ever support ray tracing in their hardware.

If anything this launch reminds me of the 8800GTX generation. The big card is fast but you have to pay through the nose if you want it.
  Reply
#63
I don't know what got you up on the wrong side of the bed this morning, but your have attacked me and my integrity implying that I "owe" something to Nvidia besides a thank-you and fair reviews of their hardware. You never backed down on it, but instead went on to escalate your nonsense with attacks on Nvidia by using snippets of what you have gleaned so far.

I am done discussing this with you - until after the reviews are out - or you will have me at an unfair advantage. Your mind is already sealed shut and firmly made up and I don't give a fsck about changing it. I will post my opinion and do nothing different with my upcoming reviews - without any fear of repercussions, as I have *always* done. And the cards will fall as they will. I am not going to walk about on eggshells out of fear of what might happen. I didn't back down with AMD over their lack of ethics, and I will be god-damned if I start to wilt now.

I can shut BTR down in a heartbeat if I choose - and I will - before I compromise on ANYTHING that is against my my morals and ethics. It is my HOBBY, and not my main source of income. I have already elected to lose thousands of dollars because I refuse to accept sponsored content - I don't do this for money nor do I hold back on my honest opinion and criticism for *any* reason.

At THIS POINT, there is a LOT I don't know - and my mind isn't even made up as to any sort of a conclusion yet. I cannot make any judgment yet and I will not stupidly criticize Nvidia because of my lack of understanding at this point - like one of us is doing.


(08-29-2018, 04:22 PM)gstanford Wrote: I'm not baiting apoppin to criticize nvidia.  Both he and I know very well he can't if he wants current arrangements with nvidia to continue.  He will be out of pocket and late with reviews if he does, that is the reality with the NDA situation.

nvidia will just say "We treated you well with the plane tickets and you abused our good treatment of you with criticism/poor review, no more free hardware or early driver access or reviewer tools for you!".

Quote:Greg, seriously, you need to wait for reviews and see.
I can already see all I need to see.  This is going to be nv30 mk II.
  Reply
#64
(08-29-2018, 04:33 PM)gstanford Wrote: It's amazing how close turing is to nv30 on certain levels.  Especially the mixed and fixed precisions....

and i'm sure nvidia will "optimize" their drivers to "fix" things like shadow of the tomb raider, just like they did for nv30.....
They have had less than a year to work on these games and this is brand new technology. The way they have shown the games running now is nothing short of remarkable IMO. It looks like buying this generation's cards for the RTX features is the wrong move. But that's only one component of the launch. What about the 50% performance uplift? And seeing as you seem to hate the ray tracing already, maybe you would like the new DLSS AA. You seem to love your AA and that feature alone would mean a huge performance for you in the games that support it. And I'm guessing most if not all of the upcoming games will support the feature. I'm sure nVidia will work with the developers to make sure it has widespread support, like they always do.

In the long run, the RTX ray tracing is going to lead to way better looking games and less work overall for the developers. Take that from someone with some experience in 3D rendering with ray tracing.
  Reply
#65
(08-29-2018, 04:29 PM)SickBeast Wrote:
(08-29-2018, 04:22 PM)gstanford Wrote: I can already see all I need to see.  This is going to be nv30 mk II.
Wow, that's harsh.  You're comparing Turing to a failed architecture that did not support all of the DX features of the time, that also ran hot and loud?  Turing looks like it will run cool, overclock well, provide a 50% performance uplift over the previous generation, and add the coolest new graphics feature since DX9.  AMD has nothing that even comes close and who knows if and when they will ever support ray tracing in their hardware.

If anything this launch reminds me of the 8800GTX generation.  The big card is fast but you have to pay through the nose if you want it.


You cannot debate with someone whose mind is already made up. At least I won't bother. I have seen his BS before and I won't be trapped by it again.

He is pissed off and lashing out because he can neither afford - nor be able to use RT - on his outdated and buggy Windows OS. My advice: Stick to what you know - the PS4 is a good match for you, Greg. Forget modern PC gaming, you are out of date and you don't need photo-realism anyway.
  Reply
#66
(08-29-2018, 04:40 PM)apoppin Wrote:
(08-29-2018, 04:29 PM)SickBeast Wrote:
(08-29-2018, 04:22 PM)gstanford Wrote: I can already see all I need to see.  This is going to be nv30 mk II.
Wow, that's harsh.  You're comparing Turing to a failed architecture that did not support all of the DX features of the time, that also ran hot and loud?  Turing looks like it will run cool, overclock well, provide a 50% performance uplift over the previous generation, and add the coolest new graphics feature since DX9.  AMD has nothing that even comes close and who knows if and when they will ever support ray tracing in their hardware.

If anything this launch reminds me of the 8800GTX generation.  The big card is fast but you have to pay through the nose if you want it.


You cannot debate with someone whose mind is already made up.  At least I won't bother.  I have seen this BS before and I won't be trapped by it again.

No, you shouldn't bother. But I will. Aggressive

Greg likes a good debate. The console gaming thread at ABT is an all time classic.
  Reply
#67
(08-29-2018, 04:43 PM)SickBeast Wrote: No, you shouldn't bother.  But I will.  Aggressive

Greg likes a good debate.  The console gaming thread at ABT is an all time classic.

Good luck with it. Greg is barely a PC gamer at all with a deprecated OS, 1080P, and an inability to even run DX12, never mind running any kind of RT. He is no expert on this subject although he is in his own mind because of what he has gathered from leaks and from snippets of information publicly available..

Greg is probably just pissed off because he painted himself into a corner and he decided to attack everyone today - including Nvidia, and also me - for accepting travel (as usual) to a Nvidia sponsored event - and he is trying to paint me as "owing" them a kiss-ass review - something I have never done in the past and will never do. If I compromise or kiss ass - like he is implying - I might as well stop reviewing.

And as I say again, wait for the reviews. Then we will see the truth of the matter - and if I become a sell-out, or not (then).
  Reply
#68
Thanks for what you shared with us on IQ. And now there is something new that you are not even close to being an expert on even though you make a sweeping pre-judgement that RT is "not ready".

I can't comment on price yet. I simply don't know enough yet, and am glad to admit it. I don't know about performance yet. Wait for the reviews and see if mine has made me a kiss-ass sell-out, or not.

But carry on. Your opinion is your own and I am not going to say anything other than I disagree with it.
  Reply
#69
(08-29-2018, 04:55 PM)gstanford Wrote: It isn't a matter of what I can and can't afford (and I can afford Turing if I decide I want it).  Its a matter of bang for buck, which is where Turing falls down and has pissed just about everyone off.

RayTracing likely will revolutionize gaming . . . In time, and its time hasn't arrived yet (Just like VR's hasn't or 3DVision's for that matter).  When consoles can do it with reasonable performance it will take off.

I'm not out of date and I've demonstrated that I know a thing or two about "photo realism" (high quality rendering) and how to extract it from games over the years....  More than most on this forum I'd say.
We are largely agreed that Turing is a generation to be skipped by both of us. What you just said there was by far the most reasonable comment you have made today. Turing's price is the big issue. Really IMO it's the only issue aside from the ray tracing potentially running slow and/or with unstable framerate and maybe even hitching.

I'm still willing to reserve judgement until I see the reviews. If Geforce Turing blows me away, I will buy it. It's that simple.

Another option I'm considering is RTX 2070 SLI if nVidia has indeed "fixed" SLI tecnology with the new connector they have created. If it's universally supported in all games with 100% scaling, that could be very interesting. I'm sure apoppin is going to want a partner card to test this feature, or else team up with someone like Elric again.

The RTX 2070 is the only Turing card that looks to be priced somewhat reasonably to me. $399-449 would have been a killer price IMO. But I understand nVidia needs to make money.
  Reply
#70
(08-29-2018, 05:01 PM)SickBeast Wrote: We are largely agreed that Turing is a generation to be skipped by both of us.  What you just said there was by far the most reasonable comment you have made today.  Turing's price is the big issue.  Really IMO it's the only issue ...

If you are stuck on 1080P/DX11 because of an old OS, then Turing is a silly upgrade.   I get that Turing is out for Greg.

And for you - with a GTX 1080 Ti - you simply don't need to upgrade at all.  Yes, it *appears* that your upgrade path is quite expensive.  But if you had a Titan Xp, I would say it is a trivial upgrade if performance justifies it (there is good resale value in the Titans).  But I cannot comment yet since I am completely in the dark about performance and don't even know what the new top card is replacing.  

Wait for the reviews
Dance3
  Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)