• 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AMD Making iGPU For Intel Mobile 8th Gen CPU
#1
http://www.tomshardware.com/news/amd-int...35852.html
Interestingly, this was Intel's idea.
Quote:The dawn of the chiplet marks a tremendous shift in the semiconductor industry. The industry is somewhat skeptical of the chiplet concept, largely because it requires competitors to arm their competition, but the Intel and AMD collaboration proves that it can work with two of the biggest heavyweights in the computing industry. Not to mention bitter rivals. Industry watchers have also largely been in agreement that EMIB would not filter down to the consumer market for several years, but the announcement clearly proves the technology is ready for prime time.

DARPA initially brought the chiplet revolution to the forefront with its CHIPS (Common Heterogeneous Integration and Intellectual Property (IP) Reuse Strategies) initiative, which aims to circumvent the limitations of the waning Moore's Law.

Intel plans to bring the new devices to market early in 2018 through several major OEMs. Neither Intel nor AMD have released any detailed information, such as graphics or compute capabilities, TDP ratings, or HBM2 capacity, but we expect those details to come to light early next year.
  Reply
#2
Quote:. Intel confirmed to us that its new arrangement with AMD doesn't include licensing of AMD's technology; instead, Intel is merely purchasing finished semi-custom products from AMD. That means Intel likely isn't abandoning its own internal GPU development programs. Intel also confirmed to us that it approached AMD to begin the project.


(11-07-2017, 01:03 AM)SteelCrysis Wrote: http://www.tomshardware.com/news/amd-int...35852.html
Interestingly, this was Intel's idea.
Quote:The dawn of the chiplet marks a tremendous shift in the semiconductor industry. The industry is somewhat skeptical of the chiplet concept, largely because it requires competitors to arm their competition, but the Intel and AMD collaboration proves that it can work with two of the biggest heavyweights in the computing industry. Not to mention bitter rivals. Industry watchers have also largely been in agreement that EMIB would not filter down to the consumer market for several years, but the announcement clearly proves the technology is ready for prime time.

DARPA initially brought the chiplet revolution to the forefront with its CHIPS (Common Heterogeneous Integration and Intellectual Property (IP) Reuse Strategies) initiative, which aims to circumvent the limitations of the waning Moore's Law.

Intel plans to bring the new devices to market early in 2018 through several major OEMs. Neither Intel nor AMD have released any detailed information, such as graphics or compute capabilities, TDP ratings, or HBM2 capacity, but we expect those details to come to light early next year.
  Reply
#3
(11-07-2017, 02:23 AM)gstanford Wrote: One wonders what nvidia makes of it though as it has the potential to squeeze them a little in the laptop arena which I doubt they are happy about.

Then bring on Volta!

Angel
  Reply
#4
This could be interesting. AMD has said that their own integrated graphics for Ryzen will be weaker than this chip they are providing to Intel. That could mean that this Intel/AMD hybrid could be quite potent. Intel has already demonstrated that they can produce a pretty decent iGPU on their own, they did it already with Broadwell. For them to go with AMD it tells me that they're going after nVidia's market share, as gstanford alluded to. We'll find out soon enough.
  Reply
#5
(11-08-2017, 03:58 AM)SickBeast Wrote:  Intel has already demonstrated that they can produce a pretty decent iGPU on their own, they did it already with Broadwell.
Intel used Nvidia graphics. Intel is awful at designing their own graphics anything. They are CPU engineers.
Thinking
  Reply
#6
(11-08-2017, 06:05 AM)gstanford Wrote: And maybe this whole thing is why we haven't seen Volta yet.  nvidia probably got wind of how this was going to play out months before the rest of us and might well be slipping some slight revisions into Volta to hold back any forthcoming intel threats or encroachments.

It's not just laptop graphics intel could target.  They want to get in on automatic car AI learning systems and high end professional compute, both areas where nvidia currently are and intel would probably prefer that they weren't.

Interesting take here (use Google translate, it's in Italian) - but please note, it's Polaris architecture, not Vega
http://www.bitsandchips.it/9-hardware/91...logia-emib

Quote: UPDATE: According to the latest leaked official information, the GPU will not be based on uArch VEGA but on Polaris, always in conjunction with HBM2 memories.

and

Quote:Finally, according to PC World , AMD will provide video drivers, but Intel engineers will write software that will handle communication between CPU, GPU and HBM2: " Intel's software team plays a critical role both in managing power as well as ensuring that the right drivers are in place for optimizing performance . "
  Reply
#7
If Intel integrates a full Polaris chip onto their CPU that's going to be a really big deal. It will be a big blow to nVidia. Based on the fact that they are using HBM2 I have a feeling this is exactly what they are doing. It's going to be great for PC gaming also. A bit of a death blow for the consoles also. The consoles will survive but they will take a big hit if Intel does this.
  Reply
#8
(11-08-2017, 03:07 PM)SickBeast Wrote: If Intel integrates a full Polaris chip onto their CPU that's going to be a really big deal.  It will be a big blow to nVidia.  Based on the fact that they are using HBM2 I have a feeling this is exactly what they are doing.  It's going to be great for PC gaming also.  A bit of a death blow for the consoles also.  The consoles will survive but they will take a big hit if Intel does this.

I have full confidence that Intel will completely botch this. We have plenty of precedent.

All that Nvidia needs to do is to bring out mobile Volta. I am pretty sure they have improved power/performance over Pascal. What else do they need to do? It's is a niche gaming market - the ultra light.

I am guessing - based on no insider information whatsoever - that we will see Volta announced at CES
Thinking

And then Volta will be a really big deal. It will be a big blow to AMD and Intel. Based on the fact that they are NOT using HBM2. And the consoles are in a world of their own unaffected by what Intel does. And it will be great for PC gaming also - far bigger than anything Intel or AMD will do in 2018.
:doh:
  Reply
#9
Who really cares? The guy who brought us Vega is heading for Intel.

Good luck, Intel

We'll see something from them in a few years. Every few years, Intel makes a foray into graphics, wastes a few billion dollars, then pulls out again. I hope they do better this time.

Angel

(11-09-2017, 07:53 AM)gstanford Wrote: The plot thickens even more.

As I suggested, Intel has hired Koduri, as their cheif GPU architect no less.

So, what are intels plans?  Well, to extend their existing (note: not the gpu they will be licensing from AMD, but their own existing igpu) in to new markets and product classes and also to develop discrete (ie: stand alone, plug in) graphics cards "for a broad range of computing segments".  In other words they want both compute and gaming markets.

https://www.anandtech.com/show/12017/int...-architect
  Reply
#10
(11-09-2017, 12:44 PM)gstanford Wrote: Well, I hope intel has a go of it.  They have the resources at their disposal than ATi and AMD never had.  If nothing else intel competing in the discrete market should force some lower prices out of nvidia.

That's what I said. I hope Intel finally can break into graphics. AMD certainly isn't giving Nvidia competition.

But no matter what, whatever Intel does, it's going to take *years* to see any discrete graphics. We'll see incremental improvements probably right away, however, and a new Radeon Graphics integration into Intel's CPU.

I know I don't buy Intel CPUs for the CPU graphics ... although the new Coffee Lake MBs allow for using it to record and stream while using PCIe graphics without impacting the desktop gaming performance
Thinking
  Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)