Page 1
Page 1
Started By
Message

FRANKENCHIP - This will change how computing is done

Posted on 7/6/14 at 9:55 am
Posted by SpidermanTUba
my house
Member since May 2004
36128 posts
Posted on 7/6/14 at 9:55 am
quote:

The chip company announced on Wednesday at GigaOm Structure in San Francisco that it is preparing to sell a Xeon E5-FPGA hybrid chip to some of its largest customers.


LINK /

LINK /

Essentially a FPGA is a reconfigurable gate array that can be configured with software - a "blank computer" if you like (though it can do much more than compute). Adding this to a CPU will enable programmers to literally create custom hardware on the CPU.



This is going to change the face of computing. The distinction between electrical engineering and software engineering may be largely irrelevant in the future.



quote:


Intel is taking field programmable gate arrays seriously as a means of accelerating applications and has crafted a hybrid chip that marries an FPGA to a Xeon E5 processor and puts them in the same processor socket.


I would seriously recommend any software engineers who engineer software in which performance is crucial to immediately start learning Verilog. Your job may depend on it.


The potential for accelerating games is huge. Also scientific applications. One group was able to get a 75 X performance boost for a fluid dynamics code by writing custom circuits to compute the fluid fluxes.

This post was edited on 7/6/14 at 10:00 am
Posted by TigerRagAndrew
Check my style out
Member since Aug 2004
7216 posts
Posted on 7/6/14 at 6:47 pm to
Interesting. Doubt it will make it to the core i7 anytime soon. I will keep an eye on this
Posted by TigerRagAndrew
Check my style out
Member since Aug 2004
7216 posts
Posted on 7/6/14 at 6:50 pm to
Also, without reading the article, are we talking about extending the ISA to include instructions for accessing this FPGA? Curious as to the technical implementation details.
Posted by Meauxjeaux
98836 posts including my alters
Member since Jun 2005
39887 posts
Posted on 7/6/14 at 8:39 pm to
quote:

I would seriously recommend any software engineers who engineer software in which performance is crucial to immediately start learning Verilog. Your job may depend on it.


There's still guys out there making a living on COBOL.

I think the future for software engineers, whatever they write in, is ok.
Posted by SpidermanTUba
my house
Member since May 2004
36128 posts
Posted on 7/10/14 at 2:16 pm to
quote:

Also, without reading the article, are we talking about extending the ISA to include instructions for accessing this FPGA? Curious as to the technical implementation details.

I dunno. That's a good question. I would think they would have to.


I've been doing some more reading and the FPGA companies themselves have been making FPGA+CPUs for a while now.



It may turn out you don't need the CPU at all. There are already several open source CPUs that you can download and install to an FPGA. Think about the implications - rather than buying a particular CPU, you'd just buy an FPGA - and the software required to program a CPU into the chip - and upgrades to the CPU could be done via software!

Whatever FPGA gates you had leftever could be used for specialized hardware or for additional cores - or for a GPU. The same FPGA could be tailored to gamers, science & engineering, data base management - etc.


I think it would be interesting to see how many OISC's could be crammed into one FPGA.

This post was edited on 7/10/14 at 2:19 pm
Posted by SpidermanTUba
my house
Member since May 2004
36128 posts
Posted on 7/10/14 at 2:23 pm to
quote:


There's still guys out there making a living on COBOL.

I think the future for software engineers, whatever they write in, is ok.






I agree - when it comes to actual software programming.

Verilog (and others like it) is a hardware programming language. Its syntax is similar - but its works in a very different way. Rather than describing an algorithm - you're describing how a bunch of digital electric circuits hook up. All the modules in the "program" are "running" at all times in parallel - essentially. You are describing data-flow rather than an algorithm. You also need to know a thing or two about electric circuits to make efficient code.

There are engineers translating C and C++ code into Verilog right now. They'll see 10X -100X performance improvements - you don't get that from switching from COBOL to C++. This is a whole different ballgame. If software engineers don't start playing electrical engineer on the side - the electrical engineers playing software engineer on the side may beat them.

This post was edited on 7/10/14 at 2:25 pm
Posted by SpidermanTUba
my house
Member since May 2004
36128 posts
Posted on 7/12/14 at 6:38 am to
These guys made a 1,000 core CPU with an FPGA:


LINK


They processed mpegs with it 20x faster.
Posted by UltimaParadox
Huntsville
Member since Nov 2008
40839 posts
Posted on 7/12/14 at 10:42 am to
Any decent EE/CPE degree has been requiring VHDL classes for 20 years. Most universities call it advanced digital logic.

FPGAs are making a comeback to provide specialization speed increases as they just can't add more cores anymore. It's like math co-processers back in the day to give floating point arithmetic.
Posted by CP3
Baton Rouge
Member since Sep 2009
7401 posts
Posted on 7/12/14 at 10:51 am to
Yup. We had verilog in digital logic I & II at Lsu which is required for both computer and electrical engineering.
Posted by SpidermanTUba
my house
Member since May 2004
36128 posts
Posted on 7/17/14 at 11:25 am to
quote:

Any decent EE/CPE degree has been requiring VHDL classes for 20 years. Most universities call it advanced digital logic.



Of course - isn't that basically what most EE's do these days anyway?

I'm talking about software engineers. These guys (me for instance) are going to have to learn to program hardware. At least the ones with apps that require top performance. And I see the EE guys as being competitors - I think its a lot easier for an EE to learn C++ than it is for a CS to learn Verilog.

quote:


FPGAs are making a comeback to provide specialization speed increases as they just can't add more cores anymore. It's like math co-processers back in the day to give floating point arithmetic.


I'm loving it every time I read about these things. I've been doing HPC for years and I have every now and then heard of people using ASICs or FPGAs to accelerate something - but now FPGA clusters are becoming more and more common. They will ultimately blow GPUs away, I think, because you can always program an FPGA to act like a GPU.


The FPGA+CPU on the same chip will make it even more useful. The FPGA has direct access to the CPU's cache - this means you can use the FPGA to create custom caching algorithms.




The cool thing about Verilog from my POV is that it is dataflow oriented. Most of the high performance stuff I write is best expressed as a dataflow (from one short lived fine grained thread to the next).
This post was edited on 7/17/14 at 11:28 am
Posted by euphemus
Member since Mar 2014
536 posts
Posted on 7/17/14 at 11:46 am to
This is the nerdiest active thread currently on the Tech Board.

Let me ask you guys a question:

Is the end of Mohr's law here seeing as how Intel is struggling so much with their latest 14nm node? Once we hit the theoretical physical limit of a gate length (a single atom or whatever), where do we go from there?
Posted by SpidermanTUba
my house
Member since May 2004
36128 posts
Posted on 7/17/14 at 11:51 am to
quote:

This is the nerdiest active thread currently on the Tech Board.


That's a shame if you think about it.

quote:

Once we hit the theoretical physical limit of a gate length (a single atom or whatever), where do we go from there?



parallelism is the answer. On all levels.

The FPGA - essentially - is a way to achieve extremely fine grained parallelism. But parallelism will have to be exploited at all levels to realize full performance potential. The Intel Phis - for instance - ~70 cores on one chip ( and I think each is 4 threads - so 280 threads).


Posted by Hawkeye95
Member since Dec 2013
20293 posts
Posted on 7/17/14 at 11:53 am to
quote:

I would seriously recommend any software engineers who engineer software in which performance is crucial to immediately start learning Verilog. Your job may depend on it.


The thing is performance isn't really that relevant right now. for most applications, current hardware provides more than enough processing power for applications.

Even with big data the barrier is often memory addressing more so than processing speed.
Posted by SpidermanTUba
my house
Member since May 2004
36128 posts
Posted on 7/17/14 at 12:02 pm to
quote:


The thing is performance isn't really that relevant right now. for most applications, current hardware provides more than enough processing power for applications.



This is why I qualified it as being crucial for performance critical programming.


I would point out - however - that I think the landscape is changing in terms of what business's will demand in software. Now that HPC is available in the Cloud - small and medium sized businesses can make use of it without having to overcome the hurdle of buying and maintaining their own machine.

quote:


Even with big data the barrier is often memory addressing more so than processing speed.




You are right on the point that memory bandwidth constitutes a huge hurdle. I think there is still room for improvement in terms of algorithms, though.

Caching for instance is hardly ever used in the most optimal way. FPGA's may be able to help with custom caching (I'm just speculating - I haven't read anything that says they can).
Posted by Hawkeye95
Member since Dec 2013
20293 posts
Posted on 7/17/14 at 1:08 pm to
quote:

I would point out - however - that I think the landscape is changing in terms of what business's will demand in software. Now that HPC is available in the Cloud - small and medium sized businesses can make use of it without having to overcome the hurdle of buying and maintaining their own machine.


yeah, but I think current hardware will meet their requirements. And if it doesn't, the cloud providers will just throw CPUs at the problem.

I dont work on big data stuff but I have a friend who does. He says most of the use cases businesses want to do arent really big data, they are just BI++, and that big data is overkill for most everything they are looking at. Maybe that changes in 5 years. I dunno.
first pageprev pagePage 1 of 1Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram