Page 1
Page 1
Started By
Message

intel and others are about to hit the physical limits of silicon, so what's next

Posted on 5/28/14 at 8:18 pm
Posted by demosa
Member since May 2014
213 posts
Posted on 5/28/14 at 8:18 pm
intel is apparently having enough issues with manufacturing transistors at a 14nm scale, and more problems will only arise when they continue to push the limits of silicon.

so I mean...we're only like 5-7 years away from hitting the wall with silicon, so what else is there? graphene still seems like a lab project and it's extremely unlikely that it's going to be on consumer devices by 2020.

what at this point is most likely to replace silicon?
Posted by ellunchboxo
Gtown
Member since Feb 2009
18778 posts
Posted on 5/28/14 at 8:26 pm to
Graphene Valley doesn't have the same ring to it.
Posted by ChuckM
Lafayette
Member since Dec 2006
1645 posts
Posted on 5/28/14 at 8:42 pm to
quote:

Graphene


Beat me to it...its the fixall of the future..

Posted by marchballer
The Greatest Country on Earth
Member since Aug 2008
4118 posts
Posted on 5/29/14 at 12:28 am to
It has to be graphene but I have a buddy doing a PhD on some properties of graphene and he seems to believe we are pretty far off from having graphene based transistors. I know guys in a research group working on carbon nanotube transistors but I think the yield is still way too low to be cost effective.
Posted by Volvagia
Fort Worth
Member since Mar 2006
51892 posts
Posted on 5/29/14 at 12:43 am to
quote:

so I mean...we're only like 5-7 years away from hitting the wall with silicon, so what else is there? graphene still seems like a lab project and it's extremely unlikely that it's going to be on consumer devices by 2020.


True, but hardware has far outpaced software in terms of effective use of resources. Even if processing unit development stopped there would still be advances of performance.


Also, there is a lessening market drive at the moment for more processing power. Outside of a few exceptions, you only really utilize the power of an upper mainstream processor with multitasking.
Posted by Hawkeye95
Member since Dec 2013
20293 posts
Posted on 5/29/14 at 9:59 am to
quote:

Also, there is a lessening market drive at the moment for more processing power. Outside of a few exceptions, you only really utilize the power of an upper mainstream processor with multitasking.

And they have other ways to improve performance.

The real game is in low power chips, and intel knows it.
Posted by junkfunky
Member since Jan 2011
33854 posts
Posted on 5/29/14 at 10:11 am to
Actual tit fat.
Posted by Korkstand
Member since Nov 2003
28703 posts
Posted on 5/29/14 at 10:21 am to
quote:

Actual tit fat.
Posted by eScott
Member since Oct 2008
11376 posts
Posted on 5/29/14 at 10:29 am to
[link=(gizmodo.com/this-brain-inspired-microchip-is-9-000-times-faster-tha-1569176926)]LINK[/link]

This looks like we still have a ways to go with silicon.

ETA: I didn't read the title completely, was up all night.
This post was edited on 5/29/14 at 10:35 am
Posted by whodatfan
Member since Mar 2008
21324 posts
Posted on 5/29/14 at 10:31 am to
Posted by foshizzle
Washington DC metro
Member since Mar 2008
40599 posts
Posted on 5/30/14 at 9:26 am to
quote:

ETA: I didn't read the title completely, was up all night.


You also posted a bad link.
Posted by hondurantiger
Portland, OR
Member since Feb 2007
2175 posts
Posted on 5/30/14 at 12:58 pm to
Intel is not the only player. There is also TSMC and the IBM Global Foundry alliance.
But yes...we are starting to hit the wall.
Posted by euphemus
Member since Mar 2014
536 posts
Posted on 6/1/14 at 11:47 am to
quote:

intel and others are about to hit the physical limits of silicon, so what's next


I think we always knew that we would reach this point some day. After all the theoretical limit for the transistor gate length would have been a single atom when it is all said and done. We have been heading towards this point (just like the end of oil) for some time now.

Having said let me ask some fundamental questions:

Does it really matter if we hit a physical wall in terms of how small transistors/chips can get? Everything that needs a processor that needs to be small (smartphones, smart watches etc.) are already small enough. The main driver once the physical limit is reached would be to make things more efficient to reduce power consumption and increase battery life. For devices that do need more power, processors can always be stacked for parallel processing - 2, 4, 8 whatever.

So in essence what I am saying is, does the end of Moore's law even have effect on normal day to day activities of a lay person? I am not so sure it does.
Posted by Volvagia
Fort Worth
Member since Mar 2006
51892 posts
Posted on 6/1/14 at 1:31 pm to
quote:

Does it really matter if we hit a physical wall in terms of how small transistors/chips can get? Everything that needs a processor that needs to be small (smartphones, smart watches etc.) are already small enough. The main driver once the physical limit is reached would be to make things more efficient to reduce power consumption and increase battery life. For devices that do need more power, processors can always be stacked for parallel processing - 2, 4, 8 whatever.



If think that is the whole point though.

It's not to miniaturize further.

Small architectures allow for greater parallel processing. This has been the trend for quite some time now...air cooled processors have long since hit their speed bumps.

If you can't shrink it further, then adding more processors is not realistic. It just rapidly gets too big.
Posted by euphemus
Member since Mar 2014
536 posts
Posted on 6/2/14 at 2:07 am to
I guess what I am saying is for hardware such as desktops/servers that require large amounts of computing power, you can still use multiple processors to get the job done, since they don't have size limitations. For wearables, smart phones, tablets etc. on which you want to watch 4K videos at 500-600 ppi or whatever, I'm sure processors at that time will be able to handle that. Beyond that, at which point the human eyes can't tell the difference, what is the point of increasing specs for specs change?

The closest analogy I can come up with is this: The world is running out of oil and gas is at a premium and everyone's focus is to maximize fuel economy barring a few specialized vehicles for certain use cases. if the max speed you can go on the freeway is limited to 65 mph at the time, what is the point of building a car with 400 bhp with a top speed to 155 mph in those circumstances? As we hit the Moore's law wall, I see the same with semiconductors - existing silicon tech will be good enough for most mobile consumer applications with a focus on efficiency and battery life, while the heavy lifting will be done by desktops and mainframes (think cloud server - thin client).
This post was edited on 6/2/14 at 2:08 am
Posted by Volvagia
Fort Worth
Member since Mar 2006
51892 posts
Posted on 6/2/14 at 2:59 am to
I got that.

I didn't disagree with it. In fact, I basically said the same earlier in the thread.

I was just saying that this issue isn't about being able to make processors smaller for mobile applications as suggested by your first post.
first pageprev pagePage 1 of 1Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram