As a former Intelite, i've watched particularly the past decade and worried...I hope the reward happens as Intel has been mired since BK took over in 2013; and honestly, fuck him for what he did to to get to CEO...he led the company downhill at a rapidly increasing velocity. In fact, both CEOs from the Fab side have done serious damage to the company...
@teachnaduinn313418 сағат бұрын
great explanation of subject thanks
@user-fx2oo3bi9cКүн бұрын
Wait for 0.0001 nm chip 😂😂😂 Don't need battery 🔋.
@vtrandalКүн бұрын
Nvidia has been slow to adopt low precision (8 bit or less) multipliers in their GPU products. Google was years ahead of Nvidia in this important need for the advance of Deep Learning and AI.
@vtrandalКүн бұрын
@3:25 DWDM comes to mind as you're reviewing the history of photonics. DWDM is dense wavelength division multiplexing which is an unfortunate name. Fortunately, the entire electromagnetic spectrum upto some limit (wavelengths approaching the Planck length) seems to obey the mathematics of orthogonal decomposition as in linear systems.
@user-mi9rl7eu9hКүн бұрын
If you trust the US gov with oxygen, you're already dead.
@brurpoКүн бұрын
Intel really likes the number 14 huh?
@anonymous-nu9vs2 күн бұрын
2:10
@alaintulinabo81552 күн бұрын
Awesome, thanks for breaking down everything. What do you think Intel steak will be in 10 years, above or equal to NVIDIA in term of market share ?
@gregoryshklover30882 күн бұрын
A bit confused on the message... TSMC will start producing this technology in 2026. Intel seems to be the pioneer, arrowlake is planned for later this year on 20A process with ribbonfet and backside delivery. The videos used in the video are actually from Intel press releases... Intel actually has a track record of innovation and pioneering, including high-k, finfets, and in this case, ribbonfets with backside power.
@apedanticpeasant14473 күн бұрын
This was brilliant. Thank you for taking the time to create this.
@peterkimemiah96693 күн бұрын
Travel in Milliseconds I believe not microseconds, or are both correct?
@widodoakrom39383 күн бұрын
Remember 100 years ago humans only has room size computer
3 күн бұрын
Thank you for this wonderful and informative video about this new tech.
@nimakolahimahmodi24993 күн бұрын
what was the chip name for memristor?
@gaming_for_sanity3 күн бұрын
Seems incredibly dangerous and dumb to allow 90% of the world’s chips to be made in a place China is planning to invade.
@rogerpha13984 күн бұрын
Global Foundries was holding AMD back but became a powerhouse when they moved production to TSMC
@barriewright28574 күн бұрын
Brilliant 👏🏿.
@haasandreas4 күн бұрын
Very interesting,thanks for the video.😁
@humn_rights4 күн бұрын
I don't understand this 😂
@tuvoca8254 күн бұрын
Technologynis always both good and bad. When we see the bad... we cringe to think what is next.
@williamwidjaja8505 күн бұрын
Nope, my ex employee and a fallen giant like Boeing
@adamesd36995 күн бұрын
0:18 Wait, what? TSMC doesn’t build 90% of the world’s supply of chips. Not even close. Did you mean almost 90% of leading edge nodes? That would be more accurate.
@thedubdude5 күн бұрын
I love watching your videos. You are great at explaining things. Keep up the great work. Thanks. More photonics would be awesome.
@adrian802125 күн бұрын
Anastasia are you wearing Cartier Santos watch?
@phvaessen5 күн бұрын
what about the ACCEL chip from chinese company SMIC ? They announced a new photonisc chip 3000 times faster than Nvidia's A100 using 1 million times less energy. Able to produce it using old known technology (no need for expensive lithography), and much cheaper to produce ?
@CPUGalaxy6 күн бұрын
Very interesting video!
@CyScorpion6 күн бұрын
Anastasi, when I listen to you explain this tech, not only is it astounding to learn but it inspires the imagination as to what kinds of tech evolves from us overcoming new limitations.
@abc52286 күн бұрын
Gambling càn work, as it can NOT...
@ScientificZoom7 күн бұрын
👍👌👏🎉
@wisefool36197 күн бұрын
Does photonic computing generate less heat?
@Gazzat730917 күн бұрын
Cut the bullshit who needs computing power like that and doesn't seek to abuse it? It's high time you told us the truth about Ai as in exactly what it is aswell as exactly why your soooo desperate to get it as vast as possible when you have no fucking idea how deadly your consequences of that will be on your fellow humanity.
@teachnaduinn31347 күн бұрын
give us a new vid your great
@ImagesOfCountries8 күн бұрын
Awesome presentation ! ... 👍
@user-mp3fj6xt9w8 күн бұрын
Aren't there still problems how to mass produce graphene though?
@odin8239 күн бұрын
with great risk comes great reward. at the same time, i wouldn't buy any new brand new tech. you want to wait until they work out all the bugs before you invest in a machine with untested technology. with tech changing at astronomical rates, you need to watch and see what the new standards are gong to be with AI. by the time they work the bugs out of something, there is something new that makes it obsolete. so you really have to watch what you buy. no way im buying another blueray disk player.
@deanfrazier75539 күн бұрын
It would be interesting to see the node-by-year chart on a log scale
@weakbit6339 күн бұрын
Thank you for the informative Video! Thumbs up Nr. 11780 What I don't understand is the size of a Ribbonfet I hear if they said this is a 5nm then it have a other mechanical size in real? Can anyone explain me that why it is so? Best Regards weakbit
@aniketbisht28239 күн бұрын
You speak so softly. I had to crank my volume to 90%. Nevertheless, interesting video.
@TriPham-yo7we10 күн бұрын
Use digital camera idea of ccd is very efficiently if the camera like 64 camera can perform more data processing than suppercomputer with cooler and smaller
@Jayf198110 күн бұрын
We miss you! Where R U?
@gator1984atcomcast10 күн бұрын
I was in the Air Force at Edwards’s AFB in California in 1963 when s 23 year old soldier predicted that light would be used for computers. Electrons aren’t faster than electrons but communication with fiber optics suggests computation at the speed of information transfer.
@gator1984atcomcast10 күн бұрын
Photonic computing should be used for AI, resulting in much less power consumption.
@benben-dn1ck11 күн бұрын
If i am not mistaken Whole Brain Emulation (WBE) is uploading everything including feelings and emotions (i mean feelings and emotions should remain intact for the uploaded brain to believe it fully exists).If this understanding of mine is correct then how is this"person" going to satisfy desires such as sex and hunger?I mean it has now become a robot. The other point is,lets assume that i just got my brain uploaded,how would friends and family see me?
@equityomnia838811 күн бұрын
Congratulations for the ASM sponsorshiop; you truly deserve it for the quality of your knowledge and the educational value of how you share it. Also, I feel that all your viewers would be interested in a video about Hyper NA UV machines...
@acmaysnetworker11 күн бұрын
thank you, this was an awesome video and it was NOT to long. the knowledge and breaking down the structures of the nextgen chips is helping my information gathering as we go into the future at blazing rates of time and speed. YES, please complete the CFET video as I will be wait to consume the information. understanding the chips and new memory and coding is imperative for any of use in Tech or stem. there is not enough hours in a day to stay up on the massive amounts of new data on new methods and tech. again thank you for you work and channel
@giaccommander747412 күн бұрын
@AnastasiInTech ...Are you okay? You sound sickly ... Get well soon, if not!
@4puf12 күн бұрын
I would like to hear you talk about Groq chips. Can you make a video about it?
@AnastasiInTech12 күн бұрын
kzread.info/dash/bejne/gK6N0tGJmMbRc84.html
@benandcullensoldchannel208712 күн бұрын
So her choices were chip design engineer or super-model?
Пікірлер
Planned for 2027 but might see it in 2037
As a former Intelite, i've watched particularly the past decade and worried...I hope the reward happens as Intel has been mired since BK took over in 2013; and honestly, fuck him for what he did to to get to CEO...he led the company downhill at a rapidly increasing velocity. In fact, both CEOs from the Fab side have done serious damage to the company...
great explanation of subject thanks
Wait for 0.0001 nm chip 😂😂😂 Don't need battery 🔋.
Nvidia has been slow to adopt low precision (8 bit or less) multipliers in their GPU products. Google was years ahead of Nvidia in this important need for the advance of Deep Learning and AI.
@3:25 DWDM comes to mind as you're reviewing the history of photonics. DWDM is dense wavelength division multiplexing which is an unfortunate name. Fortunately, the entire electromagnetic spectrum upto some limit (wavelengths approaching the Planck length) seems to obey the mathematics of orthogonal decomposition as in linear systems.
If you trust the US gov with oxygen, you're already dead.
Intel really likes the number 14 huh?
2:10
Awesome, thanks for breaking down everything. What do you think Intel steak will be in 10 years, above or equal to NVIDIA in term of market share ?
A bit confused on the message... TSMC will start producing this technology in 2026. Intel seems to be the pioneer, arrowlake is planned for later this year on 20A process with ribbonfet and backside delivery. The videos used in the video are actually from Intel press releases... Intel actually has a track record of innovation and pioneering, including high-k, finfets, and in this case, ribbonfets with backside power.
This was brilliant. Thank you for taking the time to create this.
Travel in Milliseconds I believe not microseconds, or are both correct?
Remember 100 years ago humans only has room size computer
Thank you for this wonderful and informative video about this new tech.
what was the chip name for memristor?
Seems incredibly dangerous and dumb to allow 90% of the world’s chips to be made in a place China is planning to invade.
Global Foundries was holding AMD back but became a powerhouse when they moved production to TSMC
Brilliant 👏🏿.
Very interesting,thanks for the video.😁
I don't understand this 😂
Technologynis always both good and bad. When we see the bad... we cringe to think what is next.
Nope, my ex employee and a fallen giant like Boeing
0:18 Wait, what? TSMC doesn’t build 90% of the world’s supply of chips. Not even close. Did you mean almost 90% of leading edge nodes? That would be more accurate.
I love watching your videos. You are great at explaining things. Keep up the great work. Thanks. More photonics would be awesome.
Anastasia are you wearing Cartier Santos watch?
what about the ACCEL chip from chinese company SMIC ? They announced a new photonisc chip 3000 times faster than Nvidia's A100 using 1 million times less energy. Able to produce it using old known technology (no need for expensive lithography), and much cheaper to produce ?
Very interesting video!
Anastasi, when I listen to you explain this tech, not only is it astounding to learn but it inspires the imagination as to what kinds of tech evolves from us overcoming new limitations.
Gambling càn work, as it can NOT...
👍👌👏🎉
Does photonic computing generate less heat?
Cut the bullshit who needs computing power like that and doesn't seek to abuse it? It's high time you told us the truth about Ai as in exactly what it is aswell as exactly why your soooo desperate to get it as vast as possible when you have no fucking idea how deadly your consequences of that will be on your fellow humanity.
give us a new vid your great
Awesome presentation ! ... 👍
Aren't there still problems how to mass produce graphene though?
with great risk comes great reward. at the same time, i wouldn't buy any new brand new tech. you want to wait until they work out all the bugs before you invest in a machine with untested technology. with tech changing at astronomical rates, you need to watch and see what the new standards are gong to be with AI. by the time they work the bugs out of something, there is something new that makes it obsolete. so you really have to watch what you buy. no way im buying another blueray disk player.
It would be interesting to see the node-by-year chart on a log scale
Thank you for the informative Video! Thumbs up Nr. 11780 What I don't understand is the size of a Ribbonfet I hear if they said this is a 5nm then it have a other mechanical size in real? Can anyone explain me that why it is so? Best Regards weakbit
You speak so softly. I had to crank my volume to 90%. Nevertheless, interesting video.
Use digital camera idea of ccd is very efficiently if the camera like 64 camera can perform more data processing than suppercomputer with cooler and smaller
We miss you! Where R U?
I was in the Air Force at Edwards’s AFB in California in 1963 when s 23 year old soldier predicted that light would be used for computers. Electrons aren’t faster than electrons but communication with fiber optics suggests computation at the speed of information transfer.
Photonic computing should be used for AI, resulting in much less power consumption.
If i am not mistaken Whole Brain Emulation (WBE) is uploading everything including feelings and emotions (i mean feelings and emotions should remain intact for the uploaded brain to believe it fully exists).If this understanding of mine is correct then how is this"person" going to satisfy desires such as sex and hunger?I mean it has now become a robot. The other point is,lets assume that i just got my brain uploaded,how would friends and family see me?
Congratulations for the ASM sponsorshiop; you truly deserve it for the quality of your knowledge and the educational value of how you share it. Also, I feel that all your viewers would be interested in a video about Hyper NA UV machines...
thank you, this was an awesome video and it was NOT to long. the knowledge and breaking down the structures of the nextgen chips is helping my information gathering as we go into the future at blazing rates of time and speed. YES, please complete the CFET video as I will be wait to consume the information. understanding the chips and new memory and coding is imperative for any of use in Tech or stem. there is not enough hours in a day to stay up on the massive amounts of new data on new methods and tech. again thank you for you work and channel
@AnastasiInTech ...Are you okay? You sound sickly ... Get well soon, if not!
I would like to hear you talk about Groq chips. Can you make a video about it?
kzread.info/dash/bejne/gK6N0tGJmMbRc84.html
So her choices were chip design engineer or super-model?