• Welcome to ลงประกาศฟรี โพสฟรี โปรโมทเว็บไซด์ให้ติดอันดับ SEO ด้วย PBN.
 

poker online

ปูนปั้น

Intel's Loihi 2 Accelerates Efforts To Make Chips Like Human Brains

Started by kafa88, October 01, 2021, 10:40:57 AM

Previous topic - Next topic

kafa88




AI chips are mainly used for specialized mathematical calculations. Neuromorphic calculations are very different.

Intel revealed Thursday that its Loihi 2 chip, its second-generation processor, combines common electronics with the slot architecture of the human brain, in an effort to inject new advancements into the computing industry.Loihi 2, an example of a technology called neuromorphic computing, is about 10 times faster than its predecessor, according to Intel. The speed improvement is the result of an eight-fold increase in the number of digital neurons.

A chip that is the equivalent of a human brain cell that mimics the way the brain handles information. The chip can also be better programmed to help researchers handle more computing tasks.The chip is built with 4 pre-production Intel manufacturing processes as well. This is an advanced method that Intel plans to use to create the upcoming mainstream Intel chips in 2023. The Intel 4 process can bite electronics on a chip more densely.

This is a big advantage for Intel to pack 1 million digital neurons on a chip that is 30 square millimeters in size.The Loihi chip is particularly good at detecting sensory inputs like gestures, sounds, and even smells at a glance, said Mike Davies, the group head of Intel Labs that developed Loihi. get better "We can detect slippage if the robotic hand picks up the cup," Davies said.Neuromorphic Computing is different from Artificial Intelligence.

It is a revolutionary computing technology based more loosely on the way the brain learns and responds because it focuses more on the physical nature of the human gray matter.It is profoundly different from conventional chips. For example, the Loihi 2 stores tiny amounts of data that are spread across the net of neurons. Not in the traditional large computer memory. and there is no central clock signal to synchronize the processing steps on the chip.