8 Top AI Certifications: Latest Hotlist You Won’t Want To Miss



new computer technology :: Article Creator

This New, All-Optical Computer Has Blistering 100 GHz Clock Speeds

The computing world is on the cusp of a transformative leap forward, as researchers at the California Institute of Technology (Caltech) have unveiled an all-optical computer capable of achieving clock speeds exceeding 100 GHz. This new technology has the potential to revolutionize industries requiring real-time data processing and could be the start of a new era in ultrafast computing.

A computer's clock speed determines how fast it can execute instructions, making it a critical factor for performance. Historically, clock speeds have increased steadily, following Moore's Law. However, they plateaued around 5 GHz in the early 2000s. Two significant roadblocks have halted further progress, though.

The first is Dennard scaling, which suggested that shrinking transistors would maintain efficiency. However, smaller transistors leak current, causing higher power consumption. Simultaneously, the von Neumann bottleneck limited data transfer speeds between memory and processors.

These challenges have slowed applications demanding ultra-high-speed processing. But that could all change very soon, according to a pre-published study featured on arXiv. Enter the first 100GHz all-optical computer, a design that sidesteps these limitations by using light instead of electricity to power it.

BGR

At the heart of this new computer is an optical implementation of a recurrent neural network. The device operates entirely within the optical domain, utilizing laser pulses to process data. One key component is the optical cavity, which acts as a memory and a computational layer. Here, light signals are recirculated and manipulated at astonishing speeds determined by the frequency of the laser pulses.

This architecture allows the all-optical computer to perform tasks like signal classification, time-series prediction, and image generation with unparalleled speed and efficiency. Unlike traditional designs, the optical approach eliminates bottlenecks associated with data transfer and power density.

A computer of this nature could revolutionize high-speed telecoms, ultrafast imaging, and generative AI, especially if combined with the enhanced logic gates some researchers have created. Additionally, autonomous vehicles may rely on this technology for split-second decision-making, further increasing the reliability of autonomous electric vehicles.

Looking ahead, researchers aim to integrate this technology into compact, scalable systems using advanced materials like thin-film lithium niobate. Of course, scaling an all-optical computer capable of 100 GHz speeds to a consumer-friendly level is another factor, and researchers will undoubtedly have their work cut out for them.

But, if they pull it off, we'll see blazing-fast computers, unlike anything we've ever dreamed of.


New Computer Memory Tech Could Power The AI Of The Future

A research team, led by the University of Cambridge, has developed a novel computer memory design, which promises to significantly improve performance while reducing the energy demands of internet and communications technologies.

As per the university, AI, algorithms, internet usage, and other data-driven technologies are estimated to require over 30% of our global electricity consumption within the next decade.

"To a large extent, this explosion in energy demands is due to shortcomings of current computer memory technologies," said first author Dr Markus Hellenbrand, from Cambridge's Department of Materials Science and Metallurgy. "In conventional computing, there's memory on one side and processing on the other, and data is shuffled back between the two, which takes both energy and time."

The researchers experimented with a new type of technology known as resistive switching memory. Unlike conventional memory devices that can encode data in two states (one or zero), this novel type of memory can enable a continuous range of states.

This is done by applying an electrical current on specific materials, causing the electrical resistance to increase or decrease. The various changes in electrical resistance create different possible states to store data.

"A typical USB stick based on continuous range would be able to hold between ten and 100 times more information, for example," explained Hellenbrand.

The team developed a prototype device based on hafnium oxide, which had so far proven to be challenging for resistive switching memory applications. That's because the material has no structure at the atomic level. Hellenbrand and his co-scientists, however, found a solution: throwing barium into the mix.

"These materials can work like a synapse in the brain.

When barium was added, it formed highly-structured barium "bridges" between thick films of hafnium oxide. At the point where these bridges meet the device contacts, an energy barrier was created, allowing the electrons to cross. The energy barrier can be raised or lowered, which changes the resistance of the hafnium oxide composite, and in turn allows multiple states to exist in the material.

"What's really exciting about these materials is they can work like a synapse in the brain: they can store and process information in the same place, like our brains can," Hellebrand said.

The researchers believe that this could lead to the development of computer memory devices with far greater density and performance but lower energy consumption, making the technology especially promising in the field of AI and machine learning.

A patent of the technology has been filed by Cambridge Enterprise, the university's commercialisation arm, and the scientists are now working with the industry to run larger feasibility studies. They claim that integrating hafnium oxide into existing manufacturing processes won't prove challenging, as the material is already being used in semiconductor production.


MIT's New Computer Chip Is Interchangeable Like LEGOs - Popular Science

Get the Popular Science daily newsletter💡 Breakthroughs, discoveries, and DIY tips sent every weekday.

Chips are in everything: smartphones, supercomputers, remote-sensing robots. Now, MIT engineers created an electronics chip design that allows for sensors and processors to be easily swapped out or added on, like bricks of LEGO. A reconfigurable, modular chip like this could be useful for upgrading smartphones, computers, or other devices without producing as much waste. Additionally, it could be useful for artificial intelligence applications. Their paper describing the tech was published this week in the journal Nature Electronics. 

Here's how the chip is configured. It has alternating layers for sensing and processing. Instead of having copper wires, the layers of the chip communicate internally through optical signals, more specifically, with light-emitting diodes (LEDs). These two features allow various elements on individual layers to be easily interchanged with other elements. 

"As we enter the era of the internet of things based on sensor networks, demand for multifunctioning edge-computing devices will expand dramatically," Jeehwan Kim, associate professor of mechanical engineering at MIT, said in a press release. "Our proposed hardware architecture will provide high versatility of edge computing in the future." (Edge computing refers to electronics that can process data independently without having to connect to a central server). 

To test how the chip performs on simple tasks, the team made a prototype with image sensors, LEDs, and a processor containing "artificial brain synapses"—-components made of silicon, silver, and copper that mimic how the brain transmits information (the team also calls these memristors). Instead of just transmitting information in binary (as 0 or 1), the strength of the memristors' output electrical current depends on the strength of incoming current. This allows it to have a range of values based on the strengths of the signals. And it consistently remembers what value is associated with what strength of signal so calculations stay constant. A connected circuit, or array, of these neurons could directly process and classify signals on-chip. 

[Related: The trick to a more powerful computer chip? Going vertical.]

Researchers trained a version of the stacked chip to recognize the letters M, I, and T. (For MIT.) That chip had photodetectors for receiving the visual signal and passed it down to other layers that encoded the image as a sequence of LED pixels and classified the signal based on the strength of incoming light. The researchers used laser light to shine different letters onto the chip, and it was usually able to recognize which letter it was given, although it did better with clearer and brighter images. At some point, the researchers added a "denoising" processor that helped the chip understand more of the blurry images. 

The team imagines that this modular capability will allow them to add features like image recognition to smartphone cameras, or health monitoring sensors to electronic skins. 

"We can make a general chip platform, and each layer could be sold separately like a video game," Jeehwan Kim said. "We could make different types of neural networks, like for image or voice recognition, and let the customer choose what they want, and add to an existing chip like a LEGO."

  More deals, reviews, and buying guides The PopSci team has tested hundreds of products and spent thousands of hours trying to find the best gear and gadgets you can buy.  

Charlotte is the assistant technology editor at Popular Science. She's interested in understanding how our relationship with technology is changing, and how we live online.






Comments

Follow It

Popular posts from this blog

What is Generative AI? Everything You Need to Know

Top Generative AI Tools 2024

60 Growing AI Companies & Startups (2025)