Aktualności

Light-Speed AI: Fotoniczny procesor MIT zrewolucjonizuje bezprzewodowe przetwarzanie sygnałów 6G

The explosion of connected devices and our daily dependence on smooth, uninterrupted wireless connections have put wireless bandwidth in the spotlight. Every day, everything from smart cities to remote work and cloud computing leans on these invisible networks. But there’s a catch: the wireless spectrum, that essential backbone, is limited. Managing it efficiently has never been more complicated—or more important.

AI Takes Center Stage

To keep up with the rush, engineers have turned to artificial intelligence. AI is already making waves by interpreting and classifying wireless signals on the fly, trimming latency and squeezing out more performance. But there’s a snag—most current AI models that process wireless signals are greedy when it comes to computing power and energy. That makes them difficult to use in real time, especially in small edge devices like your phone or an IoT sensor.

Recently, a team from MIT offered a promising new fix: a custom-built optical hardware accelerator for wireless signal processing. This isn’t your average processor. It uses light (photons!) to handle machine learning computations at speeds that leave digital chips in the dust. The result? Wireless signals get classified almost instantly.

Meet the Photonic AI Accelerator

What’s truly remarkable about this photonic chip is its leap in speed. It’s not just a little bit faster—it’s reportedly up to 100 times faster than current digital versions. And it’s sharp, too, correctly classifying about 95 percent of signals it sees. Plus, because it’s compact, energy-efficient, flexible, and scalable, it could slip into devices everywhere—from massive data centers to devices you carry in your pocket.

The potential uses are vast. In future 6G networks, for example, this chip could adjust data speeds and reliability in real time, selecting the ideal wireless settings on the fly. But that’s just the start: imagine health devices like smart pacemakers that respond to a patient’s changing needs, or autonomous vehicles that must interpret their environment and make near-instant decisions to keep us safe. Real-time learning at the edge could be a literal lifesaver.

How It All Works

Digging into its design, the MIT group built a new kind of optical neural network they call the Multiplicative Analog Frequency Transform Optical Neural Network, or MAFT-ONN. The techy name hides a simple idea: it handles wireless signals directly in the frequency domain, before turning them into digital data. This allows for crazy-fast, super-efficient computations. And unlike other optical approaches that need a separate chunk of hardware for every neural “unit,” MAFT-ONN can host up to 10,000 neurons in a single device, thanks to an approach called photoelectric multiplication. That means it gets more power—and more brains—with less bloat.

How well does it work? In early simulations, MAFT-ONN nailed wireless signal classification with about 85 percent accuracy to start and improved to over 99 percent with more measurements—all in the blink of an eye (a mere 120 nanoseconds per classification). As one researcher put it, “The longer you measure, the higher accuracy you’ll get. Because MAFT-ONN computes inferences in nanoseconds, you don’t lose much speed to gain more accuracy.”

Where does it go from here? The MIT team wants to expand the chip’s capabilities, tackling even more sophisticated AI models and bigger challenges. It’s been a huge collaborative effort, supported by organizations like the U.S. Army Research Lab, MIT Lincoln Laboratory, and others.

Curious for more? You can dive into the original story at MIT News.

Jaka jest twoja reakcja?

Podekscytowany
0
Szczęśliwy
0
Zakochany
0
Nie jestem pewien
0
Głupi
0

Komentarze są zamknięte.