Toggle light / dark theme

An international team of scientists developed augmented reality glasses with technology to receive images beamed from a projector, to resolve some of the existing limitations of such glasses, such as their weight and bulk. The team’s research is being presented at the IEEE VR conference in Saint-Malo, France, in March 2025.

Augmented reality (AR) technology, which overlays and virtual objects on an image of the real world viewed through a device’s viewfinder or , has gained traction in recent years with popular gaming apps like Pokémon Go, and real-world applications in areas including education, manufacturing, retail and health care. But the adoption of wearable AR devices has lagged over time due to their heft associated with batteries and electronic components.

AR glasses, in particular, have the potential to transform a user’s physical environment by integrating virtual elements. Despite many advances in hardware technology over the years, AR glasses remain heavy and awkward and still lack adequate computational power, battery life and brightness for optimal user experience.

Meta has unveiled the next iteration of its sensor-packed research eyewear, the Aria Gen 2. This latest model follows the initial version introduced in 2020. The original glasses came equipped with a variety of sensors but lacked a display, and were not designed as either a prototype or a consumer product. Instead, they were exclusively meant for research to explore the types of data that future augmented reality (AR) glasses would need to gather from their surroundings to provide valuable functionality.

In their Project Aria initiative, Meta explored collecting egocentric data—information from the viewpoint of the user—to help train artificial intelligence systems. These systems could eventually comprehend the user’s environment and offer contextually appropriate support in daily activities. Notably, like its predecessor, the newly announced Aria Gen 2 does not feature a display.

Meta has highlighted several advancements in Aria Gen 2 compared to the first generation:

A research team led by Professor Takayuki Hoshino of Nagoya University’s Graduate School of Engineering in Japan has demonstrated the world’s smallest shooting game by manipulating nanoparticles in real time, resulting in a game that is played with particles approximately 1 billionth of a meter in size.

This research is a significant step toward developing a computer interface system that seamlessly integrates virtual objects with real nanomaterials. They published their study in the Japanese Journal of Applied Physics.

The game demonstrates what the researchers call “nano-mixed reality (MR),” which integrates digital technology with the physical nanoworld in real time using high-speed electron beams. These beams generate dynamic patterns of electric fields and on a display surface, allowing researchers to control the force field acting on the nanoparticles in real time to move and manipulate them.

February 2025 features Comet CK-25, observed with AI-driven telescopic networks for real-time imaging and analysis. A spectacular planetary alignment of Mercury, Venus, and Mars will be enhanced by augmented reality devices for interactive viewing. A partial lunar eclipse will occur on February 27th-28th, with an immersive experience via the Virtual Lunar Observation Platform (VLOP). Technological advancements highlight new methods of observing and interacting with space events, bridging Earth and the cosmos. February 2025 is set to mesmerize stargazers and tech enthusiasts alike, as the cosmos aligns with cutting-edge advancements in astronomical observation. This month isn’t just about celestial spectacles; it’s about witnessing how new technology is redefining our view of space from Earth.

Antennas receive and transmit electromagnetic waves, delivering information to our radios, televisions, cellphones and more. Researchers in the McKelvey School of Engineering at Washington University in St. Louis imagines a future where antennas reshape even more applications.

Their new metasurfaces, ultra-thin materials made of tiny nanoantennas that can both amplify and control light in very precise ways, could replace conventional refractive surfaces from eyeglasses to smartphone lenses and improve dynamic applications such as augmented reality/ and LiDAR ( and ranging).

While metasurfaces can manipulate light very precisely and efficiently, enabling powerful optical devices, they often suffer from a major limitation: Metasurfaces are highly sensitive to the , meaning they can only interact with light that is oriented and traveling in a certain direction. While this is useful in polarized sunglasses that block glare and in other communications and imaging technologies, requiring a specific polarization dramatically reduces the flexibility and applicability of metasurfaces.

Augmented reality (AR) has become a hot topic in the entertainment, fashion, and makeup industries. Though a few different technologies exist in these fields, dynamic facial projection mapping (DFPM) is among the most sophisticated and visually stunning ones. Briefly put, DFPM consists of projecting dynamic visuals onto a person’s face in real-time, using advanced facial tracking to ensure projections adapt seamlessly to movements and expressions.

While imagination should ideally be the only thing limiting what’s possible with DFPM in AR, this approach is held back by technical challenges. Projecting visuals onto a moving face implies that the DFPM system can detect the user’s facial features, such as the eyes, nose, and mouth, within less than a millisecond.

Even slight delays in processing or minuscule misalignments between the camera’s and projector’s image coordinates can result in projection errors—or “misalignment artifacts”—that viewers can notice, ruining the immersion.

Meta Platforms is assembling a specialized team within its Reality Labs division, led by Marc Whitten, to develop the AI, sensors, and software that could power the next wave of humanoid robots.

S platform capabilities. + s social media platforms. We believe expanding our portfolio to invest in this field will only accrue value to Meta AI and our mixed and augmented reality programs, Bosworth said. + How is Meta planning to advance its robotics work?

S CTO Andrew Bosworth. Bloomberg News reported the hiring first. + Meta has also appointed John Koryl as vice president of retail. Koryl, the former CEO of second-hand e-commerce platform The RealReal, will focus on boosting direct sales of Meta’s Quest mixed reality headsets and AI wearables, including Ray-Ban Meta smart glasses, developed in partnership with EssilorLuxottica.

S initial play is to become the backbone of the industry similar to what Google The company has already started talks with robotics firms like Unitree Robotics and Figure AI. With plans to hire 100 engineers this year and billions committed to AI and AR/VR, Meta is placing a major bet on humanoid robots as the next leap in smart home technology.


In today’s AI news, OpenAI will ship GPT-5 in a matter of months and streamline its AI models into more unified products, said CEO Sam Altman in an update. Specifically, Altman says the company plans to launch GPT-4.5 as its last non-chain-of-thought model and integrate its latest o3 reasoning model into GPT-5.

In other advancements, Harvey, a San Francisco AI startup focused on the legal industry, has raised $300 million in a funding round led by Sequoia that values the startup at $3 billion — double the amount investors valued it at in July. The Series D funding round builds on the momentum and reflects investors’ enthusiasm for AI tools …

Meanwhile, Meta is in talks to acquire South Korean AI chip startup FuriosaAI, according to people familiar with the matter, a deal that could boost the social media giant’s custom chip efforts amid a shortage of Nvidia chips and a growing demand for alternatives. The deal could be completed as early as this month.

Then, AI took another step into Hollywood today with the launch of a new filmmaking tool from showbiz startup Flawless. The product — named DeepEditor — promises cinematic wizardry for the digital age. For movie makers, the tool offers photorealistic edits without a costly return to set.

In videos, join IBM’s Boris Sobolev as he explains how model customization can enhance reliability and decision-making of agentic systems. Discover practical tips for data collection, tool use, and pushing the boundaries of what your AI can achieve. Supercharge your AI agents for peak performance!

Researchers at the University of Liège (Belgium) have uncovered a previously unknown mechanism that regulates the immune response against parasites. During a parasitic infection, specific immune cells, known as virtual memory T cells, become activated and express a surface molecule called CD22, which prevents an excessive immune reaction. This discovery could help in better-controlling inflammation and improving immune responses to infections.

The findings are published in the journal Science Immunology.

Nearly a quarter of the world’s population is infected by helminths, that establish themselves in the intestine for extended periods. In response to these invaders, the immune system deploys complex defense strategies. In their recent study, the researchers revealed a previously unsuspected mechanism that regulates the activation of certain : CD8+ virtual memory T cells (TVM).