-
Sony tests AI-powered Playstation characters
https://www.independent.co.uk/tech/ai-playstation-characters-sony-ps5-chatgpt-b2712813.html
A demo video, first reported by The Verge, showed an AI version of the character Aloy from the Playstation game Horizon Forbidden West conversing through voice prompts during gameplay on the PS5 console.
The character’s facial expressions are also powered by Sony’s advanced AI software Mockingbird, while the speech artificially replicates the voice of the actor Ashly Burch.
-
BEAR – BE-A-Rigger – Maya Rigging Tool
https://github.com/Grackable/bear_core
BEAR claims to be the most intuitive and easy-to-use rigging tool available, offering production-proven features that streamline the rigging workflow for maximum efficiency and consistency.
-
Jellyfish Pictures suspends operations
https://www.broadcastnow.co.uk/post-and-vfx/jellyfish-pictures-suspends-operations/5202847.article
According to a report in Indian news outlet, Animation Xpress, Jellyfish is facing financial struggles and has temporarily suspended its global operations.
-
Judge allows authors AI copyright lawsuit against Meta to move forward
The lawsuit has already provided a few glimpses into how Meta approaches copyright, with court filings from the plaintiffs claiming that Mark Zuckerberg gave the Llama team permission to train the models using copyrighted works and that other Meta team members discussed the use of legally questionable content for AI training.
-
Lumotive Light Control Metasurface – This Tiny Chip Replaces Bulky Optics & Mechanical Mirrors
Programmable Optics for LiDAR and 3D Sensing: How Lumotive’s LCM is Changing the Game
For decades, LiDAR and 3D sensing systems have relied on mechanical mirrors and bulky optics to direct light and measure distance. But at CES 2025, Lumotive unveiled a breakthrough—a semiconductor-based programmable optic that removes the need for moving parts altogether.
The Problem with Traditional LiDAR and Optical Systems
LiDAR and 3D sensing systems work by sending out light and measuring when it returns, creating a precise depth map of the environment. However, traditional systems have relied on physically moving mirrors and lenses, which introduce several limitations:
- Size and weight – Bulky components make integration difficult.
- Complexity – Mechanical parts are prone to failure and expensive to produce.
- Speed limitations – Physical movement slows down scanning and responsiveness.
To bring high-resolution depth sensing to wearables, smart devices, and autonomous systems, a new approach is needed.
Enter the Light Control Metasurface (LCM)
Lumotive’s Light Control Metasurface (LCM) replaces mechanical mirrors with a semiconductor-based optical chip. This allows LiDAR and 3D sensing systems to steer light electronically, just like a processor manages data. The advantages are game-changing:
- No moving parts – Increased durability and reliability
- Ultra-compact form factor – Fits into small devices and wearables
- Real-time reconfigurability – Optics can adapt instantly to changing environments
- Energy-efficient scanning – Focuses on relevant areas, saving power
How Does it Work?
LCM technology works by controlling how light is directed using programmable metasurfaces. Unlike traditional optics that require physical movement, Lumotive’s approach enables light to be redirected with software-controlled precision.
This means:
- No mechanical delays – Everything happens at electronic speeds.
- AI-enhanced tracking – The sensor can focus only on relevant objects.
- Scalability – The same technology can be adapted for industrial, automotive, AR/VR, and smart city applications.
Live Demo: Real-Time 3D Sensing
At CES 2025, Lumotive showcased how their LCM-enabled sensor can scan a room in real time, creating an instant 3D point cloud. Unlike traditional LiDAR, which has a fixed scan pattern, this system can dynamically adjust to track people, objects, and even gestures on the fly.
This is a huge leap forward for AI-powered perception systems, allowing cameras and sensors to interpret their environment more intelligently than ever before.
Who Needs This Technology?
Lumotive’s programmable optics have the potential to disrupt multiple industries, including:
- Automotive – Advanced LiDAR for autonomous vehicles
- Industrial automation – Precision 3D scanning for robotics and smart factories
- Smart cities – Real-time monitoring of public spaces
- AR/VR/XR – Depth-aware tracking for immersive experiences
The Future of 3D Sensing Starts Here
Lumotive’s Light Control Metasurface represents a fundamental shift in how we think about optics and 3D sensing. By bringing programmability to light steering, it opens up new possibilities for faster, smarter, and more efficient depth-sensing technologies.
With traditional LiDAR now facing a serious challenge, the question is: Who will be the first to integrate programmable optics into their designs?
COLLECTIONS
| Featured AI
| Design And Composition
| Explore posts
POPULAR SEARCHES
unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke
FEATURED POSTS
-
AnimationXpress.com interviews Daniele Tosti for TheCgCareer.com channel
-
Convert 2D Images to 3D Models
-
59 AI Filmmaking Tools For Your Workflow
-
STOP FCC – SAVE THE FREE NET
-
AI Data Laundering: How Academic and Nonprofit Researchers Shield Tech Companies from Accountability
-
Photography basics: Lumens vs Candelas (candle) vs Lux vs FootCandle vs Watts vs Irradiance vs Illuminance
-
Photography basics: Shutter angle and shutter speed and motion blur
-
Eddie Yoon – There’s a big misconception about AI creative
Social Links
DISCLAIMER – Links and images on this website may be protected by the respective owners’ copyright. All data submitted by users through this site shall be treated as freely available to share.
