https://www.awn.com/news/dreamworks-animation-release-moonray-open-source
MoonRay is DreamWorks’ open-source, award-winning, state-of-the-art production MCRT renderer, which has been used on feature films such as How to Train Your Dragon: The Hidden World, Trolls World Tour, The Bad Guys, the upcoming Puss In Boots: The Last Wish, as well as future titles. MoonRay was developed at DreamWorks and is in continuous active development and includes an extensive library of production-tested, physically based materials, a USD Hydra render delegate, multi-machine and cloud rendering via the Arras distributed computation framework.
Note: it does not support osl and usd handling is limited. Cycles may still be a fair alternative.
EDIT
MoonRay review: DreamWorks Animations’ superb rendering software is free for all
A high-performance Monte Carlo ray tracer that’s capable of both DreamWorks’ trademark stylised look and photorealism.
It has all the required features for that setup, including Arbitrary Output Variables (AOVs), which allow data from a shader or renderer to be output during rendering to aid compositing. Additionally, Deep Output and Cryptomatte are supported.
With support for OptiX 7.6 and GPU render denoising with Open Image Denoise 2, MoonRay is able to deliver particularly impressive results, especially when working interactively.
MoonRay has moved to a hybrid CPU and GPU rendering mode for its default state. It’s called XPU, and in many ways combines the best of both types of rendering workflow.
VFX Reference Platform 2023 is probably the biggest addition because it enables the use of MoonRay directly in Nuke 15.
MoonRay has already achieved great success with an array of feature films. Now the renderer is open source, the CG world can expect to see a whole new swathe of MoonRay-powered animations.
With an intuitive, user-friendly interface and a powerful AI engine, Flair AI can generate high-quality product photoshoots in seconds.
DocRes is a new model that simplifies document image restoration by handling five tasks: dewarping, deshadowing, appearance enhancement, deblurring, and binarization within a single system.
https://github.com/zzzhang-jx/docres
https://www.theverge.com/2024/5/7/24151109/apple-final-cut-camera-app-support-multicam-ipad
Apple has released Final Cut Camera for iPhone and iPad, allowing filmmakers to take video and stream it live back to an iPad for a multicam shoot. The updated Final Cut 2 app allows users to can control each Final Cut Camera-running device connected to it with a multiscreen view. Users can switch between production and editing anytime to live-cut their projects in the new version.
It’s becoming clear that deterministic physics cannot easily answer all aspects of nature, at astronomical and biological level.
Is this a limitation in modern mathematics and/or tools. Or an actual barrier?
https://www.instagram.com/gerdegotit/reel/C6s-2r2RgSu/
Since spending a lot of time recently with SDXL I’ve since made my way back to SD 1.5
While the models overall have less fidelity. There is just no comparing to the current motion models we have available for animatediff with 1.5 models.
To date this is one of my favorite pieces. Not because I think it’s even the best it can be. But because the workflow adjustments unlocked some very important ideas I can’t wait to try out.
Performance by @silkenkelly and @itxtheballerina on IG