a blog of links related to computer animation and production technology Sponsored by ReelMatters.com

Thank you for visiting!! Please bear with us while we go through a redesign of the blog. New features and phone support on the way!

Anders Langlands – Render Color Spaces

https://www.colour-science.org/anders-langlands/

 

This page compares images rendered in Arnold using spectral rendering and different sets of colourspace primaries: Rec.709, Rec.2020, ACES and DCI-P3. The SPD data for the GretagMacbeth Color Checker are the measurements of Noburu Ohta, taken from Mansencal, Mauderer and Parsons (2014) colour-science.org.

 

Björn Ottosson – How software gets color wrong

https://bottosson.github.io/posts/colorwrong/

 

Most software around us today are decent at accurately displaying colors. Processing of colors is another story unfortunately, and is often done badly.

 

To understand what the problem is, let’s start with an example of three ways of blending green and magenta:

  • Perceptual blend – A smooth transition using a model designed to mimic human perception of color. The blending is done so that the perceived brightness and color varies smoothly and evenly.
  • Linear blend – A model for blending color based on how light behaves physically. This type of blending can occur in many ways naturally, for example when colors are blended together by focus blur in a camera or when viewing a pattern of two colors at a distance.
  • sRGB blend – This is how colors would normally be blended in computer software, using sRGB to represent the colors. 

 

Let’s look at some more examples of blending of colors, to see how these problems surface more practically. The examples use strong colors since then the differences are more pronounced. This is using the same three ways of blending colors as the first example.

 

Instead of making it as easy as possible to work with color, most software make it unnecessarily hard, by doing image processing with representations not designed for it. Approximating the physical behavior of light with linear RGB models is one easy thing to do, but more work is needed to create image representations tailored for image processing and human perception.

 

Also see:

https://www.pixelsham.com/2022/04/05/bjorn-ottosson-okhsv-and-okhsl-two-new-color-spaces-for-color-picking/

EVER (Exact Volumetric Ellipsoid Rendering) – Gaussian splatting alternative

https://radiancefields.com/how-ever-(exact-volumetric-ellipsoid-rendering)-does-this-work

 

https://half-potato.gitlab.io/posts/ever/

 

Unlike previous methods like Gaussian Splatting, EVER leverages ellipsoids instead of Gaussians and uses Ray Tracing instead of Rasterization. This shift eliminates artifacts like popping and blending inconsistencies, offering sharper and more accurate renderings.

 

 

Microsoft is discontinuing its HoloLens headsets

https://www.theverge.com/2024/10/1/24259369/microsoft-hololens-2-discontinuation-support

 

Software support for the original HoloLens headset will end on December 10th.

Microsoft’s struggles with HoloLens have been apparent over the past two years.

15 Years of Art Experience in One DRAGON

 

Bonus clip in the post: Character Design Concept Art Process – Professional Workflow

(more…)

Meta Horizon Hyperscape

𝐌𝐞𝐭𝐚 𝐇𝐲𝐩𝐞𝐫𝐬𝐜𝐚𝐩𝐞 𝐢𝐧 𝐚 𝐧𝐮𝐭𝐬𝐡𝐞𝐥𝐥
Hyperscape technology allows us to scan spaces with just a phone and create photorealistic replicas of the physical world with high fidelity. You can experience these digital replicas on the Quest 3 or on the just announced Quest 3S.

 

https://www.youtube.com/clip/UgkxGlXM3v93kLg1D9qjJIKmvIYW-vHvdbd0

 

𝐇𝐢𝐠𝐡 𝐅𝐢𝐝𝐞𝐥𝐢𝐭𝐲 𝐄𝐧𝐚𝐛𝐥𝐞𝐬 𝐚 𝐍𝐞𝐰 𝐒𝐞𝐧𝐬𝐞 𝐨𝐟 𝐏𝐫𝐞𝐬𝐞𝐧𝐜𝐞
This level of photorealism will enable a new way to be together, where spaces look, sound, and feel like you are physically there.

 

 

𝐒𝐢𝐦𝐩𝐥𝐞 𝐂𝐚𝐩𝐭𝐮𝐫𝐞 𝐩𝐫𝐨𝐜𝐞𝐬𝐬 𝐰𝐢𝐭𝐡 𝐲𝐨𝐮𝐫 𝐦𝐨𝐛𝐢𝐥𝐞 𝐩𝐡𝐨𝐧𝐞
Currently not available, but in the future, it will offer a new way to create worlds in Horizon and will be the easiest way to bring physical spaces to the digital world. Creators can capture physical environments on their mobile device and invite friends, fans, or customers to visit and engage in the digital replicas.

 

𝐂𝐥𝐨𝐮𝐝-𝐛𝐚𝐬𝐞𝐝 𝐏𝐫𝐨𝐜𝐞𝐬𝐬𝐢𝐧𝐠 𝐚𝐧𝐝 𝐑𝐞𝐧𝐝𝐞𝐫𝐢𝐧𝐠
Using Gaussian Splatting, a 3D modeling technique that renders fine details with high accuracy and efficiency, we process the model input data in the cloud and render the created model through cloud rendering and streaming on Quest 3 and the just announced Quest 3S.

 

𝐓𝐫𝐲 𝐢𝐭 𝐨𝐮𝐭 𝐲𝐨𝐮𝐫𝐬𝐞𝐥𝐟
If you are in the US and you have a Meta Quest 3 or 3S you can try it out here:

https://www.meta.com/experiences/meta-horizon-hyperscape-demo/7972066712871980/

Popular Searches unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke

Subscribe to PixelSham.com RSS for free
Subscribe to PixelSham.com RSS for free

FEATURED LIGHTING

FEATURED COMPOSITION POSTS

FEATURED DESIGN POSTS