Thank you for visiting!! Please bear with us while we go through some troubleshooting and a redesign of the blog.
There is a database query issue which keeps hitting the site. Apologies. Still investigating solutions.
-
Managers’ Guide to Effective Annual Feedback
https://peterszasz.com/engineering-managers-guide-to-effective-annual-feedback
The main goals of a regular, written feedback cycle are:
- Recognition, support for self-reflection and personal growth
- Alignment with team- and company needs
- Documentation
These promote:
- Recognize Achievements: Use the feedback process to boost morale and support self-reflection.
- Align Goals: Ensure individual contributions match company objectives.
- Document Progress: Keep a clear record of performance for future decisions.
- Prepare Feedback: Gather 360-degree feedback, focus on examples, and anticipate reactions.
- Strength-Based Approach: Focus on enhancing strengths over fixing weaknesses.
- Deliver Feedback Live: Engage in discussion before providing written feedback.
- Follow-Up: Use feedback to guide future goals and performance improvement.
-
GIL To Become Optional in Python 3.13
GIL or Global Interpreter Lock can be disabled in Python version 3.13. This is currently experimental.
What is GIL? It is a mechanism used by the CPython interpreter to ensure that only one thread executes the Python bytecode at a time.
https://medium.com/@r_bilan/python-3-13-without-the-gil-a-game-changer-for-concurrency-5e035500f0da
Advantages of the GIL
- Simplicity of Implementation: The GIL simplifies memory management in CPython by preventing concurrent access to Python objects, which can help avoid race conditions and other threading issues.
- Ease of Use for Single-Threaded Programs: For applications that are single-threaded, the GIL eliminates the overhead associated with managing thread safety, allowing for straightforward and efficient code execution.
- Compatibility with C Extensions: The GIL allows C extensions to operate without needing to implement complex threading models, which simplifies the development of Python extensions that interface with C libraries.
- Performance for I/O-Bound Tasks: In I/O-bound applications, the GIL does not significantly hinder performance since threads can be switched out during I/O operations, allowing other threads to run.
Disadvantages of the GIL
- Limited Multithreading Performance: The GIL can severely restrict the performance of CPU-bound multithreaded applications, as it only allows one thread to execute Python bytecode at a time, leading to underutilization of multicore processors.
- Thread Management Complexity: Although the GIL simplifies memory management, it can complicate the design of concurrent applications, forcing developers to carefully manage threading issues or use multiprocessing instead.
- Hindrance to Parallel Processing: With the GIL enabled, achieving true parallelism in Python applications is challenging, making it difficult for developers to leverage multicore architectures effectively.
- Inefficiency in Context Switching: Frequent context switching due to the GIL can introduce overhead, especially in applications with many threads, leading to performance degradation.
https://geekpython.in/gil-become-optional-in-python
-
Ben Gunsberger – AI generated podcast about AI using Google NotebookLM
Listen to the podcast in the post
“I just created a AI-Generated podcast by feeding an article I write into Google’s NotebookLM. If I hadn’t make it myself, I would have been 100% fooled into thinking it was real people talking.”
-
Apple releases Depth Pro – An open source AI model that rewrites the rules of 3D vision
The model is fast, producing a 2.25-megapixel depth map in 0.3 seconds on a standard GPU.
https://github.com/apple/ml-depth-pro
https://arxiv.org/pdf/2410.02073
-
Anders Langlands – Render Color Spaces
https://www.colour-science.org/anders-langlands/
This page compares images rendered in Arnold using spectral rendering and different sets of colourspace primaries: Rec.709, Rec.2020, ACES and DCI-P3. The SPD data for the GretagMacbeth Color Checker are the measurements of Noburu Ohta, taken from Mansencal, Mauderer and Parsons (2014)ย colour-science.org.
-
Bjรถrn Ottosson – How software gets color wrong
https://bottosson.github.io/posts/colorwrong/
Most software around us today are decent at accurately displaying colors. Processing of colors is another story unfortunately, and is often done badly.
To understand what the problem is, letโs start with an example of three ways of blending green and magenta:
- Perceptual blendย โ A smooth transition using a model designed to mimic human perception of color. The blending is done so that the perceived brightness and color varies smoothly and evenly.
- Linear blendย โ A model for blending color based on how light behaves physically. This type of blending can occur in many ways naturally, for example when colors are blended together by focus blur in a camera or when viewing a pattern of two colors at a distance.
- sRGB blendย โ This is how colors would normally be blended in computer software, using sRGB to represent the colors.ย
Letโs look at some more examples of blending of colors, to see how these problems surface more practically. The examples use strong colors since then the differences are more pronounced. This is using the same three ways of blending colors as the first example.
Instead of making it as easy as possible to work with color, most software make it unnecessarily hard, by doing image processing with representations not designed for it. Approximating the physical behavior of light with linear RGB models is one easy thing to do, but more work is needed to create image representations tailored for image processing and human perception.
Also see:
-
EVER (Exact Volumetric Ellipsoid Rendering) – Gaussian splatting alternative
https://radiancefields.com/how-ever-(exact-volumetric-ellipsoid-rendering)-does-this-work
https://half-potato.gitlab.io/posts/ever/
Unlike previous methods like Gaussian Splatting, EVER leverages ellipsoids instead of Gaussians and uses Ray Tracing instead of Rasterization. This shift eliminates artifacts like popping and blending inconsistencies, offering sharper and more accurate renderings.
-
The Rise and Fall of Adobe – The better, alternative software list to a criminal company
Best alternatives to Adobe:
https://github.com/KenneyNL/Adobe-Alternatives
- Affinity (Photo and illustration editing) https://affinity.serif.com/
- DaVinci Resolve (video editing): https://www.blackmagicdesign.com/au/products/davinciresolve/
- Clip Studio Paint (illustration): https://www.clipstudio.net/en/
- Toon Boom (animation): https://www.toonboom.com/
-
Microsoft is discontinuing its HoloLens headsets
https://www.theverge.com/2024/10/1/24259369/microsoft-hololens-2-discontinuation-support
Software support for the original HoloLens headset will end on December 10th.
Microsoftโs struggles with HoloLens have been apparent over the past two years.
-
Meta Horizon Hyperscape
๐๐๐ญ๐ ๐๐ฒ๐ฉ๐๐ซ๐ฌ๐๐๐ฉ๐ ๐ข๐ง ๐ ๐ง๐ฎ๐ญ๐ฌ๐ก๐๐ฅ๐ฅ
Hyperscape technology allows us to scan spaces with just a phone and create photorealistic replicas of the physical world with high fidelity. You can experience these digital replicas on the Quest 3 or on the just announced Quest 3S.https://www.youtube.com/clip/UgkxGlXM3v93kLg1D9qjJIKmvIYW-vHvdbd0
๐๐ข๐ ๐ก ๐ ๐ข๐๐๐ฅ๐ข๐ญ๐ฒ ๐๐ง๐๐๐ฅ๐๐ฌ ๐ ๐๐๐ฐ ๐๐๐ง๐ฌ๐ ๐จ๐ ๐๐ซ๐๐ฌ๐๐ง๐๐
This level of photorealism will enable a new way to be together, where spaces look, sound, and feel like you are physically there.๐๐ข๐ฆ๐ฉ๐ฅ๐ ๐๐๐ฉ๐ญ๐ฎ๐ซ๐ ๐ฉ๐ซ๐จ๐๐๐ฌ๐ฌ ๐ฐ๐ข๐ญ๐ก ๐ฒ๐จ๐ฎ๐ซ ๐ฆ๐จ๐๐ข๐ฅ๐ ๐ฉ๐ก๐จ๐ง๐
Currently not available, but in the future, it will offer a new way to create worlds in Horizon and will be the easiest way to bring physical spaces to the digital world. Creators can capture physical environments on their mobile device and invite friends, fans, or customers to visit and engage in the digital replicas.๐๐ฅ๐จ๐ฎ๐-๐๐๐ฌ๐๐ ๐๐ซ๐จ๐๐๐ฌ๐ฌ๐ข๐ง๐ ๐๐ง๐ ๐๐๐ง๐๐๐ซ๐ข๐ง๐
Using Gaussian Splatting, a 3D modeling technique that renders fine details with high accuracy and efficiency, we process the model input data in the cloud and render the created model through cloud rendering and streaming on Quest 3 and the just announced Quest 3S.๐๐ซ๐ฒ ๐ข๐ญ ๐จ๐ฎ๐ญ ๐ฒ๐จ๐ฎ๐ซ๐ฌ๐๐ฅ๐
If you are in the US and you have a Meta Quest 3 or 3S you can try it out here:https://www.meta.com/experiences/meta-horizon-hyperscape-demo/7972066712871980/
Collections
| Explore posts
| Design And Composition
| Featured AI
Popular Searches
unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke
FEATURED POSTS
Social Links
DISCLAIMER – Links and images on this website may be protected by the respective ownersโ copyright. All data submitted by users through this site shall be treated as freely available to share.