Wednesday 30 June 2021

Is There More To a Digital Image Than Meets the Eye?

NAB

We might think we know what makes up our perception of digital images, such as resolution, color, and pixels but what if there was something else about the image that gave us clues as to its characteristics? It’s a question of perception.

https://amplify.nabshow.com/articles/is-there-more-to-a-digital-image-than-meets-the-eye/

“I think there is more in an image than can be resolved by your eyes,” writes David Shapton at RedShark News. “I think there is a macro effect, or an epiphenomenon, in addition to the original visual or auditory phenomenon.”

Shapton suggests that there are macro effects in images that we don’t see directly, which nevertheless inform our perception and give us extra information about the image.

He calls it “natural intrinsic metadata.” He says, “It’s there in every reproduced image. The nature and quantity of it does depend on the resolution.”

If that seems a bit woolly, then that may be because the science of sight isn’t hard and fast. In fact, think too hard and you stray into the grey area of philosophy.

For example, do we see the same colors? We all perceive colors through cells found in our eyes composed of a hundred million photoreceptors — the rods and cones. But it’s our brain which interprets light, colors and form, so there might be no way to know if we see the same colors.

“We see with our brain, not with our eyes,” Shapton acknowledges. “Whatever we see is always being interpreted, so there are bound to be ‘epiphenomena.’ The content of an image has a big effect on how the image is perceived; As resolution, dynamic range and color depth (and frame rate) increase it requires creators to re-evaluate our work. A bit like you have to rethink your painting process when you switch from oil paint to water colors.”

Pixels are clearly not the only criteria for seeing an image. Higher contrast increases perceived brightness; an object’s edges appear sharper.

Plus, we’re talking pixels — light that that been recorded onto a sensor and replicated digitally on a screen. That has to impact our visual sense.

“When we are looking at an object in the real world, we are placing it in focus, while not concentrating deeply on the rest of the surroundings,” says YungKyung Park, associate professor in Color Design at Ewha Womans University Seoul. She specializes in the color science field.

“However, when we are looking at the same object using the display, we can focus on every point of the screen, allowing us to perceive substantially more information about this object.”

The Science of Hyperrealism

She argues that 8K displays are able to provide a smoother gradient and improve sharpness to the point, where objects seem even more realistic than in real world.

This phenomenon is referred to as hyperrealism.

“Hyperrealism is achieved when we are able to capture and comprehend even the subtlest lighting and shading effects and the display is able to transmit extreme glossy and shadow expressions, giving us abundant information about the object,” she says. “Ultra-high resolution displays are evolving to approach the capacity of our vision.”

She attributes the appearance of “hyperrealism” to the formation of “mach bands” — a phenomenon observed when a band of gradients will appear lighter or darker than they actually are.

“Mach bands affect our perception of color and brightness values — and essentially result in an optical illusion. That lateral inhibition in the retina visual system enhances the perceived sharpness of edges.”

Park does provide some science to back this up. She led an experiment at Ewha University to find out whether increasing the screen resolution from 4K to 8K made a significant difference in viewing experience.

The study, involving 120 students, rated perceived image quality increasing by 30% and depth perception increasing 60% from 4K to 8K.

What fascinated Park is that rather than pointing out the increased sharpness or contrast of the image associated with higher resolution, participants highlighted the main differences to be those related to sensory perceptions.

“They described images depicted on 8K screen as evoking higher sense and perception — for example, noting that objects look cooler, warmer, more delicious, heavier,” she says.

Further, the study found that perceptual qualities (parameters highly relatable to the display technology itself like contrast, color, and resolution) are positively related to cognitive attributes (like temperature, sense of reality, space, depth, and perceived image quality).

Because of all this additional information about the object, she concludes, “on the cognitive level, it appears like it has an even higher resolution — which is the perceptual rather than measured quality — which makes us feel like the image is more real and has higher senses.”

Park’s experiment was sponsored by Samsung in support of marketing its 8K flat panels so perhaps we should take all of this with a pinch of salt.

The need for an 8K display, let alone anything of higher pixel count, is typically debunked as being a waste of time since we humans can’t physically resolve the visual detail.

Visual Acuity and Hyperacuity

This is being countered by arguments that the most common method for determining whether resolution is visible to humans — termed simple acuity — only tells part of the story.

What we need is the view of an independent visual imaging expert. How about Chris Chinnock, owner of analysis firm Insight Media?

“Generally, simple acuity is measured via the Snellen eye chart which determines the ability to see distinct black‐on‐white horizontal and vertical high‐contrast elemental blocks,” Chinnock explains.

“However, human vision is far more complex than a simple acuity measurement. Research suggests that 8K images are engaging other senses that can be difficult, if not impossible, to measure but are real nonetheless.”

For example, take a look at the night sky. Some stars are far too small to be seen according to simple acuity theory — but we can see them nonetheless.

Shapton and Chinnock suggest that Vernier acuity combined with the brain’s ability to “fill in the gaps” come into play. Vernier, or hyperacuity, refers to the ability to discern slight misalignments between lines — an ability that’s impossible using simple acuity descriptions of human vision.

“You’ve got simple and hyperacuity coming together in the brain to create an image that dimensionalizes the image in ways that we’re not fully aware of,” Chinnock contends.

So, factor in ultra-resolution with HDR, immersive audio (also suggested by Shapton as enhancing perception), wider color gamut and higher frame rates to reduce motion blur and we begin to see things — differently.

“Whether you have an 8K image or a 100K image, the higher fidelity and lack of artefacts reinforces what the brain is able to do,” Chinnock says. “The brain is not having to work as hard to recreate the image.”

Truth be told, we don’t actually know. As Shapton notes, “the field of digital video, especially extremely high resolution, is very young.”

We are not in Kansas any more.

 


No comments:

Post a Comment