Exposure Value – What does it mean?

Exposure Value (Ev) – what does Ev mean?

I get asked this question every now and again because I frequently use it in the description annotations of image shot data here on the blog.

And I have to say from the outset the Exposure Value comes in two flavours – relative and absolute – and here I’m only talking mainly about the former.

So, let’s start with basic exposure.

Exposure can be thought of as Intensity x Time.

Intensity is controlled by our aperture, and time is controlled by our shutter speed.

This image was shot at 0.5sec (time), f11 (intensity) and ISO 100.

exposure value

We can think of the f11 intensity of light striking the sensor for 0.5sec as a ‘DOSAGE’ – and if that dosage results in the desired scene exposure then that dosage can be classed as the exposure value.

Let’s consider two exposure settings – 0.5sec at f11 ISO100 and 1sec at f16 ISO 100.

Technically speaking they are two different exposures, but BOTH result in the same light dosage at the sensor.  The second exposure is TWICE the length of time but HALF the intensity.

So both exposures have the same Exposure Value or Ev.

The following exposure of the same scene is 1sec at f11 ISO 100:

exposure value

The image was shot at the same intensity (f11) but the shutter speed (time) was twice as long, and so the dosage was doubled.  Double the dose = +1Ev!

And in this version the exposure was 0.25sec at f11 ISO 100:

exposure value

Here the light dosage at the sensor is HALF that of the correct/desired exposure because the time factor was halved while using the same intensity.

So half the dose = -1Ev!

Now some of you will be thinking that -1Ev is 1 stop under exposure – and you’d be right!

But Ev, or exposure value, is just a cleaner way of thinking about exposure because it doesn’t tie you to any specific camera setting – and it’s more easily transferable between cameras.

What Do I Mean by that?

Example – If I use say a 50mm prime lens on my Nikon D800E with the metering in matrix mode, ISO 100 and f14 I might get a metered exposure shutter speed of 1/10th of a second.

But if I replace the D800E with a D4 set at 100 ISO, matrix and f14 I’ll guarantee the metered shutter speed requirement will be either 1/13 or 1/15th of a second.

The D4 meters between -1/3Ev and -2/3Ev (in other words 1/2 stop) faster/brighter than the D800E when fitted with the same lens and set to the same aperture and ISO, and shooting exactly the same framing/composition.

Yet the ‘as metered’ shots from both cameras look pretty much the same with respect to light dosage – exposure value.

Exposure Settings Don’t Transfer between camera models very well, because the meter in a camera is calibrated to the response curve of the sensor.

A Canon 1DX Mk2 will usually generate a evaluative metered shutter speed 1/3rd of a stop faster than a matrix metered Nikon D4S for the same given focal length, aperture and ISO setting.

Both setups ‘as metered’ shots will look pretty much the same, but transposing the Canon settings to the Nikon will result in -1/3 stop under exposure – which on a digital camera is definitely NOT the way to go!

‘As Metered’ can be regarded as +/-0Ev for any camera (Note: this does NOT mean Ev=0!)

Any exposure compensation you use in order to achieve the ‘desired’ exposure on the other hand can be thought of as ‘metered + or – xEv’.

exposure compensation

Shot with the D4 plus 70-200 f2.8@70mm in manual exposure mode, 1/2000th sec, f8 and ISO 400 using +2/3Ev compensation.

The matrix metered exposure indicated by the camera before the exposure value compensation was 1/3200th – this would have made the Parasitic Jaeger (posh name for an Arctic Skua!) too dark.

A 1DXMk2 using the corresponding lens and focal length, f8, ISO 400 and evaluative metering would have wanted to generate a shutter speed of at least 1/4000th sec without any exposure compensation, and 1/2500th with +2/3Ev exposure compensation.

And if shot at those settings the Canon image would look pretty much like the above.

But if the Nikon D4 settings had been fully replicated on the Canon then the shot would be between 1/3 and 1/2 stop over exposed, risking ‘blowing’ of some of the under-wing and tail highlights.

So the simple lesson here is don’t use other photographers settings – they never work unless you’re on identical gear! 

But if you are out with me and I tell you “matrix/evaluative plus 1Ev” then your exposure will have pretty much the same ‘light dosage’ as mine irrespective of you using the right shutter speed, aperture or ISO for the job or not!

I was brought up to think in terms of exposure value and Ev units, and to use light meters that had Ev scales on them – hell, the good ones still have ’em!

If you look up the ‘tech-specs’ for your camera you’ll find that metering sensitivity is normally quoted as an Ev range.  And that’s not all – your auto focus may well have a low light Ev limited quoted too!

To all intents and purposes Ev units and your more familiar ‘f-stops’ amount to one and the same thing.

As we’ve seen before, different exposures in terms of intensity and time can have the same exposure value, and all Ev is concerned with is the cumulative outcome of our shutter speed, aperture and ISO choices.

Most of you will take exposures at ‘what the camera meter says’ settings, or you will use the meter indicated exposure as a baseline and modify the exposure settings with either positive or negative ‘weighting’ via your exposure compensation dial.

That’s Ev compensation relative to your meters baseline.

But have you ever asked yourself just how accurate your camera meter is?

So I’ve just stepped outside my front door and taken these two frames:

exposure value

EV=15/Sunny 16 Rule 1/100th sec, f16, 100 ISO – click to view large.

exposure value

Matrix Metering, no exposure compensation 1/200th sec, f16, ISO 100 – click to view large

These two raw files have been brought into Lightroom and THE ONLY adjustment has been to change the profile from Adobe Color to Camera Neutral.

Members of my subscription site can download the raw files and see for themselves.

Look at the histogram in both images!

The exposure for xxx164.NEF (the top image) is perfection personified while xxx162.NEF is under exposed by ONE WHOLE STOP – why?

Because the bottom image has been shot at the camera-specified matrix metered exposure, while the top image has been shot using the good old ‘Sunny 16 Rule’ that’s been around since God knows when!

“Yeah, but I could just use the shadow recovery slider on the bottom shot Andy….”  Yes, you could, if you wanted to be an idle tit, and even then the top image would still be better because there’s no ‘recovery’ being used on it in the first place.  Remember, more work at the camera means less work in processing!

Recovery of either shadows or highlights is ‘poor form’ and no substitute for correct exposure in the first place. Digital photography is just like shooting colour transparency film – you need to ‘peg the highlights’ as highlights BUT without over exposing them and causing them to ‘blow’.

In other words – ETTR, expose to the right!

And seeing as your camera meter wants to turn everything into midtone grey shite it’s the very last thing you should ever allow to dictate your final exposure settings – as the two images above prove beyond argument.

And herein lies the problem.

Even if you use the spot metering function the meter will read the brightness of what is covered by the ‘spot’ and then calculate the exposure required to expose that tonal brightness AS A MID TONE GREY.

That’s all fine ‘n dandy – if the metered area is actually an exact mid tone.  But what if you were metering a highlight?

Then the metered exposure would want to expose said highlight as a midtone and the overall highlight exposure would be far too dark.  And you can guess what would happen if you trusted your meter to spot-read a shadow.

A proper hand-held spot meter has an angle of view or AoV of 1 degree.

Your camera spot meter angle of view is dictated by the focal length of the lens you have fitted.

On my D800E for example, I need to have a lens AoV of around 130mm focal length equivalent for my spot to cover 1 degree, because the ‘spot’ is 4mm in diameter – total stupidity.

But it does function fairly well with wider angle lenses and exposure calculations when used in conjunction with the live view histogram.  And that will be subject of my next blog post – or perhaps I’ll do a video for YouTube!

So I doubt this blog post about relative exposure compensation is going to light your world on fire – it began as an explanation to a recurring question about my exif annotation habits and snowballed somewhat from there!

But I’ll leave you with this little guide to the aforementioned Sunny 16 Rule, which has been around since Noah took up boat-building:

To use this table just set your ISO to 100.

Your shutter speed needs to be the reciprocal of your ISO – in other words 1/100 sec for use with the stated aperture values:

Aperture Lighting conditions Shadow PROPERTIES
f/22* Snow/sand Dark with sharp edges
f/16 Sunny Distinct
f/11 Slight overcast Soft around edges
f/8 Overcast Barely visible
f/5.6** Heavy overcast No shadows
f/4 Open shade/sunset No shadows

* – I would not shoot at f22 because of diffraction – try 1/200th f16

** – let’s try some cumulative Ev thinking here and go for more depth of field using f11 and sticking with 100 ISO. -2Ev intensity (f5.6 to f11) requires +2Ev on time, so 1/100th sec becomes 1/25th sec.

Over the years I’ve taken many people out on photo training days, and a lot of them seem to think I’m some sort of magician when I turn their camera on, switch it manual, dial in a couple of settings and produce a half decent image without ever looking at the meter on their camera.

It ain’t magic – I just had this table burnt into the back of my eyeballs years ago.

Works a charm – if you can do the mental calculations in your head, and that’s easy with practice.  The skill is in evaluating your shooting conditions and relating them to the lighting and shadow descriptions.

And here’s a question for you; we know our camera meter wants to ‘peg’ what it’s measuring as a midtone irrespective of whether it’s measuring a midtone or not.  But what do you think the Sunny 16 Rule is ‘pegging’ and where is it pegging it on the exposure curve?

If you can answer that question correctly then the other flavour of exposure value – absolute – might well be of distinct interest to you!

Give it a try, and if you use it correctly you’ll never be more than 1/3rd of a stop out, if that.  Then you can go and unsubscribe from all those twats on YouTube who told you it was out-dated and defunct or never told you about it in the first place!

I hope you’ve found the information in this post useful.

I don’t monetize my YouTube videos or fill my blog posts with masses of affiliate links, and I rely solely on my patrons to help cover my time and server costs. If you would like to help me to produce more content please visit my Patreon page on the button above.

Many thanks and best light to you all.

Astro Landscape Photography

Astro Landscape Photography

Astro Landscape Photography

One of my patrons, Paul Smith, and I ventured down to Shropshire and the spectacular quartsite ridge of The Stiperstones to get this image of the Milky Way and Mars (the large bright ‘star’ above the rocks on the left).

I always work the same way for astro landscape photography, beginning with getting into position just before sunset.

Using the PhotoPills app on my phone I can see where the milky way will be positioned in my field of view at the time of peak sky darkness.  This enables me to position the camera exactly where I want it for the best composition.

The biggest killer in astro landscape photography is excessive noise in the foreground.

The other problem is that foregrounds in most images of this genre are not sharp due to a lack of depth of field at the wide apertures you need to shoot the night sky at – f2.8 for example.

To get around this problem we need to shoot a separate foreground image at a lower ISO, a narrower aperture and focused closer to the camera.

Some photographers change focus, engage long exposure noise reduction and then shoot a very long exposure.  But that’s an eminently risky thing to do in my opinion, both from a technical standpoint and one of time – a 60 minute exposure will take 120 minutes to complete.

The length of exposure is chosen to allow the very low photon-count from the foreground to ‘build-up’ on the sensor and produced a usable level of exposure from what little natural light is around.

From a visual perspective, when it works, the method produces images that can be spectacular because the light in the foreground matches the light in the sky in terms of directionality.

Light Painting

To get around the inconvenience of time and super-long exposures a lot of folk employ the technique of light painting their foregrounds.

Light painting – in my opinion – destroys the integrity of the finished image because it’s so bloody obvious!  The direction of light that’s ‘painted’ on the foreground bares no resemblance to that of the sky.

The other problem with light painting is this – those that employ the technique hardly ever CHECK to see if they are in the field of view of another photographer – think about that one for a second or two!

My Method

As I mentioned before, I set up just before sunset.  In the shot above I knew the milky way and Mars were not going to be where I wanted them until just after 1am, but I was set up by 9.20pm – yep, a long wait ahead, but always worth the effort.

Astro Landscape Photography

As we move towards the latter half of civil twilight I start shooting my foreground exposure, and I’ll shoot a few of these at regular intervals between then and mid nautical twilight.

Because I shoot raw the white balance set in camera is irrelevant, and can be balanced with that of the sky in Photoshop during post processing.

The key things here are that I have a shadowless even illumination of my foreground which is shot at a low ISO, in perfect focus, and shot at say f8 has great depth of field.

Once deep into blue hour and astronomical twilight the brighter stars are visible and so I now use full magnification in live view and focus on a bright star in the cameras field of view.

Then it’s a waiting game – waiting for the sky to darken to its maximum and the Milky Way to come into my desired position for my chosen composition.

Shooting the Sky

Astro landscape photography is all about showing the sky in context with the foreground – I have absolutely ZERO time for those popular YouTube photographers who composite a shot of the night sky into a landscape image shot in a different place or a different angle.

Good astro landscape photography HAS TO BE A COMPOSITE though – there is no way around that.

And by GOOD I mean producing a full resolution image that will sell through the agencies and print BIG if needed.

The key things that contribute to an image being classed good in my book are simple:

  • Pin-point stars with no trailing
  • Low noise
  • Sharp from ‘back’ to ‘front’.

Pin-points stars are solely down to correct shutter speed for your sensor size and megapixel count.

Low noise is covered by shooting a low ISO foreground and a sequence of high ISO sky images, and using Starry Landscape Stacker on Mac (Sequator on PC appears to be very similar) in conjunction with a mean or median stacking mode.

Further noise cancelling is achieved but the shooting of Dark Frames, and the typical wide-aperture vignetting is cancelled out by the creation of a flat field frame.

And ‘back to front’ image sharpness should be obvious to you from what I’ve already written!

So, I’ll typically shoot a sequence of 20 to 30 exposures – all one after the other with no breaks or pauses – and then a sequence of 20 to 30 dark frames.

Shutter speeds usually range from 4 to 6 seconds

Watch this video on my YouTube Channel about shutter speed:

Best viewed on the channel itself, and click the little cog icon to choose 1080pHD as the resolution.

Putting it all Together

Shooting all the frames for astro landscape photography is really quite simple.

Putting it all together is fairly simple and straight forward too – but it’s TEDIOUS and time-consuming if you want to do it properly.

The shot above took my a little over 4 hours!

And 80% of it is retouching in Photoshop.

I produce a very extensive training title – Complete Milky Way Photography Workflow – with teaches you EVERYTHING you need to know about the shooting and processing of astro landscape photography images – you can purchase it here – and if you use the offer code MWAY15 at the checkout you’ll get £15 off the purchase price.

But I wanted to try Raw Therapee for this Stiperstones image, and another of my patrons – Frank – wanted a video of processing methodology in Raw Therapee.

Easier said than done, cramming 4 hours into a typical YouTube video!  But after about six attempts I think I’ve managed it, and you can see it here, but I warn you now that it’s 40 minutes long:

Best viewed on the channel itself, and click the little cog icon to choose 1080pHD as the resolution.

I hope you’ve found the information in this post useful, together with the YouTube videos.

I don’t monetize my YouTube videos or fill my blog posts with masses of affiliate links, and I rely solely on my patrons to help cover my time and server costs.  If you would like to help me to produce more content please visit my Patreon page on the button above.

Many thanks and best light to you all.

ETTR Processing in Lightroom

ETTR Processing in Lightroom

When we shoot ETTR (expose to the right) in bright, harsh light, Lightroom can sometimes get the wrong idea and make a real ‘hash’ of rendering the raw file.

Sometimes it can be so bad that the less experienced photographer can get the wrong impression of their raw file exposure – and in some extreme cases they may even ‘bin’ the image thinking it irretrievably over exposed.

I’ve just uploaded a video to my YouTube channel which shows you exactly what I’m talking about:

The image was shot by my client and patron Paul Smith when he visited the Mara back in October last year,  and it’s a superb demo image of just how badly Lightroom can demosaic a straight forward +1.6 Ev ETTR shot.

Importing the raw file directly into Lightroom gives us this:

ETTR

But importing the raw file directly into RawTherapee with no adjustments gives us this:

ETTR

Just look at the two histogram versions – Lightroom is doing some crazy stuff to the image ‘in the background’ as there are ZERO develop settings applied.

But if you watch the video you’ll see that it’s quite straight forward to regain all that apparent ‘blown detail’.

And here’s the important bit – we do so WITHOUT the use of the shadow or highlight recovery sliders.  Anyone who has purchased my sharpening videos HERE knows that those two sliders can VERY EASILY cause undesirable ‘pseudo-sharpening’ halos, and they should only be used with caution.

ETTR

The way I process this +1.6 stop ETTR exposure inside Lightroom has revealed all the superb mid tone detail and given us a really good image that we could take into Photoshop and improve with some precision localized adjustments.

So don’t let Lightroom control you – you need to control IT!

Thanks for reading and watching.

You can also view this post on the free section of my Patreon pages HERE

If you feel this article and video has been beneficial to you and would like to see more per week, then supporting my Patreon page for as little as $1 per month would be a massive help.  Thanks everyone!

 

Lightroom v7.3 – Some Thoughts

Lightroom v7.3 – Some Thoughts

Well, I’d like to say the past week has been a blast, but Adobe screwed any chance of that happening by releasing version 7.3 of Lightroom on the 3rd/4th.

The week actually started off quite well with me uploading a Raw Therapee basic “get you started’ video:

I created that video primarily to help out anyone who has purchased my latest video training ‘Professional Grade Image Sharpening’ – click this link and get it bought if you haven’t already!

I’d planned to get out and do some photography, and do some serious SEO work on my YouTube channel.

But when I turned my machines on at 6.15am on the 4th I was greeted with some queries from clients and blog/channel viewers about some new fangled update for Lightroom.

Then the CC update panel told me I had application updates for Photoshop and Lightroom, so we clicked update on both.

I’m a bit of a Photoshop junkie, and I always look forward to any update if I’m honest, just so I can go and have a play with it!

But I’ve been a bit ‘meh..’ over Lightroom for quite a while now, for a few reasons.

Firstly, it’s trying to become some sort of pathetic 1 stop shop image processor, catering to the ‘instant gratification brigade’ INSTEAD OF what it’s meant to be – a superb digital asset management program and a raw processor designed to work in conjunction with the KING of image processors – PHOTOSHOP.

Secondly, it’s unique demosaicing algorithm is ludicrously outdated in comparison to C1, Iridient and RT, and its capture/input sharpening controls leave a lot to be desired.  Anyone who has been sensible and bought my massive sharpening training knows exactly what I’m talking about here, as I demonstrate these facts more than a few times!

In point of fact, on the demosaicing front, it’s not as clever as that found in either Canon DPP or Nikon Capture.

But, with a bit of patience and effort, you can strip all the crap background adjustments away, and get back to a relatively neutral starting point; as I’ve discussed many times previously on this blog.

So, once the updates were done, and I’d had a quick look at Photoshop, I fired up the new Lightroom v7.3 – and immediately wished I hadn’t!

Heading over to the Adobe Lightroom Forum I see A LOT of very upset users.

Strangely enough though, heading over to YouTube I see the exact opposite!

But, positive or negative, all the buzz is about the new profiles.

Lightroom v7.3 Profiles

There are tens of thousands of Lightroom v7.3 fan boys out there, plus even more users with a low level knowledge base, who do NOT understand what a ‘profile’ is – and Adobe are using this as a massive marketing tool.

Lightroom v7.3 profiles are simply Lightroom v7.2 PRESETS, re-bundled into something called a profile, and shoved into a different location in the Lightroom GUI.

The subtle difference is this – if you have a preset that gives a ‘certain look’ to an image, when you apply it, the relevant sliders in the dev module move.

But if you have a ‘profile’ that gives the same visual appearance, when you apply it the relevant sliders DON”T move.

A PRESET is a visible, front GUI adjustment, and a PROFILE is a buried, background adjustment.

You’ll see this corroborated by an Adobe Forum Moderator a little later on..

A preset shows up in the control sliders, and you can easily tweak these after applying the preset.

Application of a PROFILE however, gives you no control indication of what it’s done, so you can’t tweak its adjustments because you can’t see them.

Profiles just pander to people who basically want Adobe to process their images for them – harsh, but true.

Presets – for me, the few that I make are simply to save time in applying settings to remove Adobes processing of my images.

But for years there has been a third party after-market revenue stream in preset bundles from certain photography trainers – buy these and your images will look like mine!  So presets too steered their purchasers away from actually processing their own images, but at least those presets were designed by photographers!

Anyway, for those that haven’t seen the two videos I upload to YouTube about Lightroom v7.3 they are embedded below:

 

I was expecting a mixed response to those videos, from the sane and sensible:

lightroom v7.3

Click me – good old Franky!

lightroom v7.3

Click me

to the plain stupid:

lightroom v7.3

Click me – I’m worth a read!

but I wasn’t expecting the raft of these, this is the tamest:

lightroom v7.3

You have to have a thick skin if you stick videos on YouTube, but what the f**k does  a comment like that achieve?

Anyway, F**K all that. At the end of the first video I do say that if I find anything out about the new default sharpening amount in Lightroom v7.3 I would let you know in a blog article.

So I headed over to the Adobe Lightroom Forum to beg the question – it only took 10 minutes and an Adobe moderator addressed the question, and a bit more besides.

I’ve screen-grabbed it so please click the image below to read it:

lightroom v7.3

I’m interesting so CLICK ME!

So, the important take-aways are:

nothing has changed

and

part of an effort by Adobe to offer a more pleasing “out-of-the-box” rendering

and

At the ‘base’ level nothing has changed. The demosaicing algorithm is unchanged, MelissRGB is still the default colour space within the UI, and the Adobe Standard profile (DCP) for each supported camera is also unchanged. Likewise, the Camera Matching profiles are unchanged.

and

All of the new Adobe Raw and Creative profiles are built on top of Adobe Standard (i.e. Adobe Standard remains the base profile for all supported cameras). As such, these XMP based profiles apply settings under-the-hood.

Conclusion.

So basically the whole version update is geared SOLELY towards people with a camera who want instant gratification by allowing Adobe techs to process their images for them.

As someone who’s understood the photography process, and watched it evolve over the last 40 years, I count myself as something slightly more than just a fat bloke with a camera.

Forget about all this “I care about my images” garbage – I KNOW what constitutes a technically sound image, and ever since the inception of PV2012, Lightroom has been on a slippery slope towards losing it’s full professional image maker credibility.

Like many others, I still use Lightroom, and I always will.  As I said before, it excels in Digital Asset Management, and it’s Soft Proofing and Print facilities are really without equal.

Have they improved any of those features? In a nutshell, NO.

My monthly subscription has gone up by £25 a year, and for my money I’ve now got even more work to do inside the dev module to make sense of my raw files.  If you’ve lost the understanding of what I mean, go and watch the 2nd video again!

Am I even remotely thinking about dropping my subs and using another application?

I might look like a cabbage, but I’m not one!  My £120+ buys me access to a constantly updated installation of the finest image processor on the face of Gods Earth – the mighty Photoshop.

And for those without the required level of prior knowledge, that privilege used to cost in excess of £800+ plus serious upgrade fees every couple of years.  That’s why there was such ripping ‘trade’ in torrenting and cracked copies!

So overall, I’m quids-in, and I can think of Lightroom as something of a freebie, which makes even Lightroom v7.3 good VFM.

Added to that, I can always open a raw file in RT and get a 16bit ProPhotoRGB tiff file into Photoshop that’ll kick Lightrooms version into the last millennium.

But I can’t help it, I do resent deeply the road down which the Adobe bosses are taking Lightroom.

What they should have done is make CC into an idiots version, and re-worked the Classic CC into a proper raw editor with multiple choices for demosaicing, a totally re-worked input sharpening module, and interface the result with the existing Print, Soft-Proof and DAM.

But of course, that would cost them money and reduce their profit margin – so there’s no chance of my idea ever coming to fruition.

I take my hat off to the guys in the C1 dev team, but C1 is far too hostile an environment for any of those thousands of idiots who love the new Lightroom profiles – because that would mean they’d need to do some actual processing work!

And if C1 is hostile, then RT is total Armageddon – hell, it even sends me into a cold sweat!

But photography has always been hard work that demanded knowledge before you started, and a lot of hard learning to acquire said knowledge.

Hard work never hurt anyone, and when does the path of least resistance EVER result in the best possible outcome?

Never – the result is always an average compromise.

And good image processing is all about the BEST IMAGE POSSIBLE from a raw file.

Which brings me nicely back to my sharpening training – get it bought you freebie-hunting misers! GO ON – DO IT NOW – BEFORE YOU FORGET and before I die of starvation!

sharpening

DO IT!


Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Professional Grade Image Sharpening

Professional Grade Image Sharpening for Archive, Print & Web – my latest training video collection.

image sharpening

View the overview page on my download store HERE

Over 11 hours of video training, spread across 58 videos…well, I told you it was going to be big!

And believe me, I could have made it even bigger, because there is FAR MORE to image sharpening than 99% of photographers think.

And you don’t need ANY stupid sharpener plugins – or noise reductions ones come to that.  Because Photoshop does it ALL anyway, and is far more customizable and controllable than any plugin could hope to be.

So don’t waste your money any more – spend it instead, on some decent training to show you how to do the job properly in the first place!

You won’t find a lot of these methods anywhere else on the internet – free or paid for – because ‘teachers cannot teach what they don’t know’ – and I know more than most!

image sharpening

As you can see from the list of lessons above, I cover more than just ‘plain old sharpening’.

Traditionally, image sharpening produces artifacts – usually white and black halos – if it’s over done. And image sharpening emphasizes ‘noise’ in areas of shadow and other low frequency detail, when it’s applied to an image in the ‘traditional’, often taught, blanket manner.

Why sharpen what isn’t in focus – to do so is madness, because all you do is sharpen the noise, and cause more artifacts!

Maximum sharpening should only be applied to detail in the image that is ‘fully in focus’.

So, as ‘focus sharpness’ falls off, so to should the level of applied sharpening.  That way, noise and other artifacts CAN NOT build up in an image.

And the same can be said for noise reduction, but ‘in reverse’.

So image sharpening needs to be applied in a differential manor – and that’s what this training is all about.

Using a brush in Lightroom etc to ‘brush in’ some sort of differential sharpening is NOT a good idea, because it’s imprecise, and something of a fools task.

Why do I say that? Simple……. Because the ‘differential factor bit’ is contained within the image itself – and it’s just sitting there on your computer screen WAITING for you to get stuck in and use it.

But, like everything else in modern digital photography, the knowledge and skill to do so has somehow been lost in the last 12 to 15 years, and the internet is full of ‘teachers’ who have never had these skills in the first place – hence they can’t teach ’em!

However, everyone who buys this training of mine WILL have those skills by the end of the course.

It’s been a real hard slog to produce these videos.  Recording the lessons is easy – it’s the editing and video call-outs that take a lot of time.  And I’ve edited all the audio in Audacity to remove breath sounds and background noise – many thanks to Curtis Judd for putting those great lessons on YouTube!

The price is £59.99. So right now, that’s over 11 hours of training for less than £5.50 per hour – that’s way cheaper than a 1to1, or even a workshop day with a crowd of other people!

So head off over to my download store and buy it, because what you’ll learn will improve your image processing, whether it’s for big prints or just jpegs on the web – guaranteed – just click here!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

 

Photoshop View Magnification

View Magnification in Photoshop (Patreon Only).

A few days ago I uploaded a video to my YouTube channel explaining PPI and DPI – you can see that HERE .

But there is way more to pixel per inch (PPI) resolution values than just the general coverage I gave it in that video.

And this post is about a major impact of PPI resolution that seems to have evaded the understanding and comprehension of perhaps 95% of Photoshop users – and Lightroom users too for that matter.

I am talking about image view magnification, and the connection this has to your monitor.

Let’s make a new document in Photoshop:

View Magnification

We’ll make the new document 5 inches by 4 inches, 300ppi:

View Magnification

I want you to do this yourself, then get a plastic ruler – not a steel tape like I’ve used…..

Make sure you are viewing the new image at 100% magnification, and that you can see your Photoshop rulers along the top and down the left side of the workspace – and right click on one of the rulers and make sure the units are INCHES.

Take your plastic ruler and place it along the upper edge of your lower monitor bezel – not quite like I’ve done in the crappy GoPro still below:

View Magnification

Yes, my 5″ long image is in reality 13.5 inches long on the display!

The minute you do this, you may well get very confused!

Now then, the length of your 5×4 image, in “plastic ruler inches” will vary depending on the size and pixel pitch of your monitor.

Doing this on a 13″ MacBook Pro Retina the 5″ edge is actually 6.875″ giving us a magnification factor of 1.375:1

On a 24″ 1920×1200 HP monitor the 5″ edge is pretty much 16″ long giving us a magnification factor of 3.2:1

And on a 27″ Eizo ColorEdge the 5″ side is 13.75″ or there abouts, giving a magnification factor of 2.75:1

The 24″ HP monitor has a long edge of not quite 20.5 inches containing 1920 pixels, giving it a pixel pitch of around 94ppi.

The 27″ Eizo has a long edge of 23.49 inches containing 2560 pixels, giving it a pixel pitch of 109ppi – this is why its magnification factor is less then the 24″ HP.

And the 13″ MacBook Pro Retina has a pixel pitch of 227ppi – hence the magnification factor is so low.

So WTF Gives with 1:1 or 100% View Magnification Andy?

Well, it’s simple.

The greatest majority of Ps users ‘think’ that a view magnification of 100% or 1:1 gives them a view of the image at full physical size, and some think it’s a full ppi resolution view, and they are looking at the image at 300ppi.

WRONG – on BOTH counts !!

A 100% or 1:1 view magnification gives you a view of your image using ONE MONITOR or display PIXEL to RENDER ONE IMAGE PIXEL  In other words the image to display pixel ratio is now 1:1

So at a 100% or 1:1 view magnification you are viewing your image at exactly the same resolution as your monitor/display – which for the majority of desk top users means sub-100ppi.

Why do I say that?  Because the majority of desk top machine users run a 24″, sub 100ppi monitor – Hell, this time last year even I did!

When I view a 300ppi image at 100% view magnification on my 27″ Eizo, I’m looking at it in a lowly resolution of 109ppi.  With regard to its properties such as sharpness and inter-tonal detail, in essence, it looks only 1/3rd as good as it is in reality.

Hands up those who think this is a BAD THING.

Did you put your hand up?  If you did, then see me after school….

It’s a good thing, because if I can process it to look good at 109ppi, then it will look even better at 300ppi.

This also means that if I deliberately sharpen certain areas (not the whole image!) of high frequency detail until they are visually right on the ragged edge of being over-sharp, then the minuscule halos I might have generated will actually be 3 times less obvious in reality.

Then when I print the image at 1440, 2880 or even 5760 DOTS per inch (that’s Epson stuff), that print is going to look so sharp it’ll make your eyeballs fall to bits.

And that dpi print resolution, coupled with sensible noise control at monitor ppi and 100% view magnification, is why noise doesn’t print to anywhere near the degree folk imagine it will.

This brings me to a point where I’d like to draw your attention to my latest YouTube video:

Did you like that – cheeky little trick isn’t it!

Anyway, back to the topic at hand.

If I process on a Retina display at over 200ppi resolution, I have a two-fold problem:

  • 1. I don’t have as big a margin or ‘fudge factor’ to play with when it comes to things like sharpening.
  • 2. Images actually look sharper than they are in reality – my 13″ MacBook Pro is horrible to process on, because of its excessive ppi and its small dimensions.

Seriously, if you are a stills photographer with a hankering for the latest 4 or 5k monitor, then grow up and learn to understand things for goodness sake!

Ultra-high resolution monitors are valid tools for video editors and, to a degree, stills photographers using large capacity medium format cameras.  But for us mere mortals on 35mm format cameras, they can actually ‘get in the way’ when it comes to image evaluation and processing.

Working on a monitor will a ppi resolution between the mid 90’s and low 100’s at 100% view magnification, will always give you the most flexible and easy processing workflow.

Just remember, Photoshop linear physical dimensions always ‘appear’ to be larger than ‘real inches’ !

And remember, at 100% view magnification, 1 IMAGE pixel is displayed by 1 SCREEN pixel.  At 50% view magnification 1 SCREEN pixel is actually displaying the dithered average of 2 IMAGE pixels.  At 25% magnification each monitor pixel is displaying the average of 4 image pixels.

Anyway, that’s about it from me until the New Year folks, though I am the worlds biggest Grinch, so I might well do another video or two on YouTube over the ‘festive period’ so don’t forget to subscribe over there.

Thanks for reading, thanks for watching my videos, and Have a Good One!

 

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

YouTube Channel Latest Video Training

My YouTube Channel Latest Photography Video Training.

I’ve been busy this week adding more content to the old YouTube channel.

Adding content is really time-consuming, with recording times taking around twice the length of the final video.

Then there’s the editing, which usually takes around the same time, or a bit longer.  Then encoding and compression and uploading takes around the same again.

So yes, a 25 minute video takes A LOT more than 25 minutes to make and make live for the world to view.

This weeks video training uploads are:

This video deals with the badly overlooked topic of raw file demosaicing.

Next up is:

This video is a refreshed version of getting contrast under control in Lightroom – particularly Lightroom Classic CC.

Then we have:

This video is something of a follow-up to the previous one, where I explain the essential differences between contrast and clarity.

And finally, one from yesterday – which is me, restraining myself from embarking on a full blown ‘rant’, all about the differences between DPI (dots per inch) and PPI (pixels per inch):

Important Note

Viewing these videos is essential for the betterment of your understanding – yes it is!  And all I ask for in terms of repayment from yourselves is that you:

  1. Click the main channel subscribe button HERE https://www.youtube.com/c/AndyAstbury
  2. Give the video a ‘like’ by clicking the thumbs up!

YouTube is a funny old thing, but a substantial subscriber base and like videos will bring me closer to laying my hands on latest gear for me to review for you!

If all my blog subscribers would subscribe to my YouTube channel then my subs would more than treble – so go on, what are you waiting for.

I do like creating YouTube free content, but I do have to put food on the table, so I have to do ‘money making stuff’ as well, so I can’t afford to become a full-time YouTuber yet!  But wow, would I like to be in that position.

So that’s that – appeal over.

Watch the videos, and if you have any particular topic you would like me to do a video on, then please just let me know.  Either email me, or you can post in the comments below – no comment goes live here unless I approve it, so if you have a request but don’t want anyone else to see it, then just say.

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Adobe Lightroom Classic and Photoshop CC 2018 tips

Adobe Lightroom Classic and Photoshop CC 2018 tips – part 1

So, you’ve either upgraded to Lightroom Classic CC and Photoshop CC 2018, or you are thinking doing so.

Well, here are a couple of things I’ve found – I’ve called this part1, because I’m sure there will be other problems/irritations!

Lightroom Classic CC GPU Acceleration problem

If you are having problems with shadow areas appearing too dark and somewhat ‘chocked’ in the develop module – but things look fine in the Library module – then just follow the simple steps in the video above and TURN OFF GPU Acceleration in the Lightroom preferences panel under the performance tab.

Adobe Lightroom Classic and Photoshop CC 2018 tips

Turn OFF GPU Acceleration

UPDATE: I have subsequently done another video on this topic that illustrates the fact that the problem did not exist in Lr CC 2015 v.12/Camera Raw v.9.12

In the new Photoshop CC 2018 there is an irritation/annoyance with the brush tool, and something called the ‘brush leash’.

Now why on earth you need your brush on a leash God ONLY KNOWS!

But the brush leash manifests itself as a purple/magenta line that follows your brush tool everywhere.

You have a smoothness slider for your brush – it’s default setting is 10%.  If we increase that value then the leash line gets even longer, and even more bloody irritating.

And why we would need an indicator (which is what the leash is) of smoothness amount and direction for our brush strokes is a bit beyond me – because we can see it anyway.

So, if you want to change the leash length, use the smoothing slider.

If you want to change the leash colour just go to Photoshop>Preferences>Cursors

Adobe Lightroom Classic and Photoshop CC 2018 tips

Here, you can change the colour, or better still, get rid of it completely by unticking the “show brush leash while smoothing” option.

So there are a couple of tips from my first 24 hours with the latest 2018 ransom ware versions from Adobe!

But I’m sure there will be more, so stay tuned, and consider heading over to my YouTube channel and hitting the subscribe button, and hit the ‘notifications bell’ while you’re at it!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

 

Monitor Calibration Update

Monitor Calibration Update

Okay, so I no longer NEED a new monitor, because I’ve got one – and my wallet is in Leighton Hospital Intensive Care Unit on the critical list..

What have you gone for Andy?  Well if you remember, in my last post I was undecided between 24″ and 27″, Eizo or BenQ.  But I was favoring the Eizo CS2420, on the grounds of cost, both in terms of monitor and calibration tool options.

But I got offered a sweet deal on a factory-fresh Eizo CS270 by John Willis at Calumet – so I got my desire for more screen real-estate fulfilled, while keeping the costs down by not having to buy a new calibrator.

monitor calibration update

But it still hurt to pay for it!

Monitor Calibration

There are a few things to consider when it comes to monitor calibration, and they are mainly due to the physical attributes of the monitor itself.

In my previous post I did mention one of them – the most important one – the back light type.

CCFL and WCCFL – cold cathode fluorescent lamps, or LED.

CCFL & WCCFL (wide CCFL) used to be the common type of back light, but they are now less common, being replaced by LED for added colour reproduction, improved signal response time and reduced power consumption.  Wide CCFL gave a noticeably greater colour reproduction range and slightly warmer colour temperature than CCFL – and my old monitor was fitted with WCCFL back lighting, hence I used to be able to do my monitor calibration to near 98% of AdobeRGB.

CCFL back lights have one major property – that of being ‘cool’ in colour, and LEDs commonly exhibit a slightly ‘warmer’ colour temperature.

But there’s LEDs – and there’s LEDs, and some are cooler than others, some are of fixed output and others are of a variable output.

The colour temperature of the backlighting gives the monitor a ‘native white point’.

The ‘brightness’ of the backlight is really the only true variable on a standard type of LCD display, and the inter-relationship between backlight brightness and colour temperature, and the size of the monitors CLUT (colour look-up table) can have a massive effect on the total number of colours that the monitor can display.

Industry-standard documentation by folk a lot cleverer than me has for years recommended the same calibration target settings as I have alluded to in previous blog posts:

White Point: D65 or 6500K

Brightness: 120 cdm² or candelas per square meter

Gamma: 2.2

monitor calibration update

The ubiquitous ColorMunki Photo ‘standard monitor calibration’ method setup screen.

This setup for ‘standard monitor calibration’ works extremely well, and has stood me in good stead for more years than I care to add up.

As I mentioned in my previous post, standard monitor calibration refers to a standard method of calibration, which can be thought of as ‘software calibration’, and I have done many print workshops where I have used this method to calibrate Eizo ColorEdge and NEC Spectraviews with great effect.

However, these more specialised colour management monitors have the added bonus of giving you a ‘hardware monitor calbration’ option.

To carry out a hardware monitor calibration on my new CS270 ColorEdge – or indeed any ColorEdge – we need to employ the Eizo ColorNavigator.

The start screen for ColorNavigator shows us some interesting items:

monitor calibration update

The recommended brightness value is 100 cdm² – not 120.

The recommended white point is D55 not D65.

Thank God the gamma value is the same!

Once the monitor calibration profile has been done we get a result screen of the physical profile:

monitor calibration update

Now before anyone gets their knickers in a knot over the brightness value discrepancy there’s a couple of things to bare in mind:

  1. This value is always slightly arbitrary and very much dependent on working/viewing conditions.  The working environment should be somewhere between 32 and 64 lux or cdm² ambient – think Bat Cave!  The ratio of ambient to monitor output should always remain at between 32:75/80 and 64:120/140 (ish) – in other words between 1:2 and 1:3 – see earlier post here.
  2. The difference between 100 and 120 cdm² is less than 1/4 stop in camera Ev terms – so not a lot.

What struck me as odd though was the white point setting of D55 or 5500K – that’s 1000K warmer than I’m used to. (yes- warmer – don’t let that temp slider in Lightroom cloud your thinking!).

monitor calibration updateAfter all, 1000k is a noticeable variation – unlike the brightness 20cdm² shift.

Here’s the funny thing though; if I ‘software calibrate’ the CS270 using the ColorMunki software with the spectro plugged into the Mac instead of the monitor, I visually get the same result using D65/120cdm² as I do ‘hardware calibrating’ at D55 and 100cdm².

The same that is, until I look at the colour spaces of the two generated ICC profiles:

monitor calibration update

The coloured section is the ‘software calibration’ colour space, and the wire frame the ‘hardware calibrated’ Eizo custom space – click the image to view larger in a separate window.

The hardware calibration profile is somewhat larger and has a slightly better black point performance – this will allow the viewer to SEE just that little bit more tonality in the deepest of shadows, and those perennially awkward colours that sit in the Blue, Cyan, Green region.

It’s therefore quite obvious that monitor calibration via the hardware/ColorNavigator method on Eizo monitors does buy you that extra bit of visual acuity, so if you own an Eizo ColorEdge then it is the way to go for sure.

Having said that, the differences are small-ish so it’s not really worth getting terrifically evangelical over it.

But if you have the monitor then you should have the calibrator, and if said calibrator is ‘on the list’ of those supported by ColorNavigator then it’s a bit of a JDI – just do it.

You can find the list of supported calibrators here.

Eizo and their ColorNavigator are basically making a very effective ‘mash up’ of the two ISO standards 3664 and 12646 which call for D65 and D50 white points respectively.

Why did I go CHEAP ?

Well, cheaper…..

Apart from the fact that I don’t like spending money – the stuff is so bloody hard to come by – I didn’t want the top end Eizo in either 27″ or 24″.

With the ‘top end’ ColorEdge monitors you are paying for some things that I at least, have little or no use for:

  • 3D CLUT – I’m a general sort of image maker who gets a bit ‘creative’ with my processing and printing.  If I was into graphics and accurate repro of Pantone and the like, or I specialised in archival work for the V & A say, then super-accurate colour reproduction would be critical.  The advantage of the 3D CLUT is that it allows a greater variety of SUBTLY different tones and hues to be SEEN and therefore it’s easier to VISUALLY check that they are maintained when shifting an image from one colour space to another – eg softproofing for print.  I’m a wildlife and landscape photographer – I don’t NEED that facility because I don’t work in a world that requires a stringent 100% colour accuracy.
  • Built-in Calibrator – I don’t need one ‘cos I’ve already got one!
  • Built-in Self-Correction Sensor – I don’t need one of those either!

So if your photography work is like mine, then it’s worth hunting out a ‘zero hours’ CS270 if you fancy the extra screen real-estate, and you want to spend less than if buying its replacement – the CS2730.  You won’t notice the extra 5 milliseconds slower response time, and the new CS2730 eats more power – but you do get a built-in carrying handle!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Raw File Compression

Raw File Compression.

Today I’m going to give you my point of view over that most vexatious question – is LOSSLESS raw file compression TRULY lossless?

I’m going to upset one heck of a lot of people here, and my chances of Canon letting me have any new kit to test are going to disappear over the horizon at a great rate of knots, but I feel compelled to post!

What prompts me to commit this act of potential suicide?

It’s this shot from my recent trip to Norway:

FW1Q1351-2

Direct from Camera

FW1Q1351

Processed in Lightroom

I had originally intended to shoot Nikon on this trip using a hire 400mm f2.8, but right at the last minute there was a problem with the lens that couldn’t be sorted out in time, so Calumet supplied me with a 1DX and a 200-400 f4 to basically get me out of a sticky situation.

As you should all know by now, the only problems I have with Canon cameras are their  short Dynamic Range, and Canons steadfast refusal to allow for uncompressed raw recording.

The less experienced shooter/processor might look at the shot “ex camera” and be disappointed – it looks like crap, with far too much contrast, overly dark shadows and near-blown highlights.

Shot on Nikon the same image would look more in keeping with the processed version IF SHOT using the uncompressed raw option, which is something I always do without fail; and the extra 3/4 stop dynamic range of the D4 would make a world of difference too.

Would the AF have done as good a job – who knows!

The lighting in the shot is epic from a visual PoV, but bad from a camera exposure one. A wider dynamic range and zero raw compression on my Nikon D4 would allow me to have a little more ‘cavalier attitude’ to lighting scenarios like this – usually I’d shoot with +2/3Ev permanently dialled into the camera.  Overall the extra dynamic range would give me less contrast, and I’d have more highlight detail and less need to bump up the shadow areas in post.

In other words processing would be easier, faster and a lot less convoluted.

But I can’t stress enough just how much detrimental difference LOSSLESS raw file compression CAN SOMETIMES make to a shot.

Now there is a lot – and I mean A LOT – of opinionated garbage written all over the internet on various forums etc about lossless raw file compression, and it drives me nuts.  Some say it’s bad, most say it makes no difference – and both camps are WRONG!

Sometimes there is NO visual difference between UNCOMPRESSED and LOSSLESS, and sometimes there IS.  It all depends on the lighting and the nature of the scene/subject colours and how they interact with said lighting.

The main problem with the ‘it makes no difference’ camp is that they never substantiate their claims; and if they are Canon shooters they can’t – because they can’t produce an image with zero raw file compression to compare their standard lossless CR2 files to!

So I’ve come up with a way of illustrating visually the differences between various levels of raw file compression on Nikon using the D800E and Photoshop.

But before we ‘get to it’ let’s firstly refresh your understanding. A camera raw file is basically a gamma 1.0, or LINEAR gamma file:

gamma,gamma encoding,Andy Astbury

Linear (top) vs Encoded Gamma

The right hand 50% of the linear gamma gradient represents the brightest whole stop of exposure – that’s one heck of a lot of potential for recording subtle highlight detail in a raw file.

It also represents the area of tonal range that is frequently most effected by any form of raw file compression.

Neither Nikon or Canon will reveal to the world the algorithm-based methods they use for lossless or lossy raw file compression, but it usually works by a process of ‘Bayer Binning’.

Bayer_Pattern

If we take a 2×2 block, it contains 2 green, 1 red and 1 blue photosite photon value – if we average the green value and then interpolate new values for red and blue output we will successfully compress the raw file.  But the data will be ‘faux’ data, not real data.

The other method we could use is to compress the tonal values in that brightest stop of recorded highlight tone – which is massive don’t forget – but this will result in a ’rounding up or down’ of certain bright tonal values thus potentially reducing some of the more subtle highlight details.

We could also use some variant of the same type of algorithm to ‘rationalise’ shadow detail as well – with pretty much the same result.

In the face of Nikon and Canons refusal to divulge their methodologies behind raw file compression, especially lossless, we can only guess what is actually happening.

I read somewhere that with lossless raw file compression the compression algorithms leave a trace instruction about what they have done and where they’ve done it in order that a raw handler programme such as Lightroom can actually ‘undo’ the compression effects – that sounds like a recipe for disaster if you ask me!

Personally I neither know nor do I care – I know that lossless raw file compression CAN be detrimental to images shot under certain conditions, and here’s the proof – of a fashion:

Let’s look at the following files:

raw file compression

Image 1: 14 bit UNCOMPRESSED

raw file compression

Image 2: 14 bit UNCOMPRESSED

raw file compression

Image 3: 14 bit LOSSLESS compression

raw file compression

Image 4: 14 bit LOSSY compression

raw file compression

Image 5: 12 bit UNCOMPRESSED

Yes, there are 2 files which are identical, that is 14 bit uncompressed – and there’s a reason for that which will become apparent in a minute.

First, some basic Photoshop ‘stuff’.  If I open TWO images in Photoshop as separate layers in the same document, and change the blend mode of the top layer to DIFFERENCE I can then see the differences between the two ‘images’.  It’s not a perfect way of proving my point because of the phenomenon of photon flux.

Photon Flux Andy??? WTF is that?

Well, here’s where shooting two identical 14 bit uncompressed files comes in – they themselves are NOT identical!:

controlunamplified control

The result of overlaying the two identical uncompressed raw files (above left) – it looks almost black all over indicating that the two shots are indeed pretty much the same in every pixel.  But if I amplify the image with a levels layer (above right) you can see the differences more clearly.

So there you have it – Photon Flux! The difference between two 14 bit UNCOMPRESSED raw files shot at the same time, same ISO, shutter speed AND with a FULLY MANUAL APERTURE.  The only difference between the two shots is the ratio and number of photons striking the subject and being reflected into the lens.

Firstly 14 Bit UNCOMPRESSED compared to 14 bit LOSSLESS (the important one!):

raw file compression

14 bit UNCOMPRESSED vs 14 bit LOSSLESS

Please remember, the above ‘difference’ image contains photon flux variations too, but if you look carefully you will see greater differences than in the ‘flux only’ image above.

raw file compression raw file compression

The two images above illustrate the differences between 14 bit uncompressed and 14 bit LOSSY compression (left) and 14 bit UNCOMPRESSED and 12 bit UNCOMPRESSED (right) just for good measure!

In Conclusion

As I indicated earlier in the post, this is not a definitive testing method, sequential shots will always contain a photon flux variation that ‘pollutes’ the ‘difference’ image.

I purposefully chose this white subject with textured aluminium fittings and a blackish LED screen because the majority of sensor response will lie in that brightest gamma 1.0 stop.

The exposure was a constant +1EV, 1/30th @ f 18 and 100 ISO – nearly maximum dynamic range for the D800E, and f18 was set manually to avoid any aperture flicker caused by auto stop down.

You can see from all the ‘difference’ images that the part of the subject that seems to suffer the most is the aluminium part, not the white areas.  The aluminium has a stippled texture causing a myriad of small specular highlights – brighter than the white parts of the subject.

What would 14 bit uncompressed minus 14 bit lossless minus photon flux look like?  In a perfect world I’d be able to show you accurately, but we don’t live in one of those so I can’t!

We can try it using the flux shot from earlier:

raw file compression

But this is wildly inaccurate as the flux component is not pertinent to the photons at the actual time the lossless compression shot was taken.  But the fact that you CAN see an image does HINT that there is a real difference between UNCOMPRESSED and LOSSLESS compression – in certain circumstances at least.

If you have never used a camera that offers the zero raw file compression option then basically what you’ve never had you never miss.  But as a Nikon shooter I shoot uncompressed all the time – 90% of the time I don’t need to, but it just saves me having to remember something when I do need the option.

raw file compression

Would this 1DX shot be served any better through UNCOMPRESSED raw recording?  Most likely NO – why?  Low Dynamic Range caused in the main by flat low contrast lighting means no deep dark shadows and nothing approaching a highlight.

I don’t see it as a costly option in terms of buffer capacity or on-board storage, and when it comes to processing I would much rather have a surfeit of sensor data rather than a lack of it – no matter how small that deficit might be.

Lossless raw file compression has NO positive effect on your images, and it’s sole purpose in life is to allow you to fit more shots on the storage media – that’s it pure and simple.  If you have the option to shoot uncompressed then do so, and buy a bigger card!

What pisses my off about Canon is that it would only take, I’m sure, a firmware upgrade to give the 1DX et al the ability to record with zero raw file compression – and, whether needed or not, it would stop miserable grumpy gits like me banging on about it!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.