Image Data Storage and Backup

Image Data Storage and Backup

Image Data Storage and Backup

It’s a sad fact of life but here’s the news:

Your computer System disc is ‘doomed to failure’ at some point in time simply because it has to work VERY HARD.

Desktop machines can be fitted with more than 1 internal drive, but laptops and ‘all-in-ones’ like the iMac can usually only use one internal drive.

Non-system, or external drives work A LOT LESS and so make for a more secure home for your precious original camera files.

So when it comes to Image Data Storage and Backup, the first thing I would advise is that these external drives are where our image files MUST live.

Applications such as Lightroom ‘live’ on your system drive – they have to.

With regard to Lightroom your next most valuable commodity is your CATALOGUE because of all the valuable processing & IPTC data it contains.

By default this is stored on your SYSTEM drive – usually in ‘PICTURES or My Pictures’ – and so from an Image Data Storage and Backup perspective it is actually in the worst place possible!

Your Lightroom catalogue comprises 3 files:

  1. The actual .lrcat file itself
  2. The catalogue previews.lrdata file
  3. The catalogue smart previews.lrdata file

Note: Lightroom will NOT function unless all 3 of these files are in the same place!

Personally I never use 3 – smart previews – do I Mr. Stott (private wise-crack)!  But the file still needs to be there for Lightroom to function.

The .lrcat file, considering the information it carries, isn’t really all that large.  But the accompanying previews.lrdata file can get MASSIVE!

You will see all sorts of crap talked about catalogue location on the internet – mainly about the speed benefits of locating it on your fastest drive.  The fastest drive on most machines is commonly the system drive – especially when computers have SSD drives fitted.

But not only is this the most vulnerable location for the main .lrcat file, the system drive is now open to being swamped by the size of the previews.lrdata file – resulting in your system speed decreasing because applications have not got enough free disk space in order to function correctly.

So catalogue location is always a trade-off in some way or other, but if you take my advice keep your catalogue (or catalogues) well away from your system drive.

Image Data Storage and Backup – Simple Image Data Storage and Backup Using Lightroom Import.

Image Data Storage and Backup

In the simple setup above, I have two identical hard drives plugged into the MacBook Pro and I’m using drive A for working storage and drive B as a backup.  My Lightroom catalogue is on drive A.

This is my standard setup for workshops, overseas trips etc.

I use the Import Dialogue in Lightroom to import the images to the A drive, and thus into my catalogue.

I COULD also select the option (see blow) to make simultaneous backups of the image files onto drive B:

Image Data Storage and Backup

However, this means that my A drive contains my catalogue, image previews, raw files, .xmp files and image adjustments, and any tiffs or jpegs I make from the raw files.  But the B drive just contains copies of the raw files and nothing else.

There is nothing wrong with working like this – after all the most important thing is to get those raw files backed up, and this method of working achieves that goal very well.

I could elect to backup the catalogue to the B drive too, but any tiff or jpeg files I made –  though indexed in the backup catalogue – would only exist on the A drive unless I copied them over to the B drive manually.

Image Data Storage and Backup – “A Slightly Better Mouse Trap”?

The way I work is a bit different, and instead of duplicating the raw files to the B drive on import, I CLONE drive A to drive B.

Cloning the A drive to the B drive ensures that ALL the contents of the A drive – and I mean everything – is safely duplicated and backed up to the backup B drive.

Because I work on the Mac system I use Carbon Copy Cloner, and most Windows users I know use Acronis True Image to achieve the same goal.

So, working with this simple system let’s go through the process:

1. Import images from our camera media to the Lightroom catalogue by putting the images on the A drive.

2. Work inside the catalogue – deleting non-keepers, adding meta data, processing etc.

3. When finished working, drive A contains all the files we need/want to keep so the two drives look like this:

Image Data Storage and Backup

As you can see, drive B is empty, and drive A contains all my data.

4. Start our cloning application – in this case Carbon Copy Cloner:

Image Data Storage and Backup

5. Set the source drive (the disk you want to clone) and the destination (where you want the clone to live) in the task pane and start the operation by clicking the ‘clone’ button.

Image Data Storage and Backup

6. While the clone task is running, notice the wording – ‘Comparing files on source and destination’ – this is interesting.  If we add more images and other data to the A drive later and run the task again in order to ‘update the clone’ CCC only adds files that are new together with files that have changed.

This means that new images are added, as are the changes to the Lightroom catalogue and previews, but it does NOT re-write unchanged files.

So the ‘cloning task’ runs much faster when you re-clone!

7. When the clone task is complete you can see that the file structure of the two disks is identical:

Image Data Storage and Backup

CCC creates (unless you tell it otherwise) a ‘Safety Net’ folder.

Say I’m working in the Lightroom catalogue and find a crappy image that I missed on my original ‘cull’.  I can tell Lightroom to delete it from the disk and thus from the catalogue.  But it will still be on the clone image/B drive, and registered within the clone of the catalogue.

So when I run the clone task again the catalogue clone is modified to register the removal of the image, but the actual image file is not deleted from the B drive – instead it gets moved to the ‘Safety Net’ folder (just in case I’ve been an idiot!).

Put simply, I can delete files (or even folders) from the A drive, but they will not be deleted from the B drive but moved to the ‘Safety Net’ folder on the next clone operation.

Good ehh?

Image Data Storage and Backup – Hard Drives and Connections

A vexatious subject for sure, and it all comes down to money!

With the laptop setup at the start of this article, the two hard disks I’m using are 500Gb G-Drives in Thunderbolt carts.

These are fast drives on very fast connections, and I do not notice any speed reduction in catalogue functionality.

I can also swap the drives from Thunderbolt to USB 3.0 carts and plug them directly into the USB 3.0 ports on my Mac Pro (which is basically a modified 2009 machine which won’t accept Thunderbolt bus connections).

For me, USB 3.0 bus speeds are plenty fast enough too,  so plugging that A drive into my Mac Pro and opening the Lightroom catalogue I created in Norway on a MacBookPro gives me pretty much the same performance in my office as I had on location.

I can now import the catalogue and files onto my main office machine, and either keep it as a stand-alone catalogue, or merge it with an existing one:

Image Data Storage and Backup

Image Data Storage and Backup – To Raid or not to Raid?

Desktop computers have space in them for the addition of more hard drives – I have 6 in mine:

  1. My system disc, which is a 500Gb SSD
  2. 3 TB storage
  3. 3 TB storage
  4. 1TB partitioned into two 500Gb drives – 500Gb for Photoshop Scratch purposes and 500 Gb bootable system disc backup done with CCC.

Discs 5 & 6 are a RAID 0 pair of 4TB WD Blacks, so they basically behave like an 8TB disc.

So what is RAID, and why use it?

Redundant Array of Independent Disks – RAID – in it’s simplest everyday form comes in two flavours – RAID 0 and RAID 1.

If we take two identical drives we can use our operating system drive management and built-in RAID controllers to configure the two drives to work simultaneously in either a 0 or 1 configuration.

RAID 1, from an Image Data Storage and Backup perspective, is a Godsend because as you write a file to one drive it is mirrored onto the other drive in the RAID pair – in other words INSTANT backup without you having to lift so much as a finger!

RAID 0, from an Image Data Storage and Backup perspective is quite the opposite.  As you write a file to a RAID 0 pair that file is broken into blocks, and half the blocks are written to one drive, the other half to the other drive.

Advantage of RAID 0 – both drives are written at the same time, so the write speed is in effect twice as fast as RAID 1.

Disadvantage of RAID 0 – lose 1 of the two drives and ALL data is lost – ooops!

The speed gain is why I use this 8TB RAID 0 two drive array, but only because I have it  mirrored onto an external 8TB G-Drive 2-bay unit configured in RAID 0 as a permanent backup.

See – I told you money was involved!  Don’t waste your cash on new cameras – spend it all on quality glass and hard drives.  You can never have enough of the latter, and the former is what improves the quality of your images.

And that’s not all – I have two 2TB Seagate expansion drives sitting on top of my desktop machine – both on USB 3.0, one operates as my Mac Time Machine backup drive, and the other is the capture, edit and project storage drive for Camtasia which I use for all my training and YouTube videos.

Now obviously you won’t need anything like the volume of storage I have, and I would certainly not recommend you use RAID 0 pairs of drives either.

But for effective image data storage and backup it is both safer and kinder to your overall system speed to keep both your images and your Lightroom catalogue on a drive or drives that are separate from the drive your applications run on.

Footnote: You’ll come across people on the internet/YouTube who will lead you to believe that they back their images up to ‘the Cloud’.

Consider someone who comes back from Norway after a week on the eagles with ‘yours-truly’ they have been shooting, culling and editing over there and they still might come back with 4000 raw files.  How do you back those up to the cloud?

Let’s now say we cull 90% of those raws, and end up with 400. Let’s now take the best 25% of those and create archival tiff files for print – that’s over 6Gb of files from your average FX camera.  Let’s also take 50% of those 400 shots and make some stock submission full resolution jpegs that may average 5MB each.

So from just that 5 days of shooting we end up with 14GB of raw files, 6.5GB of archive images and over 1GB of jpegs – 21.5GB in all for 5 days shooting.

How long is it going to take to upload 20GB+ over your average internet connection?

If you are a shooter of any kind of volume then Cloud backup is not a viable option in my opinion, because it’s your raw and archive images that are critical, not the jpegs!

Latest YouTube Video:

In case you missed it I uploaded a video to my YouTube channel the other day, processing a low key Barn Owl image in Lightroom and Photoshop and using the Lumenzia plugin to do some very simple but effective localised adjustments.

Overall, the image would be 100% IMPOSSIBLE to create in just Lightroom.

If you found this free content useful and would like to see more plus extras not viewable to the general public then please consider joining my existing patrons over on my Patreon page by clicking the link below.

Many thanks!

Image Sharpening

Image Sharpening and Raw Conversion.

A lot of people imagine that there is some sort of ‘magic bullet’ method for sharpening images.

Well, here’s the bad news – there isn’t !

Even if you shoot the same camera and lens combo at the same settings all the time, your images will exhibit an array of various properties.

And those properties, and the ratio/mix thereof, can, and will, effect the efficacy of various sharpening methods and techniques.

And, those properties will rarely be the same from shoot to shoot.

Add interchangeable lenses, varied lighting conditions, and assorted scene brightness and contrast ranges to the mix – now the range of image properties has increased exponentially.

What are the properties of an image that can determine your approach to sharpening?

I’m not even going to attempt to list them all here, because that would be truly frightening for you.

But sharpening is all about pixels, edges and contrast.  And our first ‘port of call’ with regard to all three of those items is ‘demosaicing’ and raw file conversion.

“But Andy, surely the first item should be the lens” I here you say.

No, it isn’t.

And if that were the case, then we would go one step further than that, and say that it’s the operators ability to focus the lens!

So we will take it as a given, that the lens is sharp, and the operator isn’t quite so daft as they look!

Now we have a raw file, taken with a sharp lens and focused to perfection.

Let’s hand that file to two raw converters, Lightroom and Raw Therapee:

Image Sharpening

I am Lightroom – Click me!

Image Sharpening

I am Raw Therapee – Click me!

In both raw converters there is ZERO SHARPENING being applied. (and yes, I know the horizon is ‘wonky’!).

Now check out the 800% magnification shots:

Image Sharpening

Lightroom at 800% – Click me!

Image Sharpening

Raw Therapee at 800% – Click me!

What do we see on the Lightroom shot at 800%?

A sharpening halo, but hang on, there is NO sharpening being applied.

But in Raw Therapee there is NO halo.

The halo in Lightroom is not a sharpening halo, but a demosaicing artifact that LOOKS like a sharpening halo.

It is a direct result of the demosaicing algorithm that Lightroom uses.

Raw Therapee on the other hand, has a selection of demosaicing algorithms to choose from.  In this instance, it’s using its default AMaZE (Alias Minimization & Zipper Elimination) algorithm.  All told, there are 10 different demosaic options in RT, though some of them are a bit ‘old hat’ now.

There is no way of altering the base demosaic in Lightroom – it is something of a fixed quantity.  And while it works in an acceptable manner for the majority of shots from an ever burgeoning mass of digital camera sensors, there will ALWAYS be exceptions.

Let’s call a spade a bloody shovel and be honest – Lightrooms demosaicing algorithm is in need of an overhaul.  And why something we have to pay for uses a methodology worse than something we get for free, God only knows.

It’s a common problem in Lightroom, and it’s the single biggest reason why, for example, landscape exposure blends using luminosity masks fail to work quite as smoothly as you see demonstrated on the old Tube of You.

If truth be told – and this is only my opinion – Lightroom is by no means the best raw file processor in existence today.

I say that with a degree of reservation though, because:

  1. It’s very user friendly
  2. It’s an excellent DAM (digital asset management) tool, possibly the best.
  3. On the surface, it only shows its problems with very high contrast edges.

As a side note, my Top 4 raw converters/processors are:

  1. Iridient Developer
  2. Raw Therapee
  3. Capture One Pro
  4. Lightroom

Iridient is expensive and complex – but if you shoot Fuji X-Trans you are crazy if you don’t use it.

Raw Therapee is very complex (and slightly ‘clunky’ on Mac OSX) but it is very good once you know your way around it. And it’s FREEEEEEEEE!!!!!!!

Iridient and RT have zero DAM capability that’s worth talking about.

Capture One Pro is a better raw converter on the whole than Lightroom, but it’s more complex, and its DAM structure looks like it was created by crack-smoking monkeys when you compare it to the effective simplicity of Lightroom.

If we look at Lightroom as a raw processor (as opposed to raw converter) it encourages the user to employ ‘recovery’ in shadow and highlight areas.

Using BOTH can cause halos along high contrast edges, and edges where high frequency detail sits next to very low frequency detail of a contrasting colour – birds in flight against a blue sky spring to mind.

Why do I keep ‘banging on’ about edges?

Because edges are critical – and most of you guys ‘n gals hardly ever look at them close up.

All images contain areas of high and low frequency detail, and these areas require different process treatments, if you want to obtain the very best results AND want to preserve the ability to print.

Cleanly defined edges between these areas allow us to use layer masks to separate these areas in an image, and obtain the selective control.

Clean inter-tonal boundaries also allow us to separate shadows, various mid tone ranges, and highlights for yet more finite control.

Working on 16 bit images (well, 15 bit plus 1 level if truth be told) means we can control our adjustments in Photoshop within a range of 32,768 tones.  And there is no way in hell that localised adjustments in Lightroom can be carried out to that degree of accuracy – fact.

I’ll let you in to a secret here!  You all watch the wrong stuff on YouTube!  You sit and watch a video by God knows what idiot, and then wonder why what you’ve just seen them do does NOT work for you.

That’s because you’ve not noticed one small detail – 95% of the time they are working on jpegs!  And jpegs only have a tonal range of 256.  It’s really easy to make luminosity selections etc on such a small tonal range work flawlessly.  You try the same settings on a 16 bit image and they don’t work.

So you end up thinking it’s your fault – your image isn’t as ‘perfect’ as theirs – wrong!

It’s a tale I hear hundreds of times every year when I have folk on workshops and 1to1 tuition days.  And without fail, they all wish they’d paid for the training instead of trying to follow the free stuff.

You NEVER see me on a video working with anything but raw files and full resolution 16 bit images.

My only problem is that I don’t ‘fit into’ today’s modern ‘cult of personality’!

Most adjustments in Lightroom have a global effect.  Yes, we have range masks and eraser brushes.  But they are very poor relations of the pixel-precise control you can have in Photoshop.

Lightroom is – in my opinion of course – becoming polluted by the ‘one stop shop, instant gratification ideology’ that seems to pervade photography today.

Someone said to me the other day that I had not done a YouTube video on the new range masking option in Lightroom.  And they are quite correct.

Why?

Because it’s a gimmick – and real crappy one at that, when compared to what you can do in Photoshop.

Photoshop is the KING of image manipulation and processing.  And that is a hard core, irrefutable fact.  It has NO equal.

But Photoshop is a raster image editor, which means it needs to be fed a diet of real pixels.  Raw converters like Lightroom use ‘virtual pixels’ – in a manner of speaking.

And of course, Lightroom and the CameraRaw plug in for Photoshop amount to the same thing.  So folk who use either Lightroom or Photoshop EXCLUSIVELY are both suffering from the same problems – if they can be bothered to look for them.

It Depends on the Shot

sharpening

The landscape image is by virtue, a low ISO, high resolution shot with huge depth of field, and bags of high frequency inter-tonal detail that needs sharpening correctly to its very maximum.  We don’t want to sharpen the sky, as it’s sharp enough through depth of field, as is the water, and we require ZERO sharpening artifacts, and no noise amplification.

If we utilise the same sharpening workflow on the center image, then we’ll all get our heads kicked in!  No woman likes to see their skin texture sharpened – in point of fact we have to make it even more unsharp, smooth and diffuse in order to avoid a trip to our local A&E department.

The cheeky Red Squirrel requires a different approach again.  For starters, it’s been taken on a conventional ‘wildlife camera’ – a Nikon D4.  This camera sensor has a much lower resolution than either of the camera sensors used for the previous two shots.

It is also shot from a greater distance than the foreground subjects in either of the preceding images.  And most importantly, it’s at a far higher ISO value, so it has more noise in it.

All three images require SELECTIVE sharpening.  But most photographers think that global sharpening is a good idea, or at least something they can ‘get away with’.

If you are a photographer who wants to do nothing else but post to Facebook and Flickr then you might as well stop reading this post.  Good luck to you and enjoy your photography,  but everything you read in this post, or anywhere on this blog, is not for you.

But if you want to maximize the potential of your thousands of pounds worth of camera gear, and print or sell your images, then I hate to tell you, but you are going to have to LEARN STUFF.

Photoshop is where the magic happens.

As I said earlier, Photoshop is a raster image processor.  As such, it needs to be fed an original image that is of THE UTMOST QUALITY.  By this I mean a starting raw file that has been demosaiced and normalized to:

  1. Contain ZERO demosaic artifacts of any kind.
  2. Have the correct white and black points – in other words ZERO blown highlights or blocked shadows.  In other words, getting contrast under control.
  3. Maximize the midtones to tease out the highest amount of those inter-tonal details, because this is where your sharpening is going to take place.
  4. Contain no more sharpening than you can get away with, and certainly NOT the amount of sharpening you require in the finished image.

With points 1 thru 3 the benefits should be fairly obvious to you, but if you think about it for a second, the image described is rather ‘flattish – looking’.

But point 4 is somewhat ambiguous.  What Adobe-philes like to call capture or input sharpening is very dependent on three variables:

  1. Sensor megapixels
  2. Demosaic effeciency
  3. Sharpening method – namely Unsharp Mask or Deconvolution

The three are inextricably intertwined – so basically it’s a balancing act.

To learn this requires practice!

And to that end I’m embarking on the production of a set of videos that will help you get to grips with the variety of sharpening techniques that I use, and why I use them.

I’ll give you fair warning now – when finished it will be neither CHEAP nor SHORT, but it will be very instructive!

I want to get it to you as soon as possible, but you wouldn’t believe how long tuition videos take to produce.  So right now I’m going to say it should be ready at the end of February or early March.

UPDATE:  The new course is ready and on sale now, over on my digital download site.

sharpening

The link to the course page is HERE.

Hopefully I’ve given you a few things to think about in this post.

Don’t forget, I provide 1to1 and group tuition days in this and all things photography related.

And just in case you’ve missed it, here’s a demo of how useful Photoshop Smart Sharpen can be:

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Color Temperature

Lightroom Color Temperature (or Colour Temperature if you spell correctly!)

“Andy – why the heck is Lightrooms temperature slider the wrong way around?”

That’s a question that I used to get asked quite a lot, and it’s started again since I mentioned it in passing a couple of posts ago.

The short answer is “IT ISN”T….it’s just you who doesn’t understand what it is and how it functions”.

But in order to give the definitive answer I feel the need to get back to basics though – so here goes.

The Spectrum Locus

Let’s get one thing straight from the start – LOCUS is just a posh word for PATH!

Visible light is just part of the electro-magnetic energy spectrum typically between 380nm (nanometers) and 700nm:

Color Temperature

In the first image below is what’s known as the Spectrum Locus – as defined by the CIE (Commission Internationale de l´Eclairage or International Commission on Illumination).

In a nutshell the locus represents the range of colors visible to the human eye – or I should say chromaticities:

Color Temperature

The blue numbers around the locus are simply the nanometer values from that same horizontal scale above. The reasoning behind the unit values of the x and y axis are complex and irrelevant to us in this post, otherwise it’ll go on for ages.

The human eye is a fickle thing.

It will always perceive, say, 255 green as being lighter than 255 red or 255 blue, and 255 blue as being the darkest of the three.  And the same applies to any value of the three primaries, as long as all three are the same.

Color Temperature

This stems from the fact that the human eye has around twice the response to green light as it does red or blue – crazy but true.  And that’s why your camera sensor – if it’s a Bayer type – has twice the number of green photosites on it as red or blue.

In rather over-simplified terms the CIE set a standard by which all colors in the visible spectrum could be expressed in terms of ‘chromaticity’ and ‘brightness’.

Brightness can be thought of as a grey ramp from black to white.

Any color space is a 3 dimensional shape with 3 axes x, y and z.

Z is the grey ramp from black to white, and the shape is then defined by the colour positions in terms of their chromaticity on the x and y axes, and their brightness on the z axis:

Color Temperature

But if we just take the chromaticity values of all the colours visible to the human eye we end up with the CIE1931 spectrum locus – a two dimensional plot if you like, of the ‘perceived’ color space of human vision.

Now here’s where the confusion begins for the majority of ‘uneducated photographers’ – and I mean that in the nicest possible way, it’s not a dig!

Below is the same spectrum locus with an addition:

Color Temperature

This additional TcK curve is called the Planckian Locus, or dark body locus.  Now please don’t give up here folks, after all you’ve got this far, but it’ll get worse before it gets better!

The Planckian Locus simply represents the color temperature in degrees Kelvin of the colour emitted by a ‘dark body’ – think lump of pure carbon – as it is heated.  Its color temperature begins to visibly rise as its thermal temperature rises.

Up to a certain thermal temperature it’ll stay visibly black, then it will begin to glow a deep red.  Warm it up some more and the red color temperature turns to orange, then yellow and finally it will be what we can call ‘white hot’.

So the Planckian Locus is the 2D chromaticity plot of the colours emitted by a dark body as it is heated.

Here’s point of confusion number 1: do NOT jump to the conclusion that this is in any way a greyscale. “Well it starts off BLACK and ends up WHITE” – I’ve come across dozens of folk who think that – as they say, a little knowledge is a dangerous thing indeed!

What the Planckian Locus IS indicative of though is WHITE POINT.

Our commonly used colour management white points of D65, D55 and D50 all lie along the Planckian Locus, as do all the other CIE standard illumimant types of which there’s more than few.

The standard monitor calibration white point of D65 is actually 6500 Kelvin – it’s a standardized classification for ‘mean Noon Daylight’, and can be found on the Spectrum Locus/Plankckian Locus at 0.31271x, 0.32902y.

D55 or 5500 Kelvin is classed as Mid Morning/Mid Afternoon Daylight and can be found at 0.33242x, 0.34743y.

D50 or 5000 kelvin is classed as Horizon Light with co-ordinates of 0.34567x, 0.35850.

But we can also equate Planckian Locus values to our ‘picture taking’ in the form of white balance.

FACT: The HIGHER the color temperature the BLUER the light, and lower color temperatures shift from blue to yellow, then orange (studio type L photofloods 3200K), then more red (standard incandescent bulb 2400K) down to candle flame at around 1850K).  Sunset and sunrise are typically standardized at 1850K and LPS Sodium street lights can be as low as 1700K.

And a clear polar sky can be upwards of 27,000K – now there’s blue for you!

And here’s where we find confusion point number 2!

Take a look at this shot taken through a Lee Big Stopper:

Color Temperature

I’m an idle git and always have my camera set to a white balance of Cloudy B1, and here I’m shooting through a filter that notoriously adds a pretty severe bluish cast to an image anyway.

If you look at the TEMP and TINT sliders you will see Cloudy B1 is interpreted by Lightroom as 5550 Kelvin and a tint of +5 – that’s why the notation is ‘AS SHOT’.

Officially a Cloudy white balance is anywhere between 6000 Kelvin and 10,000 kelvin depending on your definition, and I’ve stuck extra blue in there with the Cloudy B1 setting, which will make the effective temperature go up even higher.

So either way, you can see that Lightrooms idea of 5550 Kelvin is somewhat ‘OFF’ to say the least, but it’s irrelevant at this juncture.

Where the real confusion sets in is shown in the image below:

Color Temperature

“Andy, now you’ve de-blued the shot why is the TEMP slider value saying 8387 Kelvin ? Surely it should be showing a value LOWER than 5550K – after all, tungsten is warm and 3200K”….

How right you are…..and wrong at the same time!

What Lightroom is saying is that I’ve added YELLOW to the tune of 8387-5550 or 2837.

FACT – the color temperature controls in Lightroom DO NOT work by adjusting the Planckian or black body temperature of light in our image.  They are used to COMPENSATE for the recorded Planckian/black body temperature.

If you load in image in the develop module of Lightroom and use any of the preset values, the value itself is ball park correct(ish).

The Daylight preset loads values of 5500K and +10. The Shade preset will jump to 7500K and +10, and Tungsten will drop to 2850K and +/-0.

But the Tungsten preset puts the TEMP slider in the BLUE part of the slider Blue/Yellow graduated scale, and the Shade preset puts the slider in the YELLOW side of the scale, thus leading millions of people into mistakenly thinking that 7500K is warmer/yellower than 2850K when it most definitely is NOT!

This kind of self-induced bad learning leaves people wide open to all sorts of misunderstandings when it comes to other aspects of color theory and color management.

My advice has always been the same, just ignore the numbers in Lightroom and do your adjustments subjectively – do what looks right!

But for heaven sake don’t try and build an understanding of color temperature based on the color balance control values in Lightroom – otherwise you get in one heck of a mess.

 

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Monitor Calibration Update

Monitor Calibration Update

Okay, so I no longer NEED a new monitor, because I’ve got one – and my wallet is in Leighton Hospital Intensive Care Unit on the critical list..

What have you gone for Andy?  Well if you remember, in my last post I was undecided between 24″ and 27″, Eizo or BenQ.  But I was favoring the Eizo CS2420, on the grounds of cost, both in terms of monitor and calibration tool options.

But I got offered a sweet deal on a factory-fresh Eizo CS270 by John Willis at Calumet – so I got my desire for more screen real-estate fulfilled, while keeping the costs down by not having to buy a new calibrator.

monitor calibration update

But it still hurt to pay for it!

Monitor Calibration

There are a few things to consider when it comes to monitor calibration, and they are mainly due to the physical attributes of the monitor itself.

In my previous post I did mention one of them – the most important one – the back light type.

CCFL and WCCFL – cold cathode fluorescent lamps, or LED.

CCFL & WCCFL (wide CCFL) used to be the common type of back light, but they are now less common, being replaced by LED for added colour reproduction, improved signal response time and reduced power consumption.  Wide CCFL gave a noticeably greater colour reproduction range and slightly warmer colour temperature than CCFL – and my old monitor was fitted with WCCFL back lighting, hence I used to be able to do my monitor calibration to near 98% of AdobeRGB.

CCFL back lights have one major property – that of being ‘cool’ in colour, and LEDs commonly exhibit a slightly ‘warmer’ colour temperature.

But there’s LEDs – and there’s LEDs, and some are cooler than others, some are of fixed output and others are of a variable output.

The colour temperature of the backlighting gives the monitor a ‘native white point’.

The ‘brightness’ of the backlight is really the only true variable on a standard type of LCD display, and the inter-relationship between backlight brightness and colour temperature, and the size of the monitors CLUT (colour look-up table) can have a massive effect on the total number of colours that the monitor can display.

Industry-standard documentation by folk a lot cleverer than me has for years recommended the same calibration target settings as I have alluded to in previous blog posts:

White Point: D65 or 6500K

Brightness: 120 cdm² or candelas per square meter

Gamma: 2.2

monitor calibration update

The ubiquitous ColorMunki Photo ‘standard monitor calibration’ method setup screen.

This setup for ‘standard monitor calibration’ works extremely well, and has stood me in good stead for more years than I care to add up.

As I mentioned in my previous post, standard monitor calibration refers to a standard method of calibration, which can be thought of as ‘software calibration’, and I have done many print workshops where I have used this method to calibrate Eizo ColorEdge and NEC Spectraviews with great effect.

However, these more specialised colour management monitors have the added bonus of giving you a ‘hardware monitor calbration’ option.

To carry out a hardware monitor calibration on my new CS270 ColorEdge – or indeed any ColorEdge – we need to employ the Eizo ColorNavigator.

The start screen for ColorNavigator shows us some interesting items:

monitor calibration update

The recommended brightness value is 100 cdm² – not 120.

The recommended white point is D55 not D65.

Thank God the gamma value is the same!

Once the monitor calibration profile has been done we get a result screen of the physical profile:

monitor calibration update

Now before anyone gets their knickers in a knot over the brightness value discrepancy there’s a couple of things to bare in mind:

  1. This value is always slightly arbitrary and very much dependent on working/viewing conditions.  The working environment should be somewhere between 32 and 64 lux or cdm² ambient – think Bat Cave!  The ratio of ambient to monitor output should always remain at between 32:75/80 and 64:120/140 (ish) – in other words between 1:2 and 1:3 – see earlier post here.
  2. The difference between 100 and 120 cdm² is less than 1/4 stop in camera Ev terms – so not a lot.

What struck me as odd though was the white point setting of D55 or 5500K – that’s 1000K warmer than I’m used to. (yes- warmer – don’t let that temp slider in Lightroom cloud your thinking!).

monitor calibration updateAfter all, 1000k is a noticeable variation – unlike the brightness 20cdm² shift.

Here’s the funny thing though; if I ‘software calibrate’ the CS270 using the ColorMunki software with the spectro plugged into the Mac instead of the monitor, I visually get the same result using D65/120cdm² as I do ‘hardware calibrating’ at D55 and 100cdm².

The same that is, until I look at the colour spaces of the two generated ICC profiles:

monitor calibration update

The coloured section is the ‘software calibration’ colour space, and the wire frame the ‘hardware calibrated’ Eizo custom space – click the image to view larger in a separate window.

The hardware calibration profile is somewhat larger and has a slightly better black point performance – this will allow the viewer to SEE just that little bit more tonality in the deepest of shadows, and those perennially awkward colours that sit in the Blue, Cyan, Green region.

It’s therefore quite obvious that monitor calibration via the hardware/ColorNavigator method on Eizo monitors does buy you that extra bit of visual acuity, so if you own an Eizo ColorEdge then it is the way to go for sure.

Having said that, the differences are small-ish so it’s not really worth getting terrifically evangelical over it.

But if you have the monitor then you should have the calibrator, and if said calibrator is ‘on the list’ of those supported by ColorNavigator then it’s a bit of a JDI – just do it.

You can find the list of supported calibrators here.

Eizo and their ColorNavigator are basically making a very effective ‘mash up’ of the two ISO standards 3664 and 12646 which call for D65 and D50 white points respectively.

Why did I go CHEAP ?

Well, cheaper…..

Apart from the fact that I don’t like spending money – the stuff is so bloody hard to come by – I didn’t want the top end Eizo in either 27″ or 24″.

With the ‘top end’ ColorEdge monitors you are paying for some things that I at least, have little or no use for:

  • 3D CLUT – I’m a general sort of image maker who gets a bit ‘creative’ with my processing and printing.  If I was into graphics and accurate repro of Pantone and the like, or I specialised in archival work for the V & A say, then super-accurate colour reproduction would be critical.  The advantage of the 3D CLUT is that it allows a greater variety of SUBTLY different tones and hues to be SEEN and therefore it’s easier to VISUALLY check that they are maintained when shifting an image from one colour space to another – eg softproofing for print.  I’m a wildlife and landscape photographer – I don’t NEED that facility because I don’t work in a world that requires a stringent 100% colour accuracy.
  • Built-in Calibrator – I don’t need one ‘cos I’ve already got one!
  • Built-in Self-Correction Sensor – I don’t need one of those either!

So if your photography work is like mine, then it’s worth hunting out a ‘zero hours’ CS270 if you fancy the extra screen real-estate, and you want to spend less than if buying its replacement – the CS2730.  You won’t notice the extra 5 milliseconds slower response time, and the new CS2730 eats more power – but you do get a built-in carrying handle!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Lightroom Folders Panel

Lightroom Folders Panel

Lightroom Folders PanelI find a lot of people are either confused or just plain unsure of what functions you can carry out in the Lightroom folders panel.

All Lightroom users should be familiar with the perennial problem of missing files, folders or drives from their Lightroom Catalogue, as indicated by the “!” exclamation mark in the top right corner of their image thumbnails.

Adobe really hack me off – for ages Lightroom indicated “missing” files with a question mark on the thumbnail.  But in their infinite wisdom they changed that to an exclamation mark (of course, still keeping the question mark in the Folders Panel!) just to confuse the bejesus out of everyone.  So now we have TWO TYPES of exclamation mark, each with a different meaning – nice one chaps……………….

Lightroom Folders Panel

With the exception of disconnected drives, missing files and folders are usually the result of moving files and folders via Windows Explorer on PC or Finder on the Mac.

And I find that in the majority of cases folk are just simply unaware that the same operations can be carried out within Lightroom via the Lightroom Folders Panel.

  • We can move files between folders.
  • We can move folders between drives.
  • And we can create new hierarchical folder structures on any attached drive.

All with the added bonuses of:

  1. Not leaving the Lightroom GUI.
  2. Lightroom does NOT loose the file/folder locations, so we avoid the dreaded “!” problem!

So I have created a couple of video lessons on YouTube:

If you are viewing this post via subscription email the please view the physical blog post – sometimes the video links do not show up in the emails.

Hopefully these two short lessons will enable you to understand the folder structure and placement options available to you via the Lightroom Folders Panel.

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Lightroom – Neutralise Hidden Exposure Compensation

Lightroom – Neutralise Hidden Exposure Compensation.

We all know how good Lightroom is – but it’s also a total pain in the arse!

Ages ago, I did a post about Lightroom 5 and accurate colour HERE and, according to this blogs page-view stats, that post still gets a large global viewing figure every month – so it’s something of an on-going problem for a lot of users.

But things have moved on a bit since then, and we are now working with the v5 release of Lightroom 6/CC 2015 – and things haven’t got any better, sadly, from the perspective of actually “seeing what you captured”.

The problem lies in the fact that Lightroom, for a long time, ceased to be a “neutral” RAW handler.  It uses a variety of ‘behind the scenes’ algorithms to add what it thinks are good adjustments in terms of exposure brightness and contrast.  In other words Lightroom adds hidden adjustments which we cannot see because they are not registered on the adjustment sliders under process version 2012.

Why does it do this – God only knows!

But when I take into account the support Lightroom currently offers for mobile phone cameras, cloud synch etc, I can’t help thinking that Adobe are trying to give Lightroom some sort of mass-market appeal by adding what the designers and coders think is some sort of WOW-factor to image previews – though I might be wrong!

But whatever Adobes reasoning, the fact remains that SOME OF US want to see our raw files for what they are – straight gamma 2.2 encoded versions of what the sensor recorded.  Only by learning how to Neutralise Hidden Exposure Compensation  can we actually arrive at a suitable starting point for the development process.

The Case To Answer

Firstly, a lot of you might be wondering WTF I’m ranting on about – your RAW image previews look great before you start doing anything to them – mmmmm….

If that’s the case then NEWS FLASH – RAW files should look as flat as dish-water pre-process, and you have do some work to make them look good.  So believe me, if your raws look “nice ‘n punchy” from the get-go then something is wrong somewhere!

Out there in photography land there are two RAW file handlers that are notorious for being “neutral” in their initial raw render – Raw Digger, and Iridient Developer.

Let me demonstrate the “case to answer” by using the same image I used the other day when giving Canon an indirect slagging off over lossless compression:

Neutralise Hidden Exposure Compensation in Lightroom

Raw file opened in Lightroom with no user adjustments BUT WITH Lightroom ‘hidden exposure compensation’.

Now let’s open the same file in Raw Digger:

Neutralise Hidden Exposure Compensation in Lightroom

Raw file opened in Raw Digger with no user adjustments.

And now in Iridient Developer:

Neutralise Hidden Exposure Compensation in Lightroom

Raw file opened in Iridient Developer with no user adjustments.

And now, just for good measure, my Lightroom-processed version of the image:

Neutralise Hidden Exposure Compensation in Lightroom

Raw file processed in Lightroom WITH user adjustments.

Both RAW Digger and Iridient Developer give the user a much better processing start point simply because they are neutral and don’t go about making contrast-loaded ‘background adjustments’.  And I’m sure you can see that the final Lightroom processed version of the image bares more resemblance to the RAW Digger and Iridient screen grabs than the Lightroom ‘as is’ preview.

Now if you are a total maniac then you can go and download either of the two aforementioned raw developers and get yourself super-confused or you can learn how to ‘neutralise’ the Lightroom background adjustment ‘crap’ – which is far easier!

How to Neutralise Hidden Exposure Compensation in Lightroom.

Step 1.  Scroll down to the Camera Calibration Panel in the Develop module and switch the Process Version from PV2012 to PV 2010:

Neutralise Hidden Exposure Compensation in Lightroom

Step 1 in Neutralising Lightroom Hidden Exposure Compensation.

Step 2.  Scroll up to the Basics panel (a very different looking one if you never used Lightroom 3!) and make the following changes:

  1. Blacks from 5 to 0
  2. Brightness from +50 to 0
  3. Contrast from +25 to 0
Neutralise Hidden Exposure Compensation in Lightroom

Step 2 in Neutralising Lightroom Hidden Exposure Compensation.

Step 3.  Move to the Tone Curve and change the Medium Contrast tone curve to Linear:

Neutralise Hidden Exposure Compensation in Lightroom

Step 3 in Neutralising Lightroom Hidden Exposure Compensation.

DO NOT concern yourself with the fact that your image has gone dark and flat, it’s to be expected!

Step 4.  Scroll back down to Camera Calibration and switch the process version BACK to PV2012, then scroll back up to the Basics Panel:

Neutralise Hidden Exposure Compensation in Lightroom

Step 4 in Neutralising Lightroom Hidden Exposure Compensation.

Step 5.  Yes I know it still looks awful, but if you now change that -1EV to 0 on the exposure slider you’ll get a great process start image:

Neutralise Hidden Exposure Compensation in Lightroom

Step 5 in Neutralising Lightroom Hidden Exposure Compensation.

Looking at the before and after images you can see that we have got contrast under control – in other words we have removed the excess contrast added to the image with the  Lightroom hidden background shenanigans.

Indeed, we can see exactly how much contrast has been removed with this ‘by the numbers’ process by looking at the -33 Contrast value – DO NOT RESET THIS BACK TO 0!!!!

The process has decreased contrast still further by lifting the Blacks value to +25.  You need to check the shadow areas on the image in this respect.  If they are looking a bit noisy (Hello Canon!) you might want to drop the blacks value to maybe +5 to +10 and open the shadows a bit more with a small positive adjustment to the Shadows slider in the basics panel.

And so processing is just a matter of a few subjective tweaks until I’m happy with the image:

Neutralise Hidden Exposure Compensation in Lightroom

Click to view larger image.

In the Tone Curve panel you can see the multi-point Custom Curve the process has added.  If you click the up/down arrows to the right of the word Custom you will see a menu giving you the option to save the curve:

Neutralise Hidden Exposure Compensation in Lightroom

Saving the custom curve.

I save the curve with the name 2010to2012 – by default it saves as an .xmp file, and to the user/Library/Application Support/Adobe/CameraRaw/Curves file path (Mac).

Saving the curve is useful as it makes for a very quick adjustment of further images.

However, there is a caveat (isn’t there always!) and it’s this:

The majority of  adjustments in Lightroom are specific to camera sensor and ISO.  In simple terms the same numeric value of adjustment to any control slider can have differing effects depending on the sensor it was made by and the ISO at which it was shot.  It’s very important that you wrap your head around this fact.

The curve I’ve produced here is correct for a Canon 1DX at the shot ISO which was 1000 or 800 if my memory serves correctly.  I could apply this curve to a 100 ISO image shot with a Nikon D800E, and it would do a good job, but I might get a slightly better result if I go through the whole process again to produce a custom curve for the D800E using a 100 ISO shot to begin with.  But even if that new curve visually gives a different result it will still have the same numeric values in the basics panel!

If I save the curve and then apply it to another image via the Tone Curve panel the contrast and blacks Basic Panel values do NOT change – but you will get a better distribution of contrast.

You may want to generate and save at least a low and high ISO variant of the curve for each of your camera bodies; or you could be a smart-arse like me by just using one curve and eye-balling the finer tweaks.

You can also create the curve and then save the settings as a User Develop Preset and then apply it to future imports via the import module.

So there you have it, how to Neutralise Hidden Exposure Compensation in Lightroom and see you images properly – have fun folks!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Raw File Compression

Raw File Compression.

Today I’m going to give you my point of view over that most vexatious question – is LOSSLESS raw file compression TRULY lossless?

I’m going to upset one heck of a lot of people here, and my chances of Canon letting me have any new kit to test are going to disappear over the horizon at a great rate of knots, but I feel compelled to post!

What prompts me to commit this act of potential suicide?

It’s this shot from my recent trip to Norway:

FW1Q1351-2

Direct from Camera

FW1Q1351

Processed in Lightroom

I had originally intended to shoot Nikon on this trip using a hire 400mm f2.8, but right at the last minute there was a problem with the lens that couldn’t be sorted out in time, so Calumet supplied me with a 1DX and a 200-400 f4 to basically get me out of a sticky situation.

As you should all know by now, the only problems I have with Canon cameras are their  short Dynamic Range, and Canons steadfast refusal to allow for uncompressed raw recording.

The less experienced shooter/processor might look at the shot “ex camera” and be disappointed – it looks like crap, with far too much contrast, overly dark shadows and near-blown highlights.

Shot on Nikon the same image would look more in keeping with the processed version IF SHOT using the uncompressed raw option, which is something I always do without fail; and the extra 3/4 stop dynamic range of the D4 would make a world of difference too.

Would the AF have done as good a job – who knows!

The lighting in the shot is epic from a visual PoV, but bad from a camera exposure one. A wider dynamic range and zero raw compression on my Nikon D4 would allow me to have a little more ‘cavalier attitude’ to lighting scenarios like this – usually I’d shoot with +2/3Ev permanently dialled into the camera.  Overall the extra dynamic range would give me less contrast, and I’d have more highlight detail and less need to bump up the shadow areas in post.

In other words processing would be easier, faster and a lot less convoluted.

But I can’t stress enough just how much detrimental difference LOSSLESS raw file compression CAN SOMETIMES make to a shot.

Now there is a lot – and I mean A LOT – of opinionated garbage written all over the internet on various forums etc about lossless raw file compression, and it drives me nuts.  Some say it’s bad, most say it makes no difference – and both camps are WRONG!

Sometimes there is NO visual difference between UNCOMPRESSED and LOSSLESS, and sometimes there IS.  It all depends on the lighting and the nature of the scene/subject colours and how they interact with said lighting.

The main problem with the ‘it makes no difference’ camp is that they never substantiate their claims; and if they are Canon shooters they can’t – because they can’t produce an image with zero raw file compression to compare their standard lossless CR2 files to!

So I’ve come up with a way of illustrating visually the differences between various levels of raw file compression on Nikon using the D800E and Photoshop.

But before we ‘get to it’ let’s firstly refresh your understanding. A camera raw file is basically a gamma 1.0, or LINEAR gamma file:

gamma,gamma encoding,Andy Astbury

Linear (top) vs Encoded Gamma

The right hand 50% of the linear gamma gradient represents the brightest whole stop of exposure – that’s one heck of a lot of potential for recording subtle highlight detail in a raw file.

It also represents the area of tonal range that is frequently most effected by any form of raw file compression.

Neither Nikon or Canon will reveal to the world the algorithm-based methods they use for lossless or lossy raw file compression, but it usually works by a process of ‘Bayer Binning’.

Bayer_Pattern

If we take a 2×2 block, it contains 2 green, 1 red and 1 blue photosite photon value – if we average the green value and then interpolate new values for red and blue output we will successfully compress the raw file.  But the data will be ‘faux’ data, not real data.

The other method we could use is to compress the tonal values in that brightest stop of recorded highlight tone – which is massive don’t forget – but this will result in a ’rounding up or down’ of certain bright tonal values thus potentially reducing some of the more subtle highlight details.

We could also use some variant of the same type of algorithm to ‘rationalise’ shadow detail as well – with pretty much the same result.

In the face of Nikon and Canons refusal to divulge their methodologies behind raw file compression, especially lossless, we can only guess what is actually happening.

I read somewhere that with lossless raw file compression the compression algorithms leave a trace instruction about what they have done and where they’ve done it in order that a raw handler programme such as Lightroom can actually ‘undo’ the compression effects – that sounds like a recipe for disaster if you ask me!

Personally I neither know nor do I care – I know that lossless raw file compression CAN be detrimental to images shot under certain conditions, and here’s the proof – of a fashion:

Let’s look at the following files:

raw file compression

Image 1: 14 bit UNCOMPRESSED

raw file compression

Image 2: 14 bit UNCOMPRESSED

raw file compression

Image 3: 14 bit LOSSLESS compression

raw file compression

Image 4: 14 bit LOSSY compression

raw file compression

Image 5: 12 bit UNCOMPRESSED

Yes, there are 2 files which are identical, that is 14 bit uncompressed – and there’s a reason for that which will become apparent in a minute.

First, some basic Photoshop ‘stuff’.  If I open TWO images in Photoshop as separate layers in the same document, and change the blend mode of the top layer to DIFFERENCE I can then see the differences between the two ‘images’.  It’s not a perfect way of proving my point because of the phenomenon of photon flux.

Photon Flux Andy??? WTF is that?

Well, here’s where shooting two identical 14 bit uncompressed files comes in – they themselves are NOT identical!:

controlunamplified control

The result of overlaying the two identical uncompressed raw files (above left) – it looks almost black all over indicating that the two shots are indeed pretty much the same in every pixel.  But if I amplify the image with a levels layer (above right) you can see the differences more clearly.

So there you have it – Photon Flux! The difference between two 14 bit UNCOMPRESSED raw files shot at the same time, same ISO, shutter speed AND with a FULLY MANUAL APERTURE.  The only difference between the two shots is the ratio and number of photons striking the subject and being reflected into the lens.

Firstly 14 Bit UNCOMPRESSED compared to 14 bit LOSSLESS (the important one!):

raw file compression

14 bit UNCOMPRESSED vs 14 bit LOSSLESS

Please remember, the above ‘difference’ image contains photon flux variations too, but if you look carefully you will see greater differences than in the ‘flux only’ image above.

raw file compression raw file compression

The two images above illustrate the differences between 14 bit uncompressed and 14 bit LOSSY compression (left) and 14 bit UNCOMPRESSED and 12 bit UNCOMPRESSED (right) just for good measure!

In Conclusion

As I indicated earlier in the post, this is not a definitive testing method, sequential shots will always contain a photon flux variation that ‘pollutes’ the ‘difference’ image.

I purposefully chose this white subject with textured aluminium fittings and a blackish LED screen because the majority of sensor response will lie in that brightest gamma 1.0 stop.

The exposure was a constant +1EV, 1/30th @ f 18 and 100 ISO – nearly maximum dynamic range for the D800E, and f18 was set manually to avoid any aperture flicker caused by auto stop down.

You can see from all the ‘difference’ images that the part of the subject that seems to suffer the most is the aluminium part, not the white areas.  The aluminium has a stippled texture causing a myriad of small specular highlights – brighter than the white parts of the subject.

What would 14 bit uncompressed minus 14 bit lossless minus photon flux look like?  In a perfect world I’d be able to show you accurately, but we don’t live in one of those so I can’t!

We can try it using the flux shot from earlier:

raw file compression

But this is wildly inaccurate as the flux component is not pertinent to the photons at the actual time the lossless compression shot was taken.  But the fact that you CAN see an image does HINT that there is a real difference between UNCOMPRESSED and LOSSLESS compression – in certain circumstances at least.

If you have never used a camera that offers the zero raw file compression option then basically what you’ve never had you never miss.  But as a Nikon shooter I shoot uncompressed all the time – 90% of the time I don’t need to, but it just saves me having to remember something when I do need the option.

raw file compression

Would this 1DX shot be served any better through UNCOMPRESSED raw recording?  Most likely NO – why?  Low Dynamic Range caused in the main by flat low contrast lighting means no deep dark shadows and nothing approaching a highlight.

I don’t see it as a costly option in terms of buffer capacity or on-board storage, and when it comes to processing I would much rather have a surfeit of sensor data rather than a lack of it – no matter how small that deficit might be.

Lossless raw file compression has NO positive effect on your images, and it’s sole purpose in life is to allow you to fit more shots on the storage media – that’s it pure and simple.  If you have the option to shoot uncompressed then do so, and buy a bigger card!

What pisses my off about Canon is that it would only take, I’m sure, a firmware upgrade to give the 1DX et al the ability to record with zero raw file compression – and, whether needed or not, it would stop miserable grumpy gits like me banging on about it!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Simple Masking in Photoshop

Simple Masking in Photoshop – The Liquid Chocolate Shots

Masking in Photoshop is what the software was built for, and yet so many Photoshop users are unfamiliar or just downright confused by the concept that they never use the technique.

Mask mastery will transform the way you work with Photoshop!

Take these shots for instance:

Milk and Liquid Chocolate Splash

Liquid Milk and White Chocolate splash together in an abstract isolated on white background

Wanting a shot to look like liquid chocolate and cream on a black or white background is all well and good, but producing it can be either as simple or hard as you care to make it.

Trying to get a pure white background ‘in camera’ is problematic to say the least, and chucking hot melted chocolate around if fraught with its own set of problems!

Shooting on a dark or black background is easier because it demands LESS lighting.

Masking in Photoshop will allow us to isolate the subject and switch out the background.

Now for the ‘chocolate bit’ – we could substitute it with brown emulsion paint – but have you seen the bloody price of it?!

Cheap trade white emulsion comes by the gallon at less than the price of a litre of the right coloured paint; and masking in Photoshop + a flat colour layer with a clipping mask put in the right blend mode will turn white paint into liquid chocolate every time!

A tweak with the Greg Benz Lumenzia plugin will finish the shot in Photoshop:

SSChocA final tweak in Lightroom and the whole process takes from the RAW shot on the left to the finished image on the right.

The key to a good mask in Photoshop is ALWAYS good, accurate pixel selection, and you’d be surprised just how simple it is.

Watch the video on my YouTube channel; I use the Colour Range tool to make a simple selection of the background, and a quick adjustment of the mask edge Smart Radius and Edge Contrast in order to obtain the perfect Photoshop mask for the job:

Like everything else in digital photography, when you know what you can do in post processing, it changes the way you shoot – hence I know I can make the shot with white paint on a black background!

Useful Links:

Greg B’s Lumenzia Plugin for Photoshop – get it HERE – you can’t afford NOT to have it in your arsenal of Photoshop tools.

UPDATE June 2018: Greg Benz (the plugin author) has launched a comprehensive Lumenzia training course – see my post here for more information.

Masking in Photoshop – you mustn’t let the concept frighten or intimidate you!  It’s critical that you understand it if you want to get the very best from your images; and it’s a vast subject simply because there are many types of mask, and even more ways by which to go about producing them.

It’s a topic that no one ever stops learning about – nope, not even yours truly! But in order to explore it to the full you need to understand all the basic concepts AND how to cut through all the bullshit that pervades the internet about it – stick with me on this folks and hang on for the ride!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Monitor Brightness.

Monitor Brightness & Room Lighting Levels.

I had promised myself I was going to do a video review of my latest purchase – the Lee SW150Mk2 system and Big and Little Stopper filters I’ve just spent a Kings ransom on for my Nikon 14-24mm and D800E:

Lee SW150 mk2,Lee Big Stopper,Lee Little Stopper,Lee Filters, Nikon 14-24, Nikon, Nikon D800E, landscape photography,Andy Astbury,Wildlife in Pixels

PURE SEX – and I’ve bloody well paid for this! My new Lee SW150 MkII filter system for the Nikon 14-24. Just look at those flashy red anodised parts – bound to make me a better photographer!

But I think that’ll have to wait while I address a question that keeps cropping up lately.  What’s the question?

Well, that’s the tricky bit because it comes in many guises. But they all boil down to “what monitor brightness or luminance level should I calibrate to?”

Monitor brightness is as critical as monitor colour when it comes to calibration.  If you look at previous articles on this blog you’ll see that I always quote the same calibration values, those being:

White Point: D65 – that figure takes care of colour.

Gamma: 2.2 – that value covers monitor contrast.

Luminance: 120 cdm2 (candelas per square meter) – that takes care of brightness.

Simple in’it….?!

However, when you’ve been around all this photography nonsense as long as I have you can overlook the possibility that people might not see things as being quite so blindingly obvious as you do.

And one of those ‘omissions on my part’ has been to do with monitor brightness settings COMBINED with working lighting levels in ‘the digital darkroom’.  So I suppose I’d better correct that failing on my part now.

What does a Monitor Profile Do for your image processing?

A correctly calibrated monitor and its .icc profile do a really simple but very mission-critical job.

If we open a new document in Photoshop and fill it with flat 255 white we need to see that it’s white.  If we hold an ND filter in front of our eye then the image won’t look white, it’ll look grey.

If we hold a blue filter in front of our eye the image will not look white – it’ll look blue.

That white image doesn’t exist ‘inside the monitor’ – it’s on our computer!  It only gets displayed on the monitor because of the graphics output device in our machine.

So, if you like, we’re on the outside looking in; and we are looking through a window on to our white image.  The colour and brightness level in our white image are correct on the inside of the system – our computer – but the viewing window or monitor might be too bright or too dark, and/or might be exhibiting a colour tint or cast.

Unless our monitor is a totally ‘clean window’ in terms of colour neutrality, then our image colour will not be displayed correctly.

And if the monitor is not running at the correct brightness then the colours and tones in our images will appear to be either too dark or too bright.  Please note the word ‘appear’…

Let’s get a bit fancy and make a greyscale in Photoshop:

monitor brightness,monitor calibration,monitor luminance,ColorMunki,i1 Display,spectrophotometer,colourimeter,ambient light,work space,photography,digital darkroom,Andy Astbury,Wildlife in Pixels,gamma,colour correction,image processing

The dots represent Lab 50 to Lab 95 – the most valuable tonal range between midtone and highlight detail.

Look at the distance between Lab 50 & Lab 95 on the three greyscales above – the biggest ‘span’ is the correctly calibrated monitor.  In both the ‘too bright & contrasty’ and the ‘too dark low contrast’ calibration, that valuable tonal range is compressed.

In reality the colours and tones in, say an unprocessed RAW file on one of our hard drives, are what they are.  But if our monitor isn’t calibrated correctly, what we ‘see’ on our monitor IS NOT REALITY.

Reality is what we need – the colours and tones in our images need to be faithfully reproduced on our monitor.

And so basically a monitor profile ensures that we see our images correctly in terms of colour and brightness; it ensures that we look at our images through a clean window that displays 100% of the luminance being sent to it – not 95% and not 120% – and that all our primary colours are being displayed with 100% fidelity.

In a nutshell, on an uncalibrated monitor, an image might look like crap, when in reality it isn’t.  The shit really starts to fly when you start making adjustments in an uncalibrated workspace – what you see becomes even further removed from reality.

“My prints come out too dark Andy – why?”

Because your monitor is too bright – CALIBRATE it!

“My pics look great on my screen, but everyone on Nature Photographers Network keeps telling me they’ve got too much contrast and they need a levels adjustment.  One guy even reprocessed one – everyone thought his version was better, but frankly it looked like crap to me – why is this happening Andy?

“Because your monitor brightness is too low but your gamma is too high – CALIBRATE it!  If you want your images to look like mine then you’ve got to do ALL the things I do, not just some of ’em – do you think I do all this shit for fun??????????……………grrrrrrr….

But there’s a potential problem;  just because your monitor is calibrated to perfection, that does NOT mean that everything will be golden from this point on

Monitor Viewing Conditions

So we’re outside taking a picture on a bright sunny day, but we can’t see the image on the back of the camera because there’s too much daylight, and we have to dive under a coat with our camera to see what’s going on.

But if we review that same image on the camera in the dark then it looks epic.

Now you have all experienced that…….

The monitor on the back of your camera has a set brightness level – if we view the screen in a high level of ambient light the image looks pale, washed out and in a general state of ultra low contrast.  Turn the ambient light down and the image on the camera screen becomes more vivid and the contrast increases.

But the image hasn’t changed, and neither has the camera monitor.

What HAS changed is your PERCEPTION of the colour and luminance values contained within the image itself.

Now come on kids – join the dots will you!

It does not matter how well your monitor is calibrated, if your monitor viewing conditions are not within specification.

Just like with your camera monitor, if there is too much ambient light in your working environment then your precisely calibrated monitor brightness and gamma will fail to give you a correct visualization or ‘perception’ of your image.

And the problems don’t end there either; coloured walls and ceilings reflect that colour onto the surface of your monitor, as does that stupid luminous green shirt you’re wearing – yes, I can see you!  And if you are processing on an iMac then THAT problem just got 10 times worse because of the glossy screen!

Nope – bead-blasting your 27 inches of Apple goodness is not the answer!

Right, now comes the serious stuff, so READ, INGEST and ACT.

ISO Standard 3664:2009 is the puppy we need to work to (sort of) – you can actually go and purchase this publication HERE should you feel inclined to dump 138 CHF on 34 pages of light bedtime reading.

There are actually two ISO standards that are relevant to us as image makers; ISO 12646:2015(draft) being the other.

12646 pertains to digital image processing where screens are to be compared to prints side by side (that does not necessarily refer to ‘desktop printer prints from your Epson 3000’).

3664:2009 applies to digital image processing where screen output is INDEPENDENT of print output.

We work to this standard (for the most part) because we want to process for the web as well as for print.

If we employ a print work flow involving modern soft-proofing and otherwise keep within the bounds of 3664 then we’re pretty much on the dance-floor.

ISO 3664 sets out one or two interesting and highly critical working parameters:

Ambient Light White Point: D50 – that means that the colour temperature of the light in your editing/working environment should be 5000Kelvin (not your monitor) – and in particular this means the light FALLING ON TO YOUR MONITOR from within your room. So room décor has to be colour neutral as well as the light source.

Ambient Light Value in your Editing Area: 32 to 64 Lux or lower.  Now this is what shocks so many of you guys – lower than 32 lux is basically processing in the dark!

Ambient Light Glare Permissible: 0 – this means NO REFLECTIONS on your monitor and NO light from windows or other light sources falling directly on the monitor.

Monitor White Point – D65 (under 3664) and D50 (under 12646) – we go with D65.

Monitor Luminance – 75 to 100 cdm2 (under 3664) and 80 to 120 cdm2 (under 12646 – here we begin to deviate from 3664.

We appear to be dealing with mixed reference units, but 1 Lux = 1 cdm2 or 1 candela per square metre.

The way Monitor Brightness or Luminance relates to ambient light levels is perhaps a little counter-intuitive for some folk.  Basically the LOWER your editing area Lux value the LOWER your Monitor Brightness or luminance needs to be.

Now comes the point in the story where common sense gets mixed with experience, and the outcome can be proved by looking at displayed images and prints; aesthetics as opposed numbers.

Like all serious photographers I process my own images on a wide-gamut monitor, and I print on a wide-gamut printer.

Wide gamut monitors display pretty much 90% to100% of the AdobeRGB1998 colour space.

What we might refer to as Standard Gamut monitors display something a little larger than the sRGB colour space, which as we know is considerably smaller than AdobeRGB1998.

monitor brightness,monitor calibration,monitor luminance,ColorMunki,i1 Display,spectrophotometer,colourimeter,ambient light,work space,photography,digital darkroom,Andy Astbury,Wildlife in Pixels,gamma,colour correction,image processing

Left is a standard gamut/sRGB monitor and right is a typical wide gamut/AdobeRGB1998 monitor – if you can call any NEC ‘typical’!

Find all the gory details about monitors on this great resource site – TFT Central.

At workshops I process on a 27 inch non-Retina iMac – this is to all intents and purposes a ‘standard gamut’ monitor.

I calibrate my monitors with a ColorMunki Photo – which is a spectrophotometer.  Spectro’s have a tendency to be slow, and slightly problematic in the very darkest tones and exhibit something of a low contrast reaction to ‘blacks’ below around Lab 6.3 (RGB 20,20,20).

If you own a ColorMunki Display or i1Dispaly you do NOT own a spectro, you own a colorimeter!  A very different beast in the way it works, but from a colour point of view they give the same results as a spectro of the same standard – plus, for the most part, they work faster.

However, from a monitor brightness standpoint, they differ from spectros in their slightly better response to those ultra-dark tones.

So from a spectrophotometer standpoint I prefer to calibrate to ISO 12646 standard of 120cdm2 and control my room lighting to around 35-40 Lux.

Just so that you understand just how ‘nit-picking’ these standards are, the difference between 80cdm2 and 120 cdm2 is just 1/2 or 1/3rd of a stop Ev in camera exposure terms, depending on which way you look at it!

However, to put this monitor brightness standard into context, my 27 inch iMac came from Apple running at 290 cdm2 – and cranked up fully it’ll thump out 340 cdm2.

Most stand-alone monitors you buy, especially those that fall under the ‘standard gamut’ banner, will all be running at massively high monitor brightness levels and will require some severe turning down in the calibration process.

You will find that most monitor tests and reviews are done with calibration to the same figures that I have quoted – D65, 120cdm2 and Gamma 2.2 – in fact this non-standard set up has become so damn common it is now ‘standard’ – despite what the ISO chaps may think.

Using these values, printing out of Lightroom for example, becomes a breeze when using printer profiles created to the ICC v2 standard as long as you ‘soft proof’ the image in a fit and proper manner – that means CAREFULLY, take your time.  The one slight shortcoming of the set up is that side by side print/monitor comparisons may look ever so slightly out of kilter because of the D65 monitor white point – 6,500K transmitted white point as opposed to a 5,000K reflective white point.  But a shielded print-viewer should bring all that back into balance if such a thing floats your boat.

But the BIG THING you need to take away from the rather long article is the LOW LUX VALUE of you editing/working area ambient illumination.

Both the ColorMunki Photo and i1Pro2 spectrophotometers will measure your ambient light, as will the ColorMunki Display and i1 Display colorimeters, to name but a few.

But if you measure your ambient light and find the device gives you a reading of more than 50-60 lux then DO NOT ask the device to profile for your ambient light; in fact I would not recommend doing this AT ALL, here’s why.

I have a main office light that is colour corrected to 5000K and it chucks out 127 Lux at the monitor.  If I select the ‘measure and calibrate to ambient’ option on the ColorMunki Photo it eventually tells me I need a monitor brightness or luminance of 80 cdm2 – the only problem is that it gives me the same figure if I drop the ambient lux value to 100.

Now that smells a tad fishy to me……..

So my advice to anyone is to remove the variables, calibrate to 120 cdm2 and work in a very subdued ambient condition of 35 to 40 Lux. I find it easier to control my low lux working ambient light levels than bugger about with over-complex calibration.

To put a final perspective on this figure there is an interesting page on the Apollo Energytech website which quotes lux levels that comply with the law for different work environments – don’t go to B&Q or Walmart to do a spot of processing, and we’re all going to end up doing hard time at Her Madges Pleasure –  law breakers that we are!

Please consider supporting this blog.

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.