Astro Landscape Photography

Astro Landscape Photography

Astro Landscape Photography

One of my patrons, Paul Smith, and I ventured down to Shropshire and the spectacular quartsite ridge of The Stiperstones to get this image of the Milky Way and Mars (the large bright ‘star’ above the rocks on the left).

I always work the same way for astro landscape photography, beginning with getting into position just before sunset.

Using the PhotoPills app on my phone I can see where the milky way will be positioned in my field of view at the time of peak sky darkness.  This enables me to position the camera exactly where I want it for the best composition.

The biggest killer in astro landscape photography is excessive noise in the foreground.

The other problem is that foregrounds in most images of this genre are not sharp due to a lack of depth of field at the wide apertures you need to shoot the night sky at – f2.8 for example.

To get around this problem we need to shoot a separate foreground image at a lower ISO, a narrower aperture and focused closer to the camera.

Some photographers change focus, engage long exposure noise reduction and then shoot a very long exposure.  But that’s an eminently risky thing to do in my opinion, both from a technical standpoint and one of time – a 60 minute exposure will take 120 minutes to complete.

The length of exposure is chosen to allow the very low photon-count from the foreground to ‘build-up’ on the sensor and produced a usable level of exposure from what little natural light is around.

From a visual perspective, when it works, the method produces images that can be spectacular because the light in the foreground matches the light in the sky in terms of directionality.

Light Painting

To get around the inconvenience of time and super-long exposures a lot of folk employ the technique of light painting their foregrounds.

Light painting – in my opinion – destroys the integrity of the finished image because it’s so bloody obvious!  The direction of light that’s ‘painted’ on the foreground bares no resemblance to that of the sky.

The other problem with light painting is this – those that employ the technique hardly ever CHECK to see if they are in the field of view of another photographer – think about that one for a second or two!

My Method

As I mentioned before, I set up just before sunset.  In the shot above I knew the milky way and Mars were not going to be where I wanted them until just after 1am, but I was set up by 9.20pm – yep, a long wait ahead, but always worth the effort.

Astro Landscape Photography

As we move towards the latter half of civil twilight I start shooting my foreground exposure, and I’ll shoot a few of these at regular intervals between then and mid nautical twilight.

Because I shoot raw the white balance set in camera is irrelevant, and can be balanced with that of the sky in Photoshop during post processing.

The key things here are that I have a shadowless even illumination of my foreground which is shot at a low ISO, in perfect focus, and shot at say f8 has great depth of field.

Once deep into blue hour and astronomical twilight the brighter stars are visible and so I now use full magnification in live view and focus on a bright star in the cameras field of view.

Then it’s a waiting game – waiting for the sky to darken to its maximum and the Milky Way to come into my desired position for my chosen composition.

Shooting the Sky

Astro landscape photography is all about showing the sky in context with the foreground – I have absolutely ZERO time for those popular YouTube photographers who composite a shot of the night sky into a landscape image shot in a different place or a different angle.

Good astro landscape photography HAS TO BE A COMPOSITE though – there is no way around that.

And by GOOD I mean producing a full resolution image that will sell through the agencies and print BIG if needed.

The key things that contribute to an image being classed good in my book are simple:

  • Pin-point stars with no trailing
  • Low noise
  • Sharp from ‘back’ to ‘front’.

Pin-points stars are solely down to correct shutter speed for your sensor size and megapixel count.

Low noise is covered by shooting a low ISO foreground and a sequence of high ISO sky images, and using Starry Landscape Stacker on Mac (Sequator on PC appears to be very similar) in conjunction with a mean or median stacking mode.

Further noise cancelling is achieved but the shooting of Dark Frames, and the typical wide-aperture vignetting is cancelled out by the creation of a flat field frame.

And ‘back to front’ image sharpness should be obvious to you from what I’ve already written!

So, I’ll typically shoot a sequence of 20 to 30 exposures – all one after the other with no breaks or pauses – and then a sequence of 20 to 30 dark frames.

Shutter speeds usually range from 4 to 6 seconds

Watch this video on my YouTube Channel about shutter speed:

Best viewed on the channel itself, and click the little cog icon to choose 1080pHD as the resolution.

Putting it all Together

Shooting all the frames for astro landscape photography is really quite simple.

Putting it all together is fairly simple and straight forward too – but it’s TEDIOUS and time-consuming if you want to do it properly.

The shot above took my a little over 4 hours!

And 80% of it is retouching in Photoshop.

I produce a very extensive training title – Complete Milky Way Photography Workflow – with teaches you EVERYTHING you need to know about the shooting and processing of astro landscape photography images – you can purchase it here – and if you use the offer code MWAY15 at the checkout you’ll get £15 off the purchase price.

But I wanted to try Raw Therapee for this Stiperstones image, and another of my patrons – Frank – wanted a video of processing methodology in Raw Therapee.

Easier said than done, cramming 4 hours into a typical YouTube video!  But after about six attempts I think I’ve managed it, and you can see it here, but I warn you now that it’s 40 minutes long:

Best viewed on the channel itself, and click the little cog icon to choose 1080pHD as the resolution.

I hope you’ve found the information in this post useful, together with the YouTube videos.

I don’t monetize my YouTube videos or fill my blog posts with masses of affiliate links, and I rely solely on my patrons to help cover my time and server costs.  If you would like to help me to produce more content please visit my Patreon page on the button above.

Many thanks and best light to you all.

Adobe Lightroom Classic and Photoshop CC 2018 tips

Adobe Lightroom Classic and Photoshop CC 2018 tips – part 1

So, you’ve either upgraded to Lightroom Classic CC and Photoshop CC 2018, or you are thinking doing so.

Well, here are a couple of things I’ve found – I’ve called this part1, because I’m sure there will be other problems/irritations!

Lightroom Classic CC GPU Acceleration problem

If you are having problems with shadow areas appearing too dark and somewhat ‘chocked’ in the develop module – but things look fine in the Library module – then just follow the simple steps in the video above and TURN OFF GPU Acceleration in the Lightroom preferences panel under the performance tab.

Adobe Lightroom Classic and Photoshop CC 2018 tips

Turn OFF GPU Acceleration

UPDATE: I have subsequently done another video on this topic that illustrates the fact that the problem did not exist in Lr CC 2015 v.12/Camera Raw v.9.12

In the new Photoshop CC 2018 there is an irritation/annoyance with the brush tool, and something called the ‘brush leash’.

Now why on earth you need your brush on a leash God ONLY KNOWS!

But the brush leash manifests itself as a purple/magenta line that follows your brush tool everywhere.

You have a smoothness slider for your brush – it’s default setting is 10%.  If we increase that value then the leash line gets even longer, and even more bloody irritating.

And why we would need an indicator (which is what the leash is) of smoothness amount and direction for our brush strokes is a bit beyond me – because we can see it anyway.

So, if you want to change the leash length, use the smoothing slider.

If you want to change the leash colour just go to Photoshop>Preferences>Cursors

Adobe Lightroom Classic and Photoshop CC 2018 tips

Here, you can change the colour, or better still, get rid of it completely by unticking the “show brush leash while smoothing” option.

So there are a couple of tips from my first 24 hours with the latest 2018 ransom ware versions from Adobe!

But I’m sure there will be more, so stay tuned, and consider heading over to my YouTube channel and hitting the subscribe button, and hit the ‘notifications bell’ while you’re at it!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

 

Lumenzia – Not Just for Landscapes

Luminosity Masking is NOT just for landscape photographs – far from it.

But most folk miss the point of luminosity masking because they think it’s difficult and tedious.

The point, as I always see it, is that luminosity masking allows you to make dramatic but subtle changes and enhancements to your image with what are actually VERY fast and crude “adjustments”.

This in reality means that luminosity masking is FAST – and way faster than trying to do “localised” adjustments.  But the creation of the masks and choosing which one to use is what crippled the “ease factor” for most.

But with this new Lumenzia extension is so snappy and quick at showing you the different masks that, if you know what area of the image you want to adjust, the whole process takes SECONDS.

Let’s look at a White-tailed Eagle taken just 15 days ago:

Straight off the 1Dx it looks like this:

RAW unprocessed .CR2 file

RAW unprocessed .CR2 file (CLICK to view in new window)

Inside the Develop Module of Lightroom 5 it looks like:

camera

RAW unprocessed – (CLICK to view in new window)

A few tweaks later and it looks like:

Lr5adjust

Tweaks are what you can see in the Basics Panel + CamCal set to Neutral, and Chroma Noise removal in the Lens Corrections Panel is turned ON – (CLICK to view in new window)

Sending THIS adjusted image to Photoshop:

ps1

(CLICK to view in new window)

All I want to do is give a “lift” to the darker tones in the bird; under the wings, and around the side of head, legs and tail.

Using a BRUSH to do the job is all fine ‘n dandy BUT, you would be creating a localised adjustment that’s all-encompassing from a tonal perspective; all tones that fell under the brush get adjusted by the same amount.

A luminosity mask, or indeed ANY pixel-based mask is exactly what it says it is – a mask full of pixels. And those pixels are DERIVED from the real pixels in your image.  But the real beauty is that those pixels will be anywhere from 1% to 100% selected, or not selected at all.

Where they are 100% selected they are BLACK, and any adjustment you make BEHIND that mask will NOT be visible.

Pixels that are NOT selected will be WHITE, and your adjustment will show fully.

But where the pixels are between 1% and 99% selected they will appear as 1% GREY to 99% grey and so will show or hide variation of said adjustment by the same amounts…got it?

The Lumenzia D4 mask looks like it’ll do the job I want:

(CLICK to view in new window)

Lumenzia D4 mask (CLICK to view in new window)

Click the image to view larger – look at the subtle selections under those wings – try making that selection any other way in under 2 seconds – you’ve got no chance!

The “lift” I want to make in those WHITER areas of the mask is best done with a Curves Adjustment layer:

Select "Curve" in the Lumenzia GUI - (CLICK to view in new window)

Select “Curve” in the Lumenzia GUI – (CLICK to view in new window)

So hit the Curve button and voilà:

The Lumenzia D4 mask is now applied to Curves Adjustment Layer - (CLICK to view in new window)

The Lumenzia D4 mask is now applied to Curves Adjustment Layer – (CLICK to view in new window)

You can see in the image above that I’ve made a very rough upwards deflection of the curve to obtain an effective but subtle improvement to those under-wing areas etc. that I was looking to adjust.

The total time frame from opening the image in Photoshop to now is about 20 seconds!  Less time than the Lightroom 5 adjustments took…

And to illustrate the power of that Lumenzia D4 Luminosity mask, and the crudity of the adjustment I made, here’s the image WITHOUT THE MASK:

The effect of the luminosity mask is best illustrated by "hiding" it - bloody hell, turn it back on ! - (CLICK to view in new window).

The effect of the luminosity mask is best illustrated by “hiding” it – bloody hell, turn it back on ! – (CLICK to view in new window).

And at full resolution you can see the subtleties of the adjustment on the side of the head:

ll+lum

With Lumenzia (left) and just the Lightroom 5 processing (right) – (CLICK to view in new window).

If you want to get the best from your images AND you don’t want to spend hours trying to do so, then Lumenzia will seriously help you.

Clicking this link HERE to buy Lumenzia doesn’t mean it costs you any more than if you buy it direct from the developer.  But it does mean that I get a small remuneration from the developer as a commission which in turn supports my blog.  Buying Lumenzia is a total no-brainer so please help support this blog by buying it via these links – many thanks folks.

UPDATE June 2018: Greg Benz (the plugin author) has launched a comprehensive Lumenzia training course – see my post here for more information.

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Camera Calibration

Custom Camera Calibration

The other day I had an email fall into my inbox from leading UK online retailer…whose name escapes me but is very short… that made my blood pressure spike.  It was basically offering me 20% off the cost of something that will revolutionise my photography – ColorChecker Passport Camera Calibration Profiling software.

I got annoyed for two reasons:

  1. Who the “f***” do they think they’re talking to sending ME this – I’ve forgotten more about this colour management malarkey than they’ll ever know….do some customer research you idle bastards and save yourselves a mauling!
  2. Much more importantly – tens of thousands of you guys ‘n gals will get the same email and some will believe the crap and buy it – and you will get yourselves into the biggest world of hurt imaginable!

Don’t misunderstand me, a ColorChecker Passport makes for a very sound purchase indeed and I would not like life very much if I didn’t own one.  What made me seethe is the way it’s being marketed, and to whom.

Profile all your cameras for accurate colour reproduction…..blah,blah,blah……..

If you do NOT fully understand the implications of custom camera calibration you’ll be in so much trouble when it comes to processing you’ll feel like giving up the art of photography.

The problems lie in a few areas:

First, a camera profile is a SENSOR/ASIC OUTPUT profile – think about that a minute.

Two things influence sensor/asic output – ISO and lens colour shift – yep. that’s right, no lens is colour-neutral, and all lenses produce colour shifts either by tint or spectral absorption. And higher ISO settings usually produce a cooler, bluer image.

Let’s take a look at ISO and its influence on custom camera calibration profiling – I’m using a far better bit of software for doing the job – “IN MY OPINION” – the Adobe DNG Profile Editor – free to all MAC download and Windows download – but you do need the ColorChecker Passport itself!

I prefer the Adobe product because I find the ColorChecker software produced camera calibration profiles there were, well, pretty vile in terms of increased contrast especially; not my cup of tea at all.

camera calibration, Andy Astbury, colour, color management

5 images shot at 1 stop increments of ISO on the same camera/lens combination.

Now this is NOT a demo of software – a video tutorial of camera profiling will be on my next photography training video coming sometime soon-ish, doubtless with a somewhat verbose narrative explaining why you should or should not do it!

Above, we have 5 images shot on a D4 with a 24-70 f2.8 at 70mm under a consistent overcast daylight at 1stop increments of ISO between 200 and 3200.

Below, we can see the resultant profile and distribution of known colour reference points on the colour wheel.

camera calibration, Andy Astbury, colour, color management

Here’s the 200 ISO custom camera calibration profile – the portion of interest to us is the colour wheel on the left and the points of known colour distribution (the black squares and circled dot).

Next, we see the result of the image shot at 3200 ISO:

camera calibration, Andy Astbury, colour, color management

Here’s the result of the custom camera profile based on the shot taken at 3200 ISO.

Now let’s super-impose one over t’other – if ISO doesn’t matter to a camera calibration profile then we should see NO DIFFERENCE………….

camera calibration, Andy Astbury, colour, color management

The 3200 ISO profile colour distribution overlaid onto the 200 ISO profile colour distribution – it’s different and they do not match up.

……..well would you bloody believe it!  Embark on custom camera calibration  profiling your camera and then apply that profile to an image shot with the same lens under the same lighting conditions but at a different ISO, and your colours will not be right.

So now my assertions about ISO have been vindicated, let’s take a look at skinning the cat another way, by keeping ISO the same but switching lenses.

Below is the result of a 500mm f4 at 1000 ISO:

camera calibration, Andy Astbury, colour, color management

Profile result of a 500mm f4 at 1000 ISO

And below we have the 24-70mm f2.8 @ 70mm and 1000 ISO:

camera calibration, Andy Astbury, colour, color management

Profile result of a 24-70mm f2.8 @ 70mm at 1000 ISO

Let’s overlay those two and see if there’s any difference:

camera calibration, Andy Astbury, colour, color management

Profile results of a 500mm f4 at 1000 ISO and the 24-70 f2.8 at 1000 ISO – as massively different as day and night.

Whoops….it’s all turned to crap!

Just take a moment to look at the info here.  There is movement in the orange/red/red magentas, but even bigger movements in the yellows/greens and the blues and blue/magentas.

Because these comparisons are done simply in Photoshop layers with the top layer at 50% opacity you can even see there’s an overall difference in the Hue and Saturation slider values for the two profiles – the 500mm profile is 2 and -10 respectively and the 24-70mm is actually 1 and -9.

The basic upshot of this information is that the two lenses apply a different colour cast to your image AND that cast is not always uniformly applied to all areas of the colour spectrum.

And if you really want to “screw the pooch” then here’s the above comparison side by side with with  the 500f4 1000iso against the 24-70mm f2.8 200iso view:

camera calibration, Andy Astbury, colour, color management

500mm f4/24-70mm f2.8 1000 ISO comparison versus 500mm f4 1000 ISO and 24-70mm f2.8 200 ISO.

A totally different spectral distribution of colour reference points again.

And I’m not even going to bother showing you that the same camera/lens/ISO combo will give different results under different lighting conditions – you should by now be able to envisage that little nugget yourselves.

So, Custom Camera Calibration – if you do it right then you’ll be profiling every body/lens combo you have, at every conceivable ISO value and lighting condition – it’s one of those things that if you don’t do it all then you’d be best off not doing at all in most cases.

I can think of a few instances where I would do it as a matter of course, such as scientific work, photo-microscopy, and artwork photography/copystand work etc, but these would be well outside the remit the more normal photographic practices.

As I said earlier, the Passport device itself is worth far more than it’s weight in gold – set up and light your shot and include the Passport device in a prominent place. Take a second shot without it and use shot 1 to custom white balance shot 2 – a dead easy process that makes the device invaluable for portrait and studio work etc.

But I hope by now you can begin to see the futility of trying to use a custom camera calibration profile on a “one size fits all” basis – it just won’t work correctly; and yet for the most part this is how it’s marketed – especially by third party retailers.

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

The ND Filter

Long Exposure & ND Filters

long exposure,slow shutter speed,ND filter,CMOS sensor,noise

A view of the stunning rock formations at Porth Y Post on the Welsh island of Anglesey. The image is a long exposure of very rough sea, giving the impression of smoke and fog.  30 seconds @f13 ISO 100. B&W 10stop ND – unfiltered exposure would have been 1/30th.

The reason for this particular post began last week when I was “cruising” a forum on a PoD site I’m a member of, and I came across a thread started by someone about heavy ND filters and very long exposures.

Then, a couple of days later a Facebook conversation cropped up where someone I know rather well seemed to be losing the plot over things totally by purchasing a 16 stop ND.

The poor bugger got a right mauling from “yours truly” for the simple reason that he doesn’t understand the SCIENCE behind the art of photography.  This is what pisses me off about digital photography – it readily provides “instant gratification” to folk who know bugger all about what they are doing with their equipment.  They then spend money on “pushing the envelope” only to find their ivory tower comes tumbling down around them because they THOUGHT they knew what they were doing………..stop ranting Andy before you have a coronary!

OK, I’ll stop “ranting”, but seriously folks, it doesn’t matter if you are on a 5DMkIII or a D800E, a D4 or a 1Dx – you have to realise that your camera works within a certain set of fixed parameters; and if you wander outside these boundaries for reasons of either stupidity or ignorance, then you’ll soon be up to your ass in Alligators!

Avid readers of this blog of mine (seemingly there are a few) will know that I’ve gone to great lengths in the past to explain how sensors are limited in different ways by things such as diffraction and that certain lens/sensor combinations are said to be “diffraction limited; well here’s something new to run up your flag pole – sensors can be thought of as being “photon limited” too!

I’ll explain what I mean in a minute…..

SENSOR TYPE

Most folk who own a camera of modern design by Nikon or Canon FAIL at the first hurdle by not understanding their sensor type.

Sensors generally fall into two basic types – CCD and CMOS.

Most of us use cameras fitted with CMOS sensors, because we demand accurate fast phase detection AF AND we demand high levels of ADC/BUFFER speed.  In VERY simplistic terms, CCD sensors cannot operate at the levels of speed and efficiency demanded by the general camera-buying public.

So, it’s CMOS to the rescue.  But CMOS sensors are generally noisier than CCDs.

When I say “noise” I’m NOT referring to the normal under exposure luminance noise that a some of you might be thinking of. I’m talking about the “background noise” of the sensor itself – see post HERE .

Now I’m going to over simplify things for you here – I need to because there are a lot of variables to take into account.

  • A Sensor is an ARRAY of PHOTOSITES or PHOTODIODES
  • A photodiode exists to do one thing – react to being struck by PHOTONS of light by producing electrons.
  • To produce electrons PROPORTIONAL to the number of photons that strike it.

Now in theory, a photodiode that sees ZERO photons during the exposure should release NO ELECTRONS.

At the end of the exposure the ADC comes along and counts the electrons for each photodiode – an ANALOGUE VALUE – and converts it to a DIGITAL VALUE and stores that digital value as a point of information in the RAW file.

A RAW converter such as Lightroom then reads all these individual points of information and using its own in-built algorithms it normalises and demosaics them into an RGB image that we can see on our monitor.

Sounds simple doesn’t it, and theoretically it is.  But in practice there’s a lot of places in the process where things can go sideways rapidly……..!

We make a lot of assumptions about our pride and joy – our newly purchased DSLR – and most of these assumptions are just plain wrong.  One that most folk get wrong is presuming ALL the photodiodes on their shiny new sensor BEHAVE IN THE SAME WAY and are 100% identical in response.  WRONG – even though, in theory, it should be true.

Some sensors are built to a budget, some to a standard of quality and bugger the budget.

Think of the above statement as a scale running left to right with crap sensors like a 7D or D5000 on the left, and the staggering Phase IQ260 on the right.  There isn’t, despite what sales bumph says, any 35mm format sensor that can come even close to residing on the right hand end of the scale, but perhaps a D800E might sit somewhere between 65 and 70%.

The thing I’m trying to get at here is that “quality control” and “budget” are opposites in the manufacturing process, and that linearity and uniformity of photodiode performance costs MONEY – and lots of it.

All our 35mm format sensors suffer from a lack of that expensive quality control in some form or other, but what manufacturers try to do is place the resulting poor performance “outside the envelope of normal expected operation” as a Nikon technician once told me.

In other words, during normal exposures and camera usage (is there such a thing?) the errors don’t show themselves – so you are oblivious to them. But move outside of that “envelope of normal expected operation” and as I said before, the Alligators are soon chomping on your butt cheeks.

REALITY

Long exposures in low light levels – those longer than 30 to 90 seconds – present us with one of those “outside the envelope” situations that can highlight some major discrepancies in individual photodiode performance and sensor uniformity.

Earlier, I said that a photodiode, in a perfect world, would always react proportionally to the number of photons striking it, and that if it had no photon strikes during the exposure then it would have ZERO output in terms of electrons produced.

Think of the “perfect” photodiode/photosite as being a child brought up by nuns, well mannered and perfectly behaved.

Then think of a child brought up in the Gallagher household a la “Shameless” – zero patience, no sense of right or wrong, rebellious and down right misbehaved.  We can compare this kid with some of the photodiodes on our sensor.

These odd photodiodes usually show a random distribution across the sensor surface, but you only ever see evidence of their existence when you shoot in the dark, or when executing very long exposures from behind a heavy ND filter.

These “naughty” photodiodes behave badly in numerous ways:

  • They can release a larger number of electrons than is proportional to their photon count.
  • They can go to the extreme of releasing electrons when the have a ZERO photon count.
  • They can mimic the output of their nearest neighbors.
  • They can be clustered together and produce random spurious specks of colour.

And the list goes on!

It’s a Question of Time

These errant little buggers basically misbehave because the combination of low photon count and overly long exposure time allow them to, if you like, run out of patience and start misbehaving.

It is quite common for a single photodiode or cluster of them to behave in a perfect manner for any shutter speed up to between 30 seconds and 2 minutes. But if we expose that same photodiode or cluster for 3 minutes it can show abnormal behavior in its electron output.  Expose it for 5 minutes and its output could be the same, or amplified, or even totally different.

IMPORTANT – do not confuse these with so-called “hot pixels” which show up in all exposures irrespective of shutter duration.

Putting an ND filter in front of your lens is the same as shooting under less light.  Its effect is even-handed across all exposure values in the scenes brightness range, and therein lies the problem.  Cutting 10 stops worth of photons from the highlights in the scene will still leave plenty to make the sensor work effectively in those areas of the image.

But cutting 10 stops worth of photons from the shadow areas – where there was perhaps 12 stops less to begin with – might well leave an insufficient number of photons in the very darkest areas to make those particular photodiodes function correctly.

Exposure is basically a function of Intensity and Time, back in my college days we used to say that Ex = I x T !

Our ND filter CUTS intensity across the board, so Time has to increase to avoid under exposure in general.  But because we are working with far fewer photons as a whole, we have to curb the length of the Time component BECAUSE OF the level of intensity reduction – we become caught in a “Catch 22” situation, trying to avoid the “time triggered” malfunction of those errant diodes.

Below is an 4 minute exposure from behind a Lee Big Stopper on a 1Dx – click on both images to open at full resolution in a new window.

long exposure,slow shutter speed,ND filter,CMOS sensor,noise

Canon 1Dx
4 minutes @ f13
ISO 200 Lee 10stop

long exposure,slow shutter speed,ND filter,CMOS sensor,noise

The beastly Nikon D800E fairs a lot better under similar exposure parameters, but there are still a lot of repairs to be done:

long exposure,slow shutter speed,ND filter,CMOS sensor,noise

A 4 minute exposure on a D800, f11 at 200ISO

Most people use heavy ND filters for the same reason I do – smoothing out water.

long exposure,slow shutter speed,ND filter,CMOS sensor,noise

The texture of the water in the top shot clutters the image and adds nothing – so get rid of it! D4,ISO 50, 30secs f11 Lee Big Stopper

Then we change the camera orientation and get a commercial shot:

long exposure,slow shutter speed,ND filter,CMOS sensor,noise

Cemlyn Bay on the northwest coast of Anglesey, North Wales, Approximately 2.5 km to the east is Wylfa nuclear power station. Same exposure as above.

In this next shot all I’m interested in is the jetty, neither water surface texture or horizon land add anything – the land is easy to dump in PShop but the water would be impossible:

long exposure,slow shutter speed,ND filter,CMOS sensor,noise

I see the bottom image in my head when I look at the scene top left. Again, the 10 stop ND fixes the water, which adds precisely nothing to the image. D4 ISO 50, 60 secs, f14 B&W 10 stop

The mistake folk make is this, 30 seconds is usually enough time to get the effect on the water you want, and 90 to 120 seconds is truly the maximum you should ever really need.  Any longer and you’ll get at best no more effect, and at worst the effect will not look as visually appealing – that’s my opinion anyway.

This time requirement dovetails nicely with the “operating inside the design envelope” physics of the average 35mm format sensor.

So, as I said before, we could go out on a bit of a limb and say that our sensors are all “photon limited”; all diodes on the sensor must be struck by x number of photons.

And we can regard them as being exposure length limited; all diodes on the sensor must be struck by x photons in y seconds in order to avoid the pitfalls mentioned.

So next time you have the idea of obtaining something really daft, such as the 16 stop ND filter my friend ordered, try engaging your brain.  An unfiltered exposure that meters out at 1/30th sec will be 30 seconds behind a 10 stop ND filter, and a whopping 32 minutes behind a 16 stop ND filter.  Now at that sort of exposure time the sensor noise in the image will be astonishing in both presence and variety!

As I posted on my Book of Face page the other day, just for kicks I shot this last Wednesday night:

long exposure,slow shutter speed,ND filter,CMOS sensor,noise

Penmon Lighthouse in North Wales at twilight.
Sky is 90 secs, foreground is 4 minutes, D4, f16, ISO 50 B&W 10 stop ND filter

The image truly gives the wrong impression of reality – the wind was cold and gusting to 30mph, and the sea looked very lumpy and just plain ugly.

I spent at least 45 minutes just taking the bloody speckled colour read noise out of the 4 minute foreground exposure – I have to wonder if the image was truly worth the effort in processing.

When you take into account everything I’ve mentioned so far plus the following:

  • Long exposures are prone to ground vibration and the effects of wind on the tripod etc
  • Hanging around in places like the last shot above is plain dangerous, especially when it’s dark.

you must now see that keeping the exposures as short as possible is the sensible course of action, and that for doing this sort of work a 6 stop ND filter is a more sensible addition to your armoury than a 16 stop ND filter!

Just keep away from exposures above 2 minutes.

And before anyone asks, NO – you don’t shoot star trails in one frame over 4 hours unless you’re a complete numpty!  And for anyone who thinks you can cancel noise by shooting a black frame think on this – the black frame has to be shot immediately after the image, and has to be the same exposure duration as the main image.  That means a 4 hour single frame star trail plus black frame to go with it will take at least 8 hours – will your camera battery last that long?  If it dies before the black frame is finished then you lose BOTH frames……………

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Exposure Compensation

Exposure Compensation

Exposure Compensation – that’s something else that cropped up once or twice for the chaps on my recent Norwegian Eagle workshop!

We had something like 420 or more dives from eagles during the trip, and very few if any were shot with flat metering, or 0Ev compensation.

What is Exposure Compensation, and why do we need to use it?

It all begins with this little button:

Exposure Compensation,exposure

D3 Exposure Compensation button – Nikon, Canon and most others use the same symbol.

Pushing this button and rotating your main command dial will select a certain exposure compensation value.

Why do we need to use Exposure Compensation though?

Cameras, for all their complexity and “intelligent whotsits” are basically STUPID!  They don’t know WHAT you are trying to photograph, or HOW you are trying to photograph it.

They make a lot of very basic assumptions about what you are trying to do – 99.99% of which are WRONG!

The camera does NOT know if you are trying to photograph:

  • A white cat in a coal shed
  • A black cat in a snow storm
  • A white cat in a snow storm
  • A black cat in a coal shed

All it sees is a frame full of various amounts of light and shade, and depending on your metering mode (which should always be Matrix/Evaluative – see post here) it gives you an “average mean exposure value”.

Take a general scene of fairly low contrast under flat overcast light:

exposure compensation,exposure,metering

A scene as WE see it.

exposure compensation,exposure,metering

The same scene as the camera METER sees it.

exposure compensation,exposure,metering

Lighter tones within the scene.

exposure compensation,exposure,metering

Some darker area tones within the scene.

exposure compensation,exposure,metering

The exposure is governed by the PREDOMINANT tone.

As discussed in the previous metering article mentioned earlier, only MATRIX/EVALUATIVE takes the entire frame area into account.

Okay, so that scene was fairly bland on the old tonal front, so let’s have a look at something a little more relevant:

exposure compensation,exposure,metering

Straight off the camera with no processing. 1/2000th @ f4 1600ISO +1.3Ev

exposure compensation,exposure,metering

As the camera metered the scene WITHOUT compensation.

Why would the image be so dark and under exposed?

Well here’s an approximation of the cameras average tone “thought process”:

exposure compensation,exposure,metering

The approximate average value of the scene.

But if we look at some averages WITHIN the overall image:

exposure compensation,exposure,metering

Random tonal averages within the image.

We can see that the tonal values for the subject are generally darker than the average scene value, therefore the camera records those values as “under exposed”.

This is further compounded by the cameras brain making the decision that the commonest tonal value MUST represent “mid grey” – which it DOESN’T; it’s lighter than that – and so under exposing the image even further!

Now I’m not going to get into the argument about “what is mid grey” and do Nikon et al calibrate to 12%, 18%, 20% or whatever – to be honest it’s “neither here nor there” from our standpoint.

What is CRITICAL though is that we understand the old adage:

“Light Subject Dark Background = Under, or negative exposure compensation. And that Dark Subject Light Background = Over, or positive exposure compensation”.

Okay, but what are we actually doing?

In any exposure mode other than Manual mode, we are allowing the camera to meter the scene AND make the decision over which shutter speed or aperture to use depending on whether we have the camera in Av or Tv mode – that’s Canon-speak for A or S on Nikon.

If we are in shutter priority/S/Tv mode then the camera sets the aperture to give us its metered exposure – that thing that’s usually WRONG! – at the shutter speed we’ve selected.

If, as in the case above, we ADD +1.3Ev – one and one third stops of POSITIVE exposure compensation, the camera uses the shutter speed we’ve selected but then opens up the aperture WIDER than it’s “brain” wants it to.

How wide? 1.3 stops wider, thus allowing 1.3 stops more light into the the sensor during the exposure time.

If we were in Av/A or aperture priority mode then it’s the shutter speed that would take up the slack and become 1.3 stops SLOWER than the cameras “brain” wanted it to be.

Here’s an example of negative exposure compensation:

exposure compensation,exposure,metering

1/3200th @ f4.5 1000ISO -1.3Ev exposure compensation.

In this particular shot we’re pointing towards the sun – a “dark subject, light background” positive exposure compensation scenario, or so you’d think.

But I want to “protect” those orange highlights in the water and the brightest tones in the eagle, so if I “peg those highlights” just over a stop below the top end of the cameras’  tonal response curve then there is no way on earth they are going to “blow” in the final RAW file.

Manual Exposure mode can still furnish us with exposure compensation based on metering if we engage AUTO-ISO.  If we decide we want to shoot continuously with a high shutter speed and a set aperture at a fixed ISO then our exposures are going to be all over the place.  But if we engage AUTO-ISO and let the camera choose the ISO speed via the meter reading, we can use the exposure compensation adjustments just the same as we do in Av or Tv modes.

This get’s us away from the problem of fixed ISO Tv mode running out of aperture in low light or when very high shutter speeds are needed; or conversely, stopping the aperture down too far when the sun comes out! – I’ll do a breakdown on this method of shooting later in the year – it’s not without it’s problems.

Next time you get the chance to stand by a large lake or other body of water, just take a moment to notice that the water is dark in some places and light in others. ambient light falling on a moving subject can easily be very uniform and so the subject basically has the same exposure value all the time.  But it’s the changing brightness of the background as the subject moves across it that causes us to need exposure compensation.

People seem to think there’s some sort of “magic” at play when they come out with me and I’m throwing exposure compensation values at them.  But there’s no magic here folks, just an ability to see beyond “the subject, framing etc” and to actually “see the light” and understand it.

After all, when we click our shutters we are imaging light – the subject is, for the most part, purely incidental!

And there’s only one way you can learn to see light and grasp its implications for camera exposure, and that’s to practice.

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Flash Duration – How Fast Can We Go

Flash duration – how long the burst of photons from flash actually lasts, does seem to get a lot of people confused.

Earlier this year I posted an article on using flash HERE where the prime function of the flash was as a fill light. As a fill, flash should not be obvious in the images, as the main lighting is still the ambient light from the sun, and we’re just using the flash to “tickle” the foreground with a little extra light.

flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography,Red Squirrel

Flash as “fill” where the main lighting is still ambient daylight, and a moderate shutter speed is all that’s required. 1/800th sec @ f8 is plenty good enough for this shot.

Taking pictures is NEVER a case of just “rocking up”, seeing a shot and pressing the shutter; for me it’s a far more complex process whereby there’s a possible bucket-load of decisions to be made in between the “seeing the shot” bit and the “pressing the shutter” bit.

My biggest influencers are always the same – shutter speed and aperture, and the driving force behind these two things is light, and a possible lack thereof.

Once I make the decision to “add light” I then have to decide what role that additional light is going to take – fill, or primary source.

Obviously, in the shot above the decision was fill, and everything was pretty straight forward from there on, and aperture/shutter speed  selection is still dictated by the ambient lighting – I use the flash as a “light modifier”.

The duration of the flash is controlled by the TTL metering system and it’s duration is fairly irrelevant.

Let’s take a look at a different scenario.

flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography

The lovely Jo doing her 1930’s screen icon “pouty thing”. Flash is the ONLY light source in this image. 1/250th @ f9 ISO 100.

In this shot the lighting source is pure flash.  There’s very little in the way of ambient light present in this dark set, and what bit there is was completely over-powered by the flash output – so the lighting from the Elinchrom BX 500 monoblocks being used here is THE SOLE light source.

Considerations over the lighting itself are not the purpose of this post – what we are concerned with here are the implications for shutter speed due to flash synchronization.

The flash units were the standard type of studio flash unit offering no TTL interface with the camera being used, so it’s manual everything!

But the exposure in terms of shutter speed is capped at 1/250th of a second due to the CAMERA – that is it’s highest synch speed.

The focal length of the lens is 50mm so I need to shoot at around f8 or f9 to obtain workable depth of field, so basic exposure settings are dictated.  This particular shot was achieved by balancing the light-to-subject distance along the lines of the inverse square law for each light.

But from the point of view of this post the big consideration is this – can I afford to have movement in the subject?

At 1/250th sec you’d think not.  Then you’d think “hang on, flash durations are a lot faster than that” – so perhaps I can…..or can I ?

Flash Duration & Subject Movement

Flash duration, in terms of action-stopping power, is not as simple or straight forward as you might think.

Consider the diagram below:

flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography

Flash Power Output curve plotted against Output duration (time).

The grey shaded area in the diagram is the “power output curve’ of the flash.

Most folk think that a flash is an “instant on, instant off” kind of thing – how VERY wrong they are!

When we set the power output on either the back panel of our SB800/580EX etc, or on the power pack of a studio flash unit, or indeed any other flash unit, we are setting a peak output limit.

We might set a Nikon SB800 to 1/4 power, or we might set channel B output on a Quadra Ranger to 132Watt/sec, but either way, we are dictating the maximum flash output power – the peak output limit. The “t 5 time” – or to be more correct the “t 0.5 time” is the total time duration where the flash output is at 50% or above of the selected peak output limit we set.

Just to clarify: we set say, 1/4th power output on the back of a Canon 580EX – this is the selected peak output limit. The t5 time for this is the total time duration where the light output is at or above 50% of that selected 1/4th power – NOT 50% of the flash units full power output – do not get confused over this!

So when it comes to total “light emission duration” we’ve got 3 different ways of looking at things:

  1. Total – and I mean TOTAL – duration; the full span of the output curve.
  2. T 0.5 – the duration of the flash where its output is at 50% or above that level set by the user – the peak output limit.
  3. T 0.1 – the duration of the flash where its output is at 10% or above that level set by the user.

Anyone looking at the diagram above can see that the total output emission time/flash duration is A LOT LONGER than the t5 time.  Usually you find that t5 times are somewhere around 1/3rd of the total emission time, or flash duration.

Getting back to our shot of Jo above, if my memory serves me correctly the BX heads I used for the shot had a t5 time of around 1/1500th sec.  So the TOTAL duration of the flash output would be around 1/500th sec.

So I can’t afford to have any movement in the subject that isn’t going to be arrested by 1/500th sec flash duration, let alone the 1/250th shutter speed.

Why? Well that 1/250th sec the shutter is open will comprise of 1/500th sec of flash photons entering the lens, and 1/500th sec of NOTHING entering the lens but AMBIENT LIGHT photons.

Let us break flash output down a bit more:

In the previous article I mentioned, I quoted a table of Nikon SB800 duration times.  At the top of the table was the SB800 1/1 or full output power flash duration.  All times quoted in that table were t5 times.

The one I want to concentrate on is that 1/1 full power t5 time of 1/1050th sec.

Even though Nikon try to tempt you into believing that the flash only emits light for 1/1050th sec it does in fact light the scene for a full 1/350th sec – most flash manufacturers units are quoted as t5 times.

Now in most cases when you might employ flash – which let’s face it, is as some sort of fill light in a general ambient/flash mixed exposure, this isn’t in reality, a big problem.  Reduced power multiple pulse AutoFP/HSS also makes it not a problem.

But if you are trying to stop high speed action – in other words “freeze time”, then it can become a major headache; especially when you need all the flash power you can get hold of.

Why? Let’s break the diagram above down to basics.

flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography

The darker shaded area represents the “tail” of the flash output – the area that can cause many problems when trying to stop high speed action.

  • The first 50% of the total light output is over and finished in the first 1/1050th of the total flash duration.
  • The other 50% of the total light output takes place over a further 1/525th sec, and is represented by the dark grey area – let’s call this area the flash “output tail”.  Some publications & websites refer to this tail as after-glow.  I always thought that ‘after glow” was something ladies did after a certain type of energetic activity!
  • The light will continue to decay for a full 1/525th sec after t5, until the output of light has died down to 0% and the full “burn time” of 1/350th sec has been reached.

That’s right – 1/1050th + 1/525th = 1/350th.

So, if our shutter speed is 1/350th sec or longer we are going to see some ghosting in our image caused by the movement of the subject during that extra 1/525th sec post t5 time.

I need to point out that most speedlight type flash units are “isolated-gate bipolar transistor” devices – that’s IGBT to you and me. Einstein studio flash units are also IGBT units – I’ll cover the implications of this in a later post, but for now you just need to know that the IGBT circuitry works to eliminate sub t5 output BUT doesn’t work if your speedlight is set to output at maximum power.  And if you need access to full 1/1 power with your speedlights for any reason then IGBT won’t help you.

Let’s see the problem in action as it were:

flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography

A bouncing golf ball shot at 1/250th sec using full power output on an SB800.
The ball is moving UPWARDS.
The blur between points A & B are caused by the “tail” or “after-glow” of the flash.

And the problem will be further exacerbated if there is ANY ambient light in scene from a window for instance, as this will boost the general scene illumination during that “tail end” 1/525th sec.

We might be well advised, if using any form of non-TTL flash mode, to use a shutter speed equal to, or shorter in duration to the t5 time, as in the shot below:

flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography

A bouncing golf ball shot at 1/2000th sec using full power output on an SB800.

All I’ve done in this second shot is go -3Ev on the shutter speed, +1Ev on the aperture and +2Ev on ISO speed.

Don’t forget, the flash is in MANUAL mode with a full power output.

With the D4 in front-curtain synch the full power, 1/350th sec flash pulse begins as the front shutter curtain starts to move, and it “burns” continuously while the 1/2000th sec “letter-box” shutter-slot travels across the sensor.

In both shots you may be wondering how I triggered the exposure. Sitting on the desk you can see a small black box with a jack plug sticking out the back – this is the audio sensor of a TriggerSmart audio/light/Infra Red combined trigger system.  As the golf ball strikes the desk the audio sensor picks up the noise and the control box triggers the camera shutter and hence the flash.

Hardy, down at the distributors,Flaghead, has been kind enough to send me one of these systems for incorporation into some long-term photography projects, and in a series of high speed flash workshops and training tutorials.  And I have to say that I’m mighty impressed with the system, and at the retail pricing point ownership of this product is a no-brainer.  The unit is going to feature in quite a few blog post in the near-future, but click HERE to email Hardy for more details.

Even though I constantly extol the virtues of the Nikon CLS system, there comes a time when its automatic calculations fight AGAINST you – and easy high speed photography becomes something of a chore.

Any form of flash exposure automation makes assumptions about what you are trying to do.  In certain circumstances these assumptions are pretty much correct.  But in others they can be so far wide of the mark that if you don’t turn the automation OFF you’ll never get the shot you want.

Wresting full control over speed lights from the likes of Nikons CLS gives you access to super-highspeed flash durations AND high shutter speeds without a lot of the synching problems incurred with studio monoblocks.

Liquid in Motion,flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography

Liquid in Motion – arrested at 1/8000th sec shutter speed using SB800’s at full 1/1 power.

Liquid in Motion,flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography

Liquid in Motion – arrested at 1/8000th sec shutter speed using SB800’s at full 1/1 power. A 100% crop from the shot above.

Liquid in Motion,flash duration,fill flash,flash,shutter speed,photography,Andy Astbury,digital photography,wildlife photography

“Scotch & Rocks All Over The Place”
Simple capture with manual speed lights at full power and 1/8000th shutter speed.

The shots above are all taken with 2x SB800s lighting the white background and 1 heavily defused SB800 acting as a top light.

One background light is set at 1/1 manual FP, the other to manual 1/1 SU-4 remote.  The top light is set to 1/8 power SU-4 remote.

The majority light in the shot is in fact that white background – it’s punching light back through the glass and liquid splash – the subject is backlit.

So, that background is being lit for a full 1/350th of a second.

But shooting in front curtain synch I’m using 1/8000th sec as a shutter speed, an exposure duration 3 stops shorter than the flash unit t5 time for full power. So in effect I’m using the combined background flash units as a very short-term continuous light source which lasts for 1/350th of a second, but the camera is only recording the very first 1/8000th sec – in other words, photons are still leaving the flash AFTER the rear shutter curtain has closed and the exposure is finished.

Finally, the shutter and flash are triggered by dropping the faux crushed ice through the IR sensor beam of the TriggerSmart unit.

This is very much along the lines of what’s termed HYPERSYNCH – a technique you can use with conventional slow burn studio flash units and certain types of 3rd party trigger units such as Pocket Wizards – but that’s yet another story, and is fraught with synch problems that you have program out of the system using the Pocket Wizard utility.

So, there’s more to come from me about flash in future posts, but for now just remember – there’s not a lot you can’t do with speed lights – as long as you’ve got enough of the little darlings!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

What Shutter Speed?

Shutter speed, and the choices we make over it, can have a profound effect on the outcome of the final image.

Now everyone has a grasp of shutter speed and how it relates to subject movement – at least I hope they do!

We can either use a fast shutter speed to freeze constant action, or we can use a slow shutter speed to:

  • Allow us to capture movement of the subject for creative purposes
  • Allow us to use a lower ISO/smaller aperture when shooting a subject with little or no movement.

 

Fast Shutter Speed – I need MORE LIGHT Barry!

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels

1/8000th sec @ f8, Nikon D4 and 500mm f4

Good strongish sunlight directly behind the camera floods this Red Kite with light when it rolls over into a dive.  I’m daft enough to be doing this session with a 500mm f4 that has very little in the way of natural depth-of-field so I opt to shoot at f8.  Normally I’d expect to be shooting the D4 at 2000iso for action like this but my top end shutter speed is 1/8000th and this shutter speed at f8 was slightly too hot on the exposure front, so I knocked the ISO down to 1600 just to protect the highlights a little more.

Creative Slow Shutter Speed – getting rid of light.

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels

1/5th sec @ f22

I wanted to capture the movement in a flock of seagulls taking off from the water, so now I have to think the opposite way to the Kite shot above.

Firstly I need to think carefully about the length of shutter speed I choose: too short and I won’t capture enough movement; and too long will bring a vertical movement component into the image from me not being able to hold the camera still – so I opt for 1/5th sec.

Next to consider is aperture.  Diffraction on a deliberate motion blur has little impact, but believe it or not focus and depth of field DO – go figure!

So I can run the lens at f16/20/22 without much of a worry, and 100 ISO gets me the 1/5th sec shutter speed I need at f22.

 

Slow Shutter  Rear Curtain Synch Flash

We can use a combination of both techniques in one SINGLE exposure with the employment of flash, rear curtain synch and a relatively slow shutter speed:

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels

6/10th sec @ f3.5 -1Ev rear curtain synch flash

A technique the “Man Cub” uses to great effect in his nightclub photography, here he’s rotated the camera whilst the shutter is open, thus capturing the glowing LEDs and other highlights as circular trails.  As the shutter begins to close, the scene is lit by the 1/10,000th sec burst of light from the reduced power, rear curtain synched SB800 flash unit.

But things are not always quite so cut-and-dried – are they ever?

Assuming the lens you use is tack sharp and the subject is perfectly focused there are two factors that have a direct influence upon how sharp the shot will be:

  • System Vibration – caused by internal vibrations, most notably from the mirror being activated.
  • Camera Shake – caused by external forces like wind, ground vibration or you not holding the camera properly.

Shutter Speed and System Vibration

There was a time when we operated on the old adage that the slowest shutter speed you needed for general hand held shooting was equal to 1/focal length.

So if you were using a 200mm lens you shot with a minimum shutter speed of 1/200th sec, and, for the most part, that rule served us all rather well with 35mm film; assuming of course that 1/200th sec was sufficient to freeze the action!

Now this is a somewhat optimistic rule and assumes that you are hand holding the camera using a good average technique.  But put the camera on a tripod and trigger it with a cable or remote release, and it’s a whole new story.

Why?  Because sticking the camera on a tripod and not touching it during the exposure means that we have taken away the “grounding effect” of our mass from the camera and lens; thus leaving the door open to for system vibration to ruin our image.

 

How Does System Vibration Effect an Image?

Nowadays we live in a digital world with very high resolution sensors instead of film. and the very nature of a sensor – its pixel structure (to use a common parlance) has a direct influence on minimum shutter speed.

So many camera owners today have the misguided notion that using a tripod is the answer to all their prayers in terms of getting sharp images – sadly this ain’t necessarily so.

They also have the other misguided notion that “more megapixels” makes life easier – well, that definitely isn’t true!

The smallest detail that can be recorded by a sensor is a point of light in the projected image that has the same dimensions a one photosite/pixel on that sensor. So, even if a point is SMALLER than the photosite it strikes, its intensity or luminance will effect the whole photosite.

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images.

A point of light smaller than 1 photosite (left) has an effect on the whole photosite (right).

If the lens is capable of resolving this tiny detail, our sensor – in this case (right) – isn’t, and so the lens out-resolves the sensor.

But let’s now consider this tiny point detail and how it effects a sensor of higher resolution; in other words, a sensor with smaller photosites:

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

The same detail projected onto a higher resolution sensor (right). Though not shown, the entire photosite will be effected, but its surface area represents a much small percentage of the whole sensor area – the sensor now matches the lens resolution.

Now this might seem like a good thing; after all, we can resolve smaller details.  But, there’s a catch when it comes to vibration:

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

A certain level of vibration causes the small point of light to vibrate. The extremes of this vibration are represented by the the outline circles.

The degree of movement/vibration/oscillation is identical on both sensors; but the resulting effect on the exposure is totally different:

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

The same level of vibration has more effect on the higher resolution sensor.

If you read the earlier post on sensor resolution and diffraction HERE you’ll soon identify the same concept.

The upshot of it all is that “X” level of internal system vibration has a greater effect on a higher resolution sensor than it does on a lower resolution sensor.

Now what’s all this got to with shutter speed I hear you ask.  Well, whereas 1/focal length used to work pretty well back in the day, we need to advance the theory a little.

Let’s look at four shots from a Nikon D3, shot with a 300mm f2.8, mounted on a tripod and activated by a remote (so no finger-jabbing on the shutter button to effect the images).

Also please note that the lens is MANUALLY FOCUSED just once, so is sharply on the same place for all 4 shots.

These images are full resolution crops, I strongly recommend that you click on all four images to open them in new tabs and view them sequentially.

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

Shutter = 1/1x (1/320th) Focal Length. No VR, No MLU (Mirror Lock Up). Camera on Tripod+remote release.

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

Shutter = 1/2x (1/640th) Focal length. No VR. No MLU. Camera on Tripod+remote release.

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

Shutter = 1/2x Focal length + VR. No MLU. Camera on Tripod+remote release.

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

Shutter = 1/2x Focal length. Camera on Tripod+remote release + MLU – NO VR + Sandbag.

Now the thing is, the first shot at 1/320th looks crap because it’s riddled with system vibration – mainly a result of what’s termed ‘mirror slap’.  These vibrations travel up the lens barrel and are then reflected back by the front of the lens.  You basically end up with a packet of vibrations running up and down the lens barrel until they eventually die out.

These vibrations in effect make the sensor and the image being projected onto it ‘buzz, shimmy and shake’ – thus we get a fuzzy image; and all the fuzziness is down to internal system vibration.

We would actually have got a sharper shot hand holding the lens – the act of hand holding kills the vibrations!

As you can see in shot 2 we get a big jump in vibration reduction just by cranking the shutter speed up to 2x focal length (actually 1/640th).

The shot would be even sharper at 3x or 4x, because the vibrations are of a set frequency and thus speed of travel, and the faster the shutter speed we use the sooner we can get the exposure over and done with before the vibrations have any effect on the image.

We can employ ‘mirror up shooting’ as a technique to combat these vibrations; by lifting the mirror and then pausing to give the vibrations time to decay; and we could engage the lens VR too, as with the 3rd shot.  Collectively there has been another significant jump in overall sharpness of shot 3; though frankly the VR contribution is minimal.

I’m not a very big fan of VR !

In shot 4 you might get some idea why I’m no fan of VR.  Everything is the same as shot 3 except that the VR is OFF, and we’ve added a 3lb sandbag on top of the lens.  This does the same job as hand holding the lens – it kills the vibrations stone dead.

When you are shooting landscapes with much longer exposures/shutter speeds THE ONLY way to work is tripod plus mirror up shooting AND if you can stand to carry the weight, a good heavy sand bag!

Shot 4 would have been just as sharp if the shutter had been open for 20 seconds, just as long as there was no movement at all in the subject AND there was no ground vibration from a passing heavy goods train (there’s a rail track between the camera and the subject!).

For general tripod shooting of fairly static subjects I was always confident of sharp shots on the D3 (12Mb) at 2x focal length.

But since moving to a 16Mp D4 I’ve now found that sometimes this let’s me down, and that 2.5x focal length is a safer minimum to use.

But that’s nothing compared to what some medium format shooters have told me; where they can still detect the effects of vibration on super high resolution backs such as the IQ180 etc at as much as 5x focal length – and that’s with wide angle landscape style lenses!

So, overall my advice is to ALWAYS push for the highest shutter speed you can possibly obtain from the lighting conditions available.

Where this isn’t possible you really do need to perfect the skill of hand holding – once mastered you’ll be amazed at just how slow a shutter speed you can use WITHOUT employing the VR system (VR/IS often causes far more problems than it would apparently solve).

For long lens shooters the technique of killing vibration at low shutter speeds when the gear is mounted on a tripod is CRITICAL, because without it, the images will suffer just because of the tripod!

The remedy is simple – it’s what your left arm is for.

So, to recap:

  • If you shot without a tripod, the physical act of hand holding – properly – has a tendency to negate internal system vibrations caused by mirror slap etc just because your physical mass is in direct contact with the camera and lens, and so “damps” the vibrations.
  • If you shoot without a tripod you need to ensure that you are using a shutter speed fast enough to negate camera shake.
  • If you shoot without a tripod you need to ensure that you are using a shutter speed fast enough to FREEZE the action/movement of your subject.

 

Camera Shake and STUPID VR!

Now I’m going to have to say at the outset that this is only my opinion, and that this is pointed at Nikons VR system, and I don’t strictly know if Canons IS system works on the same math.

And this is not relevant to sensor-based stabilization, only the ‘in the lens’ type of VR.

The mechanics of how it works are somewhat irrelevant, but what is important is its working methodology.

Nikon VR works at a frequency of 1000Hz.

What is a “hertz”?  Well 1Hz = 1 full frequency cycle per second.  So 1000Hz = 1000 cycles per second, and each cycle is 1/1000th sec in duration.

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

Full cycle sine wave showing 1,0.5 & 0.25 cycles.

Now then, here’s the thing.  The VR unit is measuring the angular momentum of the lens movement at a rate of 1000 times per second. So in other words it is “sampling” movement every 1/1000th of a second and attempting to compensate for that movement.

But Nyquist-Shannon sampling theory – if you’re up for some mind-warping click HERE – says that effective sampling can only be achieved at half the working frequency – 500 cycles per second.

What is the time duration of one cycle at a frequency of 500Hz?  That’s right – 1/500th sec.

So basically, for normal photography, VR ceases to be of any real use at any shutter speed faster than 1/500th.

Remember shot 3 with the 300mm f2.8 earlier – I said the VR contribution at 1/640th was minimal?  Now you know why I said it!

Looking again at the frequency diagram above, we may get a fairly useful sample at 1/4 working frequency – 1/250th sec; but other than that my personal feelings about VR is that it’s junk – under normal circumstances it should be turned OFF.

What circumstances do I class as abnormal? Sitting on the floor of a heli doing ariel shots out of the open door springs to mind.

If you are working in an environment where something is vibrating YOU while you hand hold the camera then VR comes into its own.

But if it’s YOU doing the vibrating/shaking then it’s not going to help you very much in reality.

Yes, it looks good when you try it in the shop, and the sales twat tells you it’ll buy you three extra stops in shutter speed so now you can get shake-free shots at 1/10th of a second.

But unless you are photographing an anaesthetized Sloth or a statue, that 1/10th sec shutter speed is about as much use to you as a hole in the head. VR/IS only stabilizes the lens image – it doesn’t freeze time and stop a bird from flapping its wings, or indeed a brides veil from billowing in the breeze.

Don’t get me wrong; I’m not saying VR/IS is a total waste of time in ALL circumstances.  But I am saying that it’s a tool that should only be deployed when you need it, and YOU need to understand WHEN that time is; AND you need to be aware that it can cause major image problems if you use it in the wrong situation.

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

In Conclusion

shutter speed,Red Kite,Andy Astbury,action photography,Wildlife in Pixels,vibration reduction,camera shake,mirror slap,sharp images

1/2000th sec is sufficient to pretty much freeze the forward motion of this eagle, but not the downward motion of the primary feathers.

This rather crappy shot of a White-tailed eagle might give you food for thought, especially if compared with the Red Kite at the start of the post.

The primary feathers are soft because we’ve run out of depth of field.  But, notice the motion blur on them too?  Even though 1/2000th sec in conjunction with a good panning technique is ample to freeze the forward motion of the bird, that same 1/2000th sec is NOT fast enough to freeze the speed of the descending primary feathers on the end of that 4 foot lever called a wing.

Even though your subject as a whole might be still for 1/60th sec or longer, unless it’s dead, some small part of it will move.  The larger the subject is in the frame then more apparent that movement will be.

Getting good sharp shots without motion blur in part of the subject, or camera shake and system vibration screwing up the entire image is easy; as long as you understand the basics – and your best tool to help you on your way is SHUTTER SPEED.

A tack sharp shot without blur but full of high iso noise is vastly superior to a noiseless shot full of blur and vibration artefacting.

Unless it’s done deliberately of course – “H-arty Farty” as my mate Ole Martin Dahle calls it!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Metering Modes Explained

Camera Metering Modes

Become a Patron!

I always get asked about which camera metering mode I use,  and to be honest, I think sometimes the folk doing the asking just can’t get their heads around my simplistic, and sometimes quite brutal answers!

“Andy, it’s got to be more complicated than that surely….otherwise why does the camera give me so many options…?”

Well, I always like to keep things really simple, mainly because I’m not the brightest diamond in the jewellery shop, and because I’m getting old and most often times my memory keeps buggering off on holiday without telling me!

But before I espouse on “metering the Uncle Andy way” let’s take a quick look at exactly how the usual metering options work and their effects on exposure.

The Metering Modes

  • Average (a setting usually buried in the center-weighted menu)
  • Spot
  • Center-weighted
  • 3D Matrix (Nikon) or Evaluative (Canon)
Metering Mode Icons

Metering Mode Icons

You can continue reading this article FREE over on my public Patreon posts pages.  Just CLICK HERE

Nikon D4S

The new Nikon D4S announced today

 

Nikon D4S left & D4 right

Nikon D4S left & D4 right

Well, that’s about right, my sexy Nikon D4 is officially out of date, and thanks to the Nikon D4S I’ve just lost a grand off the resale value of my camera – cheers chaps…..

Is Uncle Andy stressed at all about being kitted out with yesterdays gear?

Nope, not really.

So what’s new on the Nikon D4S ?

  • Well there’s been a few ergonomic tweaks which basically mean nothing for starters.
  • Seemingly dispelled are the rumours that it would have a higher Mp count – apparently this stays the same at 16.2Mp.
  • I was expecting some major change in AF but no, they’ve kept the venerable Multi-Cam 3500FX system.
  • New sensor design.
  • BUT – they’ve changed the image processor to Expeed 4 from Expeed 3.
  • AND – they’ve changed the battery from EN-EL18 to an EN-EL18a.

Bare in mind all I’m going on is the web – perish the thought that Nikon would ever think my opinion worthy of note and ACTUALLY SEND ME ONE.

Other changes:

  • A new Group Area AF mode – which from my own photography PoV is fairly meaningless, seeing as we already have 9 point dynamic AF – I can’t see it’ll make much difference. Plus, the Group AF mode always focusses on the nearest point – something you rarely want the camera to do!
  • 6 possible white balance presets as opposed to 3 on the D4 – I jam all my cameras into Cloudy B1 custom WB and leave them there – so this improvement isn’t worth jumping up and down about either.
  • Fairly gimmicky S Raw
  • Spot White Balance

On the storage front most reports say that the D4S carries over the D4 crazy arrangement of 1x CF plus 1x XQD.

My Basic Thoughts:

New Sensor – well the benefits can’t been seen by yours truly until I see a few RAW files from it – preferably taken by myself.

I’m glad they’ve kept it to 16.2Mp – if you crunch the numbers this is the optimum Mp count for an FX sensor – as Canon worked out aeons ago with the 1DsMk2; but then joined the stupid Mp race.

Image Processor changes – well, it’s reportedly 30% faster than the Expeed 3, which basically means that the D4S fires off images to storage 30% faster.

Now I can go out with the D4 and shoot getting on for 100 uncompressed 14bit RAW files in one continuous burst at 8 or 9 fps – do I want to chew through my storage any faster?  NO!

The Expeed 4 gives better high ISO performance?

Well perhaps it does, but I look at it this way.  If light is so damn low that you need to shoot at crackpot ISO numbers then you can say one thing – the light is crap.

If the light is crap then the image will look like crap – it’s just that with the Expeed 4 it’ll be slightly less noisy crap.

If I can pull 1/8000th sec at f7 or f8 at 3200ISO in half descent looking light using a D4 – which I do regularly – then why do I need a higher ISO capability?

The Red Squidger images you’ve seen in the previous blog articles are all 2000ISO and there is ZERO noise degradation – so again, why do I need more ISO capability.

Now if I was a ‘jobbing’ photo-jounalist, or I was embedded with the troops in Afghanistan or something of that ilk then I’d perhaps have a much different attitude.

But I’m not, and from my own perspective of wildlife & natural history photography these changes are of little interest to me – especially when they have a £5k price tag.

Battery Changes

There was always a persistent gripe about the battery life of the D4 EN-EL18 power cell – well, I’ve got two of them and have had no problems AT ALL with batteries running low.

I was REALLY annoyed that they switched from EN-EL4A D2/D3 style batteries – I’d got a handful of those already, and now when I go to Norway in June I’ve got to take 2 bloody chargers with me: yes the venerable D3 will be getting a summer holiday this year as second camera.

So, for me at least, the increased battery life of the new Nikon D4S 18a batteries is somewhat inconsequential – why do I want a battery that lasts longer than ‘for ever’ ??

Other Changes/Additions

I can’t see anything that excites me:  spot white balance?  Go and buy a Colour Checker Passport and do the job right – and that doesn’t cost £5k either (though they are a bit pricey).

Group Area AF – do me a favour (see above).

6 White Balance presets – what’s the point?

All of the above could be given away by Nikon as a firmware update for the D4 if they fancied being generous!

What I Would Have Got Excited About.

Twin UDMA 7 CF card slots and an XQD slot for dedicated video recording.

An improved AF module.

The ability to select ‘matched pairs’ of sensors – Canon offered this years ago and it was brilliant.

Internally recorded FX video of EXACTLY the same quality as that of a Canon 5D3, or at least the same quality as internal 1080p CROP.

AF mode selector back WHERE IT SHOULD BE!

Me being put in charge at Nikon!

In Conclusion

Do I want to buy one (even if I had the dough) – NO!

Do I wish I could afford one – NO!

Would I swap my D4 for a D4s – well of course I would.

Seriously though, I can just see an awful lot of people getting “hot under the collar” and stressing over this latest incarnation of this pro body from Nikon; but seriously, if you are then you need to just take a quiet step back and think about things calmly.

There is nothing – IMHO of course – on the D4S that warrants upgrading from the D4 – unless you have a penchant for spending your money that is.

But if you are still on a D3 or something older, and were thinking about buying a D4 – then hold off a while until the D4S in available; it’s makes better fiscal sense.

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.