Weird Taranaki Cloud

Addendum added at end of this post on  20 Dec 2010

Lesley of New Plymouth recently sent in some cloud images taken by cellphone at New Plymouth.  They were taken at around 7:50am on Thursday 18 November 2010 and look a bit like someone or something had been slicing the clouds.  The first, below, was taken from the Waimea Street/Brois Street roundabout, looking roughly east-south-east  (to the right of the morning sun).

The second, above, was taken on Huatoki Street outside Vogeltown School facing in the same direction.

The bearing of sunrise at New Plymouth on 18 November was 116 degrees true, and by 7:50 am the bearing was around 99 degrees, at an elevation of 20 degrees above the horizon.

Rough map of details in photo.

Explanation:

Let’s divide the image into three interesting parts.

And look at the weather map and satellite image

On 18 November the weather map shows that an anticyclone (or high) was departing to the east of New Zealand and a warm-frontal zone was approaching from the north.  The barometer was steady at around 1020 hPa, but the clouds above Taranaki were in a state of flux.  There was still some sinking air aloft caused by the high, but also some layers of rising moist air caused by the approaching front.

One of these layers was indicated by a layer of cloud called Altocumulus (Alto=middle, Cumulus= in heaps).  This is sometimes called a mackerel sky because it looks similar to the scale patterns seen on a fish.  This cloud layer had a base that was measured by MetService at New Plymouth airport to vary between 17,000 and 21,000 feet (in aviation meteorology altitude is measured in feet and horizontal visibility is measured in metres), well above the top of Mount Taranaki (8,260 feet).

The droplets in this cloud would have been formed from water vapour condensing onto small aerosols (dust or salt particles) into little liquid balls as the layer of air rose and cooled.   Even though these cloud droplets had floated higher in the sky than the freezing level, they were still liquid – such cloud droplets are given the name super-cooled.

The zoom (above)  taken from the  8am image from the MTSAT  geostationary satellite shows a cloud gash orientated north/south and located east of Stratford. It also seems to show some possible steps up and down within the cloud deck, oriented east-north-east to west-south-west.

1. Cloud step or gap

This deck of cloud has a sort of step feature or crack in it running from lower right to upper left in these photos.  These features are sometimes seen in other cloud decks at lower and higher levels – possibly due to some sort of disruption in the rising layer as the cloud forms, but there is insufficient information in this case for any useful conjectures.

The zoomed image above shows that the air is clear along this gap.   A hint of shadow seems to indicate that there may be a jump in height from left to right across this gap.

2. Long gash

The striking gash line from left to lower right is an example of a process colloquially called a “hole punch”.

To form a “hole punch”, basically, requires a disturbance such as the addition of some frozen nuclei – it could be ice or soot or something solid with a sub-zero temperature, possibly falling from a cirrus cloud higher up.   This causes the supercooled water droplets that it touches to freeze.  The frozen ice particles then grow by absorbing water vapour from the air around them.  This lowers the relative humidity of this air and increases the evaporation rate of the remaining liquid water droplets so that they dissipate as a result of a diffusion process.  The growing ice starts to fall in streaks and a hole opens up.

Eventually the water vapour cannot diffuse fast enough from the edges of the hole and the process stops.  It’s a race between diffusion and gravity, and gravity wins.

In this case the fall streaks encountered drier air below the original cloud and mostly evaporated, but sometimes they can form a long wispy cloud just beneath the hole. These falling and evaporating clouds are called virga.

We can only guess as to the cause of the disturbance that led to this hole punch.  Since it seems to have a straight track, it could likely have been a passing aircraft.  The exhaust fumes of a jet engine wake contain water vapour, carbon dioxide, and small amounts of soot and other combustion products. Soot may have descended into the cloud and started the freezing process.  Jet engines leave a trail of frozen nuclei in their wake that, given the right conditions, can seed cloud growth and make a contrail (condensation trail), or punch a hole in a cloud and make a distrail (dissipation trail).

This cloud gash matches the flight path for aircraft between Auckland and Wellington.

There may have been other sources for the original freezing nuclei.

Here are some local examples of clouds showing a hole punch:

Colin Langley, Hauraki Gulf, Auckland

Sheryl Logan, Maungaturoto, Northland

Ron Ovenden, Whatawhata

Dave Swarbrick, Auckland

And here are some links from Wired Science and NASA explaining the processes involved

3.  A spot of colour

The above zoomed image has been highlighted to show a coloured spot in the image taken from Huatoki Street.

This is likely to be iridescence (sometimes called irisation), with sun side-lighting a thin cloud and producing a coloured pattern similar to that seen in soap bubbles.

The likely process here is called diffraction (click here for more information).

The simple way of explaining diffraction is to say that light is bent as it passes around the thin sharp edge of an object, and some wavelengths or colours are bent by different amounts, so that sunlight is turned into a rainbow of colour.  The cloud needs to be thin and all the droplets need to be all of similar size and stay coherent.  This occurs in newly formed clouds, especially in mountain wave clouds (altocumulus lenticularis), cumulus clouds, and in contrails.   Iridescence occurs best when viewed at only a small angle from the sun, with the sun low in the sky and covered.

Mathematically, diffraction is best described as being an interference pattern, where different streams of light get superimposed and combine like waves, adding here and cancelling there.

Conclusion

Our thanks to Lesley for sharing this weird and wonderful cloud image with us.  It helps to highlight some of the natural (and man-made) effects that occur around us.  It also gives us an opportunity to share our current understanding of the physical processes at work. Taranaki, with its conical mountain, is a great place for looking for weird clouds. So keep that camera ready!

Addendum

Malcolm Potts has sent in some more images of this weird cloud,  capturing more of the striking iridescence. He has permitted these to be added to this blog.

Thanks Malcolm :

Malcolm Potts

Malcolm Potts

Myth-busting ultraviolet radiation

Written by Wayde Beckman from the Health Sponsorship Council, on 11 November 2010

Who would have thought in November, when temperatures struggle to get over 20 degrees in some New Zealand places, that ultraviolet (UV) radiation levels are already high enough to cause sunburn and put Kiwis at risk of developing melanoma skin cancer?

You may have noticed there are already some red faces, not to mention shoulders and arms. But how can this be when summer hasn’t even arrived yet? Lots of people think there is a connection between air temperature and UV radiation from the sun – and that sun protection for our precious skin is only needed on hot days.

The real truth is that temperature and UV radiation are not related. New Zealand is good evidence of this – by global standards, our climate is relatively cool but we have high levels of UV radiation during daylight saving months, and sadly we lead the world when it comes to having the highest rate of melanoma.

So how do we know what the UV radiation levels are? The ultraviolet radiation index (UVI) is a standard way of measuring the level of UV radiation in our environment. The higher the UVI number, the stronger the UV radiation from the sun. When the UVI is at 3 or higher (which happens in most parts of New Zealand from September to April) we need to protect ourselves. And at this time of the year (November) the UVI levels are always above 3 between 11am and 4pm.

So avoid becoming red in the face – remember to slip, slop, slap and wrap before heading out to enjoy the outdoors. If you want to know more about how to be SunSmart, visit www.sunsmart.org.nz/ or check out the UVI at www.sunsmart.org.nz/being-sunsmart/uv-index

The graph below shows the average midday UVI values over the year. Even in the cooler parts of New Zealand, the UVI gets high enough for people to need to use sun protection during daylight saving months.

Graph based on data from a National Institute for Water & Atmospheric Research (NIWA) report on Climatology of UVI for NZ

Graph based on data from a National Institute for Water & Atmospheric Research (NIWA) report on Climatology of UVI for NZ by Richard McKenzie (May 2008).


About the author:
Wayde Beckman – Marketing and Communications Adviser
Wayde began his career in the health promotion when he joined HSC (The Health Sponsorship Council) in 2000.

He has an extensive background in performing arts, events management and media communications design. He has worked across several HSC programmes including Tobacco Control, Walking and Cycling. Wayde has worked in the area of skin cancer prevention and sun safety since 2008 as a project manager and advising on the SunSmart public communications campaign. Among other things, he is currently focussing on the redevelopment of the New Zealand UV Index.

Wayde used to run about the rugby league field but is now mostly seen running around after his kids.

Cold Windy Friday

MetService News  Release issued at 12:10pm Thu 4 Nov 2010:

MetService has issued a Special Weather Advisory for the south and east of the South Island.  A cold front crossing the South Island on Friday is likely to be followed by chilly southerly winds, with a period of cold rain and snow to 600-700 metres from Southland through Otago and Canterbury to Marlborough.

MSL midday Thu-04-Nov 2010

Above: Mean sea level analysis and infra-red satellite image for midday Thursday 04 November 2010 New Zealand daylight time.

“After a few mild weeks, this cold change will be very noticeable,” commented MetService Weather Ambassador, Bob McDavitt.  “Farmers should note that the combination of wind and cold rain or snow is likely to put stress on vulnerable stock, and travellers are advised that snow may affect higher roads and mountain passes.”

Prognosis for midday Fri-05-Nov 2010

Above: Mean sea level prognosis for midday Friday 05 November 2010 New Zealand daylight time

“The weather is also expected to be unsettled over the North Island and the north and west of the South Island on Friday, as other fronts cross these areas,” said Mr. McDavitt.  “Check the weather forecast before heading out to that fireworks show.”

Latest weather forecast and warning updates are available on metservice.com, and, for mobiles on m.metservice.com .

Verification of Severe Weather Warnings

This blog post is the third in a three-part series discussing verification of MetService forecasts. Here, we present the method used for verifying Severe Weather Warnings, along with some recent examples.

Verification Scheme

The statistical method of verifying Severe Weather Warnings is similar to that used for verifying rainfall in city forecasts. That is, a categorical approach is taken, with Probability of Detection (POD), False Alarm Ratio (FAR) and Critical Success Index (CSI) being produced on a monthly basis. In the case of Severe Weather Warnings, POD, FAR and CSI are expressed as percentages.

There is, however, much more subjectivity in determining whether a Severe Weather Warning is a success, a false alarm, or whether an event has been missed. This is because Severe Weather Warnings apply to broad areas (minimum of 1000 square kilometres, and generally much larger than that) and an assessment has to be made of how widespread the occurrence (or non-occurrence) of heavy rainfall / heavy snowfall / strong winds was. The initial assessment is made by MetService’s Severe Weather Forecasters themselves, using all available weather observations (including those from many voluntary observers MetService can call), media reports, comments on web forums about New Zealand’s weather, radar rainfall accumulations, and occasionally through discussion with territorial authorities and utility companies. At the end of each month, every assessment of every event is reviewed by two senior meteorologists who work at MetService and who are outside of forecasting operations, and I sign off on them as Chief Forecaster.

Rainfall, snowfall and wind are usually highly variable over any broad area in New Zealand – because of the complexity of the New Zealand landscape and because of the variability of weather systems themselves. For example, if the Severe Weather Warning is for 150mm rain in 24 hours, is it successful if:

  • The rain fell in 30 hours? 18 hours?
  • 200mm fell? 100mm? …
  • 250mm fell over half the area and 50mm fell over the other half of the area?

Performance over the year to September 2010

Statistics Rain


Statistics Snow


Statistics Wind

Longer-Term Trends

Graphs like those above but covering a much longer period – say, a decade – show a gradual improvement in forecast accuracy. Given the changes in observing technologies, numerical weather prediction and forecasting techniques over the last several decades, this is not surprising. What these verifications don’t show well is the much increased precision – location, amount, intensity, timing – that Severe Weather Warnings have contained in recent years.

Performance Target

MetService’s performance targets for Severe Weather Warnings are:

Probability of Detection False Alarm Ratio
2010-11 2011-12 2010-11 2011-12
Heavy Rain >83% >87% <32% <28%
Heavy Snow >79% >81% <36% <34%
Severe Gales >79% >81% <36% <34%


Limitations

It is possible to have a “perfect” POD or FAR – but not simultaneously:

  • If heavy rain (or heavy snow, or strong winds) was forecast every day, the POD would be 100% – but the FAR would be close to 100% too.
  • If heavy rain (or heavy snow, or strong winds) was never forecast, the FAR would be 0% – but the POD would be close to 0% too.

In both of these cases, the forecast would be of no value because users would have no idea on which days heavy rain (or heavy snow, or strong winds) was expected to occur.

Very few forecasts or warnings are issued in isolation. Therefore, verifying them in isolation does not provide a complete picture of their value. Not infrequently, MetService will issue a Severe Weather Watch if it is considered that an event will not quite meet the Severe Weather Warning criteria but is nevertheless notable. If, subsequently, the Severe Weather Warning criteria are met, the verification scheme will record a missed event – despite MetService having perhaps come very close to forecasting it perfectly.

Verification of Maximum and Minimum Temperature in City Forecasts

This blog post is the second in a three-part series discussing verification of MetService forecasts. Here, we present the method used for verifying maximum and minimum temperature in city forecasts, along with some recent examples.

Verification Scheme

Four times daily (around 11:15am, 4:15pm, 10:50pm and 3:45am) MetService issues city forecasts for more than 40 locations. Currently, for 32 of these (and soon for most of the remainder), the forecasts of tomorrow’s maximum temperature and minimum temperature from today’s late morning issue are verified against observations from a nearby automatic weather station.
Over a given period (typically, a month), for each location, the forecasts of tomorrow’s maximum and minimum temperatures from the late morning issue of the city forecasts are compared with observed maximum and minimum temperatures. As with the verification of precipitation in city forecasts, “tomorrow” is the 24-hour period between midnight tonight and midnight tomorrow and the scheme operates automatically – that is, there is no input by MetService staff.

Some Results

Results dating from the implementation of this particular scheme in March 2009 through to 28 October 2010 are below.







The graphs clearly demonstrate what every experienced forecaster knows:

  • Temperature forecasting is generally most difficult over the east of the South Island
  • It is much harder to forecast the minimum temperature than it is to forecast the maximum temperature.

Performance Target

MetService’s performance target for the forecast of maximum and minimum temperature in city forecasts is maximum temperature within 2° C and minimum temperature within 4° C 77% of the time for the 2010/11 financial year and 80% of the time for the 2011/12 financial year.

Limitations

Even across small distances, there can be significant variations in temperature during the course of a day – just ask any horticulturalist. Thus, the observation point used to verify the forecast for a given place may or may not fairly represent the temperature there – or may be a good indication in some weather situations but not in others. MetService observation sites are commonly at airports, which in general are at least some distance from the city or town they serve.

Verification of Rainfall in City Forecasts

This blog post is the first in a three-part series discussing verification of MetService forecasts. Here, we present the method used for verifying rainfall in city forecasts, along with some recent examples.

Verification Scheme

Four times daily (around 11:15am, 4:15pm, 10:50pm and 3:45am) MetService issues city forecasts for more than 40 locations. Currently, for 32 of these (and soon for most of the remainder), the forecasts of tomorrow’s precipitation from today’s late morning issue are verified against observations from a nearby automatic weather station.
Counts of precipitation forecasts and corresponding observations are made using a categorical forecast verification system. The principle is described in the contingency table below. Over a given period (typically, a month), for each location, the number of times an event was

  • A: Forecast and observed
  • B: Forecast and not observed
  • C: Not forecast but observed
  • … are recorded.

2 x 2 Contingency Table Event Observed
Yes No
Event Forecast Yes A B
No C D



From these, indications of accuracy are readily obtained:

Probability Of Detection (POD) Ratio of the number of times rainfall was successfully forecast to the total number of times it was observed Probability of Detection
False Alarm Ratio (FAR) Ratio of the number of times rainfall was forecast but not observed to the total number of times it was forecast False Alarm Ratio
Success Ratio (SR) Ratio of the number of times rainfall was successfully forecast to the total number of times it was forecast
Critical Success Index (CSI) Ratio of the number of times rainfall was successfully forecast to the number of times it was either forecast (successfully or not) or observed Critical Success Index
Bias Ratio of the total number of times rainfall was forecast to the total number of times it was observed Bias


For good forecasts, POD, SR, Bias and CSI approach a value of 1.

In the verification scheme:

  • “Tomorrow” is the 24-hour period between midnight tonight and midnight tomorrow
  • The forecast is considered to be of precipitation when the accompanying icon (as it appears, for example, in the city forecast for Auckland) is any one of “Rain”, “Showers”, “Hail”, “Thunder”, “Drizzle”, “Snow”, or “Few Showers”
  • Precipitation is considered to have fallen if the automatic weather station records any precipitation amount greater than zero.

The scheme operates automatically – that is, there is no input by MetService staff.

Some Results

POD, SR, Bias and CSI can be geometrically represented in a single diagram, and therefore simultaneously visualised.
The diagram below shows the accuracy of the forecasts issued late morning, for tomorrow, of precipitation, for the 32 cities verified, during September 2010.
Urban Rainfall September 2010


North and west of North Island (NWNI) East and south of North Island (SENI) North and east of South Island (NESI) West and south of South Island (WSSI)
WR Whangarei
AA Auckland
WT Whitianga
HN Hamilton
RO Rotorua
TG Tauranga
WK Whakatane
NP New Plymouth
WU Wanganui
AP Taupo
GS Gisborne
HS Hastings
NR Napier
TM Taumarunui
MS Masterton
PM Palmerston North
LV Levin
WG Wellington
NS Nelson
WB Blenheim
KI Kaikoura
CH Christchurch
AS Ashburton
OU Oamaru
TU Timaru
WS Westport
HK Hokitika
QN Queenstown
WF Wanaka
DN Dunedin
GC Gore
NV Invercargill


From the above diagram, for example:

  • The forecast for Auckland scores well, with Probability of Detection, Success Ratio and Bias all close to 1
  • The forecast for Timaru scores not as well: Probability of Detection around 0.75 is compromised by a Success Ratio just under 0.4 and a Bias of 2. In other words, during September 2010, the precipitation icon accompanying the late morning issue of the city forecast for Timaru for tomorrow, was too often one of “Rain”, “Showers”, “Hail”, “Thunder”, “Drizzle”, “Snow”, or “Few Showers”.

It’s also useful to see performance when the cities are grouped into the geographical areas described in the key above and overall. The diagram below shows the accuracy of the forecasts issued late morning, for tomorrow, of precipitation, for the 32 cities verified, during September 2010.


In this particular case, forecasts for the group of cities comprising the north and east of South Island (“NESI”) didn’t verify as well as for other places, with a Probability of Detection of about 0.75 and a False Alarm Ratio of about 0.3.

Further, it’s interesting to look at results dating from the implementation of this particular scheme in March 2009 through to 28 October 2010. Probability of Detection hovers at around 0.8 ± 0.15 (80% ± 15% in the graph immediately below). The False Alarm Ratio has shown a steady decline – in other words, predictions of rainfall in the city forecasts have improved in accuracy over the last year.
POD time series

FAR time series


Performance Target

MetService’s performance target for the forecast of precipitation in city forecasts is a combined POD greater than 0.77 for the 2010/11 financial year, increasing to 0.80 for the 2011/12 financial year.

Limitations

Even across small distances, there can be significant variations in rainfall during the course of a day. Thus, the observation point used to verify the forecast for a given place may or may not fairly represent the rainfall there – or may be a good indication in some weather situations but not in others. MetService observation sites are commonly at airports, which in general are at least some distance from the city or town they serve.

References

Finally, there is much literature about forecast verification. If you’d like to know more about it, try these:
Doswell Charles A., 2004: Weather forecasting by humans—Heuristics and decision making. Weather and Forecasting, 19, 1115–1126.
Doswell, Charles A., Robert Davies-Jones, David L. Keller, 1990: On Summary Measures of Skill in Rare Event Forecasting Based on Contingency Tables. Weather and Forecasting, 5, 576–585.
Hammond K. R., 1996: Human Judgment and Social Policy. Oxford University Press, 436 pp.
Roebber, Paul J., 2009: Visualizing Multiple Measures of Forecast Quality. Weather and Forecasting, 24, 601–608.
Stephenson, David B., 2000: Use of the “odds ratio” for diagnosing forecast skill. Weather and Forecasting, 15, 221–232.