Activity for World Meteorological Day 2014: Make Your Own Observations

On 23 March each year, national weather services around the world celebrate World Meteorological Day to commemorate the establishment of the World Meteorological Organization on this day in 1950.

In 2014, the theme for World Meteorological Day is ‘Weather and Climate: Engaging Youth’. We thought this was an ideal opportunity to provide a few do-it-yourself weather projects for the keen weather kids out there.

Why do we need weather observations?

Before weather forecasters can tell us what the weather is going to be doing tomorrow, they need to know what it is doing right now.

Every hour hundreds of weather stations across New Zealand, from Cape Reinga to Stewart Island, send in reports of how the weather is where they are. As well as telling us how windy it is, how warm it is and how much rain has fallen, the weather stations can also tell us where it is raining or snowing, how much cloud there is, and the pressure of the air.

These weather reports are really important to weather forecasters who use them along with information from radars and satellites to find out what the weather is doing across the country. These observations are also shared around the world to help build a global picture of the current conditions.

Not all of us have an official weather station in our back garden or can afford to build our very own weather radar – but that doesn’t mean we can’t take our own measurements using things we find around the house.

The weather station in Wellington’s Botanical gardens

This is the weather station in Wellington’s Botanical gardens. A is the MetService headquarters; B is a receiver on the roof which lets us collect information from satellites as they pass overhead; C is an anemometer which is used to measure wind speeds; D is a device that measures sunshine; E is the Stevenson Screen used to keep our thermometers in the shade; and F is a collection of different rain gauges.

Wind

Although we cannot see wind we can see the effects that it has on other things like trees , flags and smoke. The Beaufort scale compares wind speeds with the effects that they have over land and sea. For example, if we notice that only the leaves of a tree are moving in the wind we would describe it as a “light breeze”, but if we noticed that the whole tree was shaking we would call it a “gale”.

You can find out more about the Beaufort scale and download your own copy here:
http://about.metservice.com/assets/downloads/learning/winds_poster_web.pdf

Rain

Measuring how much rain has fallen needs a rain gauge. We can make a simple rain gauge by using a straight sided bottle, some stones and a ruler:

How to make a rain gauge - step 1How to make a rain gauge_2How to make a rain gauge - step 3

The rain gauges that MetService use are called tipping bucket rain gauges. As rain falls into the gauge it fills a little bucket at the base of a tube, then once the bucket is full it tips over and empties out the rain it has collected. A counter keeps track of how many times the bucket tips over and as we know how much rain can fit in the bucket we can work out how much rain has fallen. Because the rain gauge empties itself it never overflows.

Temperature

The temperature of the air can very a lot over a short distance. To measure the temperature of the air, meteorologists [scientists who study the atmosphere] use thermometers. You might already have a thermometer in your house which will let you measure the temperature.

The thermometers that are used for the official observations are kept in special boxes called Stevenson screens. These white boxes keep the thermometers shaded while the gaps on either side of the box let the air move freely over the thermometer.

Because we share this information around the world, it is important that we are measuring the temperature in the same way as the meteorologists everywhere else in the world. This is also why we have a special calibration lab, to ensure all our measuring equipment performs exactly the same way as the equipment used by other national weather services.

Weather

There are lots of different types of weather, so as well as recording things like if it is raining or snowing, observations such as if you can hear a thunderstorm or it is foggy are useful things to note down. Why not keep track of the different types of weather you can observe over a week? You can write them down or draw a picture to show the weather. These are the symbols we use on our website to show the different types of weather – can you name them all?

Weather icons used on metservice.com

Clouds

As well as different types of weather there are different types of cloud. Some are high in the atmosphere and made of ice others are low down close to the ground made of large water droplets giving the underside of the cloud a menacing grey colour.  We have some great clouds in New Zealand, and this poster gives some great examples of the ones you might see:
http://about.metservice.com/assets/downloads/learning/clouds_poster_web.pdf

While you are looking at the different clouds outside, try to figure out how much of the sky is covered in cloud. Is it completely covered? Perhaps it is only half the sky that has cloud covering it or perhaps you can’t see any clouds at all.

Weather observers measure the cloud cover in Oktas, or eights. A sky that is completely covered in cloud is called overcast and has 8 oktas of cloud.

How much of the sky do you think is covered in these pictures? Imagine squashing the cloud together and counting the grey coloured squares.

Measuring clouds in oktas

How much cloud do you think there is in the picture of the weather station at the top of the page?

Weather codes

As you can imagine, sending all the weather information back to the weather office and sharing it takes a lot of computer power. To make it easier, all the information is transmitted as a series of codes.

Here’s an example of an observation that gets send from a weather station:
93110 17698 /2311 10206 20137 40238 57006 60004 70300=

At first it looks like a random string of numbers, but each group gives information about different aspects of the weather. The first block gives the station number for Auckland so we know where the observation comes from. The other blocks give information about wind speed, temperature rainfall and weather. From this observation we know that there is 11 knots of wind coming from the southwest; the temperature is 20.6C; the surface pressure is 1023.8hPa; there hasn’t been any rainfall in 24 hours; and there is some cloud developing.

You can find more about decoding the observations here: http://weather.unisys.com/wxp/Appendices/Formats/SYNOP.html

We hope you enjoy doing some or all of these weather projects!

Bugs in the weather radar

If you were looking at radar imagery overnight Thursday 21 February 2013 or this morning (Friday 22 February 2013), you could be forgiven for thinking that there was quite a lot of light precipitation over the northern half of the North Island and west of Auckland.

MetService radar imagery from 7:00pm Thu-21-Feb 2013 to 11:20am Fri-22-Feb 2013

Except for parts of Gisborne and Hawkes Bay, we know that there was almost no precipitation over the North Island in the period covered by the radar imagery above.

So what was all that yellow radar echo over parts of the central North Island and around Waikato and Auckland?

We’re not completely sure, but we strongly suspect the echo is swarms of insects. To show up in radar imagery like this, they must have some of the characteristics of precipitation particles: similar size, fairly numerous, enough water content (or perhaps coating).

The overnight period Thursday 21 February / Friday 22 February 2013 isn’t the only time lately radar echoes like this have been observed over the North Island. They’ve shown up, to a greater or lesser extent, most evenings in February. The weather over the North Island has been settled — and very dry — during this time.

Interestingly, there is quite a lot of literature on the subject of radar entomology.

Verification of Severe Weather Warnings

This blog post is the third in a three-part series discussing verification of MetService forecasts. Here, we present the method used for verifying Severe Weather Warnings, along with some recent examples.

Verification Scheme

The statistical method of verifying Severe Weather Warnings is similar to that used for verifying rainfall in city forecasts. That is, a categorical approach is taken, with Probability of Detection (POD), False Alarm Ratio (FAR) and Critical Success Index (CSI) being produced on a monthly basis. In the case of Severe Weather Warnings, POD, FAR and CSI are expressed as percentages.

There is, however, much more subjectivity in determining whether a Severe Weather Warning is a success, a false alarm, or whether an event has been missed. This is because Severe Weather Warnings apply to broad areas (minimum of 1000 square kilometres, and generally much larger than that) and an assessment has to be made of how widespread the occurrence (or non-occurrence) of heavy rainfall / heavy snowfall / strong winds was. The initial assessment is made by MetService’s Severe Weather Forecasters themselves, using all available weather observations (including those from many voluntary observers MetService can call), media reports, comments on web forums about New Zealand’s weather, radar rainfall accumulations, and occasionally through discussion with territorial authorities and utility companies. At the end of each month, every assessment of every event is reviewed by two senior meteorologists who work at MetService and who are outside of forecasting operations, and I sign off on them as Chief Forecaster.

Rainfall, snowfall and wind are usually highly variable over any broad area in New Zealand – because of the complexity of the New Zealand landscape and because of the variability of weather systems themselves. For example, if the Severe Weather Warning is for 150mm rain in 24 hours, is it successful if:

  • The rain fell in 30 hours? 18 hours?
  • 200mm fell? 100mm? …
  • 250mm fell over half the area and 50mm fell over the other half of the area?

Performance over the year to September 2010

Statistics Rain


Statistics Snow


Statistics Wind

Longer-Term Trends

Graphs like those above but covering a much longer period – say, a decade – show a gradual improvement in forecast accuracy. Given the changes in observing technologies, numerical weather prediction and forecasting techniques over the last several decades, this is not surprising. What these verifications don’t show well is the much increased precision – location, amount, intensity, timing – that Severe Weather Warnings have contained in recent years.

Performance Target

MetService’s performance targets for Severe Weather Warnings are:

Probability of Detection False Alarm Ratio
2010-11 2011-12 2010-11 2011-12
Heavy Rain >83% >87% <32% <28%
Heavy Snow >79% >81% <36% <34%
Severe Gales >79% >81% <36% <34%


Limitations

It is possible to have a “perfect” POD or FAR – but not simultaneously:

  • If heavy rain (or heavy snow, or strong winds) was forecast every day, the POD would be 100% – but the FAR would be close to 100% too.
  • If heavy rain (or heavy snow, or strong winds) was never forecast, the FAR would be 0% – but the POD would be close to 0% too.

In both of these cases, the forecast would be of no value because users would have no idea on which days heavy rain (or heavy snow, or strong winds) was expected to occur.

Very few forecasts or warnings are issued in isolation. Therefore, verifying them in isolation does not provide a complete picture of their value. Not infrequently, MetService will issue a Severe Weather Watch if it is considered that an event will not quite meet the Severe Weather Warning criteria but is nevertheless notable. If, subsequently, the Severe Weather Warning criteria are met, the verification scheme will record a missed event – despite MetService having perhaps come very close to forecasting it perfectly.

Verification of Rainfall in City Forecasts

This blog post is the first in a three-part series discussing verification of MetService forecasts. Here, we present the method used for verifying rainfall in city forecasts, along with some recent examples.

Verification Scheme

Four times daily (around 11:15am, 4:15pm, 10:50pm and 3:45am) MetService issues city forecasts for more than 40 locations. Currently, for 32 of these (and soon for most of the remainder), the forecasts of tomorrow’s precipitation from today’s late morning issue are verified against observations from a nearby automatic weather station.
Counts of precipitation forecasts and corresponding observations are made using a categorical forecast verification system. The principle is described in the contingency table below. Over a given period (typically, a month), for each location, the number of times an event was

  • A: Forecast and observed
  • B: Forecast and not observed
  • C: Not forecast but observed
  • … are recorded.

2 x 2 Contingency Table Event Observed
Yes No
Event Forecast Yes A B
No C D



From these, indications of accuracy are readily obtained:

Probability Of Detection (POD) Ratio of the number of times rainfall was successfully forecast to the total number of times it was observed Probability of Detection
False Alarm Ratio (FAR) Ratio of the number of times rainfall was forecast but not observed to the total number of times it was forecast False Alarm Ratio
Success Ratio (SR) Ratio of the number of times rainfall was successfully forecast to the total number of times it was forecast
Critical Success Index (CSI) Ratio of the number of times rainfall was successfully forecast to the number of times it was either forecast (successfully or not) or observed Critical Success Index
Bias Ratio of the total number of times rainfall was forecast to the total number of times it was observed Bias


For good forecasts, POD, SR, Bias and CSI approach a value of 1.

In the verification scheme:

  • “Tomorrow” is the 24-hour period between midnight tonight and midnight tomorrow
  • The forecast is considered to be of precipitation when the accompanying icon (as it appears, for example, in the city forecast for Auckland) is any one of “Rain”, “Showers”, “Hail”, “Thunder”, “Drizzle”, “Snow”, or “Few Showers”
  • Precipitation is considered to have fallen if the automatic weather station records any precipitation amount greater than zero.

The scheme operates automatically – that is, there is no input by MetService staff.

Some Results

POD, SR, Bias and CSI can be geometrically represented in a single diagram, and therefore simultaneously visualised.
The diagram below shows the accuracy of the forecasts issued late morning, for tomorrow, of precipitation, for the 32 cities verified, during September 2010.
Urban Rainfall September 2010


North and west of North Island (NWNI) East and south of North Island (SENI) North and east of South Island (NESI) West and south of South Island (WSSI)
WR Whangarei
AA Auckland
WT Whitianga
HN Hamilton
RO Rotorua
TG Tauranga
WK Whakatane
NP New Plymouth
WU Wanganui
AP Taupo
GS Gisborne
HS Hastings
NR Napier
TM Taumarunui
MS Masterton
PM Palmerston North
LV Levin
WG Wellington
NS Nelson
WB Blenheim
KI Kaikoura
CH Christchurch
AS Ashburton
OU Oamaru
TU Timaru
WS Westport
HK Hokitika
QN Queenstown
WF Wanaka
DN Dunedin
GC Gore
NV Invercargill


From the above diagram, for example:

  • The forecast for Auckland scores well, with Probability of Detection, Success Ratio and Bias all close to 1
  • The forecast for Timaru scores not as well: Probability of Detection around 0.75 is compromised by a Success Ratio just under 0.4 and a Bias of 2. In other words, during September 2010, the precipitation icon accompanying the late morning issue of the city forecast for Timaru for tomorrow, was too often one of “Rain”, “Showers”, “Hail”, “Thunder”, “Drizzle”, “Snow”, or “Few Showers”.

It’s also useful to see performance when the cities are grouped into the geographical areas described in the key above and overall. The diagram below shows the accuracy of the forecasts issued late morning, for tomorrow, of precipitation, for the 32 cities verified, during September 2010.


In this particular case, forecasts for the group of cities comprising the north and east of South Island (“NESI”) didn’t verify as well as for other places, with a Probability of Detection of about 0.75 and a False Alarm Ratio of about 0.3.

Further, it’s interesting to look at results dating from the implementation of this particular scheme in March 2009 through to 28 October 2010. Probability of Detection hovers at around 0.8 ± 0.15 (80% ± 15% in the graph immediately below). The False Alarm Ratio has shown a steady decline – in other words, predictions of rainfall in the city forecasts have improved in accuracy over the last year.
POD time series

FAR time series


Performance Target

MetService’s performance target for the forecast of precipitation in city forecasts is a combined POD greater than 0.77 for the 2010/11 financial year, increasing to 0.80 for the 2011/12 financial year.

Limitations

Even across small distances, there can be significant variations in rainfall during the course of a day. Thus, the observation point used to verify the forecast for a given place may or may not fairly represent the rainfall there – or may be a good indication in some weather situations but not in others. MetService observation sites are commonly at airports, which in general are at least some distance from the city or town they serve.

References

Finally, there is much literature about forecast verification. If you’d like to know more about it, try these:
Doswell Charles A., 2004: Weather forecasting by humans—Heuristics and decision making. Weather and Forecasting, 19, 1115–1126.
Doswell, Charles A., Robert Davies-Jones, David L. Keller, 1990: On Summary Measures of Skill in Rare Event Forecasting Based on Contingency Tables. Weather and Forecasting, 5, 576–585.
Hammond K. R., 1996: Human Judgment and Social Policy. Oxford University Press, 436 pp.
Roebber, Paul J., 2009: Visualizing Multiple Measures of Forecast Quality. Weather and Forecasting, 24, 601–608.
Stephenson, David B., 2000: Use of the “odds ratio” for diagnosing forecast skill. Weather and Forecasting, 15, 221–232.

Rain or Showers

We had an enquiry recently from an astute member of the public asking about the comings and goings of rain.  

They had noticed that in southerly weather the rain has a tendency to “come in bands (e.g., 20 minutes rain, 20 mins dry, 20 mins rain etc.) rather than as a more constant rain that comes with northerlies”. They were wondering why this was. This is a good question and I will try to answer it here.  

Radar examples

First, let’s have a look at examples of northerly and southerly precipitation. Below are radar images that show rainfall-sized drops as explained in an earlier post on the storm of late May. As before, light falls are yellow and heavier falls blue.  

1.  Rain approaching Taranaki from the north on 5 June 2010. In this animation the precipitation is relatively uniform, suggesting more continuous rain:  

New Plymouth radar imagery, 6pm to 10:30pm NZ Standard Time, 5 June 2010

As an aside, the semi-circular area of yellow over the North Taranaki Bight (towards the end of the animation) is caused by cluttering reflection off the sea.  

2.  Showers moving onto the Canterbury coast and plains from the south on 8 June 2010. Here the precipitation is speckled indicating showers (with breaks in between):  

Christchurch radar imagery, 6pm to midnight NZ Standard Time, 8 June 2010

The key to unlocking the cause of the difference between rain and showers lies in the nature of vertical motion. In the post on The Structure of Highs I explained the importance of up and down motion of air. Even though vertical motion is usually much weaker than horizontal motion of air (wind), it really dictates the state of the sky:  

  • upward motion of moist air favours the formation and maintenance of cloud, and possibly precipitation too,
  • downward motion of air inhibits cloud, favouring clear skies.

Relating this to our original question, northerly flows over New Zealand occur ahead of approaching depressions or troughs of low pressure. These flows are characterized by gently rising warm moist air covering a large area. In these situations you’re more likely to get continuous rain.  

In contrast, southerly flows occur immediately behind departing depressions or troughs, where there tend to be pockets of more vigorously rising air surrounded by generally sinking clear air. In addition, if the air flows over the sea, the colder drier air picks up extra moisture and becomes unstable so that the rising air gets extra buoyancy. In these situations the precipitation is likely to be punctuated with breaks, i.e. you’re more likely to get showers.  

Combinations and exceptions

There are some complications to this. Occasionally within the gently rising northerly airstream there are pockets or bands of vigorous rising air that can even generate embedded thunderstorms. These are caused by subtle instability mechanisms. They are of great interest to pilots, because the surrounding rainy areas can make the embedded and hazardous thunderstorms difficult to detect.  

It’s also possible to have an approaching front generating a broad area of rain as it flows over the top of an older shallow flow generating light showers. I can show you an example if you’re interested - just leave a comment below. 

Topographic effects can lead to preferred places for showers in southerly flows. For example, have another look at the radar animation of showers above. Note the band of showers over the Canterbury Plains at the western edge of the radar echoes. This is caused by a low-altitude convergence effect that favours upward motion there. There are no showers farther to the west due to sheltering from the eastern Otago ranges. If you are beneath that band of showers, then it will seem more like continuous rain than passing showers. 

Another common situation for showers to behave more like continuous rain is when an unstable airmass flows onto the West Coast – in these conditions the cloud is often continuous, and the precipitation is too, with alternating periods of heavier and lighter rain.  

If you look very closely at the second animation you may detect a slight change in the orientation of the band of showers – it turns a few degrees of angle clockwise and crosses Christchurch. This turning is caused by a small clockwise shift in the direction of the wind flow. Subtle changes like this are a real challenge for our forecasters, but our growing network of weather radars is increasing our ability to forecast them.

A Lot of Rain

There has been a lot of rain in many parts of New Zealand over the past two weeks. One of the few places to escape this was the sunny West Coast of South Island – the days are clear and stunning there when the flow is southeasterly.

If you’ve been following the surface weather maps during May you would have noticed a lot of low pressure over the Tasman Sea and northern NZ, and periods of relatively high pressure farther south. This pattern has also been the case aloft, where upper level processes have been driving what has happened at the surface. Take a look at the two charts below – they’re similar to the coloured charts from an earlier post, showing the variation of upper level pressure heights. You can think of them as showing where the  flow was cyclonic and anticyclonic in the upper atmosphere.

Upper flows from 1 to 14 May 2010: blue/purple = cyclonic, yellow/red = anticyclonic

As previous, but for 15 to 28 May 2010. Images courtesy U.S National Oceanic & Atmospheric Administration Earth System Research Laboratory

During the first half of May (top chart) there was a strong tendency for anticyclonic conditions southeast of NZ. For the second half of May (bottom chart), this anticyclonic anomaly shifted westwards while a marked cyclonic tendency formed over the Tasman Sea, North Island and much of the South Island. This combination generated not only a lot of rain, but also a powerful easterly flow that amplified the rain in many eastern parts of the country.

Northern districts have had a lot of rain too, and the intense falls that were reliably recorded in Whakatane on Tuesday evening (1 June) were phenomenal. In just one hour 89.8 mm of rain fell, plus there was heavy rain either side of that hour. Falls of that intensity are fortunately rare in New Zealand, and it’s hard to imagine what they’re like unless you’ve experienced them. I did once, when visiting Nadi, Fiji many years ago…

It was the heaviest rain I’d ever seen, and it set a local record.  I recall having to leave a building and get into a car, and needing to go through the rain for just 2 or 3 metres. But that was enough for my clothes to be soaked. Driving was virtually impossible until it had eased off.

Returning to the weather in May, the storm I wrote about in my previous post was unusual in many respects, but the way the depression formed off the east coast of Australia was actually quite common. The sea off the eastern Australian seaboard is a favoured area for depressions to form, as shown here: 

The reason is to do with the temperature and moisture contrast between the Australian continent and the sea – these contrasts are the food of developing lows, and are perhaps the main way that Australia influences our weather and climate. Once the lows have formed, they then preferentially move towards the east-southeast - you guessed it, straight towards New Zealand.

Keep an eye out, if you can, on the weather maps this Queen’s Birthday weekend. Another low is forming – in the favoured area – and looks like it will bring more rain to the north, starting Saturday night. Our forecasters will be monitoring it closely to give you the best possible forewarning.