Readability tests were originally designed to help teachers understand how well children were reading and to help assess the suitability of a book for a particular age group. Typically, they look at things like word and sentence length, and the frequency of longer words.
More than 100 different readability formulas exist. The most commonly used these days is the Flesch-Kincaid just because it is built into Microsoft Word. The values it gives relate to US high school grades or the number of years of education - so Flesch-Kincaid Grade level 9 means a text is suitable for people who have already completed 9 years of education.
Word will also give you a Flesch Reading Ease result. Higher scores indicate that material is easier to read, lower numbers are more difficult. Reader's Digest magazine has a readability index of about 65, while the Harvard Law Review is in the low 30s.
The Gunning Fog Index counts up the number of difficult words - those with 3 or more syllables - in a 100 word passage and the number of clauses and then calculates a reading age. The Bible and Shakespeare have a Fog Index of about 6. Test text with the Gunning Fog Index
Usability experts suggest that if you are targeting a consumer audience you should aim for 7th or 8th grade reading level. If your copy comes in above that then it's too complicated for a mainstream audience but could work with an audience of professionals.
However, readability tests have problems. They are a blunt instrument. In some contexts a longer word is more helpful than a few shorter ones. And you could write sentences that make no sense using short words and come up with a good readability level - because they are simply based on a mathematical formula.
Readability v comprehension
So readability isn't always the same as comprehension. You also may be able to read a word without understanding the full implications of what it means.
A paper in the International Journal of Mobile and Human Computer Interaction by Singh et al (and cited by web usability guru Jakob Nielsen) gives a good example of this difference. Compare the 2 sentences:
1. He waved his hands.
2. He waived his rights.
Both are short sentences that are fairly easy to read. Everyone knows exactly what the first sentence means. You might need a law degree to understand the full implications of the second.
So while someone may be able to read the words, does this necessarily mean that they understand them?
One method often used to measure comprehension is a Cloze test. Users are provided with a sample of text with every 5th or 6th word blanked out. To make the test slightly easier you can spread the blanks further apart.
You then ask the users to fill in the missing words as best they can. The score is however many they get right. As you are testing comprehension you can allow misspellings or synonyms.
If users get 60% (eg 6 out of 10 blanks) or more, on average the theory is that the text is being understood by most readers.
But some experts argue that words should be removed on the number count, others that it should concentrate on deleted nouns or deleted verbs. Some researchers think that nouns carry the most meaning and deleting them can make the test too hard to be useful.
So how useful are Cloze tests to judge how well your site is being understood? It can be argued that it shows how readers focus on context as well as vocabulary for meaning.
3.2.2 If you do not 1)____ a valid television licence you may not watch 2) __________ programmes using BBC Online 3)________ on any device (including 4) ______ phones, "smart" phones or 5) _______, laptops, tablets and personal 6) _________) at the same time ( 7) __ virtually the same time) 8) __ the programmes are being 9) _________, simulcast or otherwise made 10) _________ (by the BBC on 11) __________, unless you have a 12) _____ television licence. For more 13) ___________ on this requirement please 14) ___ the Frequently Asked Questions 15) __ you can contact TV 16) _________ by calling 0870 241 5590 or by visiting www.tvlicensing.co.uk.
Now test your site
It's a good idea to check the readability levels of the content on your site. Check new content before you upload it.
Try a few Cloze tests on colleagues or willing customers or users and see how well they fare. Having at least 50 blanks increases the reliability of the test. You can also give different groups the same text with a different variation of missing words - 5 variations being possible if you delete every fifth word.
Readability v Cloze: which works best?
First off, remember:
- Readability measures the readability of the text - how easy it is to read
- Cloze measures the reader's ability - how much the individual understands
Copywriters should use both tools. Use your spellcheck in Word and see what reading age you come up with, and simplify where you can if the numbers come out too high. Don’t try too hard to hit the ideal targets, which are best seen as useful benchmarks instead - most of us have to use some specialist or complex language that will bring the scores down somewhat.
Cloze tests come in particularly useful if you want to test how well your users understand pages like instructions on how to use a product, your terms and conditions, or technical information. They will give you a clear guide to how much your users actually understand.
But you can't just rely on these tests. As a writer, it is up to you to make sure that the content is written in an appropriate tone, well-structured, coherent and aligned with users’ needs - all elements which add to reader understanding.
Top tips for improving readability
- Use short, simple words
- Write short sentences
- Avoid jargon and explain any complicated terms
- Use the active voice
- Write instructions in the imperative