Article

Do Not Ignore Visuals When Testing Materials

28 may 2025

Mark Gibson

,

UK

Health Communication Specialist

In health communication, whether written or digital, a lot of effort is spent testing the content, for clarity, comprehension, readability, legibility, accuracy. We test interfaces such as eCOA for usability. We test language for readability, proof of understanding and conceptual equivalence. Too often, however, visual elements, such as icons, diagrams and pictograms are ignored in the testing process or glossed over or the right questions are not asked. They are treated as if their meaning is self-evident.

It is not.

Without testing, we have no clue whether they are self-evident or not. Assumptions lead to risk of misinterpretation, confusion, inaccessibility or user disengagement.

To me, it is stunning that this article’s central tenet is to argue that, when testing anything destined for real users, visuals need to be on an equal footing to other text and functionality. Not giving visuals their due attention is not only an incomplete user/usability testing job, but it can also have other consequences.

Visuals Are Language

Visuals are not decoration. They signify compressed meaning. They are shorthand for actions, ideas and objects. This makes them a kind of language. And just like any language, visuals require interpretation.

We have written about the use of the floppy disk as shorthand for ‘save’, the paper clip  for ‘attach’ and the inbox tray for incoming messages. Floppy disks have not been used since the mid-2000s. In the era of cloud-based sharing, there is reduced need for attaching documents and, in the paperless era, work tasks are no longer physically ‘in-trayed’. Younger users simply would not know what an in-tray is.

If the user has not learned the metaphor, they cannot decode the icon. Testing helps identify who is left behind by a visual. Without testing, designers are just assuming (or hoping).

Visuals Can Mislead

Designers know what an icon is supposed to mean. Users do not necessarily share that understanding. Visuals can mislead users and designers will never know that if they do not ask users.

Here are some real-world examples:

·       The commonly used icon intended to mean “submit” that looks like a paper airplane is sometimes interpreted as “share”.

·       A heart used to indicate “health” is often interpreted as “like” or “favourite”, due to influence from social media platforms.

·       A bin (a trash can really, but I am British, so I say bin) that is used for “delete” is often interpreted literally as physical waste and not digital removal, in some cultures.

These misunderstandings would never be uncovered in testing if they were not specifically tested amongst users. It is important to test the visuals and not just the words around them.

Untested Visuals Can Compromise Accessibility

In public health, patient communication and digital apps with visuals that are not tested as thoroughly as other aspects of the system can directly compromise equity. People with users with low literacy may rely entirely on icons, but only if the icons are legible, familiar and intuitive. People with neurodivergence or cognitive impairments often benefit from clean, literal visuals, not abstract or metaphorical ones. In multilingual contexts, visuals are often the only shared mode of understanding, but only if they work well across cultures.

If you are only testing for comprehension or conceptual equivalence of the text and paying less attention to the visuals, you will only get half the picture.

UX Testing Often Focusing on Flow, Not Visual Friction

User experience testing is great at mapping journeys, such as observing how long it took a user to complete the task, where users got stuck, what users were expecting.

Unless the test explicitly probes visual elements, you will miss critical micro-confusions, such as:

·       Was the user unsure what the icon did?

·       Did they skip a step because they misread a visual?

·       Did they hesitate, squint or guess?

Users may not verbalise confusion about visuals. They may simply move on or even abandon the task. Without target observation or prompting, this friction goes unrecorded and unaddressed.

Visuals Carry Emotional Weight

As well as function, visuals carry tone, and tone matters in a system. For example, a skull icon is technically correct for danger – correct, in terms of convention – but would this tone be appropriate for use in patient communication? The icon could threaten or infantilise users. Alternatively, a red exclamation mark might be ignored if used too often or people might think it is signalling a false alarm. Cartoonish visuals might clash with the seriousness of the subject matter.

Testing visuals also need to focus on the emotional impact of the visuals and not just the text.

Visuals Change Over Time

An icon that worked just 5 years ago may now be obsolete. A symbol once universally understood may now belong to a shrinking demographic.

Without testing, you won’t notice when:

·       Metaphors go stale and decay.

·       Meaning shifts across cultures or generations.

·       The icon becomes more of a curiosity than a cue.

Testing helps track the vitality of visual elements, just as we test for proof of understanding or conceptual equivalence.

Give Visuals Their Due

The testing of visuals needs to be intentional. Interviews need to ask explicitly what they think an icon means. In UX testing, the focus should not only be on task completion, but how quickly and confidently users interpret visuals. A/B testing could run tests on icon-only buttons versus labelled buttons.

Earlier, though, at the design phase, visuals should ideally be co-designed with the target users in an iterative development approach.

Test Visuals Like You Would Test Words

Visuals should be treated like language. This is because they are. They speak. They shape meaning. And they evolve.

Ignoring, skipping or glossing over visuals in testing is like assuming everyone will understand a given content, just because the writers understand it. Visuals can be misunderstood, outdated and misaligned with user expectations.

A good visual does not need to be explained. But you will not know this until they are tested systematically and in a way that provides the proof of comprehension and usability.

Thank you for reading.


Mark Gibson

Leeds, United Kingdom, April 2025

Originally written in

English