Project 3: Perception & Cognition

Mental frames of situations can bias our perception to see objects and events that we expect to see in that situation, they are a mental shortcut. By eliminating the need for us to constantly scrutinise every detail of our environment these frames help us get around in our world with more ease. When browsing a website we expect to see a site name, a logo, a navigation bar, some other links and maybe even a search box. (Johnson, J. (2010). Designing with the mind in mind: simple guide to understanding user interface. San Francisco, CA: Morgan-Kaufmann Publishers).

If we are trying to find something, but it is in a different place or looks different from usual, we might miss it even though it is in plain view because experience tunes us to look for expected features in expected locations.

This is why consistent placement of controls is a common user-interface guideline, to ensure that reality matches the users frame for that situation. An example of this in my app design would be the consistent placement of the navigation controls at the top of the screen [figure 1 of Navigation at top of screen].

Screen Shot 2017-04-22 at 15.05.31.png

The relative ease with which we recognise things rather then recall them is the basis of the graphical user interface (GUI) (Johnson, J. (2007). GUI bloopers 2.0: Common user interface design don’ts and dos. San Francisco, CA: Morgan-Kaufmann Publishers).   

People recognise pictures very quickly, and recognising a picture also stimulates the recall of associated information. Pictures that people recognise from the physical world are useful because they can be recognised without needing to be taught. This recognition is good as long as the familiar meaning matches the intended meaning in the computer system (Johnson, et al 2007). [figure 2 Camera pencil icons].

Screen Shot 2017-04-22 at 15.05.59.png

Our perception is also filtered by our goals. When people navigate through a site or app, seeking information or a specific function they don’t look at every detail on the screen but rather scan it quickly seeking out the items that seem related to their task. By applying the primary interactive elements within the app with a consistent blue colour, the user is able to quickly distinguish the interactive navigational elements.

When we are looking for something, our brain can prime our perception to be especially sensitive to features of what we are looking for (Ware, C. (2008). Visual thinking for design. San Francisco, CA: Morgan-Kaufmann Publishers.). In an  interactive system it is important to make sure colour coding is consistant and contrasts between the colours is high enough so that the user can recognise an elements function quickly and unambiguously.

The colors that people can distinguish most easily are those that cause a strong signal (positive or negative) on one of the three color-perception channels, and neutral signals on the other two channels. Those colors are red, green, yellow, blue, black and white. All other colors cause signals on more than one color channel, and so our visual system cannot distinguish them from other colors as quickly and easily as it can distinguish those six colors (Ware, et al 2008).

Another important aspect of how we precive things, is that we unconsciouly impose stucture on visual input. We are hardwired to perceive our surroundings in terms of whole objects. The Gestalt principles of visual perception provide a useful basis for guidelines for graphic and user interface design (Soegaard, M. (2007). Gestalt Principles of form perception. Web article,,

The principle of Proximity is that the relative distance between objects in a display affects our perception of whether and how the objects are organised into subgroups. Objects that are near each other (relative to other objects) appear grouped, while those that are farther apart do not.

The heirachy of on-screen controls can be grouped and simultaneously separated by simply placing a dividing line  between them. [see fig 3 of NAV]

Screen Shot 2017-04-22 at 15.06.05.png

The dividing line separates Global Nav from Local Nav

The Local Nav is further grouped accoss heirachies by placing the Messages and Setting icons on the same side as the the Profile icon as they relate to the personal profile too, the Notebook is a seperate local fuction of imporatance.

The Gestalt principle Figure/Ground describes how our visual system structures the data it receives and states that our mind separates the visual field into the figure (the foreground) and ground (the background). This is a useful too to draw attention away to a new object while keeping within the same context.

Figure/Ground principle was adopted in the app with the use of the pop up information over the main content. The content that was formerly the figure—the focus of the users’ attention—temporarily becomes the background for new information, which appears briefly as the new figure [see fig 4 of POP UP INFO].

Screen Shot 2017-04-22 at 15.06.22.png

The design of the app was also based upon a grid layout of 4 vertical columns which provided a unifying structure to each page. The grid allows for the structured presentation of information across varying pages in the app and helps the user quickly comprehend what they are looking at [see fig 5 of GRID].

Screen Shot 2017-04-22 at 14.40.56.png


Most people in industrialized nations grow up in households and school districts that promote education and reading. They’ve learned to read and write as young children and became good or excellent readers by adolescence. However, speaking and understanding spoken language is a natural human ability, but reading is not. For hundreds of thousands of years, the human brain evolved the neural structures necessary to support spoken language and as a result, normal humans are born with an innate ability to learn spoken languages as toddlers, with no systematic training, whatever language they are exposed to.

In contrast, writing and reading did not exist until a few thousand years BC and did not become common until only four or five centuries ago—long after the human brain had evolved into its modern state. At no time during childhood do our brains show any special innate ability to learn to read. Instead, reading is an artificial skill that we must learn by systematic instruction and practice, (Sousa, D. A. (2005). How the brain learns to read. Thousand Oaks, CA: Corwin Press).

Careless writing or presentation of text can reduce skilled readers’ automatic, context-free reading to conscious, context-based reading, burdening working memory, thereby decreasing speed and comprehension. In unskilled readers, poor text presentation can block reading altogether. (Johnson, et al 2010)

Even when the vocabulary is familiar, reading can be disrupted by hard-to-read scripts and typefaces. Bottom-up, context-free, automatic reading is based on recognition of letters and words from their visual features. Therefore, a typeface with difficult-to-recognize feature and shapes will be hard to read.


The design of the app employed two distinct typefaces. The Smartplan logo and practically all the icons used throughout the app are from the same Typeface “Byom”, as is the top hierachtical heading on each the page [see fig 5 of Byom]. This consistency gave a uniformity to the wayfinding system used throughout the apps user interface.

Screen Shot 2017-04-22 at 15.06.41.png

The second font used in the design is Gotham. This font has a broad x-height allowing for clear legibility even at small type sizes. Which was important for the readability of any large text patches, as used in the “information pieces” but most importantly as used on the navigational icons. In addition I chose to keep the descriptive icon text all in caps to inform of their importance and improving legibility further [see fig 6 of Gothom].

It was important to keep in mind that too much test in a user interface loses poor readers and even alienates good readers. In the information patches I was carful to use the least amount of text necessary to get user to their intended goals.

Screen Shot 2017-04-22 at 15.06.50.png

Screen Shot 2017-04-22 at 15.06.59.png


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s