Towards Improved Constructive Thinking and Greater Holistic Objectivity and Clarity in a Complex World. This Blog is a Resource of Articles on the Thinking Process from Education, Information Science, Philosophy, Science, Linguistics, Psychology, Artificial Intelligence, Sociology, Media Studies, Statistics, Behavioural Sciences, and Other Sources. The Development of the Precision Universal Debating Project acts as the Basic Backdrop to the Whole Subject.
"Outside the box" redirects here. For several proper names, see Outside the Box.
The "nine dots" puzzle. The goal of the puzzle is to link all 9 dots using four straight lines or fewer, without lifting the pen and without tracing the same line more than once. One solution appears below.
Thinking outside the box (also thinking out of the box[1][2] or thinking beyond the box) is a metaphor that means to think differently, unconventionally, or from a new perspective. This phrase often refers to novel or creative thinking. The term is thought to derive from management consultants in the 1970s and 1980s challenging their clients to solve the "nine dots" puzzle, whose solution requires some lateral thinking.[3]
The catchphrase, or cliché, has become widely used in business environments, especially by management consultants and executive coaches, and has been referenced in a number of advertising slogans. To think outside the box is to look farther and to try not thinking of the obvious things, but to try thinking of the things beyond them.
A simplified definition for paradigm is a habit of reasoning or a conceptual framework.
A simplified analogy is "the box" in the commonly used phrase "thinking outside the box". What is encompassed by the words "inside the box" is analogous with the current, and often unnoticed, assumptions about a situation. Creative thinking acknowledges and rejects the accepted paradigm to come up with new ideas.
The notion of something outside a perceived "box" is related to a traditional topographicalpuzzle called the nine dots puzzle.[3]
The origins of the phrase "thinking outside the box" are obscure; but it was popularized in part because of a nine-dot puzzle, which John Adair claims to have introduced in 1969.[4]Management consultant Mike Vance has claimed that the use of the nine-dot puzzle in consultancy circles stems from the corporate culture of the Walt Disney Company, where the puzzle was used in-house.[5]
Christopher Columbus's Egg Puzzle as it appeared in Sam Loyd's Cyclopedia of Puzzles.
The nine dots puzzle is much older than the slogan. It appears in Sam Loyd's 1914 Cyclopedia of Puzzles.[6] In the 1951 compilation The Puzzle-Mine: Puzzles Collected from the Works of the Late Henry Ernest Dudeney, the puzzle is attributed to Dudeney himself.[7] Sam Loyd's original formulation of the puzzle[8] entitled it as "Christopher Columbus's egg puzzle." This was an allusion to the story of Egg of Columbus.
One of many solutions to the puzzle at the beginning of this article is to go beyond the boundaries to link all dots in 4 straight lines.
The puzzle proposed an intellectual challenge—to connect the dots by drawing four straight, continuous lines that pass through each of the nine dots, and never lifting the pencil from the paper. The conundrum is easily resolved, but only by drawing the lines outside the confines of the square area defined by the nine dots themselves. The phrase "thinking outside the box" is a restatement of the solution strategy. The puzzle only seems difficult because people commonly imagine a boundary around the edge of the dot array.[9] The heart of the matter is the unspecified barrier that people typically perceive.
Ironically, telling people to "think outside the box" does not help them think outside the box, at least not with the 9-dot problem.[10] This is due to the distinction between procedural knowledge (implicit or tacit knowledge) and declarative knowledge (book knowledge). For example, a non-verbal cue such as drawing a square outside the 9 dots does allow people to solve the 9-dot problem better than average.[11] However, a very particular kind of verbalization did indeed allow people to solve the problem better than average. This is to speak in a non-judgmental, free association style. These were the instructions in a study that showed facilitation in solving the 9-dot problem:
While solving the problems you will be encouraged to think aloud. When thinking aloud you should do the following: Say whatever’s on your mind. Don’t hold back hunches, guesses, wild ideas, images, plans or goals. Speak as continuously as possible. Try to say something at least once every five seconds. Speak audibly. Watch for your voice dropping as you become involved. Don’t worry about complete sentences or eloquence. Don’t over explain or justify. Analyze no more than you would normally. Don’t elaborate on past events. Get into the pattern of saying what you’re thinking about now, not of thinking for a while and then describing your thoughts. Though the experimenter is present you are not talking to the experimenter. Instead, you are to perform this task as if you are talking aloud to yourself.[12]
The nine-dot problem is a well-defined problem. It has a clearly stated goal, and all necessary information to solve the problem is included (connect all of the dots using four straight lines). Furthermore, well-defined problems have a clear ending (you know when you have reached the solution). Although the solution is "outside the box" and not easy to see at first, once it has been found, it seems obvious. Other examples of well-defined problems are the Tower of Hanoi and the Rubik's Cube.
In contrast, characteristics of ill-defined problems are:
not clear what the question really is
not clear how to arrive at a solution
no idea what the solution looks like
An example of an ill-defined problem is "what is the essence of happiness?" The skills needed to solve this type of problem are the ability to reason and draw inferences, metacognition, and epistemic monitoring.
This flexible English phrase is a rhetoricaltrope with a range of variant applications.
The metaphorical "box" in the phrase "outside the box" may be married with something real and measurable — for example, perceived budgetary[13] or organizational[14] constraints in a Hollywood development project. Speculating beyond its restrictive confines the box can be both:
(a) positive— fostering creative leaps as in generating wild ideas (the conventional use of the term);[13] and
(b) negative— penetrating through to the "bottom of the box." James Bandrowski states that this could result in a frank and insightful re-appraisal of a situation, oneself, the organization, etc.
On the other hand, Bandrowski argues that the process of thinking "inside the box" need not be construed in a pejorative sense. It is crucial for accurately parsing and executing a variety of tasks — making decisions, analyzing data, and managing the progress of standard operating procedures, etc.
Hollywood screenwriter Ira Steven Behr appropriated this concept to inform plot and character in the context of a television series. Behr imagined a core character:
He is going to be "thinking outside the box," you know, and usually when we use that cliche, we think outside the box means a new thought. So we can situate ourselves back in the box, but in a somewhat better position.[14]
The phrase can be used as a shorthand way to describe speculation about what happens next in a multi-stage design thinking process.[14]
Jump up ^Adair, John (2007). The art of creative thinking how to be innovative and develop great ideas. London Philadelphia: Kogan Page. p. 127. ISBN9780749452186.
Jump up ^MAIER, NORMAN R. F.; CASSELMAN, GERTRUDE G. (1 February 1970). "LOCATING THE DIFFICULTY IN INSIGHT PROBLEMS: INDIVIDUAL AND SEX DIFFERENCES". Psychological Reports26 (1): 103–117. doi:10.2466/pr0.1970.26.1.103.
Jump up ^Lung, Ching-tung; Dominowski, Roger L. (1 January 1985). "Effects of strategy instructions and practice on nine-dot problem solving.". Journal of Experimental Psychology: Learning, Memory, and Cognition11 (4): 804–811. doi:10.1037/0278-7393.11.1-4.804.
Jump up ^Fleck, Jessica I.; Weisberg, Robert W. (1 September 2004). "The use of verbal protocols as data: An analysis of insight in the candle problem". Memory & Cognition32 (6): 990–1006. doi:10.3758/BF03196876.
Brainstorming is de rigueur in almost every professional environment.
Managers think that to get to the heart of an issue and get the best out of their creative personnel they should put them all in a room and the ideas will fly.
But here’s the thing: brainstorming doesn’t work.
If you had you were charged with coming up with the worst way to get people to come up with plentiful, unique, imaginative ideas – then brainstorming would probably fit that bill.
I have talked briefly before about research into the subject and why more and more companies and creatives are moving away from brainstorming, at least in its classical ‘let’s all sit in a room and shout at each other’ format, but here I am going to go through all the way that brainstorming is ruining your creative process, and the ways that you can change these sessions from creative busts to creative boons.
Why brainstorming doesn’t work
Imagine your average brainstorming session. The group files into a conference room, replete with a plate of cookies and an empty whiteboard. They are then given a task, designing a widget or building an app, and told to ‘shoot for the moon’ with their ideas. What happens next?
01. The group goes for low-hanging fruit
First off, someone says something obvious. Because brainstorming requires every idea to be given its due, the group runs with this first idea, discussing the idea even when everybody knows that it is bunkum. Before you know it half the session is gone and nothing has really been accomplished.
Welcome to anchoring. Anchoring is one of the truly hundreds of cognitive biases we all have. A cognitive bias is like a little rule you have in your brain for making the world easier to understand and to help us make decisions easily and quickly. Sometimes these work for us, and sometimes against us. With anchoring, we tend to put undue weight on the first idea of piece of information we are presented with. Everything else afterwards is judged on its relative merit compared to that piece of information.
In a brainstorming session, the group gets ‘anchored’ to this initial idea and simply can’t let it go, spending too much time on it, and using it as the bar that all other ideas are measured against.
This then receives disproportionate attention during the session, and only at the end does everyone start to move on to more interesting ideas, and then rush through them.
These low-hanging fruit are also a way for some members of the group to look good. Let’s face it, we’ve all done it – chime in quickly with the thing everyone is thinking of, and then you have done your bit. You can sit back and eat the cookies while everyone else comes up with the smart ideas.
As, we will see later, starting simple and taking baby steps from there is a good idea. But too often people never get beyond this simple stage.
02. ‘Groupthink’ sets in
Groupthink goes along with the anchoring effect. Generally people want to avoid conflict and we simply can’t help but want to agree with an idea someone presents, even if we have better ideas of our own.
This is a particular problem in a group situation like brainstorming. In any group there are going to be some people who are loudmouths and some people who are wallflowers. Because criticism is generally frowned upon in brainstorming (see below), this is supposed to help those of us (me included, and probably a lot of creatives out there) to find our voice and speak up. But anyone who has ever been in such a meeting knows this is not what happens.
The loud people talk first and override the conversation and then everyone goes along with what they say. This is the natural order of such meetings. People want to conform and certainly don’t want to face off against another member of the group, particularly if they themselves are shy. This means that whatever the loudest members say goes, even if it isn’t the best of ideas. This is what brainstorming was supposed to stop, but actually it is made worse by this group thought environment.
03. You are told to be ‘uncritical’
One of the main selling points of brainstorming is that the sessions are uncritical environments were anyone can have an idea without the worry of feeling stupid. This is supposed to foster originality and let more timid, shyer members of the group (which creatives can often be) speak without fear of being shouted down.
That is great in theory, but terrible in practice. It leads to exactly the problems above, where easy or stupid ideas aren’t dismissed. The group has to run with them, discussing ideas that lack merit just so people do not get discouraged.
I am not advocating piling in on the first person to say something idiotic, but this paradox is a significant issue with brainstorming – we want people to come up with creative ideas, so don’t want to criticize them, but criticism is a necessary part of creativity. Therefore, you can’t have true creativity within a brainstorming session as they are currently run.
04. You are under pressure
In research published in the Harvard Business Review, Teresa Amabile, a Professor of Business Administration at Harvard Business School, showed that pressure is almost always terrible for creative thinking. She studied 177 people working for different companies in the US, asking them to keep diaries of when and how they felt most creative.
One of the worst correlates for creativity was pressure. When people felt under significant time pressure they felt more distracted, unable to focus on the task in hand, and as if they were on a treadmill and not really contributing to the firm’s success. This is what can happen in a brainstorming session. People are under a tight time constraint and therefore start to worry about what that instead of focusing on the task in hand. Most people will have experienced this – when you look at the clock, equally worried that you don’t have enough time to solve the issue, and that the session isn’t yet over.
This is the opposite from how a lot of creatives intuitively feel – many see the looming deadline and the pressure as a spur, something that makes the eureka moment all the more likely (we will come to the fallacy of eureka shortly). Time pressure can work, but rarely in the setting of a brainstorming session. Because the environment itself isn’t conducive to creativity, with uncritical thinking and overwhelming characters and groupthink, then the pressure only adds to the wrong decisions being made.
How to make brainstorming work
So what if you want to make your brainstorming sessions work? Well, there have been some interesting studies into what does work in this type of environment and how to get the most creative ideas out of people in such a session – Allow them to find their own way, and somewhat contradictory, don’t have a brainstorming session (yet).
01. Take baby steps
Unless you are Archimedes, there is no Eureka! When we looked at what makes creative people different, it was always that they worked hard and developed particular strategies that worked well for them.
There were no sudden moments for genius even for the geniuses. Instead, great ideas grow from good ideas that spawn from all ideas.
Joel Chan and Christian Schunn, both from the University of Pittsburgh, looked at exactly this – how do you get from a small idea to a big idea to a great idea? Chan and Schunn looked at transcripts from different engineering teams to see how these designers came up with their ideas. They found that instead of some massive leap of knowledge from a member, the progress was gradual as the teams leapfrogged from one idea to the next, always growing their thoughts into bigger and bigger ideas, and getting closer to the solution.
“Creativity is a stepwise process in which idea A spurs a new but closely related thought, which prompts another incremental step, and the chain of little mental advances sometimes eventually ends with an innovative idea in a group setting,” said the researchers.
Real creativity comes from simple collaboration and the bouncing of simple ideas to something bigger. These designers may start with the ‘low-hanging fruit’ of other brainstorming sessions, but unlike the ones that go wrong, these groups keep making progress and do not anchor to a simple idea too early.
This shows how important the group dynamic is for brainstorming. If these engineering teams had too many members, or some that were too overpowering, they might have stumbled too early. Instead the groups work well together and come up with the ideas needed together.
Chan and Schunn also said that setting up a brainstorming session as a forum for ‘great ideas’ can backfire. The members can then become fixated on trying to come up with that one great idea and forget the small ideas that it will take to get there. Companies looking to set up these brainstorming sessions should do away with that kind of expectation setting, instead allowing the creatives to explore and come up with ideas organically.
02. Use analogies
Chan and Schunn also noticed that analogies were used often by these design teams. For instance, one team was tasked with working on the design of a printer, and were thinking about how it might open and close. In discussing the mechanism, they sued the analogy of a vcr, then a garage door and then a roller door, all in the space of a few minutes as they developed their ideas.
Analogies tie in with the idea of taking baby steps. It might be difficult to explain the big idea that you have, but you can probably break it down into smaller ideas that might already exist and that you can readily describe.
Using analogies also allows others an easy window into your head. People are generally not very good at describing ideas straight from their thoughts. Instead we prefer to use metaphors, as these are a common theme to all people. By using analogy and metaphor we can bypass direct description that can be time-consuming, and use a common idea instead.
There is a downside to this that goes along with the anchoring from earlier. If you give people an analogy of what you want them to design, then that can get stuck in their head, and they will not be able to move away from that idea. Instead, you have to give the creatives and designers space to explore the ideas themselves and come up with their own analogies.
03. Allow people time
Pressure is a killer for creativity, whereas given them time can help foster ideas and creative thinking. Stuffing people in a room and tell them ‘Go!’ is really the worst thing for original thought. From her study of employees of top companies, Amabile found that,
instead of running brainstorming sessions in big groups, the best ideas were when a few employees got together naturally to discuss a problem.
In these smaller, impromptu meetings, ideas were more rapidly generated and the ideas were more creative. If you think about this, it makes sense. The pressure was off for these meetings and everyone feels more comfortable talking to just one or two others. Conversation can flow more naturally, it is unlikely that just one person will take control of the conversation, and criticism can be given in a more discreet way. All of which makes these smaller brainstorming sessions much better than their bigger cousins.
These small groups can then bring their ideas to a larger meeting where they can talk as one (meaning that individuals do not feel so singled-out of an idea is bad) and where a more critical feedback of different groups ideas can be given.
If you want to run a brainstorming session, giving people as much warning and time to prepare as possible is ideal. They can initially think through ideas in their heads, dismissing the most obvious and most wanting, then move to smaller groups to discuss the initial ideas and grow their small ideas. Then you should take the ideas to the bigger group, where all developed ideas can be discussed and critiqued.
04. Brainwrite instead of Brainstorm
This seems to be the secret to successful brainstorming – essentially don’t do it. Sitting a big room trying to come up with ideas is not going to work. Discussing and critiquing those ideas in a group is great, but just not the actual thinking. Divorcing the idea generation aspect of brainstorming from the critique is they only way to get truly great creative ideas.
To do that, Professor Leigh Thompson thinks we should brainwrite instead of brainstorm. Leigh, Professor of Management & Organizations at the Kellogg School of Management says that if everyone writes down their ideas before the meeting, and them comes to discuss them, it gets rid of all the difficulty in discussing simple ideas, as well as people trying to game the system by talking up an easy solution and then staying silent for the rest of the meeting.
By having an initial ideas period before the meeting, it gives people the time to think without the pressure of the brainstorming session
and to come up with a wide array of ideas before the meeting starts.
How to run a great brainstorming session
So what to do if you want designers, creatives and all others within your organization come up with great ideas in a brainstorming session?
Well, give them plenty of warning. The prevailing wisdom is that you have to give people time to work on their ideas before they get anywhere near that big conference room and bare whiteboard. Then they can start sprouting ideas individually, before moving to smaller groups to discuss them further.Hopefully in these small meeting they will take those small ‘baby steps’, developing ideas into something bigger that they will then collectively bring to the group.
In the brainstorming sessions themselves, constructive criticism should be given so that the group as a whole can move on to the best and most creative solutions to the problems at hand. OK, so this is all a lot harder than stuffing people in a room with some coffee and cookies and locking the door. But brainstorming sessions where originally envisaged as a way to get the best, most creative ideas out of a group of people. As they stand they do not work. Bu they can. By following a few simple ideas to foster creativity, you can grow small ideas into great ones and have brainstorming sessions that are true whirlwinds.
Design at work is toughCanva for Work is coming soon!
Andrew Tate is a freelance writer and neuroscientist who has worked on understanding the brain and how it learns in the UK, Switzerland, and the US. His interest in design stems from a passion for proper presentation, especially of data, his love of doodling, and his inability to draw anything more sophisticated than a stick figure (and his awe at anyone that can).
Naïve realism argues we perceive the world directly
Naïve realism, also known as direct realism or common sense realism, is a philosophy of mind rooted in a theory of perception that claims that the senses provide us with direct awareness of the external world. In contrast, some forms of idealism assert that no world exists apart from mind-dependent ideas and some forms of skepticism say we cannot trust our senses.
The realist view is that we perceive objects as they really are. They are composed of matter, occupy space and have properties, such as size, shape, texture, smell, taste and colour, that are usually perceived correctly. Objects obey the laws of physics and retain all their properties whether or not there is anyone to observe them.[1]
Naïve realism is known as direct as against indirect or representative realism when its arguments are developed to counter the latter position, also known as epistemological dualism;[2] that our conscious experience is not of the real world but of an internal representation of the world.
Some statements about these objects can be known to be true through sense-experience.
These objects exist not only when they are being perceived but also when they are not perceived. The objects of perception are largely perception-independent.
These objects are also able to retain properties of the types we perceive them as having, even when they are not being perceived. Their properties are perception-independent.
By means of our senses, we perceive the world directly, and pretty much as it is. In the main, our claims to have knowledge of it are justified."[3]
In the area of visual perception in psychology, the leading direct realist theorist was J. J. Gibson. Other psychologists were heavily influenced by this approach, including William Mace, Claire Michaels,[4] Edward Reed,[5] Robert Shaw, and Michael Turvey. More recently, Carol Fowler has promoted a direct realist approach to speech perception.
Naïve realism is distinct from scientific realism, which states that the universe contains just those properties that feature in a scientific description of it, not properties like colour per se but merely objects that reflect certain wavelengths owing to their microscopic surface texture. Naïve and direct realism propose no physical theory of experience and do not identify experience with the experience of quantum phenomena or the twin retinal images. This lack of supervenience of experience on the physical world means that naïve realism is not a physical theory.[6]
An example of a scientific realist is John Locke, who held the world only contains the primary qualities that feature in a corpuscularian scientific account of the world (see corpuscular theory), and that other properties were entirely subjective, depending for their existence upon some perceiver who can observe the objects."[1]
Realism in physics refers to the fact that any physical system must have definite properties whether measured/observed or not. Physics up to the 19th century was always implicitly and sometimes explicitly taken to be based on philosophical realism.
Scientific realism in classical physics has remained compatible with the naïve realism of everyday thinking on the whole but there is no known, consistent way to visualize the world underlying quantum theory in terms of ideas of the everyday world. "The general conclusion is that in quantum theory naïve realism, although necessary at the level of observations, fails at the microscopic level."[7] Experiments such as the Stern–Gerlach experiment and quantum phenomena such as complementarity lead quantum physicists to conclude that "[w]e have no satisfactory reason for ascribing objective existence to physical quantities as distinguished from the numbers obtained when we make the measurements which we correlate with them. There is no real reason for supposing that a particle has at every moment a definite, but unknown, position which may be revealed by a measurement of the right kind... On the contrary, we get into a maze of contradiction as soon as we inject into quantum mechanics such concepts as carried over from the language and philosophy of our ancestors... It would be more exact if we spoke of 'making measurements' of this, that, or the other type instead of saying that we measure this, that, or the other 'physical quantity'."[8] It is no longer possible to adhere to both the principle of locality (that distant objects cannot affect local objects), and counterfactual definiteness, a form of ontological realism implicit in classical physics. Some interpretations of quantum mechanics hold that a system lacks an actualized property until it is measured, which implies that quantum systems exhibit a non-local behaviour. Bell's theorem proved that every quantum theory must either violate local realism or counterfactual definiteness. This has given rise to a contentious debate of the interpretation of quantum mechanics. Although locality and 'realism' in the sense of counterfactual definiteness, are jointly false, it is possible to retain one of them. The majority of working physicists discard counterfactual definiteness in favor of locality, since non-locality is held to be contrary to relativity. The implications of this stance are rarely discussed outside of the microscopic domain but the thought experiment of Schrödinger's cat illustrates the difficulties presented. As quantum mechanics is applied to larger and larger objects even a one-ton bar, proposed to detect gravity waves, must be analysed quantum mechanically, while in cosmology a wavefunction for the whole universe is written to study the Big Bang. It is difficult to accept the quantum world as somehow not physically real, so "Quantum mechanics forces us to abandon naïve realism",[9] though it can also be argued that the counterfactual definiteness 'realism' of physics is a much more specific notion than general philosophical realism.[10]
" '[W]e have to give up the idea of realism to a far greater extent than most physicists believe today.' (Anton Zeilinger)... By realism, he means the idea that objects have specific features and properties — that a ball is red, that a book contains the works of Shakespeare, or that an electron has a particular spin... for objects governed by the laws of quantum mechanics, like photons and electrons, it may make no sense to think of them as having well defined characteristics. Instead, what we see may depend on how we look."[11]
"Virtual realism"[12] is closely related to the above theories.
In the research paper The reality of virtual reality it is proposed that, "virtuality is itself a bonafide mode of reality, and that 'virtual reality' must be understood as 'things, agents and events that exist in cyberspace'. These proposals resolve the incoherences found in the ordinary uses of these terms... 'virtual reality', though based on recent information technology, does not refer to mere technological equipment or purely mental entities, or to some fake environment as opposed to the real world, but that it is an ontological mode of existence which leads to an expansion of our ordinary world."[13]
"The emergence of teleoperation and virtual environments has greatly increased interest in "synthetic experience", a mode of experience made possible by both these newer technologies and earlier ones, such as telecommunication and sensory prosthetics... understanding synthetic experience must begin by recognizing the fallacy of naïve realism and with the recognition that the phenomenology of synthetic experience is continuous with that of ordinary experience."[14]
Jump up ^"We examine the prevalent use of the phrase “local realism” in the context of Bell’s Theorem and associated experiments, with a focus on the question: what exactly is the ‘realism’ in ‘local realism’ supposed to mean?". Norsen, T.Against 'Realism'
S. A. Grave, "Common Sense", in The Encyclopedia of Philosophy, ed. Paul Edwards (Collier Macmillan, 1967).
Peter J. King, One Hundred Philosophers (2004: New York, Barron's Educational Books), ISBN 0-7641-2791-8.
Selections from the Scottish Philosophy of Common Sense, ed. by G.A. Johnston (1915) online, essays by Thomas Reid, Adam Ferguson, James Beattie, and Dugald Stewart
Edward S. Reed. Encountering the World. Oxford University Press, 2003. ISBN 0-19-507301-0
Sophia Rosenfeld. Common Sense: A Political History (Harvard University Press; 2011) 346 pages; traces the paradoxical history of common sense as a political ideal since 1688
Shaw, R. E./Turvey, M. T./Mace, W. M. (1982): Ecological psychology. The consequence of a commitment to realism. In: W. Weimer & D. Palermo (Eds.), Cognition and the symbolic processes. Vol. 2, Hillsdale, NJ: Lawrence Erlbaum Associates, Inc., pp. 159–226.
Turvey, M. T., & Carello, C. (1986). "The ecological approach to perceiving-acting a pictorial essay". Acta Psychologica63 (1-3): 133–155. doi:10.1016/0001-6918(86)90060-0. PMID3591430.
Nicholas Wolterstorff. Thomas Reid and the Story of Epistemology. Cambridge University Press, 2006. ISBN 0-521-53930-7
Nelson, Quee. (2007). The Slightest Philosophy Dog's Ear Publishing. ISBN 978-1-59858-378-6