Education and the Internet – Part 1

October 19, 2014

Introduction

In an earlier post, Critical Literacy and Critical Pedagogy (2010), I observed that social critic Ivan Illich had anticipated at least one aspect of the Internet, when he referred to ‘Learning Webs’ (the title of Chapter 6 of his radical 1971 book, Deschooling Society). As I wrote at the time: ‘Illich died in 2002, and so survived long enough to witness the development of the Internet, but a quick search fails to find any specific comments by him on it.’ Therefore, we can only speculate about any opinion he might have had. I will return to Illich in Part 3.

That the Internet is transforming society in general, and education in particular, is beyond dispute, but how it is transforming it is another matter. As an educator I have mixed feelings about the phenomenon. My own formal education, apart from recent postgraduate studies, was largely pre-Web, so I have been able to compare my early learning journey with the effects of this technology on my adult experience, as well as to observe its effects on those who have grown up (or are growing up) with it.

Beyond schooling in all its forms, there is also the question of the influence of the Internet on education in the broad social sense. Some philosophers and media theorists provide useful interpretive frameworks for such social developments, and I will draw on them later. Theory aside, however, it is clear that the reality of some Web innovations falls far short of the ideal promoted by enthusiasts, and I will elucidate this point from personal experience.

In this and subsequent posts then, I am tracing the development of my own views on ‘Education and the Internet’, as the former have emerged in the course of my own education and employment. The journey will involve a lengthy detour through the details of my postgraduate research, as the latter is relevant to the topic under consideration.

From school to university

My own primary and secondary schooling in the 1970s and 80s was unexceptional. I grew up on a working-class Corporation housing estate in Dublin, with separate ‘national’ (i.e. state) schools for boys and girls. Schooling was taken for granted in that community, but education for its own sake was not especially valued. Huge class sizes in primary school meant that teachers often focussed on ‘classroom management’, although I think that in general, some extreme cases aside, we were far more respectful of authority than children in that age group today. Fortunately I came from a good home, where learning was valued, although neither of my parents had been to university, and my father hadn’t even been to secondary school.

After primary school I followed in the footsteps of my older siblings, and continued my education with the Christian Brothers. Although by that time most of the teachers were secular, there was an ethos of learning rooted in the founding example of Edmund Ignatius Rice (1762–1844). Even so, the methods were traditional, and emphasized learning of facts for examination purposes.

Towards the end of my secondary schooling (1983–85) computers were making an appearance in classrooms, but purely for the purpose of teaching programming. There was no user-friendly operating environment like Microsoft Windows, and certainly no networking. It was also at this time that the first home computers appeared, and I became the owner of a Sinclair ZX81, with 1K storage capacity! It had to be plugged into a TV and was useful for learning basic code. The display was black-and-white and there were no peripherals such as printers. All of that would change over the following decade.

Sinclair-ZX81

The Sinclair ZX81

My undergraduate years (1985–88) were entirely computer free, and it was during this time that I developed a method for the memorization of facts for exams. This, however, is only part of the story. Certainly I attended lectures and took notes in my chosen subject of Philosophy, and likewise completed essays and sat exams. To use Education parlance, I demonstrated the acquisition of ‘content’. At the same time, however, I entered into the ‘discourse’ of philosophy. This happened both formally, through a combination of text reading, small-group tutorial discussion, and essay writing; and also informally, through argument (in the philosophical sense) with classmates. It was a slow process, an apprenticeship of sorts, and it extended into the postgraduate years.

Seen against this background, memorization was the tip of the iceberg. What I memorized for exams were not simply facts, but arguments that I had already formed, usually through essay writing. In order to write those essays, I had to read a selection of texts (articles, chapters of books, and whole books), take notes summarizing the chains of reasoning, and finally argue for a position on the basis of the foregoing. The entire process was one of analysis and synthesis, comprising what I would later discover were known as ‘deep learning’ and ‘critical thinking’.

That I was studying Philosophy only made the process more explicit, since it is the business of philosophy to study reasoning, and argument is the modus operandi of the discipline. It was through this apprenticeship that I learned both the power of reasoning, as well as its limits, for even the best reasoning ultimately proceeds from a starting point that we assume. All arguments, if pushed far enough, will take us back to underlying assumptions, and uncovering such assumptions is itself a useful process. Another lesson that I imbibed was that any position could be subject to logical dissection, and my lecturers even encouraged such scrutiny of their own philosophical positions.

I have fond memories of that time. Several of my lecturers made a deep impression on me. One of them lectured on Plato’s Republic, demonstrating the relevance of the issues for our own time. This elderly Dominican priest, with a wealth of life experience, was not overly concerned with formal education. He would tell us that we could achieve a respectable, if mediocre, exam result if we accurately regurgitated the content of his lectures, but we could attain first-class honours if we told him something he didn’t know. At that time I didn’t appreciate that he was encouraging us to be more critical and creative in our learning. Nor did I understand the tediousness for someone in his position of marking dozens of identical exam papers. That experience was soon to come to me.

Postgraduate studies

In 1988 I graduated with a BA, having specialized in Philosophy, and I immediately enrolled as a postgraduate student. Initially I intended to write a minor dissertation on the topic of Myth in the work of Eric Voegelin, having become familiar with the latter through one of my undergraduate lecturers. After some preliminary research, however, which included a study of Mircea Eliade and the neo-Kantian Ernst Cassirer, my interest in the ability of myth to ‘organize’ experience developed in the direction of Ideology and language. I will go into this topic in some detail, since it played such a formative role in my intellectual development and it relates to the subject of this post.

BA graduation at University College Dublin, 1988

BA graduation at University College Dublin, 1988

The history of ideology

Today the concept of ‘ideology’ has negative connotations. It is used to refer to the body of beliefs, doctrines, etc., that guide an individual, group or institution, and is often associated with political programs. It was, however, not always understood in this way, as I will demonstrate below.

In his Novum Organon (1620), Francis Bacon (1561–1626) referred to idola (‘idols’), the false notions that obstruct the mind’s accurate comprehension of reality. Bacon categorized the different types of idols, with some being innate and others the result of socially determined distortion. Among the latter is the tendency to accept uncritically propositions that have become established with time. Language itself is a distorting medium through which we experience the world. In a move that was as significant for the development of modern science as for philosophy, Bacon proposed that the deductive logic of ancient and medieval thought be replaced by the method of induction.

Following Bacon, Thomas Hobbes (1588–1679), Claude Adrien Helvetius (1715–71) and Paul Henri Thiry d’Holbach (1723–89), developed the notion of the social determination of ideas, significantly linking it with power, including the power of religion. Helvetius recognized that domination is buttressed by the production and dissemination of certain kinds of prejudices: ‘experience shows … that almost all moral and political questions are decided by the powerful, not by the reasonable. If opinion rules the world, in the long run it is the powerful who rule opinion’ (De l’Homme). Significantly, it was also recognized that the powerful members of society do not need to impose their prejudices on the populace; rather, the latter adopt the prevailing opinion and, for some reason, prefer to live in ignorance of their true situation. For the Enlightenment thinkers, education represented the escape route from prejudice. They believed that behind the socially distorted understanding is a rational essence that can be liberated by the power of reason.

The term ‘ideology’ emerged in post-revolutionary France, where imprisoned aristocrat Destutt de Tracy (1754–1836) conceived an empirical science of thinking, designed to overcome false ideas. This was later developed in his Eléments d’idéologie, which defined ideology positively as the antithesis of prejudice. Napoleon initially supported de Tracy and his colleagues in the Moral and Political Sciences division of the Institut National, but later turned against them, branding them pejoratively as ‘ideologists’, impractical intellectuals who did not understand the real workings of government. In the final volume of his Eléments, de Tracy was forced to admit that economic interests were more powerful determinants of social life. The full implications of this idea were later drawn out by Marx.

Following de Tracy, the concept of ideology was developed in two major streams of modern thought, French positivism and German idealism, although neither used the term itself. In the former, Auguste Comte (1798–1857) conceived of knowledge passing through progressive stages, culminating in empirical science. In the latter, Georg Wilhelm Friedrich Hegel (1770–1831) spoke in terms of history as the working out of Absolute Spirit (Geist), coming to know itself through its objectification in the world. Ludwig Feuerbach (1804–72) inverted Hegel’s process by describing the Absolute (God) as a projection of human qualities, with religion being a stage to be overcome.

There are several interrelated senses of ‘ideology’ in the writings of Karl Marx (1818–83). In his early writings, culminating in The German Ideology (1846), which he co-authored with Friedrich Engels (1820–95), he demonstrated his intellectual debt to Hegel and Feuerbach. Like Hegel, Marx saw history as a law-governed process; and like Feuerbach, he wanted to reclaim essential human qualities, which had been projected outside the human being and become ‘alien’ powers. According to Paul Ricoeur, this early work represented a progressive characterization of ‘the real’ and its opposite, ‘the unreal’. The former was identified with praxis, the creative activity whereby human beings produce the material conditions of their existence. This activity carries within itself the possibility that the products of labour, including social institutions, assume an existence independent of the conditions that give rise to them. This is ‘alienation’.

The ‘German Ideology’ that Marx and Engels criticized was, nevertheless, the philosophy of the Young Hegelians, including Feuerbach. They found in this ‘idealistic’ philosophy precisely the sort of distortion that occurs when ideas become separated from their basis in real life. This leads to the illusion that society can be changed by replacing ‘false’ ideas with ‘true’ ones, as the Enlightenment thinkers believed, rather than by altering the material conditions of life. As Marx wrote in his eleventh Thesis on Feuerbach: ‘The philosophers have only interpreted the world, in various ways; the point is to change it’.

In opposition to the German idealism, Marx and Engels proposed a materialistic philosophy that they believed would re-establish the true relationship between life and thought:

Morality, religion, metaphysics, all the rest of ideology and their corresponding forms of consciousness, thus no longer retain the semblance of independence. They have no history, no development; but men developing their material production and their material intercourse alter, along with this their real existence, their thinking and the products of their thinking. Life is not determined by consciousness, but consciousness by life.

This systematic or ‘epistemological’ conception of ideology, however, in which the distorting nature of ideology is internal to knowledge itself, simply inverted the problem of idealism. After all, how could the so-called material conditions of life have any meaning for us, if they were not already imbued with ‘ideas’?

Later in The German Ideology Marx and Engels provided a more political conception of ideology:

The ideas of the ruling class are in every epoch the ruling ideas, i.e. the class which is the ruling material force of society, is at the same time its ruling intellectual force. The class which has the means of material production at its disposal, has control at the same time over the means of mental production, so that thereby, generally speaking, the ideas of those who lack the means of mental production are subject to it. The ruling ideas are nothing more than the ideal expression of the dominant material relationships, the dominant material relationships grasped as ideas …

In this sense, ideology serves the interests of a particular group, the ruling class. According to Marx and Engels’ description, the division of mental and material labour allows a section of the ruling class to become its professional thinkers, ‘its active, conceptive ideologists, who make the perfecting of the illusion of the class about itself their chief source of livelihood’. We might describe this group as the intelligentsia.

Marx’s later formulations of ‘ideology’ did not escape from the paradox inherent in his earlier writings. For instance, in his Contribution to a Critique of Political Economy (1859), he described ideology in terms of a ‘superstructure’ that depends on an ‘economic foundation’ (what he later called the ‘base’):

The mode of production of material life conditions the social, political and intellectual life process in general. It is not the consciousness of men that determines their being, but, on the contrary, their social being that determines their consciousness.

The first sentence reflects the political definition of ideology, while the second reflects the epistemological one. But is the ‘mode of production of material life’ not already ‘informed’ by ideas? And what happens to ‘the social, political and intellectual life process’ once the communist society has been achieved? What forms of intellectual life would exist then? And what did Marx think was the status of his own theory in this schema? Were his ideas exempt from the very causal process that they described? In short, the base–superstructure model presents the relationship between activity and ideas in almost mechanical (economic) terms, and this oversimplification cannot do justice to a theory of ideology. Consciousness is not a passive reflection of an independent world, and ideas have a more positive role to play in constituting our subjectivity.

The tension between the epistemological and political senses of ideology was bequeathed by Marx to the theorists who followed him, and resulted in various attempts to overcome it. Vladimir Lenin (1870–1924), for instance, declared that one simply had to choose between bourgeois and socialist ideology. For Georg Lukács (1885–1971)  all thought is ideological, but that doesn’t make all thought (or ideology) equal. Antonio Gramsci (1891–1937) conceived of ideology as an element in the phenomenon he described as ‘hegemony’, that is the institutions of civil society (family, school, media) as opposed to economy and state. Although coercion remains a possibility, through hegemony a dominant power secures its authority without recourse to it. In the words of Terry Eagleton, hegemony is ‘the “common sense” of a whole social order’. Although this would be true of any social formation, capitalism appears to represent a decisive shift in the ratio of consent to coercion; for the use of force, the naked manifestation of power, is only likely to reduce ideological credibility and destabilize the political status quo. As Machiavelli had recognized four centuries earlier, ‘deceit’ is more efficient than pure force. Hegemony represents the internalization of power – by its means, the individual lives under the illusion of self-government.

The epistemological circle inherent in Marx was not overcome by any of these theorists, since each one had to recognize his own historical situatedness, and this undermined any claim to objectivity in his theory. Perhaps the most sophisticated attempt was made by the Frankfurt School neo-Marxist Jürgen Habermas (b. 1929). In Knowledge and Human Interests (1968), Habermas distinguished three groups of sciences, each with its own distinctive ‘interest’. For the natural sciences this interest is one of technical control and manipulation. For the historical and interpretive sciences it is communication. Finally, the critical social sciences have emancipation as their objective.

Habermas credited Marx with having elaborated a theory of human nature and society in terms of practical interests. For him, Marx was actually engaged in forms of historical-interpretive science and critical social science. Under the influence of Enlightenment thinking, however, and in particular the celebration of the natural sciences, Marx conceived of his work in terms of natural science. For Habermas, then, Marx was himself subject to a form of ideological thinking – the ideology of Enlightenment attitudes towards science. Therefore, although Marx’s critique of capitalist society was still relevant, his categories had become redundant. Habermas wanted to reinstate the historical and interpretive sciences, in recognition of the fact that social activity is inherently meaningful, and cannot be reduced to causal explanation. Human beings do not simply interact, but exchange symbols in a constant process of communication, a process that can only be understood through interpretation.

In ideology, however, interpretation is not straightforward, since ideological communication is distorted, bearing as it does the marks of the power relations that pervade society. Therefore, Habermas insisted that the historical-interpretive sciences must give way to the critical social sciences, since only they have the ‘distanciation’ required by ideology critique. His ‘theory of communicative action’ represents his attempt to overcome ideology by positing an ideal of unimpeded communication towards which all utterances tend.

This notion brought Habermas into conflict with another tradition within European thought: the hermeneutical tradition in the Philosophy of Language. In order to understand this, it is necessary for me to retrace my steps.

The philosophy of language

As an undergraduate I had been attracted to the theory of the ‘language game’ developed by Ludwig Wittgenstein (1889–1951), which supported the notion that words only have meaning in a context. My postgraduate research introduced me to the more radical idea that humans are interpreting beings per se. Historically this development represented the movement from interpretation as a regional discipline (e.g. Biblical interpretation) to interpretation as fundamental to all human activity. This shift had been initiated by Edmund Husserl (1859–1938), particularly with his notion of the Lebenswelt or ‘life world’, which was taken up by his student, Martin Heidegger (1889–1976) and carried on by the latter’s student, Hans-Georg Gadamer (1900–2002), in addition to others.

Thus philosophical ‘hermeneutics’ (from a Greek word meaning ‘interpretation’) was born. Gadamer emphasized the situatedness of human understanding within a ‘horizon’ of meaning. As we grow up we adopt the interpretative framework of our culture. Tradition has a determining influence over us. It was this consideration that led to the famous (and inconclusive) debate between Gadamer and Habermas, with the latter arguing that it is possible to transcend and criticise tradition, and the former responding that criticism may take place from within a tradition but can never entirely transcend it.

Paul Ricoeur (1913–2005) took the notion of ‘situatedness’ further. He proposed that the philosophical search for meaning could no longer assume direct access to the truth. Instead it was necessary to take a ‘hermeneutic detour’. In particular, Ricoeur claimed that Marx, Nietzsche and Freud (whom he termed ‘masters of suspicion’) had shown that human beings are unconsciously determined by forces that are greater than themselves, whether the force be relations of production, the will to power, or the libido. Any search for truth would have to negotiate such factors.

Ricoeur later developed an interest in ‘narrative identity’. According to this theory, our life experiences are not disjointed episodes but rather integrated by the individual into a coherent story or narrative. In this way, the meaning of our lives is constructed through a process of interpretation, even if such a narrative is never overtly expressed.

Regarding ideology, Ricoeur provided a useful framework with which we may be able to accommodate the heterogeneous features of the phenomenon, defining it in terms of three concepts, each being successively dependent on the one preceding it. These reveal ideology’s ‘integrating’, ‘legitimating’, and ‘distorting’ functions respectively. The first concept is the most neutral, as it describes the power of ideology to integrate a society through self-image, justification, etc. From this basis, the second concept describes ideology’s role in legitimating power. Finally, based on the ability of ideology to integrate and legitimate, the third concept describes its negative role as distorting.

Semiotics

As part of my research into language and ideology, I also undertook an investigation of the area known as ‘semiotics’ (from the Greek word for ‘sign’, semeion). I was particularly interested in the work of Umberto Eco (1932–), who belongs to a tradition passing back through Charles Sanders Peirce (1839–1914) to Roger Bacon (c. 1214–c. 1293), Augustine (354–430), and the Stoic philosophers. According to this tradition, a sign is a relation of three entities (or perhaps more accurately a process involving three entities). Peirce referred to these entities as the ‘sign’, its ‘object’, and an ‘interpretant’.

This triadic relation distinguishes semiotics from the ‘semiology’ associated with Ferdinand de Saussure (1857–1913), which conceives of the sign in terms of a more ‘static’ dyadic relationship between a ‘signifier’ and a ‘signified’. In the semiotic schema, anything can become a sign of anything else, on the basis of the mediating function provided by an interpretant. A sign, therefore, is something that stands for something else, in some respect or capacity. Smoke can be a sign of fire, but it can also be a sign of human habitation. The implication is that signs require interpretation.

According to Eco, signs are interpreted according to a ‘code’, which is the sum of the cultural rules governing sign-functions. There are many interconnected subcodes, and any sign-function can be interpreted according to multiple subcodes, sometimes producing contradictory interpretations. Since the process of ‘semiosis’ is in principle unlimited, Eco invokes ‘context’ and ‘circumstance’ to explain how one interpretation becomes more plausible than another. For example, in the context of politics, ‘red’ denotes ‘communist’; and with the circumstantial marker ‘police’, it connotes ‘subversive’, etc. In the context of economics, however, ‘red’ denotes ‘debt’ (to be ‘in the red’); while with the circumstantial marker ‘employment’, it connotes ‘unemployment’, ‘eviction’, etc.

Through association, particular contexts and circumstances become part of the compositional makeup of signs. A sign can accordingly be defined as a ‘set of instructions’ for its possible employment and interpretation (note the similarity of this to Wittgenstein’s notion that the meaning of a word is its use). These instructions will vary from individual to individual, age to age, and culture to culture. They also allow for the creative attribution of meaning (e.g. metaphor), an aesthetic process with the potential to enrich the code.

In semiotic terms, ideological communication represents the attempt to constrain meaning to a single interpretation, i.e. the desired interpretation that a group wants to promote. This runs counter to the unlimited nature of semiosis. Eco calls it ‘code-switching’: the privileging of of one subcode while concealing others. He points out, however, that interpretation occurs at the destination of a message rather than at the source. The sender of a message does not have complete control over its interpretation by an addressee.

Semiotics, therefore, provides both a framework for the understanding of ideology, as well as the possibility of ideology critique and a pragmatic method of undermining it.

In conclusion, we see that the history of ideology and the history of the philosophy of language intersect at the point of tension between our historical (and linguistic) situatedness, on the one hand, and our attempt to overcome the distortions that arise from it.

First experience of teaching and the move to Cambridge

At the time that I commenced postgraduate studies I also became a tutor to philosophy undergraduates. This was my first experience of being at the front of a classroom, and I grew to like it, even if it didn’t come naturally to an introvert. One of the challenges of the job was the marking of a hundred or more essays on the same topic. This was my practical introduction to the ‘normal distribution’, although I was not then familiar with the term. It was clear that there was a range of ability in any group. I wouldn’t tutor university students again for another seventeen years, and there would be a new challenge by that time.

Before that, however, I moved to Cambridge. With the help of a letter from the Professor of Philosophy at UCD, I gained a reader’s ticket for the University Library and conducted some private research there. I got a job in Heffers, the main academic bookshop in Cambridge and an institution in the town (since bought out by its Oxford rival, Blackwell). I worked there for four years, dealing with students, academics and the general public, and I encountered many interesting people, including some celebrities, and my first wife. I met the writer, John Cornwell, and became his assistant in the Science and Human Dimension Project, a ‘public understanding of science’ body based in Jesus College. Part of this role involved conference organization. Eventually I moved from bookselling to publishing, first as an editor and project manager, and then for a small company that specialized in digital encoding (XML).

Interior of Heffers main store, Trinity Street, Cambridge

Interior of Heffers main store in Cambridge

At this time I was also a member of the Scientific and Medical Network (SMN), an association of people with a broad interest in the sciences. They ranged from well-known academics, on the one hand, to university graduates and other intellectuals who maintained an interest in what might be described as ‘progressive’ ideas. As I later discovered, at the other end of the spectrum were ideas that are sometimes referred to as ‘lunatic fringe’. I attended meetings of the local SMN group, and wrote a couple of book reviews for Network, the SMN journal.

In 2003 I moved to Australia and became the primary carer for my two children, my wife having secured a senior academic position. When, in 2006, my youngest child started at a Montessori school, I grasped the opportunity to tutor undergraduates taking compulsory units in Philosophy and Ethics at a local university. Admittedly these students were not taking a degree in Philosophy, and there was the same range of ability that anyone might expect, but one factor came as a complete surprise to me.

In the intervening decade and a half, society had witnessed the advent of the Internet. Initially this had seemed like a useful tool for communication and the exchange of information. Email, for example, was fast and convenient. Then came websites conveying basic data. Subsequent developments saw the Web becoming more interactive (so-called Web 2.0): YouTube, Wikipedia, social networking, and so on. Illich’s notion of a ‘learning web’ was not simply realised – his expectations were exceeded, at least potentially.

My first intimation of the effect of this social change on education came from reading student essays in 2006. I found myself reading text that could not possibly have been written by the students I knew in the classroom. It was simply not their authorial voice. It was not necessarily the case that I was being given an essay that had been found online, although I became aware of this possibility. Rather I was seeing whole swathes of text that were being copied and pasted. At their worst, some essays were a patchwork of chunks of text cobbled together into the semblance of a coherent essay, with the occasional substitution of terms with synonyms. The problem was that they were usually not coherent, with the added disadvantage that such submissions usually took me more time to mark because I had to trawl the Internet to find the source of the various pieces in order to demonstrate their origin. Later I learned that websites had been developed for automatic checking of this kind, at the essay submission stage, precisely in order to combat fraud and reduce the time wasted.

Naturally these essays did not score highly, and in the worst cases they were an instant ‘Fail’. More surprising to me was that the offending students seemed to think that their inclusion of a web address in the references prevented them from falling foul of the university’s plagiarism policy. Furthermore, since these were compulsory units, many students were only concerned with obtaining the required ‘Pass’ grade needed for graduation. If they had postponed the unit(s) until their final semester, sometimes in a four-year course, a ‘Fail’ meant that they had to repeat the unit and postpone their graduation.

These were not minor considerations for the students concerned, since they had both financial and social consequences. What bothered me, however, was that such practices were inimical to the sort of deep learning that I had acquired as an undergraduate. The offending students were simply not acquiring the hard-earned skills of analysis and synthesis that had been necessary to good research and writing in the pre-Web era.

Retraining

After a year of tutoring I got a job in the Extension department at the University of Western Australia, that part of the institution with a particular vocation for community outreach, and located in the beautiful building and grounds of the former Claremont Teacher Training College. My areas of responsibility were Intellectual Adventures (which included the philosophical courses), Languages, and Writing and Communication. My time in this role was very rewarding: for me it represented education in a very positive sense – people studying topics of interest to them, motivated only by the love of learning, without testing or the awarding of qualifications (at most, there were certificates of participation). Courses were offered both by academics and external providers. Many of our ‘clients’ were of mature age, with enough disposable income and spare time to participate in such activities.

UWA Extension in Claremont

UWA Extension in Claremont

After a year and a half I was offered a position in the University Vice-Chancellery, and I experienced a very different side to the university, one more concerned with governance. In my role as a senior research officer, I became aware of various trends in third-level education, including institutional reorganization and the development of novel architectural spaces. It was during my twelve months in this position that I heard that the Faculty of Education was going to offer a Master of Primary Teaching degree. I enrolled in this course and spent the next two years studying the various ‘content’ areas of the curriculum, as well as principles of pedagogy, ‘special’ education, and ‘classroom management’. I graduated in 2011.

A large proportion of the course was devoted to practical and administrative aspects of teaching, as well as the content of the curriculum, but I was particularly attracted to the philosophical aspects of pedagogy, in particular the ‘social constructivist’ ideas of Lev Vygotsky (1896–1934). These I could relate to my background in hermeneutics. For example, according to social constructivism, education is a process of induction into the norms of a society, whereby meaning is actively constructed by the child. New information is understood (or interpreted) by being assimilated to existing knowledge frameworks. It is easy to see how such ideas are congruent with Gadamer’s ‘horizon of meaning’ and Ricoeur’s ‘narrative identity’.

Another unit that I particularly enjoyed was ‘Teaching and Learning with New Technologies’, which increased my understanding of the potential role of blogs, wikis, and other new technologies in education. The present blog began as a requirement for that unit. The lecturer created a useful wiki (E-language) containing resources for those with an interest in this topic. Of particular relevance here is the Myths of e-learning page.

What was not included in this course was the sort of philosophical questioning that was a natural part of my earlier degree. It was in Dublin that I had first read Ivan Illich’s Deschooling Society, whereas this book and others like it were not familiar to the majority of my classmates, most of whom were about two decades younger than me, and graduates in a range of degrees that did not include philosophy. It was at this time that I discovered other critics of schooling, such as John Taylor Gatto, although these ‘alternative’ authors were not on my official reading list.

I will return to a consideration of these writers in Part 3, but first I need to go into some detail about another aspect of the Web that I personally encountered during this period.

To be continued in Part 2

Ivan Illich

May 6, 2014

Image

The Myth of Measurement of Values

The institutionalized values school instils are quantified ones. School initiates young people into a world where everything can be measured, including their imaginations, and, indeed, man himself.

But personal growth is not a measurable entity. It is growth in disciplined dissidence, which cannot be measured against any rod, or any curriculum, nor compared to someone else’s achievement. In such learning one can emulate others only in imaginative endeavour, and follow in their footsteps rather than mimic their gait. The learning I prize is immeasurable re-creation.

School pretends to break learning up into subject “matters,” to build into the pupil a curriculum made of these prefabricated blocks, and to gauge the result on an international scale. People who submit to the standard of others for the measure of their own personal growth soon apply the same ruler to themselves. They no longer have to be put in their place, but put themselves into their assigned slots, squeeze themselves into the niche which they have been taught to seek, and, in the very process, put their fellows into their places, too, until everybody and everything fits.

People who have been schooled down to size let unmeasured experience slip out of their hands. To them, what cannot be measured becomes secondary, threatening. They do not have to be robbed of their creativity. Under instruction, they have unlearned to “do” their thing or “be” themselves, and value only what has been made or could be made.

Once people have the idea schooled into them that values can be produced and measured, they tend to accept all kinds of rankings. There is a scale for the development of nations, another for the intelligence of babies, and even progress toward peace can be calculated according to body count. In a schooled world the road to happiness is paved with a consumer’s index.

Ivan Illich, Deschooling Society

The Blue School

April 9, 2012

The Blue Man Group has taken its work on creativity to another level by opening The Blue School. The following video contains interviews with the school founders and Sir Ken Robinson, among others.

Opting Out and Staying at Home

January 29, 2012

There was a very interesting discussion of the various types of homeschooling on Radio National’s Background Briefing this morning: Opting Out and Staying at Home. The subject goes to the heart of the issue of individual freedom versus state control, and there are some very tricky questions. For example, should parents be allowed to homeschool their children because they have religious objections to parts of the curriculum? Some of the comments on the program web page are also quite enlightening, such as the following one from Gordon:

As teachers in the public school system my wife and myself toyed with the idea of home schooing for our single child. Instead we opted to send her to a Catholic school although we are both atheists.

We figured it was more important for our daughter to have a social background but still within a disciplined environment hence the Catholic school which was very successful.

Our disgust with the public school system, from our working experiences up and down the east coast, prompted us to move outside.

The system is a mess, no discipline, no proper curricula, leadership granted according to old union loyalties rather than talent, and a heavy influence from politically correct elements amongst the teaching staff.

I’m not surprised to see the rise of home schooling. Good luck to them.

Peter Benson and Sparks

October 12, 2011

This Tedx talk by recently deceased Peter L. Benson, former president and CEO of Minneapolis-based Search Institute, is very reminiscent of Ken Robinson’s notion of The Element. Like Robinson, Benson indicates the significance to our learning of finding out what truly inspires us. He similarly emphasises the role of creativity in learning. In a final point of comparison, he relates anecdotes (such as the one about Amy Irving) that give an unexpected insight into the background of some well known person. Here’s a quote from the final minutes of the talk:

I would make knowing kids’ sparks at the very centre of school life. In fact, I’d put it right at the front. I don’t know how you can engage and connect and bond kids to the institution called school without knowing their spark. I would teach families the process of the spark dialogue, and how to name, firm and be champion. I would make the first parent-teacher conference of the year to be about the spark of a kid: let’s talk that through, and we’ll get to the rest of the stuff.

The Scholar and the Philosopher

May 1, 2011

A scholar went one day to see a practical philosopher, to determine the origins of his system. As soon as the question was asked, the master handed the academic a delicious peach. When it had been eaten, the master asked whether he would like another. The scholar ate the second peach. Then the philosopher said: ‘Are you interested in where this peach was grown?’ ‘No,’ said the scholar. ‘That is your answer about my system,’ said the master.

Counting What Can’t Be Counted

March 9, 2011

Thanks to Zoe Weil over at Cooperative Catalyst for posting this.

Arnold Greenberg, founder of Miquon Upper School in Philadelphia, Deep Run School of Homesteading, and Liberty School – A Democratic Learning Community, lives in an off-the-grid cabin in East Blue Hill, Maine. He wrote this essay, “Towards a Different Standard: Counting What Can’t Be Counted,” which I wanted to share with readers of Cooperative Catalyst. Enjoy.

Here we go again with yet another set of academic standards under the title Race to the Top—an attempt to replace the great aspirations of No Child Left Behind. Now, we have brand new recommendations for what all students should master in English and Math as they move from elementary through high school and graduate ready, it is hoped, to succeed in college and flourish in their futures.

English and math experts consulted last year by the National Governors Association and the Council of Chief School Officers went to a great deal of trouble producing the new standards. The English section, for instance, is six hundred pages long and attempts to define what all students are expected to know and be able to do. The Obama administration is taking a “tough love” approach, firing principals and teachers in schools that do not meet the standards and also encouraging states to compete for a piece of the four billion dollar federal pie if they adopt the new standards. The goal is to end up with national rather than widely different state standards, and ultimately to enable our young people to compete with other countries, most of which have national standards and outscore the U.S. on international tests.

Unfortunately, there is little substantial difference between Race to the Top and NCLB. It’s more of the same dressed up with a fresh coat of paint and reminds me of Einstein’s famous definition of insanity: doing the same thing repeatedly and expecting different results. Einstein also said, “Not everything that can be counted counts, and not everything that counts can be counted.” The purpose of this essay is to explore what “counts” in education but can’t be counted, as well as possible ways to measure those aspects of becoming educated that I believe are more significant than what we now measure—especially as we experience the world of the 21st Century.

Our current approach to education hasn’t changed in over two hundred years. It was designed to meet the needs of the Industrial Age and was based largely on techniques developed in Prussia when its work and military forces required a compliant citizenry. Known as “psycho-physics,” the Prussian model involved breaking knowledge into segments that are interrupted by a horn or bell before moving on to another subject, thereby making students dependent on the teacher. It was an effective way to stamp out factory workers and to sort young people into different levels of employment—executives, managers, and common laborers—but now it is woefully obsolete.

While the emphasis in our schools has been on preparing young people to be productive members of society, there is evidence that many people learned the necessary skills without going to school. The list of self-educated people who went on to be successful is extensive—Lincoln and Edison to name only two. What qualities and characteristics enabled them not only to learn the essential skills, but also to be creative, determined people who lived significant, productive lives?

My concern here is the emphasis our schools place on measuring what is easily measured at the expense of developing those qualities that many self-educated people learn outside of school. And since measuring everything that schools do seems to be so important, is there a way to measure the qualities that I will call a “different standard?” Can we learn to count what can’t be counted?

Before looking more closely at those questions, it is important to have a deeper awareness of the unique qualities of each child because they are ignored and smothered by our approach to learning. We are missing a major component in understanding individuality and why our schools are thwarting the true potential of so many young people unless we consider the following statement by Ralph Waldo Emerson: “The secret of education lies in respecting the pupil. It is not for you to choose what he shall know. It is chosen and foreordained and he only holds the key to its secret.” Unfortunately, the utilitarian nature of our schools ignores that “secret” aspect of individuality and instead the goal is homogenization.

Another statement of Emerson’s that resonates with me is, “The purpose of education is to teach how to live, not how to make a living.” Clearly, this is the antithesis of our current approach to education, with its overarching emphasis on what all students should know in order to be prepared for college or the workplace.

To achieve schools able to meet the utilitarian goals of society, a systematic approach was created by a team of university presidents, who, beginning in 1892, devised the Carnegie Unit—a system of breaking down knowledge into lessons that if dispensed for a certain number of minutes each day, five days a week, could, by the end of the year, produce the desired results. All subjects could be presented in this way and after twelve years, students would be ready to graduate. On paper this “scientific” approach was neat, clean, and measurable. However, it ignored many variables.

Two of the variables are the teacher and the individuality of the students, both of which are impossible to control. Lip service is given to respecting individuality but in reality, the student is also a “unit” whose uniqueness does not count. Some students are successful under this practice and learn what is expected, possibly at the expense of their talent, intelligence, and creativity. Others refuse to learn and either became discipline problems or passively go through the motions of learning enough to get by. Others learn by pursuing their interests and passions outside of school. Today, according to the Gates Foundation, an estimated 3500 students drop out every day—a figure that does not include those who drop out mentally but are still enrolled. The fact is only a small percentage of students graduate from high school prepared to do college work and less than half of students who go to college complete their education—some for financial reasons but most because they are not prepared.

It is important to see our approach to educating our children in the context of our times. As any one who has read Tom Friedman’s, The World is Flat or seen Al Gore’s “An Inconvenient Truth” knows that things are radically different today than they were even ten years ago. Our children and the “yet to be born” are inheriting a world and way of living that is becoming unrecognizable. The awesome power and potential of the Internet is transforming how we communicate and collaborate, while at the same time we are on a collision course with destructive environmental issues the results of which are impossible to calculate. If our schools are expected to prepare young people for the world of the twenty-first century, how do schools meet that challenge?

In order to prepare our young people for the coming decades, we must consider the research on how the brain works. Children are naturally powerful learners and acquire a great deal of knowledge and skills through playing, observing, asking questions, and experiencing the world around them. They learn by doing and solving problems, figuring out what works and what doesn’t, and pursuing what is relevant to them in the moment. It’s amazing to watch children learning so spontaneously and proficiently while mostly having fun.

Our schools, however, take an approach opposite to the way children learn prior to going to school. Suddenly learning becomes equated with following instructions, and too often the natural joy of learning is replaced by a prescribed curriculum whereby the teacher dispenses information to be reproduced on a test. This approach isn’t questioned by parents because that’s the way they were taught too. Only now, barraged by the media, the Internet, and increasing numbers of adult-structured extracurricular activities, young people today have very little time to call their own.

It’s interesting that the original meaning of the word school is schola, which in Greek means “leisure”—the leisure for discourse, pursuing interests, and play. Everyone acknowledges that our schools are not working and are resistant to change. Bailing out our banks and Wall Street without really changing how they do business and expecting different results is a form of Einstein’s insanity. Pouring more money into our schools and coming up with a new revision of standards is another. It hasn’t worked in the past and it will not work in the future.

Why are our schools so stuck? The reasons are many, but a major one according to Seymour Sarason in The Culture of Schools and the Problem of Change is the hierarchal structure whereby curriculum mandates and policies are created by corporations, universities, and government and passed down to Departments of Education, then to superintendents and principals, and finally to teachers who have little or no autonomy. No Child Left Behind was the most recent example. It has stifled creative change, destroyed morale, and proven to be largely ineffective, and there is no reason to believe Race to the Top will be any different with its added threat of principals and teachers losing their jobs if their students do not meet the new standards.

So what is the alternative? I believe there needs to be a paradigm shift in education before we can create schools based on how children actually learn and that address 21st-century realities. The shift I am proposing centers on a problem-based curriculum in which the goal is to develop the ability to articulate important questions about issues of concern and to learn how to find solutions. “Let the questions be the curriculum,” Socrates once advocated. He “taught” by asking questions to which he did not know the answers, and he said he owed his wisdom to his willingness to let his questions guide him. Here I think it is illuminating to note the relationship between the words “quest” and “question.” For Socrates, it is the quest for knowledge that is important. A good question is a quest and can be the beginning of important journeys into the unknown.

A problem-based approach to learning is as natural as breathing. It could dramatically change how schools are structured and how teachers teach, and ultimately enable students to develop the abilities that really “count.” Problem-based learning is built on the assumption that the most effective learning takes place when students are using their knowledge to solve real life problems that concern them. It encourages them to work either individually or collaboratively on problems that are relevant to their lives in order to create and propose solutions as opposed to the traditional approach of reproducing information. Through analysis, strategizing, and the gathering of data and information, student learning is deepened because it is being used to solve real problems. Imagine students exploring the causes for global warming and proposing solutions or analyzing our current food distribution system that has a billion people hungry and suggesting how these problems can be remedied.

In a problem based curriculum, the three Rs are replaced by the four Cs: critical thinking, creativity, collaboration, and communication. The emphasis is on is how, not what to learn, and the structure of the school day is no longer divided into units of time and separate subject matter disciplines. The classroom is no longer rows of desks with the teacher at the front “teaching.” And the children are no longer passive recipients of information, but are active problem solvers. They are learning how to look at the root causes of a problem, gather data through research, and collaborate on a possible solution. When they are finished, they present the results of their quest to their learning community, prepared to defend their solutions as part of a critical dialogue. Getting feedback and evaluating themselves is an important part of the learning process.

The role of the teacher changes from dispenser of information to model, guide, facilitator, and more experienced learner. I like to think of the teacher as a consenting partner in the learning process and of the relationship between teacher and student as a loving, collegial friendship, as opposed to the authoritarian style that is now the norm.

What are the different standards that can be achieved with a problem-based curriculum? Here are a few that I believe are most valuable: the ability to determine and articulate a significant question, to collaborate and communicate clearly orally and in writing, to become an independent, self-directed learner able to sustain motivation, to use time wisely, and to be a joyful, spirited citizen of his or her community and the world. I am convinced that the students who learn in a problem-based curriculum will do as well or better on the new Race to the Top standardized tests of academic performance without “teaching to the test.”

All of this brings us back to the question, is it possible to “count” what can’t be counted? Schools currently depend on multiple choice tests to measure performance, but I believe a different method is necessary, one that is based on observation and students’ self-perceptions. This approach to “measuring” would attempt to evaluate growth in certain areas over a period of time. Comparing a student’s self-evaluation with the observations of the teacher would be one way to measure what formerly was not measured.

Significant progress has been made in attempting to measure the qualities that are developed in a problem-based curriculum by Mark Van Ryzin, a doctoral candidate in Educational Psychology at the University of Minnesota. In what he calls the “Hope Study,” he surveyed students on issues such as their relationships with peers and teachers, their perception of the impact of learning environment on them, and how they feel about their progress and their futures. He placed their responses in the categories autonomy, belongingness, and hope, and he discovered it is possible over a period of time to see how a student’s self-perceptions have evolved. By focusing on students’ self-perceptions, perhaps we will be able to determine how successful a problem-based approach is to improving students’ performance as well as their attitude toward their futures—that is, are they happier and more hopeful?

In Mary C. Clark’s seminal book, In Search of Human Nature, a vast study of various cultures, she determines that there are three “propensities” essential to human happiness —autonomy, bonding, and meaning. This is similar to what the “Hope Study” attempts to measure. Autonomy is a sense of self, feeling one’s individually is respected and in Emerson’s words, one’s “fore-ordained” uniqueness is allowed to flourish. Bonding is the sense of belonging to a family and community. Meaning refers to having a sense of purpose; that one’s life is of value to one’s community.

Comparing the growth in these areas as students transition from a traditional to a problem-based approach with the results of standardized tests of academic achievement would provide significant information that could encourage more schools to adopt a problem based approach and radically change how schools look and operate. It is likely that students from problem-based schools will do as well or better on the “Race to the Top” tests; however, we would also be measuring what formerly was not “counted but count.”

A paradigm shift in how we structure our schools, and how we engage young people intellectually, emotionally, and imaginatively in ways that develop their ability to be collaborators and creative problem solvers can achieve different standards that can truly make a difference. The shift to a different standard will develop those all-important qualities that previously could not be counted, skills and attitudes that will go a long way toward creating a better world.

Schooling the World

December 16, 2010

The film Schooling the World puts forward a provocative thesis. Is it necessarily the case that schooling (in the modern Western sense) improves life for the majority? The film reminds me of John Taylor Gatto’s point about compulsory schooling being damaging for families and communities. Like Gatto, it quotes Ellwood P. Cubberley, Dean of Stanford’s influential School of Education in the early twentieth century:

Our schools are, in a sense, factories, in which the raw products (children) are to be shaped and fashioned into products to meet the various demands of life. The specifications for manufacturing come from the demands of twentieth-century civilization, and it is the business of the school to build its pupils according to the specifications laid down.

But the film has more of an anthropological angle, in which education is understood as enculturation – ‘the process by which a person learns the requirements of the culture by which he or she is surrounded, and acquires values and behaviours that are appropriate or necessary in that culture’ (Wikipedia). Culture itself is conceived in terms of an ‘ecosystem’, in which people sustain ways of life within their physical environment. This relatively harmonious balance is disturbed when any one element is changed. When we introduce our own system of education (schooling) into such a culture, we change it irrevocably. The film undermines our notion that other cultures are ‘developing’, a notion that entails an assumption of superiority, with our own culture always more advanced along the developmental path, or perhaps even at the summit of attainment.

There are also echoes of Ivan Illich. Readers may want to visit the blog, and the FAQ page is also worth a read.

Changing Education Paradigms

October 19, 2010

Thanks to Steve Miranda for posting this.

Lisa Fitzhugh on why Maslow matters

September 15, 2010

Re-posting this interesting item from Steve Miranda on his Re-educate blog.


Follow

Get every new post delivered to your Inbox.