In an earlier post, Critical Literacy and Critical Pedagogy (2010), I observed that social critic Ivan Illich had anticipated at least one aspect of the Internet, when he referred to ‘Learning Webs’ (the title of Chapter 6 of his radical 1971 book, Deschooling Society). As I wrote at the time: ‘Illich died in 2002, and so survived long enough to witness the development of the Internet, but a quick search fails to find any specific comments by him on it.’ Therefore, we can only speculate about any opinion he might have had. I will return to Illich in Part 3.
That the Internet is transforming society in general, and education in particular, is beyond dispute, but how it is transforming it is another matter. As an educator I have mixed feelings about the phenomenon. My own formal education, apart from recent postgraduate studies, was largely pre-Web, so I have been able to compare my early learning journey with the effects of this technology on my adult experience, as well as to observe its effects on those who have grown up (or are growing up) with it.
Beyond schooling in all its forms, there is also the question of the influence of the Internet on education in the broad social sense. Some philosophers and media theorists provide useful interpretive frameworks for such social developments, and I will draw on them later. Theory aside, however, it is clear that the reality of some Web innovations falls far short of the ideal promoted by enthusiasts, and– I will elucidate this point from personal experience.
In this and subsequent posts then, I am tracing the development of my own views on ‘Education and the Internet’, as the former have emerged in the course of my own education and employment.
From school to university
My own primary and secondary schooling in the 1970s and 80s was unexceptional. I grew up on a working-class Corporation housing estate in Dublin, with separate ‘national’ (i.e. state) schools for boys and girls. Schooling was taken for granted in that community, but education for its own sake was not especially valued. Huge class sizes in primary school meant that teachers often focussed on ‘classroom management’, although I think that in general, some extreme cases aside, we were far more respectful of authority than children in that age group today. Fortunately I came from a good home, where learning was valued, although neither of my parents had been to university, and my father hadn’t even been to secondary school.
After primary school I followed in the footsteps of my older siblings, and continued my education with the Christian Brothers. Although by that time most of the teachers were secular, there was an ethos of learning rooted in the founding example of Edmund Ignatius Rice (1762–1844). Even so, the methods were traditional, and emphasized learning of facts for examination purposes.
Towards the end of my secondary schooling (1983–85) computers were making an appearance in classrooms, but purely for the purpose of teaching programming. There was no user-friendly operating environment like Microsoft Windows, and certainly no networking. It was also at this time that the first home computers appeared, and I became the owner of a Sinclair ZX81, with 1K storage capacity! It had to be plugged into a TV and was useful for learning basic code. The display was black-and-white and there were no peripherals such as printers. All of that would change over the following decade.
My undergraduate years (1985–88) were entirely computer free, and it was during this time that I developed a method for the memorization of facts for exams. This, however, is only part of the story. Certainly I attended lectures and took notes in my chosen subject of Philosophy, and likewise completed essays and sat exams. To use Education parlance, I demonstrated the acquisition of ‘content’. At the same time, however, I entered into the ‘discourse’ of philosophy. This happened both formally, through a combination of text reading, small-group tutorial discussion, and essay writing; and also informally, through argument (in the philosophical sense) with classmates. It was a slow process, an apprenticeship of sorts, and it extended into the postgraduate years.
Seen against this background, memorization was the tip of the iceberg. What I memorized for exams were not simply facts, but arguments that I had already formed, usually through essay writing. In order to write those essays, I had to read a selection of texts (articles, chapters of books, and whole books), take notes summarizing the chains of reasoning, and finally argue for a position on the basis of the foregoing. The entire process was one of analysis and synthesis, comprising what I would later discover were known as ‘deep learning’ and ‘critical thinking’.
That I was studying Philosophy only made the process more explicit, since it is the business of philosophy to study reasoning, and argument is the modus operandi of the discipline. It was through this apprenticeship that I learned both the power of reasoning, as well as its limits, for even the best reasoning ultimately proceeds from a starting point that we assume. All arguments, if pushed far enough, will take us back to underlying assumptions, and uncovering such assumptions is itself a useful process. Another lesson that I imbibed was that any position could be subject to logical dissection, and my lecturers even encouraged such scrutiny of their own philosophical positions.
I have fond memories of that time. Several of my lecturers made a deep impression on me. One of them lectured on Plato’s Republic, demonstrating the relevance of the issues for our own time. This elderly Dominican priest, with a wealth of life experience, was not overly concerned with formal education. He would tell us that we could achieve a respectable, if mediocre, exam result if we accurately regurgitated the content of his lectures, but we could attain first-class honours if we told him something he didn’t know. At that time I didn’t appreciate that he was encouraging us to be more critical and creative in our learning. Nor did I understand the tediousness for someone in his position of marking dozens of identical exam papers. That experience was soon to come to me.
In 1988 I graduated with a BA, having specialized in Philosophy, and I immediately enrolled as a postgraduate student. Initially I intended to write a minor dissertation on the topic of Myth in the work of Eric Voegelin, having become familiar with the latter through one of my undergraduate lecturers. After some preliminary research, however, which included a study of Mircea Eliade and the neo-Kantian Ernst Cassirer, my interest in the ability of myth to ‘organize’ experience developed in the direction of Ideology and language. I will go into this topic in some detail, since it played such a formative role in my intellectual development and it relates to the subject of this post.
Today the concept of ‘ideology’ has negative connotations. It is used to refer to the body of beliefs, doctrines, etc., that guide an individual, group or institution, and is often associated with political programs. It was, however, not always understood in this way, as I will demonstrate below.
In his Novum Organon (1620), Francis Bacon (1561–1626) referred to idola (‘idols’), the false notions that obstruct the mind’s accurate comprehension of reality. Bacon categorized the different types of idols, with some being innate and others the result of socially determined distortion. Among the latter is the tendency to accept uncritically propositions that have become established with time. Language itself is a distorting medium through which we experience the world. In a move that was as significant for the development of modern science as for philosophy, Bacon proposed that the deductive logic of ancient and medieval thought be replaced by the method of induction.
Following Bacon, Thomas Hobbes (1588–1679), Claude Adrien Helvetius (1715–71) and Paul Henri Thiry d’Holbach (1723–89), developed the notion of the social determination of ideas, significantly linking it with power, including the power of religion. Helvetius recognized that domination is buttressed by the production and dissemination of certain kinds of prejudices: ‘experience shows … that almost all moral and political questions are decided by the powerful, not by the reasonable. If opinion rules the world, in the long run it is the powerful who rule opinion’ (De l’Homme). Significantly, it was also recognized that the powerful members of society do not need to impose their prejudices on the populace; rather, the latter adopt the prevailing opinion and, for some reason, prefer to live in ignorance of their true situation. For the Enlightenment thinkers, education represented the escape route from prejudice. They believed that behind the socially distorted understanding is a rational essence that can be liberated by the power of reason.
The term ‘ideology’ emerged in post-revolutionary France, where imprisoned aristocrat Destutt de Tracy (1754–1836) conceived an empirical science of thinking, designed to overcome false ideas. This was later developed in his Eléments d’idéologie, which defined ideology positively as the antithesis of prejudice. Napoleon initially supported de Tracy and his colleagues in the Moral and Political Sciences division of the Institut National, but later turned against them, branding them pejoratively as ‘ideologists’, impractical intellectuals who did not understand the real workings of government. In the final volume of his Eléments, de Tracy was forced to admit that economic interests were more powerful determinants of social life. The full implications of this idea were later drawn out by Marx.
Following de Tracy, the concept of ideology was developed in two major streams of modern thought, French positivism and German idealism, although neither used the term itself. In the former, Auguste Comte (1798–1857) conceived of knowledge passing through progressive stages, culminating in empirical science. In the latter, Georg Wilhelm Friedrich Hegel (1770–1831) spoke in terms of history as the working out of Absolute Spirit (Geist), coming to know itself through its objectification in the world. Ludwig Feuerbach (1804–72) inverted Hegel’s process by describing the Absolute (God) as a projection of human qualities, with religion being a stage to be overcome.
There are several interrelated senses of ‘ideology’ in the writings of Karl Marx (1818–83). In his early writings, culminating in The German Ideology (1846), which he co-authored with Friedrich Engels (1820–95), he demonstrated his intellectual debt to Hegel and Feuerbach. Like Hegel, Marx saw history as a law-governed process; and like Feuerbach, he wanted to reclaim essential human qualities, which had been projected outside the human being and become ‘alien’ powers. According to Ricoeur, this early work represented a progressive characterization of ‘the real’ and its opposite, ‘the unreal’. The former was identified with praxis, the creative activity whereby human beings produce the material conditions of their existence. This activity carries within itself the possibility that the products of labour, including social institutions, assume an existence independent of the conditions that give rise to them. This is ‘alienation’.
The ‘German Ideology’ that Marx and Engels criticized was, nevertheless, the philosophy of the Young Hegelians, including Feuerbach. They found in this ‘idealistic’ philosophy precisely the sort of distortion that occurs when ideas become separated from their basis in real life. This leads to the illusion that society can be changed by replacing ‘false’ ideas with ‘true’ ones, as the Enlightenment thinkers believed, rather than by altering the material conditions of life. As Marx wrote in his eleventh Thesis on Feuerbach: ‘The philosophers have only interpreted the world, in various ways; the point is to change it’.
In opposition to the German idealism, Marx and Engels proposed a materialistic philosophy that they believed would re-establish the true relationship between life and thought:
Morality, religion, metaphysics, all the rest of ideology and their corresponding forms of consciousness, thus no longer retain the semblance of independence. They have no history, no development; but men developing their material production and their material intercourse alter, along with this their real existence, their thinking and the products of their thinking. Life is not determined by consciousness, but consciousness by life.
This systematic or ‘epistemological’ conception of ideology, however, in which the distorting nature of ideology is internal to knowledge itself, simply inverted the problem of idealism. After all, how could the so-called material conditions of life have any meaning for us, if they were not already imbued with ‘ideas’?
Later in The German Ideology Marx and Engels provided a more political conception of ideology:
The ideas of the ruling class are in every epoch the ruling ideas, i.e. the class which is the ruling material force of society, is at the same time its ruling intellectual force. The class which has the means of material production at its disposal, has control at the same time over the means of mental production, so that thereby, generally speaking, the ideas of those who lack the means of mental production are subject to it. The ruling ideas are nothing more than the ideal expression of the dominant material relationships, the dominant material relationships grasped as ideas …
In this sense, ideology serves the interests of a particular group, the ruling class. According to Marx and Engels’ description, the division of mental and material labour allows a section of the ruling class to become its professional thinkers, ‘its active, conceptive ideologists, who make the perfecting of the illusion of the class about itself their chief source of livelihood’. We might describe this group as the intelligentsia.
Marx’s later formulations of ‘ideology’ did not escape from the paradox inherent in his earlier writings. For instance, in his Contribution to a Critique of Political Economy (1859), he described ideology in terms of a ‘superstructure’ that depends on an ‘economic foundation’ (what he later called the ‘base’):
The mode of production of material life conditions the social, political and intellectual life process in general. It is not the consciousness of men that determines their being, but, on the contrary, their social being that determines their consciousness.
The first sentence reflects the political definition of ideology, while the second reflects the epistemological one. But is the ‘mode of production of material life’ not already ‘informed’ by ideas? And what happens to ‘the social, political and intellectual life process’ once the communist society has been achieved? What forms of intellectual life would exist then? And what did Marx think was the status of his own theory in this schema? Were his ideas exempt from the very causal process that they described? In short, the base–superstructure model presents the relationship between activity and ideas in almost mechanical (economic) terms, and this oversimplification cannot do justice to a theory of ideology. Consciousness is not a passive reflection of an independent world, and ideas have a more positive role to play in constituting our subjectivity.
The tension between the epistemological and political senses of ideology was bequeathed by Marx to the theorists who followed him, and resulted in various attempts to overcome it. Vladimir Lenin (1870–1924), for instance, declared that one simply had to choose between bourgeois and socialist ideology. For Georg Lukács (1885–1971) all thought is ideological, but that doesn’t make all thought (or ideology) equal. Antonio Gramsci (1891–1937) conceived of ideology as an element in the phenomenon he described as ‘hegemony’, that is the institutions of civil society (family, school, media) as opposed to economy and state. Although coercion remains a possibility, through hegemony a dominant power secures its authority without recourse to it. In the words of Terry Eagleton, hegemony is ‘the “common sense” of a whole social order’. Although this would be true of any social formation, capitalism appears to represent a decisive shift in the ratio of consent to coercion; for the use of force, the naked manifestation of power, is only likely to reduce ideological credibility and destabilize the political status quo. As Machiavelli had recognized four centuries earlier, ‘deceit’ is more efficient than pure force. Hegemony represents the internalization of power – by its means, the individual lives under the illusion of self-government.
The epistemological circle inherent in Marx was not overcome by any of these theorists, since each one had to recognize his own historical situatedness, and this undermined any claim to objectivity in his theory. Perhaps the most sophisticated attempt was made by the Frankfurt School neo-Marxist Jürgen Habermas (b. 1929). In Knowledge and Human Interests (1968), Habermas distinguished three groups of sciences, each with its own distinctive ‘interest’. For the natural sciences this interest is one of technical control and manipulation. For the historical and interpretive sciences it is communication. Finally, the critical social sciences have emancipation as their objective.
Habermas credited Marx with having elaborated a theory of human nature and society in terms of practical interests. For him, Marx was actually engaged in forms of historical-interpretive science and critical social science. Under the influence of Enlightenment thinking, however, and in particular the celebration of the natural sciences, Marx conceived of his work in terms of natural science. For Habermas, then, Marx was himself subject to a form of ideological thinking – the ideology of Enlightenment attitudes towards science. Therefore, although Marx’s critique of capitalist society was still relevant, his categories had become redundant. Habermas wanted to reinstate the historical and interpretive sciences, in recognition of the fact that social activity is inherently meaningful, and cannot be reduced to causal explanation. Human beings do not simply interact, but exchange symbols in a constant process of communication, a process that can only be understood through interpretation.
In ideology, however, interpretation is not straightforward, since ideological communication is distorted, bearing as it does the marks of the power relations that pervade society. Therefore, Habermas insisted that the historical-interpretive sciences must give way to the critical social sciences, since only they have the ‘distanciation’ required by ideology critique. His ‘theory of communicative action’ represents his attempt to overcome ideology by positing an ideal of unimpeded communication towards which all utterances tend.
This notion brought Habermas into conflict with another tradition within European thought: the hermeneutical tradition in the Philosophy of Language. In order to understand this, it is necessary for me to retrace my steps.
As an undergraduate I had been attracted to the theory of the ‘language game’ developed by Ludwig Wittgenstein (1889–1951), which supported the notion that words only have meaning in a context. My postgraduate research introduced me to the more radical idea that humans are interpreting beings per se. Historically this development represented the movement from interpretation as a regional discipline (e.g. Biblical interpretation) to interpretation as fundamental to all human activity. This shift had been initiated by Edmund Husserl (1859–1938), particularly with his notion of the Lebenswelt or ‘life world’, which was taken up by his student, Martin Heidegger (1889–1976) and carried on by the latter’s student, Hans-Georg Gadamer (1900–2002), in addition to others.
Thus philosophical ‘hermeneutics’ (from a Greek word meaning ‘interpretation’) was born. Gadamer emphasized the situatedness of human understanding within a ‘horizon’ of meaning. As we grow up we adopt the interpretative framework of our culture. Tradition has a determining influence over us. It was this consideration that led to the famous (and inconclusive) debate between Gadamer and Habermas, with the latter arguing that it is possible to transcend and criticise tradition, and the former responding that criticism may take place from within a tradition but can never entirely transcend it.
Paul Ricoeur (1913–2005) took the notion of ‘situatedness’ further. He proposed that the philosophical search for meaning could no longer assume direct access to the truth. Instead it was necessary to take a ‘hermeneutic detour’. In particular, Ricoeur claimed that Marx, Nietzsche and Freud (whom he termed ‘masters of suspicion’) had shown that human beings are unconsciously determined by forces that are greater than themselves, whether the force be relations of production, the will to power, or the libido. Any search for truth would have to negotiate such factors.
Ricoeur later developed an interest in ‘narrative identity’. According to this theory, our life experiences are not disjointed episodes but rather integrated by the individual into a coherent story or narrative. In this way, the meaning of our lives is constructed through a process of interpretation, even if such a narrative is never overtly expressed.
In conclusion, we see that the history of ideology and the history of the philosophy of language intersect at the point of tension between our historical (and linguistic) situatedness, on the one hand, and our attempt to overcome the distortions that arise from it.
First experience of teaching and the move to Cambridge
At the time that I commenced postgraduate studies I also became a tutor to philosophy undergraduates. This was my first experience of being at the front of a classroom, and I grew to like it, even if it didn’t come naturally to an introvert. One of the challenges of the job was the marking of a hundred or more essays on the same topic. This was my practical introduction to the ‘normal distribution’, although I was not then familiar with the term. It was clear that there was a range of ability in any group. I wouldn’t tutor university students again for another seventeen years, and there would be a new challenge by that time.
Before that, however, I moved to Cambridge. With the help of a letter from the Professor of Philosophy at UCD, I gained a reader’s ticket for the University Library and conducted some private research there. I got a job in Heffers, the main academic bookshop in Cambridge and an institution in the town (since bought out by its Oxford rival, Blackwell). I worked there for four years, dealing with students, academics and the general public, and I encountered many interesting people, including some celebrities, and my first wife. I met the writer, John Cornwell, and became his assistant in the Science and Human Dimension Project, a ‘public understanding of science’ body based in Jesus College. Part of this role involved conference organization. Eventually I moved from bookselling to publishing, first as an editor and project manager, and then for a small company that specialized in digital encoding (XML).
At this time I was also a member of the Scientific and Medical Network (SMN), an association of people with a broad interest in the sciences. They ranged from well-known academics, on the one hand, to university graduates and other intellectuals who maintained an interest in what might be described as ‘progressive’ ideas. As I later discovered, at the other end of the spectrum were ideas that are sometimes referred to as ‘lunatic fringe’. I attended meetings of the local SMN group, and wrote a couple of book reviews for Network, the SMN journal.
In 2003 I moved to Australia and became the primary carer for my two children, my wife having secured a senior academic position. When, in 2006, my youngest started at a Montessori school, I grasped the opportunity to tutor undergraduates taking compulsory units in Philosophy and Ethics at a local university. Admittedly these students were not taking a degree in Philosophy, and there was the same range of ability that anyone might expect, but one factor came as a complete surprise to me.
In the intervening decade and a half, society had witnessed the advent of the Internet. Initially this had seemed like a useful tool for communication and the exchange of information. Email, for example, was fast and convenient. Then came websites conveying basic data. Subsequent developments saw the Web becoming more interactive (so-called Web 2.0): YouTube, Wikipedia, social networking, and so on. Illich’s notion of a ‘learning web’ was not simply realised – his expectations were exceeded, at least potentially.
My first intimation of the effect of this social change on education came from reading student essays in 2006. I found myself reading text that could not possibly have been written by the students I knew in the classroom. It was simply not their authorial voice. It was not necessarily the case that I was being given an essay that had been found online, although I became aware of this possibility. Rather I was seeing whole swathes of text that were being copied and pasted. At their worst, some essays were a patchwork of chunks of text cobbled together into the semblance of a coherent essay, with the occasional substitution of terms with synonyms. The problem was that they were usually not coherent, with the added disadvantage that such submissions usually took me more time to mark because I had to trawl the Internet to find the source of the various pieces in order to demonstrate their origin. Later I learned that websites had been developed for automatic checking of this kind, at the essay submission stage, precisely in order to combat fraud and reduce the time wasted.
Naturally these essays did not score highly, and in the worst cases they were an instant ‘Fail’. More surprising to me was that the offending students seemed to think that their inclusion of a web address in the references prevented them from falling foul of the university’s plagiarism policy. Furthermore, since these were compulsory units, many students were only concerned with obtaining the required ‘Pass’ grade needed for graduation. If they had postponed the unit(s) until their final semester, sometimes in a four-year course, a ‘Fail’ meant that they had to repeat the unit and postpone their graduation.
These were not minor considerations for the students concerned, since they had both financial and social consequences. What bothered me, however, was that such practices were inimical to the sort of deep learning that I had acquired as an undergraduate. The offending students were simply not acquiring the hard-earned skills of analysis and synthesis that had been necessary to good research and writing in the pre-Web era.
After a year of tutoring I got a job in the Extension department at the University of Western Australia, that part of the institution with a particular vocation for community outreach, and located in the beautiful building and grounds of the former Claremont Teacher Training College. My areas of responsibility were Intellectual Adventures (which included the philosophical courses), Languages, and Writing and Communication. My time in this role was very rewarding: for me it represented education in a very positive sense – people studying topics of interest to them, motivated only by the love of learning, without testing or the awarding of qualifications (at most, there were certificates of participation). Courses were offered both by academics and external providers. Many of our ‘clients’ were of mature age, with enough disposable income and spare time to participate in such activities.
After a year and a half I was offered a position in the University Vice-Chancellery, and I experienced a very different side to the university, one more concerned with governance. In my role as a senior research officer, I became aware of various trends in third-level education, including institutional reorganization and the development of novel architectural spaces. It was during my twelve months in this position that I heard that the Faculty of Education was going to offer a Master of Primary Teaching degree. I enrolled in this course and spent the next two years studying the various ‘content’ areas of the curriculum, as well as principles of pedagogy, ‘special’ education, and ‘classroom management’. I graduated in 2011.
A large proportion of the course was devoted to practical and administrative aspects of teaching, as well as the content of the curriculum, but I was particularly attracted to the philosophical aspects of pedagogy, in particular the ‘social constructivist’ ideas of Lev Vygotsky (1896–1934). These I could relate to my background in hermeneutics. For example, according to social constructivism, education is a process of induction into the norms of a society, whereby meaning is actively constructed by the child. New information is understood (or interpreted) by being assimilated to existing knowledge frameworks. It is easy to see how such ideas are congruent with Gadamer’s ‘horizon of meaning’ and Ricoeur’s ‘narrative identity’.
Another unit that I particularly enjoyed was ‘Teaching and Learning with New Technologies’, which increased my understanding of the potential role of blogs, wikis, and other new technologies in education. The present blog began as a requirement for that unit. The lecturer created a useful wiki (E-language) containing resources for those with an interest in this topic. Of particular relevance here is the Myths of e-learning page.
What was not included in this course was the sort of philosophical questioning that was a natural part of my earlier degree. It was in Dublin that I had first read Ivan Illich’s Deschooling Society, whereas this book and others like it were not familiar to the majority of my classmates, most of whom were about two decades younger than me, and graduates in a range of degrees that did not include philosophy. It was at this time that I discovered other critics of schooling, such as John Taylor Gatto, although these ‘alternative’ authors were not on my official reading list.
I will return to a consideration of these writers in Part 3, but first I need to go into some detail about another aspect of the Web that I personally encountered during this period.
Continued in Part 2