Monthly Archives: May 2020

Thoughts on the Diarmaid Ferriter article

Historian, Diarmaid Ferriter’s recent Irish Times article will have resonated with a lot of people working in the higher education sector, including me. I had recently been chatting with a colleague about how rumours were going around about the fact that our online materials would have to come suitably “branded”. I’ve no idea if that is the case and what “branding” means in this context but the word is not one that one would have heard in a university even ten years ago. I’d also been a involved, peripherally, in a project in which new programmes would be developed but part of the ‘deal’ was that in these new courses certain ‘innovative’ pedagogies would be required, i.e., imposed. Again, I’ve no idea as to what extent these pedagogies were non-negotiable but I was taken aback. The development of new programmes has always been a bottom-driven process, especially in DCU where there is a genuine culture of innovation. It dates back to NIHE days. Being required to adopt certain approaches to teaching, approaches that I fundamentally disagree with, was a massive change of ethos, at least in my head. Nonetheless, many people seemed happy to go ahead with this approach.

And this is the fundamental problem. Often we read about the tyranny of “administrators” in our universities but the reality is that many of these administrators are actually academics who have gone into ‘management’.

We tend to assume that all people who become academics do so out of a love of knowledge, of research and, in some cases, teaching. While many probably start out that way, quite a large number seem to run out of steam, often for very legitimate reasons, both personal and systemic, and move into administrative roles. And many of them, especially those with a systemising mindset, take to their roles with gusto. Sometimes, administering is a lot easier than the constant pressure to fund and create new research, all the while teaching and administering large undergraduate classes.

A relatively common feature of such academics is that they tend to distrust their colleagues and have an awful tendency to introduce layers of bureaucracy to keep us all on track. Often this lack of trust is based on the poor practices of a small number of academics. Instead of those academics being held to account, we all have to fill in more forms.

I joined the “dark side” in 2014 and served for three and a half years as associate dean for teaching and learning in my faculty. I enjoyed the job immensely, especially my interactions with people, but it was a balancing act. I strongly believe in giving academics freedom to teach in whatever way they see fit, to allow them scope for some spontaneity and not be bound by syllabuses they wrote a year previously. I’m also suspicious of the whole concept of Learning Outcomes. So my time in the job required me to walk a fine line between not doing the job I had signed up to do and becoming a sort of control freak, as can happen, and does happen.

But it is interesting to consider how we got here. To some extent, I have sympathy for senior management because the government gave us an impossible task: expand student numbers while receiving less state funding. The institutions had to raise more money and how they did so trapped them into never-ending cycles of growth. New programmes, more international students, more staff, more research grants provided the funds to introduce new programmes, recruit more international students, recruit more staff, to earn more grants to ……You see where this is going. When UL launched their strategic plan a few months ago, it could have been written by any other institution in the country or even the UK. The key theme was growth: more students, especially international ones, better facilities (to entice more students), stronger connections with industry. The growth model for higher education has become so embedded in our culture that it is very difficult to see how changes can be made. Indeed, observing a Twitter conversation between two very well-known business/finance academics, I was struck by how their thinking was exactly the same as the establishment’s thinking was in 2008: it was all more of the same.

Despite the understandable bind that senior managers have been in, we have been our own worst enemies. Irish Institutions have worked themselves into a prestige race, just as the American universities have. As a result, we have expanded our reach way beyond teaching and research: engagement and impact are now what it is all about. Under some circumstances this would be fine in my view but when there is so little money about, one would have thought that it was time to circle the wagons and focus on our core activities. But there has been something unstoppable about the corporatisation of our universities and it’ll take enormous courage and no little skill for individual institutions to chart a new way forward.

One of my biggest problems with the current model of higher education is that while we (as in the sector) talk a good game when it come access programmes, the emphasis on growth and prestige has closed many doors to school-leavers from disadvantaged backgrounds.

I think the perfect example of this (and this won’t make me many friends in DCU!) is the teaching of Actuarial Maths. Becoming an actuary used to be well within the reach of the young person from any background once they had an A in honours maths. But all institutions saw actuarial maths as an opportunity to grow student numbers: not just any old students, but students with well in excess of 500 points. So we now offer degree programmes all over Ireland that are effectively a taxpayer gift to the finance industries. We’ll train your recruits in such a way so that won’t have to take so much study leave! And I imagine you could say the same about many other professions. Meanwhile, if you’re living in the wrong postcode area, your chances of going to university are minimal.

More thoughts on lower and higher-order thinking.

I’ve been too busy being parent over the last few days to write much but ideas have been swirling around in my head. So I hope to write couple of posts this morning.

My first post today is a follow-up to this one on the concept of higher and lower order learning. In that post I was trying to make two key points:

  1. The idea that remembering is somehow a lower-form of cognitive processing that, say, evaluating or analysing, is not a concept that is based on any science. It is something that has a certain air of plausibility about it but plausibility is not good enough. Like many comparison’s made in the world of education, the idea that one form of cognitive activity is better or more complex than another is based on comparing one thing at its worst with another thing at its best. Any example of this would be education researchers extolling the virtues of the perfectly designed (and functioning) flipped classroom with a dull lecture like the one from Ferris Bueller’s Day Off. No one ever compares a flipped classroom where half the students haven’t bothered to watch the previous night’s recorded lecture with an inspirational lecture like the ones I experienced in my graduate school days in the US.

Likewise, when people describe remembering facts as being low order, I suspect they have in mind things like knowing the capital of Uzbekistan or the population of Indonesia. Furthermore, when talking about “higher order” activities like “analysing”, the image is one of researchers or even students coming to grips with the complexity of a horrid problem like climate change. The remembering is always deemed to be trivial while the analysing (or problem solving) is always deemed to be demanding.

So let’s talk a little bit about what is involved with remembering. (Speaking as a non-expert of course but someone who has taught for a very long time.)

Committing knowledge to memory involves the storage of new knowledge in our brains. But how that information is stored is complex. First, if we are to remember new information, we must adopt proven strategies like self-testing and spaced practice. It doesn’t just happen. Secondly, when we remember new information we don’t store it in our brains like we would a book on a shelf or a bit of information in a computer. No, most of the time, we will connect that knew information to memories we have already stored away. If that connection is not made, then we are unlikely to remember it. Therefore, we should think of our memories as being more like a web, or even a jigsaw, than a filing cabinet. Although, I’m never quite clear what is meant by the term “constructivism”, it seems to me that one of its basic tenets is that new information is employed to construct new schema or webs of knowledge in our brain. On that, I would agree.

So when students are being taught new knowledge, they are, or at least should not be, connecting it to previously acquired knowledge. In History, for example, new events are nearly always related to past events (as the current riots in the US are) so if you are listening to a lesson on, say, the civil rights movement in the US, you’re more than likely to be taught how the events of the 1960s ultimately stemmed from slavery and American Civil War. So remembering is a complex process, at least if memories are to endure and if they’re to have any meaning. When traditionalists like me talk about the importance of knowledge, these are the kinds of things we’re talking about. But we also stress that having knowledge stored (remembered) in long term memory frees up short term memory so that the thinker is not overwhelmed by cognitive load when tackling cognitively challenging tasks.

In my view, and in the view of many prominent educators, the tendency to shift the focus from knowledge to so-called skills is a huge mistake because knowledge is the currency of thinking. Anyone who suggests that knowledge is being devalued can simply looking at the curriculum specs for the Junior Cert and see how little knowledge is actually specified. In the case of the science curriculum, the relevant document is 29 pages of uninspiring waffle filled with education buzzwords.

Likewise there is a growing trend in education departments in universities to promote the teaching of “problem solving skills” through the use of what used to be consider toys like Lego and the like. For many, this is a fundamental error because, fundamentally, playing with Lego teaches a child to be good at solving Lego problems, not the “real world problems that educators are so keen to talk about. Cognitive psychologist, Daniel Willingham, has pointed this out on many occasions.

So to conclude, remembering is not a simple process requiring, as it does, good study practices and the connection of new knowledge with previously remembered knowledge.

 

  1. The second point is related to the first and I have hinted at it in the above. Let’s imagine we are asked to analyse a certain set of data, Covid-19 data, for example. Analysis is supposedly a higher form of thinking than remembering, at least according to Bloom. So what happens. Answer: we start remembering. We remember that Excel is a good tool for analysing data, we remember how to plot data and the pros and cons of different approaches to plotting data. We remember, for example, about linear scales and log scales and how interpolating on log scales is tricky. We might remember something about differential and cumulative analysis, about means and modes, about rolling averages, about curves and slopes (calculus!), maybe even something about error bars and delays.

 

The whole process of analysing the data is not some mysterious cognitive skill distinct from remembering. In fact, analysing and problem-solving, and all those so-called 21st century skills are no more than memory in action. As educators, especially engineers like myself, we recognise that to be an efficient problem solver, or similar, lots of practice is crucial. Practice helps to recall approaches to problem-solving in particular circumstances. It’s a kind of meta-remembering, a fancy word for gaining experience. Indeed, experience is nothing more than having a large stock of memories of problems and situations that one has encountered before so that rather than attacking every problem anew, you simply draw on your memories of how you solved similar problems in the past.

Finally, a word about “creating”, a “skill” that tops Bloom’s pyramid. While personality traits (lie risk aversion, for example) play a role in creativity, it would be a mistake to think that the ability to create is some sort of ill-defined cognitive ability. In fact being able to create depends enormously on having prior, relevant knowledge, in one’s long term memory. A number of years ago, a large survey of over a thousand entrepreneurs on the West Coast of the US were surveyed (I’ll try to find the link when I have time). The best predictors of entrepreneurial success were being in your forties, being an engineer, working for a large company, and also being a second generation American. It’s hard not to think that being knowledgeable and experienced is a key factor in becoming an entrepreneur, a creator.

Some thoughts on cocooning

Today I was going to expand on yesterday’s post but I had a Zoom meet-up last night with some old pals and my brain is somewhat sluggish this morning. If I say that Jameson was involved you’ll understand.

So, instead, I thought I’d share the following which I wrote the other day for CFIreland. One of my aims over the coming years is to challenge the narrative, perpetuated by self-appointed “advocates”, that people with chronic illnesses are consumed by their condition. We’re not.

__________

I’ve been in lock-down for nearly three months now and while there is a feeling of Groundhog Day about my current existence, I’m doing fine. It helps that I’m an introvert at heart and while I enjoy being with people I’m also happy enough to spend long periods on my own. It does help, of course, that we can all stay in contact with our friends and loved ones through WhatsApp and social media. Being locked-down in the 1970s would have been a much different proposition.

I’ll be 57 in June having somehow survived pretty much everything CF can throw at a person, including multiple infections, collapsed lungs, blocked bowels, years on home oxygen, and ultimately a double lung transplant (in 2002) and a kidney transplant (in 2011). Except for about six months in 2003, I have worked without any major interruption since I was 23. I’ve been incredibly lucky and can’t claim any credit for my survival.

I’ve been a lecturer in DCU since 1986 and while the job has provided me with a lot of flexibility, it has been demanding. In a curious way, work, not CF, has dominated my life and if anything defines me, it’s my work. It has consumed me.

Anyway, by the middle of 2019, I was feeling very tired, both emotionally and physically. I think I was suffering from burnout and this was not helped by the fact that my transplanted kidney is not working very well. I had developed a sort of depression/anxiety for which I decided to take medication, medication that seemed to put me back on an even keel. I think we all need to be prepared to admit that we are suffering mentally and not try to battle through. We take bucketloads of medication for our malfunctioning bodies, so why not for our malfunctioning ‘souls’.

Nonetheless, the lock down period has been a welcome break for me. I’ve had to work, of course, because we had to move all our teaching online and that was a bit of a learning curve for us all. But I enjoyed the whole process – it was something new.

But now as things are quieter for me, I can pause and reflect, take it easy, read more, write a bit, and generally do the things that my work schedule didn’t allow. I go for a walk every day, usually before 9am, and I always wear a mask. I don’t miss the early starts, the sitting in traffic and the useless meetings that seem to be a feature of all workplaces. But I miss my friends and colleagues, and even my students. Students keep me young!

My only real worry in all of this is that my wife, Julie, who lives separately with my son, Leo, might catch Covid-19 and I’ll be left to bring up Leo on my own. Given my health and limited life expectancy, that is a terrifying prospect. But as I’ve done throughout my CF life, I try to park those fears and just focus on the here and now. That tactic has worked for me so far.

I’m not sure when, and if, things will go back to normal for me. My employer is understanding and knows that I may have to teach online for a lot longer than my colleagues but one thing I know for sure is that I’m not going to risk Leo’s future by being reckless and trying to be a hero.

So for now, I’m happy to live a day-by-day existence but I have to admit that I miss going down to Wicklow and walking in Glendalough and generally being close to nature. In the meantime I’m just glad to be able to hear the dawn chorus, louder than ever because of the reduced traffic levels. Life could be a lot worse.

 

The myth of lower and higher order thinking

One of the main arguments against the Leaving Cert is that it only tests “lower order thinking”. This conclusion is based on the undeniable fact that memory places an important role in preparing for that exam. The idea that remembering is a lower-order form of learning has its origins in Bloom’s taxonomy (see Fig.1), but most of Bloom’s ideas are the equivalent of stamp collecting or bird watching, or even trainspotting. It’s a way of categorising different forms of learning but has no basis in cognitive science or indeed any kind of science. Most people realise this and Bloom’s ideas, though interesting from a historical point of view, are rarely taken seriously anymore. To quote the Bee Gees, Bloom’s pyramid is only words.

bloom_s_taxonomy

Figure 1 Blooms Taxonomy

Discussions around the role of memory in the education system intrigue me. Many people confuse memorisation (a key part of learning and becoming fluent at algebra, for example) with “rote learning”. Technically, rote learning involves learning without “understanding”. But the idea that learning without understanding is endemic in our education system is a myth. As Daniel Willingham pointed out in his book, “Why don’t students like School”, remembering information that has no meaning to you is extremely difficult. It’s not a good learning strategy to “rote learn”. We all know this and hence we all use meaningful words and phrases as our computer passwords. It would be far wiser from a security point of view to use gibberish as your password but nobody does that because everyone knows you’ll forget it and get locked out of your account. The key point is that our ability to recall is inextricably linked with meaning. There is a wonderful example of this phenomenon that I came across many years ago where the example of trying to learn off a paragraph of cricket commentary was used to illustrate the difficulty of learning without meaning. If you understand cricket, it might be easy enough to recall a paragraph or two but if you don’t, you’ll be bamboozled by words and phrases like “fine leg”, “silly mid-off”, “gulley” and “cover point”. You’ll be lost and you’ll be unlikely to remember anything.

But what about the trend of students of learning off essays and pre-preparing answers to questions about, say, Hamlet’s character flaws? My response to that is this: when was the last time you wrote a coherent, grammatically correct essay in forty minutes? The tactic of learning off essays is, in fact, an intelligent, tactical response to a system that places demands on you that are unrealistic. Being “smart” is a key skill to have in life and adopting a tactical approach to exam preparation is actually a sign of intelligence. I’ve done it myself and my students, even final years, continue to do so. I know this because they tell me.

But to get back to the whole question of memory and understanding, let’s ask one key question: what is the difference between memory (or stored knowledge) and understanding. My answer would be that there is no difference: understanding is just more knowledge, more memory.

Suppose I’m teaching a mathematical technique or concept and the student just doesn’t get it. What do I do as a teacher? Well, I must recognise that I need to adopt a new approach and so I draw on my long term memory and approach the technique or concept from a different angle. I might switch from an abstract approach to a geometrical one, for example. What appears to be “understanding” is really a reflection of the fact that I have more knowledge of the subject and, crucially, have more knowledge of how different ideas and perspectives are connected. This web of connected knowledge is termed a schema and manifests as “understanding”. In effect, if we want students to become smarter, we need them to remember more because if we don’t have knowledge stored in our long term memory, we have nothing to think with.

But to get back to my original point, the idea of creating a learning hierarchy with “creating” at the top is completely arbitrary and we should stop using such unscientific language when talking about education.

Finally, I think it’s a good idea to keep the LHFE in mind. This is the “long haul flight effect”. I like to imagine that going on a long flight might be a nice experience if the passenger next to me knows a lot and can hold a discussion about any number of interesting topics. Who wants to sit beside someone who remembers very little? Acquiring knowledge and remembering is not just important in our formal education but also in our personal and social lives.

Metathesiophobia* in education

Yesterday I was reading one those seemingly ubiquitous articles on how the world is changing so rapidly (“exponentially!”) that the education system needs to be revolutionised to make it “fit for purpose” in the 21st century. I don’t know why I read articles like this because they drive me nuts. But These articles usually contain claims about the “4 C’s” or in one memorable talk I went to in DCU (not by a DCU colleague), the 9 C’s! What are the odds that so-called “21st century skills” would all begin with C? Mind you, there is some opposition to the dominance of the C’s and there is growing campaign to advocate for the 4 P’s. One the 4 P’s is “passion”, the favourite word of Masterchef contestants who are invariably passionate about celeriac or turmeric or locally-sourced ingredients.

But despite my irritation, I am genuinely intrigued by academics and consultants who seem fixated on the idea that the world is changing. The world has always been changing. Every single decade of the 20th century saw enormous change. In the 1990s, the world of work was changed utterly and for the better. Whereas once we would print out memos and put them in our colleagues’ pigeon holes, in the 1990s we began to communicate by email. The world wide web revolutionised the way we sought and consumed information. Every year or two Intel would release a new Pentium chip. Microsoft Windows and the mouse completely changed how we worked on our desk-top computers. Laptops (very heavy ones!) came on the market beginning the mobile computing age. And of course, the smartphone was developed and was to become a genuine game changer.

Most of us who worked through the 1990s had come through what many educationalists refer to as the “factory model of education”. Some of us had plenty of experience of programming (e.g. with FORTRAN) on mainframe computers but many had effectively no experience of using any kind of computer.

And a funny thing happened: we coped. In fact, we quickly saw that much of the new technology that we had at our disposal was going to make our jobs a lot easier.

So if we could cope then, why not now. Since the turn of the millennium the constant change of the 20th century has continued but I would argue that change in the 21st century has been more incremental than transformational. There are exceptions of course, especially in molecular biology where gene editing techniques are now widely used. And if you say “what about AI?”, it’s worth noting that the first artificial neural networks began to appear in the 1960s, if not earlier.

One of my big issues with the whole “the world is changing exponentially” meme is that it leads down a sort of nihilistic blind alley where acquiring knowledge is devalued to the point where the very purpose of education is couched purely in terms of “skills”. And so, senior academics are often to be heard suggesting that the purpose of education is not to acquire knowledge and wisdom but to “learn how to learn”. The irony of this particular cliché is that it presupposes that at some point we will have to learn something which kind of contradicts the idea that the purpose of education is merely to learn how to learn.

I have a theory that the people who promote the idea that the world is changing so fast that we need to transform education are the very people who are most fearful of change and showing signs of panic.

 

*Fear of change apparently. God bless Google.

A question of loyalty

Let’s suppose your institution is promoting ideas that you believe are potentially damaging. What should you do? It’s a tricky question, especially if you have a certain degree of attachment to your institution, as I have. As someone who’s been around since NIHE days and who’s probably one of its longest serving employees, I’m very proud of what DCU has become, and I’m particularly proud of the small part I’ve played in DCU’s transition from a small, oddly-named college, to a university with a solid international reputation. But that doesn’t mean that I believe in everything we do.

So when fellow academics promote approaches to, say, teaching and learning, with which I profoundly disagree, I find myself in a mental tug-o-war with myself. What should come first: loyalty to the institution or loyalty to my principles? Of course, the simplest way to avoid a dilemma such as this is to just keep the head down and carry on in my own little bubble. But then I think of my son, who is only twelve, and I worry about the future of our education system. I don’t want it to be engulfed in what I consider to be plausible nonsense.

I think the way out of all of this is to do the work, read the literature and counteract what I consider to be nonsense via the academic literature.

Equity and the Leaving Cert

The Covid pandemic seems to have encouraged many people to think about equity in education – for obvious reasons. And, as one would expect, the Leaving Cert and the industry that has been built up around it are blamed for the inequities that undoubtedly exist. Whether it’s grind schools or the notion that “students learn in different ways” or the claim that “there are multiple forms of intelligence”, or even that “the Leaving Cert was designed for the industrial age”, everyone who has ever been to school seems to have some reason for calling for a revolution.

The grind school argument is valid but many of the other reasons quoted are spurious and based on pseudoscience. Howard Gardner’s ideas on multiple intelligences, for example, are largely discredited and the idea that the school system is “not fit for purpose in the 21st century” is little more than a cliché, a favourite of education consultants and corporations like Microsoft and Lego.

So, what would an equitable assessment system look like? Most people advocate for continual assessment but it’s never clear how this would reduce inequity. In fact, common sense would dictate that continual assessment, particularly if it were to take the form of project work, would likely amplify inequities depending on the home circumstances of each student.

More fundamentally, when people advocate for the Leaving Cert to be replaced, what they really seem to have in mind is that it should lean less heavily on memory and more on “skills” like problem-solving, critical thinking and creativity. Aside from the fact that these “skills” are hugely dependent on having relevant domain knowledge (in long term memory), the current emphasis on them is arbitrary, not to mention instrumentalist. Yes, we would like students to be able to think for themselves, but why the emphasis on problem-solving and creativity? Why not conscientiousness, attention to detail or simply being knowledgeable; what we used to call being “educated”.

Finally, we need to ask what the key attributes of our education system should be. Should the system be inherently equitable even if it is flawed from a pedagogical point of view? Our should we design the best possible education system while providing a comprehensive set of supports for students from disadvantaged backgrounds? I know which approach I’d take.

Thoughts on evidence based education

Back in 2013, Ben Goldacre (he of Bad Science and Bad Pharma) started a campaign to make education more evidence-informed. His original talk can be found here.

Many educators (teachers, lecturers etc), most of whom could be described as traditionalists, found Goldacre’s intervention compelling and the ResearchEd movement, now a global phenomenon, was born.

Others were less than impressed and deeply resented a medical doctor trespassing on their field and having the temerity to tell them how they should be advancing the craft of teaching.

More than that, however, many educators who are best described as being “progressive”, were much more strident in their criticism of Goldacre in particular, and the research-based approach in general. The problem was, and still is, that education is driven by ideology and gut-feeling. A person’s views on education are deeply connected with their thoughts around equity, diversity, child-rearing, and frequently, politics. Progressive educators have claimed the moral high ground of “child-centredness” and tend to characterise traditionalists as being right-wing, or even alt-right, and, bizarrely, racist. In their eyes, they’re the compassionate ones. They stress that every child is unique and attempting to research your way into teaching excellence is fundamentally flawed, doomed to failure. Some even say the research-based approach is oppressive.

Anyway, when you have a large group of educators who, at the time of Goldacre’s talk, controlled the narrative, and then suddenly found themselves in a position where their beliefs were about to be put to the test, it’s somewhat understandable that they responded with a certain amount of aggression. When observing education debates on Twitter, especially those occurring in England, I am often struck by the extent to which progressive educators personalise the debate and engage in ad hominem attacks on fellow professionals simply because they do not think the same way. So, frequently, you will see Greg Ashman, a maths teacher based in Australia, and a prolific blogger, being accused of being “alt-right”, a ludicrous accusation.

This reluctance to adopt research-informed approaches is a key feature of education policy-making, especially in Ireland. The Junior Cycle is a case in point. It’s a bizarre thing of a yoke, as the fella said. There seems to be no curriculum and the “Curriculum Specifications” are dull, uninspiring and riddled with education buzzwords and clichés. Here’s the Science document.  The constant emphasis on “process” rather than actual content is typical of current ideology (not evidence) an ideology in which specified knowledge is deemed to be unimportant compared to “skills”. One gets the sense that the designers believe that it’s more important to understand the scientific method than it is to know any science. I don’t know any science colleague who believes this. But somebody in the NCCA clearly does. This is not evidence-informed policy-making. It’s policy-making by opinion.

Which brings me to one of the biggest, most profound changes that occurred in higher education in the last fifty years. I’m talking about Modularisation and Semesterisation (M&S).

M&S was essentially a political project. It was designed to facilitate free movement of students throughout the EU and, in that sense, it worked.

But M&S was also an attempt to quantify learning and when combined with the learning outcomes (LOs) philosophy it helped to promote the concept of education as a transaction. Do this many credits and you will have achieved this many outcomes. To use a physics analogy, education became quantised. If taken seriously, M&S and LOs meant that the number of education “destinations” available was finite and well defined. But everyone knew that was nonsense and, in truth, the number of education destinations is infinite. There are many ways of getting 60% in an exam.

But the key aspect of M&S was that it was never subjected to any rigorous analysis from a pedagogy or cognitive science perspective. Yes, there was some scepticism expressed and many academics making the point that it often takes a long time, a lot longer than twelve weeks, for challenging ideas to sink in. But this was a runaway train and nobody was going to stop it. Sometimes there’s a tide in the affairs of men and to oppose it is a waste of time.

I was broadly in favour of M&S at the time. Like many engineers and scientists, I had a systemising mindset (as Simon Baron Cohen might say) and the tidiness of the new system appealed to me. The systemising mindset is common in academia, particular among academics who move into administrative roles and then proceed to impose additional “processes” on academics.

But years have passed now. I’m more comfortable with untidiness, lack of uniformity and academics having the freedom to improvise and go their own way.

I also know a little bit more about education. And I think we created a problem for ourselves and it’s this: under the M&S system, students’ recall of material they have covered seems to have declined. Despite what many educators might say in this Google age, retaining knowledge is a crucial part of learning. I won’t go off on a tangent about this: instead I recommend to anyone who doubts this to read “Seven Myths about Education” by Daisy Christodoulou.

Let’s think about modern higher education and compare it to the Leaving Cert. For secondary school students, the Leaving Cert is a two-year immersion in their chosen subjects. For university students, a module is a twelve-week immersion in the relevant subject. That seems anomalous to me.

But within that twelve weeks, students tend to be busy with continuous assessment (another intervention that is believed to be a “good thing” without much evidence) and anecdotally it seems that students really only revise their lecture notes once. In cognitive science, there is a thing called the “forgetting curve” and what is clear is that unless you intervene, with regular revision, knowledge acquired is lost very rapidly. We all know this instinctively. If someone asks you to recall the plot of a novel you read some time ago, you are more than likely to have forgotten whole swathes of it.

Sometimes, though, if material is very “memorable”, we might remember it for a little longer. This has been the dominant approach in education for a number of years. The focus, therefore, has been on making material more “engaging” and more “relevant to students’ lives”.

But this approach has obvious limitations. It’s hard to see how you can make thermodynamics engaging.

If you want to solve any problem, you have to admit that it exists. This is why I think the Irish Survey of Student Engagement is a missed opportunity. It tells us literally nothing about how students go about their studies.

M&S is here to stay but we have come up with some way of incentivising students to engage in continuous study (not CA).

The unsurprising performance of our work placement students

In my school in DCU we spend an awful lot of time angsting about our students. We worry abut how much they study. We worry about their writing skills, their attention to detail and their numerical skills.

Today, I made two INTRA (work placement scheme) “visits” via Zoom. As expected, I got glowing reports from the employers. They thought our students were hard-working, independent, excellent communicators and, to my delight, very attentive to detail.

This happens every year – I’ve rarely received a bad report on my travels to companies. Occasionally you find that the employer is worried about the student because they are very introverted and are finding the need to connect with other employees very difficult. But most of the time, they are very complimentary about their overall ability and their work ethic.

What this tells me is that we don’t see the best of our students. It seems that when put to the test in the “real world”, they up their game and reveal a side of themselves that we don’t always see. It also suggests that despite everything, budget cuts and all, the quality of our graduates is being maintained.

Sometimes these INTRA visits are great therapy.

PS We need to do more of these remote visits in the future – for the sake of the climate if nothing else.

Why I’m drawn to the humanities

When I was a student I devoured books on astrophysics and cosmology, but also philosophy. In fact, I ploughed my way through Bertrand Russell’s History of Western Philosophy and at one stage fancied myself as a disciple of Spinoza. These days, I remember nothing of Spinoza’s thoughts but I remember that Einstein was a fan.

That was the early 1980s, a time when everyone was anxious that a nuclear conflict was about to break out between the US and the Soviet Union. At the same time, religion was beginning it’s long, slow decline, and many young people like myself were searching for meaning.

I didn’t stick with philosophy for too long although I did manage to complete a couple of adult education courses in UCD.

For many years, though, I stuck with reading books about physics. But, at some stage, I began to realise that no matter how accurate the equations became, we still talked about the physical world using imagery and analogies. You can talk all you like about the “space-time continuum” or “quarks” but you’ll never really know what these things actually are. It’s all too much for our brains to comprehend. But that’s not surprising because our brains have evolved to understand the macroscopic world not the quantum world.

So I found myself drifting away from the hard sciences and began to be more interested in the culture of science rather than the science itself. People were so much more interesting than equations.

These days I find myself increasingly drawn to the humanities. I think it all started when chatting to my late brother, Tony, about religion. Tony read a lot of theology, especially the work of Cardinal Ratzinger (Pope Benedict) so when we chatted, I was way out of my depth. What I really respected about Tony was that, starting from some basic axioms (e.g. that Jesus was the son of God), he had a great ability to construct a self-consistent edifice within which all of the church’s teachings followed logically from those initial axioms. The way he argued about religion was not unlike how a scientist might argue: he just had a different starting point. Of course, his arguments were, fundamentally, based on belief while a scientist’s thoughts are based on evidence (or should be), but the two cultures seemed to me to have a lot more in common that you might think.

We need to be careful about placing too much emphasis on STEM.