Self-Direction in Graduate Education and Improving the Academic System
How can we support self-directed education at the graduate level?
When I was a student at the MIT Media Lab, I began thinking about the purpose and goals of graduate education. How could we improve the ways we currently try to achieve those ends? What changes could we make to better align graduate education with what we know about how people actually learn?
By "how people actually learn," I mean learner-led, interest-led approaches. People learn when they're curious and motivated, and when the material is meaningful to them. To learn more about this approach to learning, read these posts:
- Schools Zap Kids' Motivation and Mental Health
- Changing Our Minds: How Children Take Control of Their Own Learning
- Interest-led Learning in North Idaho
- Watch my TEDx talk: Kids Can't Be Taught But They Love to Learn
- How I Built a Career From My Strengths and Interests—and How Your Kids Can, Too!
Brief note up front: This is brainstorming, the early part of a conversation, not a completely thought-out suggestion for how graduate school might change. I'd love to hear your thoughts!
What is the goal of graduate education?
The first step in determining how we could improve graduate education is to consider the current goals. What's the purpose of attending grad school? What are students supposed to get out of the experience? What do schools and faculty gain?
The purpose varies by graduate program. Graduate programs can be roughly divided into two types: (1) research-focused programs, and (2) programs focused on practice, skills, qualifications, and professional advancement. Thus, in the former case, the goal is for students to become able to perform independent research in their discipline. In the latter case, the goal is to acquire deep knowledge of a particular discipline. The degree conferred on the student is a symbol for the rest of the world to show that students have achieved those goals.
How is that goal currently served?
Graduate programs generally have committees of faculty who determine the sets of skills and knowledge students ought to have before they graduate: the core classes, the course schedule, the material covered on oral or general exams, the expectations for a thesis or dissertation, and so on. Students may be required to take classes on research methodologies or lab skills; to work as a research assistant or teaching assistant; or to complete an independent project or paper. I'm research programs, students learn the background knowledge, the underlying assumptions, the methodologies, the ways people in the field typically pursue questions, and so on. In practical or knowledge programs, students learn the necessary information or skills for that field.
Problems with that graduate education model
Like in earlier conventional schooling, other people (not the learner) decide what needs to be learned and tell the learner, top-down. The learner expects that what they are told to learn will be sufficient; generally, however, it will not be sufficient.
Faculty decide what knowledge and skills are necessary to succeed, and focus on teaching those things. Thus, classes, labs, and field work usually focus on technical, factual, and some methodological knowledge within a discipline. But as methodologies evolve, new theories arise, and so forth, faculty may not know or teach everything students will need to know to thrive.
Generally not taught are many of the practical aspects of doing research as a student, or as a future researcher or faculty member—such as project management, grant writing, technical writing, teaching, mentoring, and advising. Students may gain experience in some or all of these areas, if they happen to have an advisor who shows them the ropes, or they happen to take an elective, or attend a workshop on the topic, etc. They are not generally part and parcel of a graduate education.
There is an implicit assumption in many grad schools that students will go beyond required coursework to learn what they need to know to succeed. However, since it is implicit, some students feel cheated by the system. They expected that if they attended all the classes and checked all the boxes, they'd be fine—and they're not. At the MIT Media Lab, this implicit assumption is made explicit, so students start out knowing that they're going to have to take (at least some) charge of learning what they need to know. And that helps, I think.
Example: What did my MIT Media Lab education look like?
One reason I chose to attend the Media Lab for grad school was the greater amount of academic freedom compared to other schools (read more about why I went to grad school). Instead of a long list of required courses and specific exams, the Media Lab gave a few guidelines, and left much of the curriculum up to the student and their advisor. That suited me just fine.
Nearly all students started in the two-year Master's program. They were required to take 2 classes a semester (adding up to a certain number of total class hours), take the thesis prep class the first semester of their second year, and work on their thesis after that. They were all funded by research assistantships. PhD students were all required to take a seminar class that focused on how to do multidisciplinary, Media Lab-esque research, one or two more classes (again, for some total number of class hours), then do their general exams, proposal, and dissertation.
The key reason for the relative freedom and lack of constraints was the multidisciplinarity of the research labs housed within the Media Lab. Very few required the same background knowledge or methodologies. They were, generally, each a combination of two or more other disciplines (biotech, tech and music, robots and psychology, etc). As such, there wasn't a common set of core classes that could feasibly be assigned to all students. The only two were that thesis prep class and the PhD seminar, which both focused on how to do research generally rather than specifically.
The Media Lab's approach to education was heavily influenced by the constructivist and constructionist theories of Jean Piaget and Seymour Papert. Basically—lots of learning by doing, through experiences and action. Media Lab classes tended to be project-heavy. Students, as I said above, were explicitly told that they would be expected to learn what they needed to learn and it wouldn't always be through a formal class. The classes that students took were chosen with guidance and approval from their advisor, who was the only one above the student in the academic hierarchy who really knew what the student needed to know in their field. (And, as I said, even that wasn't perfect.) Students were also allowed (with advisor and/or committee approval) to deviate from the common structure of a thesis, exam, or dissertation - sometimes pursuing projects, performances, exhibitions, or other formats that better aligned with their learning goals.
So, on the whole, the Media Lab approach to graduate education as I experienced it definitely leaned in the learner-led, self-directed direction. It depended greatly on one's advisor, however; some allowed more freedom and self-direction than others.
How can the goals of graduate education be served in a more learner-led way?
If the goal of grad school is to acquire and then demonstrate the capability to perform independent research and/or deep knowledge of a discipline, what are other ways to do that? I included the example of the Media Lab because it's different from a lot of other grad programs, and shows one way that a program can move toward individualized graduate education. How else?
One way to be more flexible and learner-centered would be to take a set of guidelines for what a master or researcher in the discipline needs to know (perhaps based on the current course schedule or exam material), and develop (perhaps with a mentor or advisor and committee) an individualized plan for learning everything and demonstrating mastery. The degree you get could mean you achieved your plan.
Critical to success are mentors and advisors. Learners often need people who know more than they do about topics to help them plan a path, to suggest ways to learn about the topic, to suggest related topics or areas to explore, and to answer questions. For areas involving a lot of technical skills, observing or shadowing a mentor, or working closely with them in an internship-type format, can be invaluable in discovering how the field actually works in practice.
For example, I probably learned the most about experimental design not from classes or reading about best practices, but from being present in meetings during which my professors hashed out experimental designs and discussed why they might do things one way versus another.
Formal classes could still be offered—perhaps more project-based, discussion-based, and interactive —as well as labs or sessions for learning hands-on techniques and the practical aspects of the field. They're not incompatible with self-directed education (SDE). The core of SDE is learner autonomy and choice: deciding what, how, when to learn; what to pursue; when to stop; what topics to cover. It doesn't mean that learners eschew all formal instruction. Some people like to learn that way. Many different education formats can be self-directed, because SDE isn't about the specific format of the information being acquired or the method of acquisition.
So, grad schools could still offer classes. But to serve SDE, they might:
- be offered on an interest-based timeline,
- use a flipped classroom format to take some burden of lecturing off the instructors,
- not stick to semesters as a length of activity for classes or projects, or
- allow students to delve deeper into what interests them about their field.
However, since the goal of a graduate degree (not the education part, the actual title and letters) is to signal competence or knowledge to the rest of the world, for learning to qualify as graduate-level work, as degree-granting work, it needs to meet some criteria. For some kind of degree to be conferred, someone has to vet the learner, to verify that they have learned sufficient information, skills, etc. Anyone can be a self-directed learner, and anyone could decide to pursue topics that might be considered the subjects of a graduate education. But to be a graduate student, and then be a graduate with a PhD or Masters or whatever, requires adherence to some set of standards or some examination or some oversight. The rest of the world wants outside confirmation that a learner has learned stuff, not just the learner's word for it.
Above, I suggested coming up with a learning plan, with a mentor. That's not far off the current model: students generally need the guidance and approval of their advisor and committee. Grad school can function like an apprenticeship, especially in research-focused programs. How do we evaluate if a learner has achieved their plan? How do they demonstrate mastery?
I think it's easier to come up with learner-led approaches for research degrees, since you can essentially keep the current format: learn stuff for a few years, then do a big project (i.e., a dissertation) demonstrating that you know what you're doing. Require learners to do independent research, in a way appropriate to your research questions, with a sound methodology, and communicate the results in a way that shows your mastery of the subject. Leave it up to the learner, with advice from one or more mentors, exactly what the format of the project is. Since writing about one's work is common, expected, in academia, some written component could be required.
A project-based approach could work for practical and knowledge programs, too—some Media Lab classes worked that way. Students proposed a final project for the class that would demonstrate mastery of the knowledge covered in the class.
The Media Lab program that I described above leans in this direction, especially at the doctoral level. The requirements were somewhat flexible: first, you demonstrate through your exams (which might be exams, or essays, or papers, or projects) that you're familiar with work in your field, and you know how to engage with it and synthesize it. You present your research plan: a proposal sharing your research questions and how you plan to evaluate or study them—a chance for detailed feedback. Then there's the dissertation: do the proposed work and write it up. Everything is evaluated by a committee of experts (usually 3). If they sign off, you're golden.
One problem is the interpersonal. Some students have professors or committees who don't have their best interests at heart (e.g., I knew a student whose graduation was delayed because his professor wanted him to continue doing good work for her). Some of these conflicts can be partially bypassed or solved in the current system by having the external set of school rules for who qualifies to graduate or not (though in most cases, you need your advisor and/or committee to sign off). Regardless, currently, students switch advisors if interpersonal conflict is too great; learners could switch mentors.
Another problem is that going full SDE makes it hard to compare students. Conventional schooling loves ranking and comparing students. Colleges and universities love their rankings and status. But individualized programs don't lend themselves to comparison—learners are following their own paths, acquiring different knowledge, or the same knowledge in a different order, and focusing on different things. It's not easy to line up two people like that and say who comes out ahead. I personally don't think this is a problem—it's more of a benefit—but the current academic system will see it as an obstacle.
Why bother suggesting changes to academia?
Many academics and former academics (especially the latter) acknowledge that the academic system is hugely flawed. They are happy to point at the problems. But instead of trying to fix anything, they tell current students, "You didn't create the problem, so it doesn't have to be your problem to solve. Get your degree and get out."
To an extent, this is true. Students shouldn't have to be the primary drivers of change in academia. In many cases, however, they are.
The same advice is given to current academics who are thinking of jumping ship. Not your problem anymore; don't bother.
But if so many of the people who see the problems leave, who's going to fix anything for anyone else? It feels short-sighted and a bit selfish.
We have a positive obligation to leave the world better than we found it, to improve the system for those after us, even if only in our own small way. E.g., writing a book to help grad students find balance. Or joining organizations that are rethinking academic structures (á la the Ronin Institute). Or brainstorming ways we can better align graduate education with how people learn best.Header image credit: Madcoverboy at English Wikipedia, CC BY-SA 3.0