Why I have been insisting that my CAQDAS workshop is a food festival, not a cook-along show (image via @danielagduca).
I am involved in another MOOC that is currently in the pipeline. This course — my brain child, as it was described at a working group meeting yesterday — is for prospective doctoral students, especially those coming from underrepresented backgrounds. It is exciting to see how it is taking shape, but the whole process is giving me a lot to reflect on.
- Marie-Alix Thouaille (2017). The ideal PhD researcher has no baggage. LSE Impact Blog, 26 September.
- London Higher (2022). Diversifying the pipeline to doctoral study [member discussion], 8 June.
- RT @farahbakaari I think the problem with graduate school is that a 27-year-old mortal is expected to possess the physical stamina of a 19-year old athlete and the intellectual output of a middle-age tenured professor. (21 September 2022)
- Neelam Wright (2022). Can we decolonise our doctoral training? Wonkhe, 5 October.
At the REDS conference in October 2019 (which, come to think of it, might have been the last in-person conference I attended before the world became what it is), one of the speakers, Sarah Blackford from Leeds, mentioned that there are two types of researcher developers. One is those who believe in guiding students to the training that is good for them (Team Broccoli 🥦) and the other is those who believe in letting students choose the training they want (Team Ice Cream 🍦). She shared this lovely picture from an 2016 event to make an illustrative point — also available in the conference slide deck, linked above (p.127).
Obviously you do both. The constant challenge, I find, is how you reconcile the two. This is something I regularly ponder, but I have been thinking a lot more about it recently, as I have become more substantially involved in our ESRC DTP this year. The funder articulates their expectations that each student’s self-identified development needs should be at the centre of the training programme and, simultaneously, that the institution should do something proactively about the fact that most students are not necessarily aware of the breadth of post-PhD career paths, especially beyond academia, and hence they do not always know what skills they need to develop during the PhD. Don’t these two statements contradict each other? More to come on my recipe for delicious broccoli ice cream. Watch the space!
A colleague once described my work as “interweaving warm-blooded humans and the screens“, and I was delighted with that description. At that confluence of the social and the technical, I am becoming more and more mindful of the unequal power dynamics between disciplines in the face of new methodological developments such as computational social sciences, biosocial research, and digital humanities. This post is intended to be a little space here for my own continuous reflection.
Recently at a workshop on digital tools for the humanities, a Stanford graduate student rather poignantly noted that oftentimes collaboration with computer scientists felt more like colonization by computer scientists. This statement, even if not true, is far too sharp to ignore. Frankly, I think it is true. Not long after that workshop, I attended a THATCamp, where I spent my time teaching folks how to use Gephi, and I tried to spend some time telling them that the network they create is the result of an interpretive act. I don’t think they cared, I think they just wanted to know how to make node sizes change dynamically in tandem with partition filters. This is an issue that has concerned me for some time: the way wholesale importation of digital tools, techniques and objects into humanities scholarship tends to foster a situation where rich, sophisticated problems are contracted to fit conveniently into software.
If we are creating a mess by generating so many haystacks of big data that we are losing all the needles, then we need to figure out a different kind of way of doing things, as we cannot sew new cloth without any needles. Whatever else we make of the ‘big data’ hype, it cannot and must not be the path we take to answer all our big global problems. On the contrary, it is great for small questions, but may not so good for big social questions. Social scientists need to find a way not to be complicit in the new wave of struggle over the politics of method that is intrinsic to what big data brings.
When I first started working in computational social science, I kept overhearing conversations between computer scientists and social scientists that involved sentences like, “I don’t get it how is that even research?” And I could not understand why. But then I found this quote by Gary King and Dan Hopkins two political scientists that, I think, really captures the heart of this disconnect: “[C]omputer scientists may be interested in finding the needle in the haystack such as […] the right Web page to display from a search but social scientists are more commonly interested in characterizing the haystack.”
But as Hanna Wallach writes in this great article in Communications of the ACM, the methodology is often instead: “Why not use these large-scale, social datasets in combination with the powerful predictive models developed by computer scientists”… and see what we get? (I guess you can replace “powerful predictive models” by any (for social scientists) non-standard method.) So “computational social science” has come to mean something slightly different from what it sounds like.
This article explores minimalist digital humanities pedagogy: strategies for teaching DH at institutions that don’t have many resources for doing so. Minimalist digital humanities pedagogy aims to maximize learning while minimizing stress, barriers of access, and time (for both instructors and students). This article considers how we can take a minimalist approach to course design, course websites, and DH project assignments. Throughout, it highlights how free, low-cost, and open-source tools can be used to help students increase their digital literacy, including their awareness of the ways technologies reproduce and challenge conditions of inequality. Such methods, I contend, can help students at a range of institutions develop digital skills both to navigate the world and to change it.
Within computing we have generally only focused on the wondrous and have ignored the terrifying or delegated its reporting to other disciplines. Now, with algorithmic governance replacing legal codes, with Web platform enabled surveillance capitalism transforming economics, with machine learning automating more of the labor market, and with unexplainable, non-transparent algorithms challenging the very possibility of human agency, computing has never been more deinon. The consequences of these changes will not be fully faced by us but will be by our children and our students in the decades to come. We must be willing to face the realities of the future and embrace our responsibility as computing professionals and academics to change and renew our computing curricula (and the worldview it propagates). This is the task we have been given by history and for which the future will judge us.
We reject the vague conceptualization of the discipline of ML as value-neutral. Instead, we investigate the ways that the discipline of ML is inherently value-laden. Our analysis of highly influential papers in the discipline finds that they not only favor the needs of research communities and large firms over broader social needs, but also that they take this favoritism for granted. The favoritism manifests in the choice of projects, the lack of consideration of potential negative impacts, and the prioritization and operationalization of values such as performance, generalization, efficiency, and novelty. These values are operationalized in ways that disfavor societal needs, usually without discussion or acknowledgment. Moreover, we uncover an overwhelming and increasing presence of big tech and elite universities in highly cited papers, which is consistent with a system of power-centralizing value-commitments. The upshot is that the discipline of ML is not value-neutral. We find that it is socially and politically loaded, frequently neglecting societal needs and harms, while prioritizing and promoting the concentration of power in the hands of already powerful actors.
On top of the strong hoarding instinct that I apparently was born with, I am a firm believer that inspiration comes from everywhere. This means that research in my dictionary is synonymous with trying not to drown in files and notes. Here is a playlist I am compiling for my kind of people.
What matters isn’t your writing software, it’s your file structures (sorry!) (Katherine Firth, Research Degree Insiders, 16 July 2020)
The morality of writing ‘well’ (Katherine Firth, Research Degree Insiders, 8 July 2021)
File not found: A generation that grew up with Google is forcing professors to rethink their lesson plans (Monica Chin, The Verge, 22 September 2021)
Why computing belongs within the social sciences (Randy Connolly, Communications of the ACM 63(8): 54-59)
Report examines emerging field of computational social science (Ed Grover, NCRM, 27 October 2021)
A very helpful thread. Resonates with why I like using the metaphor of a “perpetual stew” in thesis writing workshops. 🍲
Ethics of studying illegal behaviour
- One hundred dollars and a dead man: Ethical decision making in ethnographic fieldwork (Vanderstaay, 2005)
- All in the name of research (Matthews, 2014)
- The gendered affordances of Craigslist “new-in-town girls wanted” ads (Schwartz & Neff, 2019)
- Consider also: How to avoid writing up the research in a way that would serve as a how-to manual for copycats
- See also: What’s in a (pseudo)name? Ethical conundrums for the principles of anonymisation in social media research (Gerrard, 2020)
- A guide to being an ethical online investigator (Basu, 2021) — The Capitol riot has inspired a new army of amateur sleuths who want to help identify protesters. How can you, an average person, be an ethical digital activist?
Ethics of researching on leaked data
- The OKCupid dataset: A very large public dataset of dating site users (crossposted 11 May 2016)
- Media discourses surrounding ‘non-ideal’ victims: The case of the Ashley Madison data breach (Cross, Parker & Sansom, 2018)
- Every deleted Parler post, many with users’ location data, has been archived (Cameron, 2021)
- See also: Using a fitness app taught me the scary truth about why privacy settings are a feminist issue (Spinks, 2017)
- See also: Fitness tracking app Strava gives away location of secret US army bases (Hern, 2018)
9 remote interviewing tips for journalists (Damian Radcliffe, 17 August 2020)
How to transcribe interviews like a pro (Nicholas Yarmey, 18 August 2020)
RT @noor_halabi Hello! I have done so much research and arrived at two different software. One is Microsoft streams (available through your institution’s Office 365). You can upload the video and wait for about 2 hours while it generates CC. You can then copy-paste the text, or download. Otter.ai also works, and so does Dragon I hear. (17 August 2020)
What is Qualitative Data Analysis Software? (Daniel Turner, 20 August 2020)
Beginner’s guide to coding qualitative data (Daniel Turner, 19 November 2019)
What is actually Grounded Theory? (Daniel Turner, 8 July 2016)
Writing up qualitative research (Daniel Turner, 25 August 2020)
Saw the following thread keep coming up in my timeline yesterday, and found it resonating, so this post is to archive it for my own re-read. Why don’t people blog any more, instead of leaving these beautiful remarks in the ephemeral streams of microposts, but that is my problem.
RT @DrSudaPerera My contract finally arrived. It took a threat to quit unless it happened, and 10 years of being undervalued and underpaid at 5 different institutions. This moment is not one of joy. It’s one of relief and finally having the security to actually rage against precarity. THREAD (5 June 2020)
Long-term precarity is exhausting & humiliating. It’s a constant conveyor belt of toxic productivity. It’s not being able to put roots in one place. It’s never being able [to] plan for the long-term because of fluctuating salaries and never having savings.
It’s never getting a promotion or pay rise and spending your salary on house moves and train fare. It’s constantly having to adapt to new systems, rules and procedures. Of living that “hellish first year” writing new courses again and again.
It’s not saying “no”, or calling out exploitation, in case you burn bridges or get ‘a name’. Being unable to say “I don’t know how to do that” and having to learn. It’s always showing goodwill in the hope that you’ll be in good stead when a permanent job comes up.
It’s the kick in the teeth when the job you’re basically already doing gets advertised as a permanent role and you’re not even shortlisted. It’s having to take that kick with good grace because you still need a job and there might be a next time (but ‘next time’ never comes).
Trying to get out of precarity means spending your free time applying for things. It’s the Kafkaesque feeling of being excluded from funding calls because your contract doesn’t last long enough, then having that held against you when you apply for permanent roles.
In precarity you document all the work you do as lines in your CV and displays of competence for your next job application, to keep being paid, so you can make your next rent. Your permanent colleagues put their work on their promotion applications for higher pay.
Precarity is having to hide your precarity from students and networks because it might undermine your expertise. It’s realising that your knowledge is judged on your position rather than what you know and you’ve got fake it till you make it (knowing you might never ‘make it’).
I got out of precarity because my colleagues @SussexDev fought for me. Because my Head of School had the savvy to use the Uni trying to cut staff that weren’t “business critical” to frame me as such. I’m good value for money. The Uni accept my worth as a number not as a human.
I’m still cheap labour, but at least now I can make a fuss. For all those precarious staff still dealing with this shit, I will fight for you. I can’t promise I have the power for change, but now I at least have the power to try. And I’ll keep banging on about your disadvantage.
I’ll speak up for the hardships you face. I’ll explain that we don’t have diverse faculty because we don’t value diversity of experience. I’ll speak up about how valuable you are because you’ve taught widely, sat on committees, run social media accounts and written blogs.
I’ll promote the emotional labour of caring for students, of the pressure to be collegiate and never say ‘no’. I’ll argue that your experience shows versatility and adaptability not a lack of expertise. That your string of shitty contracts shows resilience not mediocrity.
I’ll call out when permanent staff don’t value that and get seduced by research superstars who never contribute anything that doesn’t advance their careers. I only got to this point because some really great permanent staff did it for me, and my HoS & HoD used their power for me.
So today, as I sit with my new permanent contract, I’m not going to celebrate my ‘success’. I’m going to reflect on how much it took out of me to get here. I’m going to remain angry I had to go through it. I’m going to acknowledge my new privilege, and use it as best as I can. ❤
Since we are on the topic of a “constant conveyor belt of toxic productivity”, in academia, here are a few more that have weighed on me.
- @ryancordell on the vicious circle of academic overwork (13 July 2018)
- @sophiephilpott1: “Y’know how on the London Underground you’re never more than 10ft from a rat? In Higher Education you’re never more than 2 years from a restructure.” (4 July 2019)
- @zeyneparsel on academia as third shift (15 January 2020)
- @wishcrys on microaggressions against junior, women, and PoC scholars (22 May 2020)
- @LizWFab: “Academia is feeling guilty for doing 4 hours of work on the weekend instead of taking time off and also feeling guilty for *not* doing 16 hours of work on the weekend instead of taking time off.” (27 September 2020)