New student, new project!
Cross-posted from the Helical project blog.
I want to extend a belated welcome to Zixuan (Jason) Yu, a Northeastern University undergraduate student who is working with me on a research coop through December 2025. Jason's project focuses on identifying elements of the Mastodon code base where we might either want to intervene (in order to answer a research question) or where there might be associated privacy considerations.
Jason's project combines goals from the Privacy Narratives project and the Helical project. He will be posting here regularly, but before then, let's dicuss the connection between privacy and experimentation.
As Donald Campbell wrote in Methods for the Experimenting Society,
[Social p]rograms are ... continued or discontinued on inadequate ... grounds[,] ... due in part to the inertia of social organizations, in part to political predicaments which oppose evaluation, and in part to the fact that the methodology of evaluation is still inadequate.
Mastodon as both a software platform and as a collection of communities has less of this inertia. We can think of each Mastodon instance as being a little society. This multiplicity and diversity could present incredible opportunties for empowering citizen social scientists. Insofar as computing systems can be made to have less friction with respect to experimentation, Mastodon's role as a FOSS platform seems ripe with opportunity. Unfortunately, in this context, Campbell's vision for an experimenting society may sound a bit like moving fast and breaking things: an ethos that many Fediverse communities reject.
This is NOT what we want! At first blush, it would seen that a notion of trust is missing from Campbell's essay. On closer inspection, however, we find trust's cousin, participant/citizen privacy. Campbell's mentions of privacy focus on scenarios where participants might be unwilling to disclose information to the researcher or research organization; today there are many more parties that may violate trust. We would argue that a violation of trust is the primary harm --- or at least the primary percieved harm --- that experimentation in social networks can cause, and therefore cannot be considered as a separate concern.
It is with this context in mind that Jason will be identifying possible intervention points and scenarios that could cause privacy vulnerabilities in the Mastodon code base.