Considering User Permissions in the Context of a Conference
Developing rules along the Axis of Amplification
I’ve been thinking a lot about how amplification seems to be core to many of the woes we have online and what we could do about it, especially as social platforms with federation or distribution could become more versatile.
It has become increasingly common to hear variants on the phrase ‘you have the right to freedom of speech but not freedom of amplification.’ As danah boyd put it in 2018: “Choosing what to amplify is not the same as curtailing someone’s ability to speak.”
It isn’t so much the content of fake news but how it is distributed. Social platforms have taken as a rule that while all amplification may not be equal, all amplification is good and should propagate with speed.
So what happens when we take, as the starting point, a limited-use social network? I have been fascinated by the idea of being able to spin up a single-use timeboxed network for a particular event, allow people to interact with it over the course of that event in a setup that is tied to in-person attendance, and then move an archived version to the web as a sort of conference proceedings-style tool after the event. I want to re-imagine user permissions in the context of an actual physical event. Also considering amplification, not just within the system, but what could happen after the proceedings are taken public. Let’s think differently about how social media can be used.
Like most of my side projects, I bit off more than I can chew, so while I’m still working through the code I figured it might make sense to write up my thoughts and ask for feedback.
So, for the purpose of this we’re considering:
How do we emulate the social mechanics of a well designed, supportive and safe conference space where people trust each other? To provide the baseline here, I’m building on SRCCON, which is very good at this, and in the past has provided users with in-person permissioning. The badges had colored lanyards and the different colors mean things: green - my image in this space can be freely recorded and shared; yellow - you must ask permission to record and share my image; red - do not record or share my image.
How do we give users the capability to operate on the system in confidence that their requested permissions will be respected while still leaving the space open enough for users to freely share?
How do we retain the data so people can learn from automatically generated conference proceedings without violating users’ preferences, especially as interaction with the open web increases the chance their work might be seen by actors who do not respect the rules of the system? We also want to attribute credit as accurately as possible.
All participants will, like in a well-structured conference, be presented with these rules as part of a code of conduct and consent to it. Because we’re dealing with a limited group invited into the space, I think as long as the rules are clear we can make this assumption.
Our goals in this experiment are to allow the users to contribute and converse freely and openly within the bounds of the conference while giving them control over who gets to see and amplify what they have to say on this platform. Users can expect their work to be handled in one way during the conference and another after the conference. The goal being to preserve useful information and educational content without forcing them to forever have their name associated with it in a way that might make them less open to talk or share during the conference. (In this consideration, the conference has strong conduct rules around what is on and off the record).
A quick song break before we get too deep!
Let’s start with the type of settings a user can control, both for their profile as defaults but also on individual posts or pieces of content (like images) where applicable:
A: Possible Amplification
All Posts and Comments may be Re-shared into other contexts or embedded from the Archive. Content is exportable by others for reuse/remix. All content marked CC-BY by default.
No Posts or Comments may be Re-shared Except for Trusted Users and on Request. The Archived versions will not be embedded unless a request is confirmed by the user. Content is not exportable. Access to the capability to re-share may be revoked on an individual basis.
No Posts or Comments my be re-shared or embedded. Content is not exportable.
B: Content Visibility
All Posts, Images and Comments visible.
All Posts and Images visible, Comments and Images within them only visible to Top Poster, Admins, on Request and Trusted Users.
All Posts public, but comments, images and comment threads by user are only visible to Trusted Users, on permission given by the user and Admins. Users may not request access of the user.
Posts prompted to be public but by default visible only to Trusted Users and Admins and approved requesters. Comments only visible to Trusted Users and Admins and approved requesters. Users may not request access of the user.
C: Content Permanence
All content retained in Archive with user’s name.
All content retained in Archive, but the user’s name will be removed and replaced with an anonymous ID unless the user specifies otherwise on individual Posts and Comments. (Other users will see a signal for this on posts throughout if set.)
No content retained in Archive. (Other users will see a signal for this on posts throughout if set.)
D: Profile Visibility
User Profile publicly visible and can be archived.
User Profile visible to Trusted Users and on Request.
User Profile never visible to any but Admins.
E: Profile Permanence
User Profile will be in Archive and can be exported by others and is publicly visible.
User Profile will be in Archive but not exportable by others.
User Profile will be in Archive but not publicly visible, instead assigned an anonymized ID code that only the user will retain.
User Profile will not be in Archive.
F: Personal Moderation Level (Admin Moderators may intervene at any level)
All Posts and threads you create (unless the thread is on a post with a different permission level) are open to real time comments without pre-moderation.
All Posts and threads you create are open to real time comments by Trusted Users without pre-moderation, other users most have their first Post moderated in a thread you control, and then will be allowed to Post in real-time by default.
All Posts and threads you create will be fully moderated. You must approve any Posts on those threads regardless if the user is trusted.
Ok. That is a lot. We need a system goal of allowing users to easily set these preference levers in a way that makes sense to how they are likely to interact with the system.
To do this, we’re going to give users clear packaged permission settings into user levels they can select on joining the system.
Green User - I Am In Public
Your content may be freely re-shared in the system.
Your posts are viewable to all.
Your threaded comment responses are viewable to all.
All posts and threads you create are open to real-time commenting.
Your user profile is public to all.
All uploaded assets are free to reuse by others under CC-BY.
All your posted content is exportable by others.
Your user profile is exportable by others.
Your posts may leave the intranet for the archive to live on the open web at conference end.
(A:1, B:1, C:1, D:1, E:1, F:1)
Yellow User - Please Ask Me For Permission
Your content can only be re-shared within the system with your active consent for each re-share or by users you consider trusted.
Your base posts are viewable to all but do not have real-time comments active except for trusted users.
Your threaded responses are only available to your trusted users and the owner of the parent thread.
Your profile is private but other users may request access.
Your uploaded assets are available on approved request. Approved users may embed your content but each embed may be revoked on a per-case bases or overall. Availability subject to custom CC type.
Your posted content is only available for export on approved request. You may freely export their own content.
Your user profile must be requested for export.
Your personal data will be purged at conference end but any content you had chosen to make public will be visible and stripped of identifying information.
(A:2, B:2, C:2, D:2, E:3, F:2)
Red User - I Am Private
Users may not re-share your content across the intranet.
Your base posts are viewable by all but you will be prompted to double check if you wish to make the posts private. All posts and comments will be marked as intended to purge the end of the event.
Your threaded responses are only visible to the top level user in that thread and your Trusted Users. Access must be actively given and cannot be requested. Your response posts will be marked as intended to purge at the end of the event.
Your profile is private to all but system administrators and may not be requested.
Your assets will be purged at the end of the event and can be exported only by you.
Your profile is only exportable by you.
All content, posts and profiles will be deleted from the system at the end of the event.
(A:3, B:3, C:3, D:3, E:4, F:2)
Ok, hopefully these user levels make sense!
I envision these rules applied to two connected systems, a Twitter style system adapted off a similar technical model from Mastodon where Posts are basically Tweets/Toots. Users (or those with permission to amplify) can escalate those Posts into full forum-style threaded conversations with support for unlimited blocks of text.
We’d have User Profiles and at the end of the conference the system would be moved from an intranet in the physical conference space to an internet website that would be locked in a static state.
Any individual user would be able to export all their content in a usable package at any time and tell the system what location they would like to consider canonical if they intend to host it themselves. For those individual user exports, any threads they posts that they started would come along and user IDs would be treated the same way as in the archive.
To enforce the privacy levels above, users would have their profile names cleared and replaced with an anonymous ID and they would get an encryption key. They may also choose to leave their profile in place and their name on some posts but not others. The anonymous ID will be applied to selected posts. The user’s real name would be purged from any association with the anonymous ID. Users who wish to limit embedding rights would be able to grant embeds to others on request by using their key to sign their approval to such requests. (This part, admittedly, is the most in need of additional technical thinking.) The other thing that this key would enable users to do is prove that they were the ones who made a particular post by decrypting the ID, if they ever wish to do so.
Only those posts that have the proper permissions set would have embed buttons in the archive and all others would have prominent notices making it clear that they should not be re-shared without permission. To aid in this we’d have clear CC-style rights notices.
My hope here isn’t that this system is the be all and end all of social media. Like I said, to what extent this can be accomplished, I’m thinking of it specifically for time-limited uses. Perhaps this isn’t the right format even for that. I’m hoping people will be interested and, if they see problems, object! Perhaps this slow side project will not be completed before someone builds a better version of the idea.
Whatever happens, I think it is interesting to think through how to design a social network with substantially different goals for users than current systems. Where the design is intended to allow users to set their exposure to the public in different contexts and control who can amplify them. Users should have a clear option to leave up content they think might have value without having to tie themselves to that content forever. I hope the system still values and helps with sharing and conversation.
We are often trapped by particular assumptions about how social media should work. We don’t need to be. If this inspires you to write another totally different approach I’d like to hear it!
Further Reference:
5 Lessons for Reporting in an Age of Disinformation by Claire Wardle
Here’s what we are doing with Galley, our discussion forum app by Mathew Ingram
How to run a small social network site for your friends by Darius Kazemi
Reading ActivityPub by Darius Kazemi
The Coral Project’s extensive public research
Heather Gold’s presentations on Tummeling
Interested in talking more about this newsletter? Join the Keybase.io team to chat about it: https://keybase.io/team/gneist.newsletter
Photo Credits:
Street Speaker (Used under CC BY-SA)