This is a nice overview of this absurd situation, but Tim Bray’s conclusions are a little surprising to me.
Yes, Mastodon traffic either is already or soon will be captured and filed permanently as in forever in certain government offices with addresses near Washington DC and Beijing, and quite likely one or two sketchy Peter-Thiel-financed “data aggregation” companies. That’s extremely hard to prevent but isn’t really the problem: The problem would be a public search engine that Gamergaters and Kiwifarmers use to hunt down vulnerable targets.
Here Bray appears to be missing the fact that those people will often end up with access to those Thiel-financed private intelligence services that will have the full-text search, while the rest of us won’t. Making things public and pretending they’re private by shunning search effectively lobotomizes everyone who abides by this custom, while still allowing the worst people to have the capability (and not only the ones working in state intelligence agencies).
What success looks like: I’d like it if nobody were ever deterred from conversing with people they know for fear that people they don’t know will use their words to attack them. I’d like it to be legally difficult to put everyone’s everyday conversations to work in service to the advertising industry. I’d like to reduce the discomfort people in marginalized groups feel venturing forth into public conversation. (emphasis mine)
This is a conflation of almost entirely unrelated issues. The first half of the first sentence is talking about non-public conversations. The solution there is obviously to use e2e encryption, so that even the servers involved can’t see it, and to build protocols and applications that don’t make it easy for users to accidentally make private things public (ActivityPub was not designed for private communication, it was designed for publishing, so, it is unlikely to ever be good at this). The second sentence is about regulating the ad industry… ok, cool, an agreeable non-sequitur. But the last sentence is talking about public conversation… and in the context of the second half of the first sentence, it carries the strong implication that Bray somehow entertains the fantasy that conversation can somehow be public and yet be uninhibited by “fear that people they don’t know will use their words to attack them”.
This is a nice overview of this absurd situation, but Tim Bray’s conclusions are a little surprising to me.
Here Bray appears to be missing the fact that those people will often end up with access to those Thiel-financed private intelligence services that will have the full-text search, while the rest of us won’t. Making things public and pretending they’re private by shunning search effectively lobotomizes everyone who abides by this custom, while still allowing the worst people to have the capability (and not only the ones working in state intelligence agencies).
This is a conflation of almost entirely unrelated issues. The first half of the first sentence is talking about non-public conversations. The solution there is obviously to use e2e encryption, so that even the servers involved can’t see it, and to build protocols and applications that don’t make it easy for users to accidentally make private things public (ActivityPub was not designed for private communication, it was designed for publishing, so, it is unlikely to ever be good at this). The second sentence is about regulating the ad industry… ok, cool, an agreeable non-sequitur. But the last sentence is talking about public conversation… and in the context of the second half of the first sentence, it carries the strong implication that Bray somehow entertains the fantasy that conversation can somehow be public and yet be uninhibited by “fear that people they don’t know will use their words to attack them”.