Synopsis: The article discusses the FBI’s seizure of the Mastodon server and emphasizes the need for privacy protection in decentralized platforms like the Fediverse. It calls for hosts to implement basic security measures, adopt policies to protect users, and notify them of law enforcement actions. Users are encouraged to evaluate server precautions and voice concerns. Developers should prioritize end-to-end encryption for direct messages. Overall, the Fediverse community must prioritize user privacy and security to create a safer environment for all.

Summary:

Introduction

  • We are in an exciting time for users wanting to regain control from major platforms like Twitter and Facebook.
  • However, decentralized platforms like the Fediverse and Bluesky must be mindful of user privacy challenges and risks.
  • Last May, the Mastodon server Kolektiva.social was compromised when the FBI seized all electronics, including a backup of the instance database, during an unrelated raid on one of the server’s admins.
  • This incident serves as a reminder to protect user privacy on decentralized platforms.

A Fediverse Wake-up Call

  • The story of equipment seizure echoes past digital rights cases like Steve Jackson Games v. Secret Service, emphasizing the need for more focused seizures.
  • Law enforcement must improve its approach to seizing equipment and should only do so when relevant to an investigation.
  • Decentralized web hosts need to have their users’ backs and protect their privacy.

Why Protecting the Fediverse Matters

  • The Fediverse serves marginalized communities targeted by law enforcement, making user privacy protection crucial.
  • The FBI’s seizure of Kolektiva’s database compromised personal information, posts, and interactions from thousands of users, affecting other instances as well.
  • Users’ data collected by the government can be used for unrelated investigations, highlighting the importance of strong privacy measures.

What is a decentralized server host to do?

  • Basic security practices, such as firewalls and limited user access, should be implemented for servers exposed to the internet.
  • Limit data collection and storage to what is necessary and stay informed about security threats in the platform’s code.
  • Adopt policies and practices to protect users, including transparency reports about law enforcement attempts and notification to users about any access to their information.

What can users do?

  • Evaluate a server’s precautions before joining the Fediverse and raise privacy concerns with admins and users on the instance.
  • Encourage servers to include privacy commitments in their terms of service to resist law enforcement demands.
  • Users have the freedom to move to another instance if they are dissatisfied with the privacy measures.

What can developers do?

  • Implement end-to-end encryption of direct messages to protect sensitive content.
  • The Kolektiva raid highlights the need for all decentralized content hosts to prioritize privacy and follow EFF’s recommendations.

Conclusion

  • Decentralized platforms offer opportunities for user control, but user privacy protection is vital.
  • Hosts, users, and developers must work together to build a more secure and privacy-focused Fediverse.
  • EatMyDick@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I have been laughed at and down voted every single fucking time I point out how woefully unprepared every fucking instance is.

    The free model is flawed and will be unsuccessful every fucking time there is any signs popular server. And users aren’t going to tolerate moving fucking servers every month.

    You think cloudflare is going to keep on protecting lemmy.world each week on their free/professional their? Enterprise starts at 20k a year before traffic, good luck raising that kind of yearly money on a hobby server.

    And then there is GDPR and CCPA all of which are ignored and clearly not being enforced just waiting for a lawsuit.

    Oh and I do I need to explain to you people the child porn reporting mechanisms that need to be in place?

    The only way if this bullshit is successful it’s if someone starts a no profit e.g Mozilla foundation and acts like a functioning adult running a business vs a 16 year old tinkering with Linux.

    Bring on the down votes and compium.

    • Kayn@dormi.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      You bring up valid points, but you are being very antagonistic towards server admins in the process. I get that you’re frustrated by being dismissed all the time

    • deafboy@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      If we want the ecosystem to be resilient, we need to migrate to a model where:

      1. The data is redundant, in a way it matters. Yes, I know the posts are currently replicated, but if the primary replica is gone, the usefulness of the copies is limited.
      2. The identities are not tied to a provider

      NOSTR does this, AND provides an incentive for keeping the content online - you simply pay one, or even multiple relay operators, for keeping your data online. However:

      1. NOSTR client UX currently sucks even more than lemmy/mastodon
      2. There is no useful content whatsoever. They’re in the “only political extremists use this” phase at the moment.
    • GONADS125@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      11 months ago

      You have great points, I agree, and it’s why I donate to support lemmy.world. I’m hoping that enough people will donate small funds that it will cumulatively enable the server admins to better protect the instance. Basically like Wikipedia’s funding model.

      Maybe it’s not realistic, but I’m hoping that the fact that we all gave enough of a shit to start anew on lemmy, a decent percentage of the userbase may be more likely to donate than typically the case in online platforms.

      I guess time will tell the future of lemmy and the main instances.

      Edit: Here are the donation pages:

      • EatMyDick@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Wikipedia is run by a central NGO which is something I’ve advocated for. What we have now, and what folks are conversing about isn’t a sane model like you propose. People really believe this place isn’t going to have serious child porn, disinformation, and censorship issues without someone competent taking over the main policing and privacy concerns.

    • nomadjoanne@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      I think part of the problem is that laws in the developed world essentially make in extremely expensive to run one of these services if you have a lot of users per month.

      Te heart of the issue is that at some point it becomes more useful for mega-corporations to have a cozy relationship with the government than with you. It used to be that if a service found that there was child porn on their service, the law simply required them to remove it and report it to the police. Very reasonable.

      The thing is though, if that is all the compliance one needs to follow, then the creation of new firms and services is quite easy. Mega-corporations don’t like this. They want to slow the creation of new services and firms because this slows the appearance of new competition. Hence they become pro-regulation, and, I’d argue, attempt to shift the entire culture towards paranoia and a demand for more regulation.

      Perhaps the only defense is to stay small. Obviously don’t allow any abusive or illegal content. But stay small so that you can skirt by without having to deal with compliance with the big-boy regulations.

      • EatMyDick@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 months ago

        Laws + costs of a server. Cloudflare is 100% in talks warning lemmy.world they aren’t going to support them for free/$20/month.

        I love how you dismiss the compliance as all you need. As if it isn’t a crazy topic that requires a lawyer every other day plus hiring a team and creating a process to deal with child porn shit.

        None of you know half the reality of running successful digital services.

        • nomadjoanne@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 months ago

          I love how you’re an asshole for no apparent reason. We both like this place and are on the same team, even if we disagree about some things.

          But, in all seriousness, I really have the feeling that you are approaching this from the standpoint of a lawyer or someone on the marketing team of a large corporation. Of course a service like lemmy.world, or any of the larger instances, should consult with a lawyer at some point if they haven’t already. But this is not a mega-corporation, and I don’t think many people in Lemmy apart from you have any intention of running it like one.

          Of course these services cost money to run and protect. No one is saying it’s free. To give a similar example, some of the largest Invidious instances blow though several terabytes a day. So they are very much dependent on donations. We should all try and chip in if we are able.

          • archomrade [he/him]@midwest.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            This person honestly just sounds frustrated with the idealism of a not-for-profit social media alternative. Their concerns have some validity, but to suggest that it can’t work without following a paid or ad-supported model is a little dogmatic in my view.

  • BrikoX@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Original post: https://kolektiva.social/@admin/110637031574056150

    Important context missing from the EFF article is that the Mastodon instance wasn’t the target of the raid according to the admins.

    In mid-May 2023, the home of one of Kolektiva.social’s admins was raided, and all their electronics were seized by the FBI. The raid was part of an investigation into a local protest. Kolektiva was neither a subject nor target of this investigation. Today, that admin was charged in relation to their alleged participation in this protest.

  • phx@lemmy.ca
    cake
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    Interesting no mention of encryption-at-rest (disk encryption), which is something I’d recommend for servers in general.

    • LedgeDrop@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      I’m curious, how would you do this in such a way that it wouldn’t come at the expense of effecting your high availability?

      If the server were on-prem or in the cloud… and the system crashed/rebooted, how would you decrypt (or add the passphrase) to the encrypted drive?.. cause the likehood of the kernel crashing or a reboot after and update is higher than an FBI raid… and it would get tiresome to have the site being down, while we wait for Bob to wake up, log in, and type the passphrase to mount the encrypted hdd.

      You could use something like HashiCorp Vault, but it isn’t perfect either. If the server were rebooted, it could talk to Vault and request the passphrase (automatically) , but this also means that the FBI could also “plug in” the server (at their leisure) and have it re-request the passphrase. … and if Vault were restarted there’s quite a process to unseal (unlock) a vault - so, it would be as cumbersome as needing to type in the passphrase on reboot.

      My point / question is: yes, encryption (conceptually) is easy, but if you look at “the whole life cycle / workflow” - it’s much more complicated and you (as an administrator) might ask yourself “does this complexity improve anything or actually protect my users?”

      • zmej420blazeit@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 months ago

        Encrypting user data is pretty standard in the industry, and even required by law in the instance of servers hosting medical information in the US. Consumer software for disk encryption like you mentioned is substantially different from usual encryption solutions employed by data centers. Whole disk encryption is commonly done at a firmware or hardware level. For an example, iPhone embedded storage is fully encrypted and tied to the rest of the phone’s hardware. No user input required.

        It wouldn’t have mattered if the guy had encryption any way because, as the article mentioned:

        To make matters worse, it appears that the admin targeted in the raid was in the middle of maintenance work which left would-be-encrypted material on the server available in unencrypted form at the time of seizure.

        • Auli@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Where does HIPA state the medical data must be encrypted on the machine? I am not an expert on HIPA put don’t remember seeing that when looking at it before.

  • iarigby@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I actually have a question about this - can’t anyone already see the posts and users’ data? Even a simple user account/script can query most stuff, like posts and comments, and you can indirectly query less easily available things like upvotes by compromising any connected server

    • radix@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Disclaimer: I’ve never run a Mastodon or similar server, so the software may have more privacy built in, but potentially the issue would be account setup information that could be associated with public posts. Email addresses, IP address logs, etc. Those would be critical in matching public “anonymous” speech with real-world identifiable information.