A mess of a girl, free on the internet. A spicy meatball indeed :3

  • 0 Posts
  • 10 Comments
Joined 8 days ago
cake
Cake day: March 16th, 2026

help-circle

  • However this is not entirely true either, for two reasons.

    1. Philosophical: FOSS relies on the “many eyes” approach to security. Adding any API, even internal adds another layer of risk. This is exactly why some projects refuse to have API access to application data, even if it runs from a privileged forked service. (Using locked sockets or other methods instead).

    Any open port is a attack vector and no matter how secure it is today, tomorrow is not a promise. More so with how this overlaps with laws like Australia’s, which requires all encryption to provide a backdoor for government access. (This means the 5 eyes nations get access by definition to this API while it’s in transit, as soon as it leaves the host system…)

    But that’s not just the only issue. The whole issue with libxz being targeted by nation state sabotage proved that, it’s possible to put backdoors into applications despite “many eyes” on the code. (That case was only caught because one obsessive person over the /testing/ speed… 90% of such attempts in most projects would go unnoticed simply as there is not enough maintainers)

    1. Licensed software: not all applications are completely open, even if the underlying OS is. This is a API thats exposed to all userland applications. Nothing stops Firefox for example from using binary blobs in Thier source to “sign” this data for supporting websites, then send this data to places you don’t consent.

    Firefox is just a example, so many applications use permissable licenses that don’t require all of the sourcecode to be human readable or even accessible.

    Big thing is nothing stops driver vendors from stealing this data too, no different than Microsoft does, whether or not you are signed Into a Microsoft account on windows. Telemetry is already a growing issue and the scope of telemetry data in closed source blobs doesn’t have to be defined…


    So by definition it’s not any more secure…

    Even if it was, the bigger question is why. Why does the application or web service need to know.

    If a child walks into a liquor store and steals alcohol, they get arrested. The burden of proof was never on the liquor store. Why is the burden of proof on the OS and not the parent or child.

    We don’t need nanny software, that teaches kids to be better liars. We need stronger punishments for criminal actions, regardless of age and more importantly punishments for the parents for allowing it to occur. Babygating the entire OS for some one elses children that would never touch it, legally. Is a example of creating solutions for a problem YOU(parents/government) created.

    All of these age laws came from the social media bans. These of which only came into existence as a means of datacollection… Non-compliance, is actually compliance with how they are written, as they all place the burden of proof on you. No evidence == no crime. It’s still a crime to lie about your age to age restricted content.


  • can’t expose what doesn’t exist. if a site asks for age verification, stop using it and find a alternative that respects privacy.

    your OS starts asking for your identity information, to share with everyone that pokes a open API, it’s time to jump OSs…

    privacy is something you make a best effort to avoid creating obvious methods of exploiting. you don’t share your credit card numbers with every email too, just because they say they cannot prove it was your card on file?

    identity is personal online, no one is entitled to it unless you choose to share it. otherwise it’s a invasion of privacy.

    want to protect kids? educate them and keep them off tiktok/social media by fining the parents when they are identified.

    fined twice? treat the kid like you would any other time a theft of service occurs.