Sok Puppette’s Techdirt Profile

sokpuppette

About Sok Puppette




Sok Puppette’s Comments comment rss

  • Feb 17th, 2022 @ 5:42pm

    Re: Re: There's more than one kind of knowledge

    I'm not saying the "platform owners" should say that. I'm saying EVERYBODY should say it.

    The problem with asking them to come up with suggestions is that they WILL. And they will claim that their suggestions are workable when they're actually not. And they'll claim that their suggestions don't force disabling security measures when they actually do. And they'll claim that their suggestions don't put people at risk when they actually do.

    They will never come up with any suggestions that don't have those problems, because that is not possible. However, every time you manage to argue away one suggestion, they'll reword things a bit, come up with a slightly modified one, and claim this one is the fix. They can do this forever.

    ... and their message to people who are not closely engaged with the issue will be that they've tried and tried to be reasonable and address the sane people's concerns, but the sane people are unreasonable and hate compromise and won't accept anything at all.

    It is incredibly bad strategy to adopt any message that suggests there's could be an acceptable way to do what those people want, because there is not.

  • Feb 15th, 2022 @ 5:47pm

    There's more than one kind of knowledge

    This article and the linked thread seem to be centered on the kind of "knowledge" where you know "this piece of content over here in this file is child porn".

    But there's another kind of knowledge, of the form "I run a social site with 100,000,000 users. It is a practical certainty that there's child porn going through my system.". It's not just "I'm ignoring a real possiblity". It's "I'm sure there's some here; I just don't know exactly where it is". Especially after the first few unrelated cases in which you find some.

    That kind of thing really isn't captured by the normal lay idea of "recklessness". And if it falls within some legal definition of recklessness, then it's still at least an extremely strong form, way out near the boundary with actual knowledge... which is probably a boundary that can move given the right kind of bad-law-making case.

    I think that the "EARN-IT" people are hoping to be able to go after the second kind of knowledge, and I'm afraid that Smith may not be protection enough.

    A bookseller in 1958 who happened to have one "obscene" book could reasonably argue that they didn't know what was in it and also didn't know, or even have any reason to believe, that there was anything like that in their stock at all.

    A large social site in 2022 knows there's some child porn in the mix somewhere. I suspect that the proponents are hoping that they can use that as enough scienter to get around Smith completely.

    It's true that it's still just as impractical for a site to find every single bit of child porn as it would be for a bookseller to find every "obscene" book... but they can still push for the idea that the First Amendment allows them to require a site to do "everything reasonably possible". Not just because it's supposedly a "best practice". Not just because not doing it would risk not finding child porn. Because the site has actual knowledge that there's a problem on their particular system.

    That means they can still try to demand scanning, whether via state law or via some other pass. Scanning, of course, means no effective encryption. They will try to get those in through the back door even if they're not in the bill, and given the subject matter I'd be really worried that they'd win in court.

    The right answer, of course, is "Yeah, I'm sure there's some child porn on every major site. Tough". But nobody seems to have the guts to say that.

  • Feb 15th, 2022 @ 6:27am

    Re: More than one way to look at data

    Of course they're unnecessary.

    Typical interaction on Halloween: kids you don't know, from blocks away, knock on your door and you throw some candy into their bags. The most interaction might be "Who are you? Good job on the costume!". There are dozens of other kids parading by, they often travel in groups, and most of the time these days even the older kids have their parents with them.

    Suppose you were the biggest child molester that ever child molested. How exactly would you turn that situation into a molestin'?

  • Jul 13th, 2021 @ 7:31am

    You know what would make things safer?

    Not raiding people for possessing or dealing in random substances, that's what. No raid, nobody gets shot. Just repeal the fucking drug laws already.

  • Apr 29th, 2021 @ 2:37pm

    (untitled comment)

    Who (the fuck) are these people and why is everybody talking about them all of a sudden?

    From all the stuff that's been plastered all over everything I read, I have gleaned the information that they're about a 60-person company in Chicago, and that they had something to do with inflicting Ruby on Rails on the world.

    Somehow I'm having trouble caring about them or anything they do...

  • Apr 27th, 2021 @ 9:46am

    (untitled comment)

    If the UK government wants support for its anti-encryption efforts, it needs to do better than basically lying to people.

    Why? Lying works in politics.

    First you lie to yourself, and convince yourself that some single thing is The Most Important Thing. Then you come up with a bunch of Things to Do, and obviously they Must Be Done if they even might have any effect at all on The Most Important Thing. Even if none of them might have any effect, you still have to do them because Something Must Be Done.

    And it doesn't matter how much damage you do elsewhere, because no other issue is The Most Important Thing.

    Then you like to everybody else. You exaggerate, you make wild accusations, whatever. If you want to ban mayonnaise, you say that mayonnaise is radioactive. Which you can justify because after all you're dealing with The Most Important Thing here.

    And, by the way, anybody who says anything that contradicts your lies, or even doesn't promote your view, is scum. It is Not OK to say that mayonnaise is not in fact radioactive. After all, true or not, the idea that mayonnaise is radioactive might actually convince somebody to ban it, and that's The Most Important Thing.

    For these people, protecting children from any exposure to sexuality, especially in relation to adults, is The Most Important Thing. If those same children end up impoverished, oppressed, or dead, well, sorry, that's just not as Important.

  • Aug 26th, 2020 @ 5:35pm

    So....

    While this QI bullshit in the US is clearly based on egregious judicial activism by the Supremes (and after that a lot of apparently intentional inactivism), let's not forget that Congress could eliminate it at any moment, has had over 50 years to do it, and hasn't done so.

    And I'm not a lawyer, but I suspect that individual states could do at least something about it with respect to those officers who operate under their own authority. They haven't done it either.

    It seems like there's plenty of blame to go around for this.

    Basically everybody in any authority in government is terrified that the world will burn down if cops have to follow rules. Or they think their constituents are. So the dereliction of duty is pretty universal.

  • Jun 22nd, 2020 @ 6:17pm

    Sorry, no.

    There are two issues here: integrity and confidentiality (aka privacy). These systems are not the answer for either one.

    Integrity is best solved end-to-end using DNSSEC. It's absolutely stupid to try to do it using hop-by-hop cryptography; you're trusting every hop not to tamper with the data.

    ... and just encrypting DNS traffic doesn't solve confidentiality either. It doesn't even improve confidentiality in the large.

    1. The adversary model is incoherent. If your ISP is spying on your DNS traffic, and you deny that to the ISP, then the ISP can just switch to watching where your actual data go. Yes, that may be slightly more costly for them, since otherwise they probably would have done it in the first place. It doesn't follow that the costs imposed on them are enough to justify the switch. In fact, they probably are not.
    2. All the proposals encourage centralization, which means that when (not if) some resolver that a lot of people are trusting goes bad, the impact is huge. Instead of a relatively large number of relatively survivable events, you create a few massive catastrophes.
    3. What this is fundamentally trying to be is an anonymity system (I guess a PIR system). Anonymity systems are HARD. Much, much harder than point to point cryptography. There are a million correlation and fault induction attacks, and in the case of DNS there are a million players in the protocol as well. There's been absolutely zero analysis of how easy or hard these methods may be to de-anonymize using readily observable data. They seem to be being designed by people who don't even understand the basics, and think they're helping when they charge ahead blindly.

    ... not to mention that it's just psychotic to tunnel a nice simple cacheable protocol like DNS over a horrific tower of hacks like HTTP.

  • Apr 9th, 2020 @ 3:29pm

    Re:

    ... oh, and even if you weren't a company with any significant infrastructure, they could also come after you for providing software for P2P or other decentralized solutions. "Protocols, not platforms" only works if somebody's allowed to provide the software to speak the protocol...

  • Apr 9th, 2020 @ 3:26pm

    (untitled comment)

    Hmm. It actually may be worse than that, because it appears to apply beyond what you'd think of as "platforms".

    The recklessness and "best practices" requirements are applied to all providers of "interactive computer services". The definition of "interactive computer service" is imported by reference from 230. That definition is:

    The term "interactive computer service" means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

    The part about "system... that enables computer access" sweeps in all ISPs and telecommunication carriers, as well as operators of things like Tor nodes. And "access software provider" brings in all software tools and many non-software tools, including open source projects.

    Under 230, those broad definitions are innocuous, because they're only used to provide a safe harbor. An ISP or software provider is immunized if it doesn't actually know about or facilitate specific content. No ISP and almost no software provider has any actual knowledge of what passes through or use its service, let alone editing the content or facilitating its creation, so they get the full safe harbor, with minimal or no actual cost to them. And anyway nobody has been after them on 230-ish issues, so including them doesn't hurt.

    Under EARN-IT, those same definitions would be used to impose liability, so now those parties actually get burdens from being inside the definition. That's worse than a repeal of 230. It doesn't just remove a safe harbor; it opens an avenue for positive attack.

    This commission could decide that it's a "best practice" for ISPs to block all traffic they can't decrypt. Or it could decide that it's a "best practice" not to provide any non-back-doored encryption software to the public, period.

    Or, since those might generate too much political backlash at the start, it could start boiling the frog on the slippery slope by, say, deciding that it's a "best practice" not to facilitate meaningfully anonymous communication, effectively outlawing Tor, I2P, and many standard VPN practices.

    Then it could start slowly expanding the scope of that, possibly even managing to creep into banning all non-back-doored encryption, without ever making any sudden jump that might cause a sharp public reaction.

    Back on the platform side, over time the rules could easily slide from the expected (and unacceptable) "best practice" of not building any strong encryption into your own product, to the even worse "best practice" of trying to identify and refuse to carry anything that might be encrypted. Start by applying it to messaging, then audio/video conferencing, then file storage... and then you have precedents giving you another avenue to push it all the way to ISPs.

  • Feb 1st, 2020 @ 1:20pm

    "Ban", eh?

    That's a pretty lame excuse for a ban.

    There is no reason that private surveillance camera users should be allowed to have the kind of automated, mass face recognition they're talking about "banning", any more than government users. They're at least as likely to abuse it and even less accountable.

    Nobody should be trying to connect names or any other information to any person who just enters a place where a camera happens to be pointed. Nor should anybody be shouldn't be using the video/images from surveillance to build any kind of face database or any other kind of database.

    Only in the US would people miss the obvious fact that the impact is the same no matter who runs the system.

  • Jan 31st, 2020 @ 12:57pm

    (untitled comment)

    I'm having trouble buying the idea that anybody at all thinks the phrase "child porn" carries any implication, or even suggestion, of legality. It's the most famously illegal thing that exists on the Internet.

    As for moderation, I will bet that almost all references to "child porn" on the Internet are in text that condemns it and/or discusses what to do to stop it. And if the pedos are in fact openly using the phrase "child porn" all over the place, what happens when they start calling it "CSAM"?

  • Jan 31st, 2020 @ 12:28pm

    "CSAM"?

    What's the actual difference between "CSAM" and child porn, and why is it important to make the distinction? Seems like another random pointless acronym being thrown around and another random pointless terminology change.

  • Jan 31st, 2020 @ 11:42am

    Re: Re: creative makeup

    https://www.documentjournal.com/2020/01/anti-surveillance-makeup-could-be-the-future-of-beauty/

    I happened to be playing with the AWS Rekognition demo the other day, and I fed it a bunch of makeup jobs from the CV dazzle site, as well as various other images with "countermeasures" from around the Web.

    Given a nice clear picture, it found every single face and every single feature on every face. It also did a good job of identifying age, sex and mood, right through some pretty extreme makeup. Try it out. It's available to the public.

    The problem with the countermeasures is that you never know whether the other guy has out-evolved you.

    By the way, the good think about Rekognition was that it seems to be crap at actually identifying faces from large groups.

    They have a celebrity recognition demo, and it did very poorly on pictures lots of people who are in the headlines... including people who ARE in the database. It spotted Marilyn Monroe in one of her really iconic shots, but not in another perfectly clear shot that it presumably hadn't been trained on. Same thing for Einstein. Turning to the headlines, it misidentified Alexandra Ocasio-Cortez and Greta Thunburg as random minor celebrities I'd never heard of. In turn it identified random minor celibrities, like members of current boy bands, as different random minor celebrities. It does well on heads of state. And both new and very old pictures of Elizabeth II worked. It may also be OK on Really Big Stars of Today (TM). But that's about it.

    So I assume it won't really identify a random picture as belonging to somebody in a collection unless said collection has a lot of good, similar pictures of that same person.

  • Jan 31st, 2020 @ 5:35am

    Re: Re: Re: Re: I have great hopes for the repeal of 230...

    In a peer to peer system, you bring your own, and you pay for it because you want to participate. Yeah, somebody has to sell it to you, but the equipment and software general purpose, you can't tell what any individual is using them for, and anybody can make them.

    If necessary, that can be extended to the entire communication infrastructure, but in fact we're not talking about the IP layer of fiber and routers here. We're talking about application layer overlays that can clearly be done peer to peer. Facebook and Google are not infrastructure.

  • Jan 31st, 2020 @ 5:32am

    Re: Re: Re: Re: I have great hopes for the repeal of 230...

    What I'm saying is that trying to make a profit will prevent them from properly providing the service. It has nothing to do with what they "should" or " should not" do. It's simply not possible to make a buck providing an unattackable service.

  • Dec 23rd, 2019 @ 8:21am

    Re: Re: others

    ... ever bought an expensive cell phone that was locked in to a single carrier?

    No. That is, not unless I was absolutely sure I could unlock it without the carrier's help or permission. I've never been wrong about that.

    Neither should anybody else.

    or an expensive android phone where software updates ceased after 1-2 years?

    No, because I've never bought one I couldn't load a custom ROM on.

    I have been fucked in 3 to 5 years because of proprietary binary blobs, though. That shit should be illegal.

    In fact, it should be illegal to distribute any software without source code. That includes firmware and other software bundled with hardware. It should also be illegal to distribute hardware without full register descriptions, and all other information necessary to write a driver supporting all of its features. And if you have any other "internal" documentation, go ahead and throw that in too.

    No exceptions, and fuck your "trade secrets".

    And if locking something down so that it will only load signed software is legal at all, there need to be some extremely heavy, legally binding regulations on the conditions under which it is allowed. THat definitely has to include the ability to update software that's gone out of support. In most cases, it should probably also include the ability for the owner of any hardware to take total control of all the software that runs on it.

    People should be tolerating this kind of abuse any longer. Not only are we suffering from wasteful obsolescence, and not only are enormous resources constantly wasted by intentionally crippled functionality and intentionally hindered interoperability, but there are massive unfixable security problems in all the shit software and abandonware that's being shoveled out.

    Meanwhile, we should be poisoning the market for this crap by mocking anybody who opts in without being absolutely forced. In the specific case of home control, there were perfectly good open alternatives that these idiots could have used instead.

  • Oct 18th, 2019 @ 4:49pm

    (untitled comment)

    Good first step. Now ban all use of it by everybody. There's nothing magically different about state surveillance.

  • Oct 11th, 2019 @ 11:35am

    Re: Re: Re: Re: Re: Re: Re: Re:

    I need to correct that slightly. That news site just turned off the name and I got a message saying "your screen name has been rejected; choose a new one" or something nonspecific like that. I only inferred that they wanted something that looked like a "real name".

  • Oct 11th, 2019 @ 11:31am

    Re: Re: Re: Re: Re: Re: Re:

    I use the name only for commenting on places like this. I have a couple of aliases, although I don't use more than one on the same site. You won't find any of them on my birth certificate. Isn't that technically what a sock puppet is?

    I use the name to make it clear to the reader that I'm not associating the comments with my "real world" identity.

    Amusingly enough, one news site decided it didn't like the name because it looked obviously fake, and made me choose one that looked like a "real name". The one I chose wasn't, of course, my actual "real name". I can't imagine what they think they're accomplishing with that nonsense.

    By the way, although I take strong stances and try to shake up assumptions, I do not write comments that I don't believe, nor do I write comments just to upset people.

    I really don't understand what pissed people off about that one, since I would think pretty much everybody would agree with it if they thought for 15 seconds. But maybe it touched some taboo or another. My first guess would be the part about the US Constitution being poorly written.

More comments from Sok Puppette >>


This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it