London Metropolitan Police Deploy Facial Recognition Tech Sporting A 100% Failure Rate

from the TOP.-TECH. dept

Facial recognition tech isn't working quite as well as the agencies deploying it have hoped, but failure after failure hasn't stopped them from rolling out the tech just the same. I guess the only way to improve this "product" is to keep testing it on live subjects in the hope that someday it will actually deliver on advertised accuracy.

The DHS is shoving it into airports -- putting both international and domestic travelers at risk of being deemed terrorists by tech that just isn't quite there yet. In the UK -- the Land of Cameras -- facial recognition tech is simply seen as the logical next step in the nation's sprawling web o' surveillance. And Amazon is hoping US law enforcement wants to make facial rec tech as big a market for it as cloud services and online sales.

Thanks to its pervasiveness across the pond, the UK is where we're getting most of our data on the tech's successes. Well... we haven't seen many successes. But we are getting the data. And the data indicates a growing threat -- not to the UK public from terrorists or criminals, but to the UK public from its own government.

London cops have been slammed for using unmarked vans to test controversial and inaccurate automated facial recognition technology on Christmas shoppers.

The Metropolitan Police are deploying the tech today and tomorrow in three of the UK capital's tourist hotspots: Soho, Piccadilly Circus, and Leicester Square.

The tech is basically a police force on steroids -- capable of demanding ID from thousands of people per minute. Big Brother Watch says the Metro tech can scan 300 faces per second, running them against hot lists of criminal suspects. The difference is no one's approaching citizens to demand they identify themselves. The software does all the legwork and citizens have only one way to opt out: stay home.

Given these results, staying home might just be the best bet.

In May, a Freedom of Information request from Big Brother Watch showed the Met's facial recog had a 98 per cent false positive rate.

The group has now said that a subsequent request found that 100 per cent of the so-called matches since May have been incorrect.

A recent report from Cardiff University questioned the technology's abilities in low light and crowds – which doesn't bode well for a trial in some of the busiest streets in London just days before the winter solstice.

The tech isn't cheap, but even if it was, it still wouldn't be providing any return on investment. To be fair, the software isn't misidentifying people hundreds of times a second. In a great majority of scans, nothing is returned at all. The public records response shows the Metro Police racked up five false positives during their June 28th deployment. This led to one stop of a misidentified individual.

But even if the number of failures is small compared to the number of faces scanned, the problem is far from minimal. A number of unknowns make this tech a questionable solution for its stated purpose. We have no idea how many hot list criminals were scanned and not matched. We don't know how many scans the police performed in total. We don't know how many of these scans are retained and what the government does with all this biometric data it's collecting. About all we can tell is the deployment led to zero arrests and one stop instigated by a false positive. That may be OK for a test run (it isn't) but it doesn't bode well for the full-scale deployment the Met Police have planned.

The public doesn't get to opt out of this pervasive scanning. Worse, it doesn't even get to opt in. There's no public discussion period for cop tech even though, in the case of mass scanning systems, the public is by far the largest stakeholder. Instead, the public is left to fend for itself as law enforcement agencies deploy additional surveillance methods -- not against targeted suspects, but against the populace as a whole. This makes the number of failures unacceptable, even if the number is a very small percentage of the whole.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, london, metropolitan police


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 20 Dec 2018 @ 4:22am

    This is great!

    I imagine this means we're all walking around London right now, for all this tech knows.
    Thanks Met Police, for the free London trip. I hear it's quite dystopic this time of year!

    link to this | view in thread ]

  2. icon
    Bamboo Harvester (profile), 20 Dec 2018 @ 4:48am

    While I...

    ...abhor street cameras, an obvious 4th violation, this line:

    "We have no idea how many hot list criminals were scanned and not matched."

    is a problem. Headlines of 100% *failure* rate, but if there were NO "hot list criminals" in the "sample group", the software is 100% *effective*.

    Putting ten or twenty test subjects in the crowd deliberately would give a more accurate reading. But then it would be a test rather than a deployment. Can't run tests after the sucker ... er... "client" has already bought the package, after all...

    link to this | view in thread ]

  3. icon
    PaulT (profile), 20 Dec 2018 @ 5:52am

    "street cameras, an obvious 4th violation"

    Erm, I'd love to hear your logic on this one. Isn't the fourth the "right of the people to be secure in their persons, houses, papers, and effects"? How does that apply to recording a public street?

    "if there were NO "hot list criminals" in the "sample group", the software is 100% *effective*."

    This, however, is definitely true. But, there's likely to have been some testing with other subjects before these public ones and they probably can't legally target people without their consent (then said consent might lead to accusations of pre-programmed bias if the software is successful).

    link to this | view in thread ]

  4. identicon
    mcinsand, 20 Dec 2018 @ 6:00am

    who makes balaclavas?

    If there is a key balaclava manufacturer, I want to invest now!

    link to this | view in thread ]

  5. icon
    Bamboo Harvester (profile), 20 Dec 2018 @ 6:05am

    Re:

    Because it's a rarity for a street camera to not catch some private property in it's view as well.

    I own several rental buildings. I've got cameras on all of them, but was very careful to make sure none of them are catching views of the neighboring properties.

    If a stoplight mounted camera has apartment or house windows in it's field of view, how can it not be a 4th violation? It's effectively surveilling that window.

    link to this | view in thread ]

  6. icon
    Bamboo Harvester (profile), 20 Dec 2018 @ 6:12am

    Re:

    100% effective...

    I'd assume the manufacturer did *some* sampling using it's own employees.

    But for the cops not to do the same makes it very difficult to claim the software match was sufficient cause for a stop. The days of people believing computers don't make mistakes are long gone.

    Now, if along with their "hot list" of criminals, they scanned EVERY cop into the system, AND the system matched them frequently (and correctly), they'd have grounds to have such stops accepted as evidence (in the US, not up on Brit law).

    And how could any cop decline to be scanned in? After all, if you've got nothing to hide, you've got nothing to worry about, right?

    link to this | view in thread ]

  7. icon
    PaulT (profile), 20 Dec 2018 @ 6:22am

    Re: Re:

    I don't know, that seems like a stretch to me. By that logic, you can commit any crime you want in front of a window that faces the street and it's a 4th amendment violation if an officer sees you and decides to act on it. Hell, just set up your crack den across the street from the police station, don't install curtains and do whatever you want!

    I understand privacy concerns and would certainly hope that those setting the cameras up were as mindful as you. But, I think it's really stretching the point to say that any camera surveillance in the public street is a violation.

    link to this | view in thread ]

  8. icon
    PaulT (profile), 20 Dec 2018 @ 6:24am

    Re: Re:

    No, adding cops into the system as their main testing pool would guarantee problems getting evidence collected, as it could easily be argued that they faked the test results. Doubly so when they get on the wrong side of their false positives (and there always will be false positives).

    link to this | view in thread ]

  9. identicon
    Blake A Senftner, 20 Dec 2018 @ 6:27am

    Re: Re:

    Think of Facial recognition as a camera that is blind to anything but faces, plus the faces need to be generally facing the camera and at least some minimum size, such as 160 pixels tall. A traffic camera lens is such that outside of the traffic intersection, human heads are too small for FR to identify, so it is effectively blind past the intersection.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 20 Dec 2018 @ 6:35am

    Re: who makes balaclavas?

    You're looking to unlock balaclavas?

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 20 Dec 2018 @ 6:35am

    is there actually ANYTHING that the UK can do right, particularly in the desire to catch someone for doing something, illegal or not? the only thing they've managed to do is criminalise the whole population in efforts to support the entertainments industries! fucking useless!!

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 20 Dec 2018 @ 6:48am

    Let's just hope that the LMP's facial recognition software didn't come from Google.

    https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorith m-ai

    link to this | view in thread ]

  13. icon
    Bamboo Harvester (profile), 20 Dec 2018 @ 7:45am

    Re: Re: Re:

    Traffic patterns if a private driveway is in their view. Archiving the comings and goings at the doors.

    I view them like the automated license plate readers.

    Prior to the readers, an officer had to have a REASON to call in a plate. They didn't do it gratuitously, as it would annoy the hell out of the clerk running the search.

    And they got ONLY the registered owner and information on the vehicle itself.

    With the ALPR, they get not just that info, but it's linked in dozens of databases, giving all kinds of information "at the push of a button" that the officer does not need to know for any reason.

    And the ALPR treats everyone as guilty. It scans EVERY plate that goes past it.

    The same for the street cameras, especially where they infringe on private property. Would you put up with your neighbor setting up a camera to watch your teenaged daughters in your pool? Of course not. So why are the cops "special" in this matter?

    I give all my tenants access to the camera footage, with real-time available for the one covering the front door.

    Ever try to get footage from a traffic camera?

    link to this | view in thread ]

  14. icon
    nasch (profile), 20 Dec 2018 @ 7:46am

    Re:

    is there actually ANYTHING that the UK can do right

    Aston Martins sure are nice.

    link to this | view in thread ]

  15. icon
    Cdaragorn (profile), 20 Dec 2018 @ 7:47am

    Re: Re:

    It's long established fact that watching private property from somewhere you're legally allowed to be is never a violation of the 4th and is within your legal right to do.

    Intruding upon someone's privacy is not the same thing as someone opening their private areas up for full display to the public. The only possible issue for the government is that it's not allowed to keep those images for very long unless except for any it has probably cause to connect to some actual crime.

    link to this | view in thread ]

  16. icon
    Bamboo Harvester (profile), 20 Dec 2018 @ 7:52am

    Re: Re: Re:

    Disagree. ALL cops are fingerprinted. It has almost no effect on OTHER prints found at a crime scene.

    I'm not saying to add the cops to a test pool - I'm saying ALL cops should be REQUIRED to be in the database, and matches to them flagged.

    Think what the bodycams were supposed to do.

    If they're doing facial recognition on crowds of innocents, how can they justify NOT having the cops in that database?

    Cops are a necessary evil. Early in the dim mists of time, society as a whole realized it was marginally better to have them on the inside urinating out than outside urinating in.

    But they need watching, and what better tools to watch them than the ones they deploy against every non-cop?

    link to this | view in thread ]

  17. icon
    Bamboo Harvester (profile), 20 Dec 2018 @ 7:53am

    Re: Re: who makes balaclavas?

    If everyone started wearing them to defeat recognition cameras, a sonic method would be built to defeat the balaclavas...

    link to this | view in thread ]

  18. icon
    Cdaragorn (profile), 20 Dec 2018 @ 7:56am

    Re: Re: Re: Re:

    You're trying to apply moral wrongs as if they were legal wrongs.

    Of course you wouldn't want someone filming your daughters. Too bad there's nothing you could legally do to make them stop. The cops aren't special because what you're implying with this example is untrue.

    link to this | view in thread ]

  19. icon
    PaulT (profile), 20 Dec 2018 @ 8:13am

    Re: Re: Re: Re:

    "Traffic patterns if a private driveway is in their view. Archiving the comings and goings at the doors."

    So, the same as anyone in the street can see, or sat in an opposing property. Do you consider a stakeout to be a violation as well, if they can see your house as well as the intended target?

    "And the ALPR treats everyone as guilty. It scans EVERY plate that goes past it."

    Every plate that's out being driven on public roads, sure. You appear to be having problem with the automation, not the camera, which is a different issue.

    "So why are the cops "special" in this matter?"

    Because they are in theory putting them up in order to monitor and protect the public streets, which is a big part of their job.

    You may disagree with how they operate in reality, and how necessary the tools actually are. But, it's not exactly hard to see why a camera that may see the corner of your property that's visible from the public street while monitoring that street is different from one set up specifically to view your property.

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 20 Dec 2018 @ 8:27am

    Re: While I...

    Headlines of 100% failure rate, but if there were NO "hot list criminals" in the "sample group", the software is 100% effective

    The 100% failure rate was the rate at which the software incorrectly identified people as a criminal, not the failure to identify any actual criminals (though that also happened). Thus, we are at 0% effective.

    link to this | view in thread ]

  21. icon
    PaulT (profile), 20 Dec 2018 @ 8:32am

    Re: Re: Re: Re:

    You seem to be completely missing the point. This is not about whether or not cops are being monitored. The point is that this is a public beta test. You can't have an effect beta test that will hold up to scrutiny if the only test subjects are employees of the people running the tests.

    "I'm saying ALL cops should be REQUIRED to be in the database, and matches to them flagged."

    They may well be - AFTER the tests are concluded. If you're calling for them to be in the tests themselves, you're asking for potentially biased tests - the results of which will be used to justify full rollout of this technology, both in the UK and US. I'm sure that's not what you mean to be asking for, but you are.

    link to this | view in thread ]

  22. identicon
    Anon, 20 Dec 2018 @ 8:35am

    Several Points

    The issue with video is as much archiving as visibility - it's one thing to surveil someone or their property because they are an active subject in an ongoing investigation; another thing to compile dossiers, or archive collections, or whatever you want to call it, on citizens not involved in criminal activities. This is not the KGB - we don't build collections of data on anyone who happens to cross path with police collection processes.

    So the police can accidentally video your living room window - but they should not keep that data (or any data) forever, or even for months.

    The same applies to license plate data - perhaps the police compile lists of license plates. But they should not have a collection going back years, to show every movement you car has performed over that time - i.e. your complete movements for the past year. Maybe with a warrant they can collect ongoing data from these devices for specific individuals; the rest should be deleted in a reasonable time (a week? Two weeks?)

    Building a similar inaccurate database of "facially identified" people with a flawed program is only rife for abuse. "Evidence" will incriminate perfectly innocent people. "Your face was identified walking toward the crime scene. We have video of you in your living room 3 weeks before where you wear the same shirt the perp did. Our license plate reader saw you drive by 4 blocks from the crime scene an hour before. Please come with us."

    Of course, trying to test facial recognition with a location that contains probably one of the largest collection of different faces - major international tourist destinations - is sure to catch the largest possible incidence of doppelgangers. More interesting would be to see how many of these false positive faces were of other ethnic extractions. There's already articles suggesting the tech fails excessively for Chines and black faces.

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 20 Dec 2018 @ 8:40am

    Re: While I...

    I am a bit lost.. Is London on the hook for all amendments of the US Constitution or just the 4th?

    link to this | view in thread ]

  24. icon
    Bamboo Harvester (profile), 20 Dec 2018 @ 8:45am

    Re: Re: Re: Re: Re:

    If the police want to watch and film a private property, they're required to get a Warrant.

    If a traffic cam is watching a private property, the police have access to the footage captured without a Warrant.

    THAT is why it's a 4th violation (in the US).

    It's also not unknown for traffic cameras to get "accidentally" moved so they're watching a window or driveway instead of the intersection.

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 20 Dec 2018 @ 8:49am

    Re: Re: Re: Re: Re:

    There have been stories in the UK about bobbies using CCTV to watch women undressing in their flats.

    Of course it will be abused, anyone who thinks otherwise is being a bit silly.

    link to this | view in thread ]

  26. icon
    Bamboo Harvester (profile), 20 Dec 2018 @ 8:50am

    Re: Several Points

    Well said.

    I know the two local "traffic" cameras to my home are archived... forever.

    And while the local cops only get real-time, the State cops can access the entire archive.

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 20 Dec 2018 @ 8:51am

    Re: Re: Re: Re: Re: Re:

    "If the police want to watch and film a private property, they're required to get a Warrant."

    In the UK?

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 20 Dec 2018 @ 8:52am

    Re: Re: Re:

    Uh-huh, sure.

    link to this | view in thread ]

  29. identicon
    Anonymous Coward, 20 Dec 2018 @ 8:55am

    Re: Re: Re:

    One could test this theory in a real life situation.

    For example, one could park their vehicle on the public road in a spot where parking is allowed and point several cameras at one particular private property. Use of binoculars will help in the test scenario.

    Then simply wait for the police to show up ... or worse - the vigilantes.

    link to this | view in thread ]

  30. identicon
    Anonymous Coward, 20 Dec 2018 @ 8:57am

    Re: Re: Re: who makes balaclavas?

    Actually - I was thinking the makeup industry should be worried.

    link to this | view in thread ]

  31. identicon
    Anonymous Coward, 20 Dec 2018 @ 9:05am

    I'm waiting for the typical hollywood movies portraying facial recognition as though it actually works.

    Zoom - Enhance - Rotate ... exact match again!

    link to this | view in thread ]

  32. icon
    Thad (profile), 20 Dec 2018 @ 9:15am

    Re:

    "Waiting for"? Movies and cop dramas have been depicting perfectly-functional facial recognition technology since at least the 1990s.

    link to this | view in thread ]

  33. icon
    James Burkhardt (profile), 20 Dec 2018 @ 9:33am

    Re: While I...

    The 100% failure rate in the headline was, in the article, shown to reference the 100% false positive rate.

    In May, a Freedom of Information request from Big Brother Watch showed the Met's facial recog had a 98 per cent false positive rate.

    The group has now said that a subsequent request found that 100 per cent of the so-called matches since May have been incorrect.

    The line you quote references false negatives.

    If the software had no criminals but flagged multiple people as criminals, it most certainly had a 100% failure rate - every flag was a failure.

    There is in fact, in this data, no way to find anything but 100% failure. Best case, no criminals were present and all that was available to flag were innocents. 100% failure by finding criminals where none existed. Certainly not 100% effective.

    The more criminals actually present and in the database, the worse it looks, because then it identified innocents as criminals, and failed to find the actual criminals.

    link to this | view in thread ]

  34. icon
    nasch (profile), 20 Dec 2018 @ 9:40am

    Re: Re: Re: Re: Re: Re:

    If the police want to watch and film a private property, they're required to get a Warrant.

    Not if they're doing it from a public location.

    link to this | view in thread ]

  35. icon
    James Burkhardt (profile), 20 Dec 2018 @ 9:45am

    Re: Re:

    I Disagree. As a rule, computers don't make mistakes. Except in rare occations, they do what they are told to do.

    The issues are A) people don't understand what they are telling the computer to do and B) We are bad at telling a computer how to do a bunch of things that we do at an instinctual level. C) Software Engineers and Police don't think the same way D) The computers at issue are dealing with maybes but are being programmed with yes or no responses.

    I see it all the time in my office. People THINK the computer is just doing "whatever it wants". But they don't understand what the computer is doing, how it is doing it, so when it acts differently than they expect, its 'going crazy'.

    link to this | view in thread ]

  36. icon
    Get off my cyber-lawn! (profile), 20 Dec 2018 @ 10:12am

    Stretch goals are important!

    Okay, so 98% is good but if we work harder, we can reach that coveted 100% goal of false positives!

    btw - my alltime fav Dilbert cartoon is the one about safety accident goals and working harder to reach that number since there weren't enough safety accidents the previous year.

    link to this | view in thread ]

  37. icon
    PaulT (profile), 20 Dec 2018 @ 10:22am

    Re: Re: Re: Re: Re: Re:

    “If the police want to watch and film a private property, they're required to get a Warrant”

    For all nearby properties that might be in shot, it just the one they intend to film? If the latter, they don’t need one just because your property is next to the public road they’re filming.

    link to this | view in thread ]

  38. identicon
    Annonymouse, 20 Dec 2018 @ 10:26am

    Testing testing testing

    I am perplexed as to why they are doing the beta testing in the field like this.
    I would have chossen a more challenging yet controlled environment. The foyer to a number of the larger police stations. That would have provided a significant number of hits for fellons and cops alike.

    link to this | view in thread ]

  39. identicon
    Anonymous Coward, 20 Dec 2018 @ 12:46pm

    Re: Re: Re: Re: Re: Re: Re:

    There was a story about some guy who objected to the use of his field for placing a camera to watch neighbors farm, he was told there was nothing he could do about it. Private property my ass.

    link to this | view in thread ]

  40. identicon
    Anonymous Coward, 20 Dec 2018 @ 12:48pm

    Re: Re:

    Guess I missed them. I'm sure they were totally realistic - not.

    link to this | view in thread ]

  41. icon
    PaulT (profile), 20 Dec 2018 @ 12:54pm

    Re: Testing testing testing

    Because they want to sell it to people who want to watch large numbers of crowds, and the average UK police station doesn’t hold as many people as Leicester bloody Square

    link to this | view in thread ]

  42. icon
    That One Guy (profile), 20 Dec 2018 @ 1:04pm

    'You first'

    Ignoring for a moment privacy implications, want to make sure that the tech is tested and accurate before it's aimed at the public? Aim the cameras at the entrance to police stations and government buildings first, with the public given the same access to that data as the police get to the data from cameras aimed at the public.

    I suspect that accuracy(or lack thereof) would suddenly become a very important selling point, practically overnight.

    link to this | view in thread ]

  43. icon
    Thad (profile), 20 Dec 2018 @ 1:09pm

    Re: Re: Re:

    Person of Interest was pretty good. I mean, the first season was pretty much just a standard cop show with a magic computer, but after that it turned into a much deeper and more interesting SF show about the ramifications of surveillance and AI.

    It's really kind of a fascinating middle-step between The Dark Knight and Westworld (Jonathan Nolan wrote or co-wrote all three).

    link to this | view in thread ]

  44. icon
    Beta (profile), 20 Dec 2018 @ 1:14pm

    Dogberry tech

    "The tech is basically a police force on steroids -- capable of demanding ID from thousands of people per minute... The difference is no one's approaching citizens to demand they identify themselves."

    To be fair, there is another difference: the tech does not detain those it cannot identify -- at least, not yet. Shakespeare himself made fun of watchmen who behave like that. ("Why, then, take no note of him, but let him go; and presently call the rest of the watch together and thank God you are rid of a knave.")

    link to this | view in thread ]

  45. identicon
    Anonymous Coward, 20 Dec 2018 @ 6:08pm

    Re: Re: Re:

    Sometimes reality is decades behind. I'm still waiting for software that will take a two-dimentional photograph and map it into a 360-degree 3D model by combining all the distorted reflections off glass and other shiny surfaces to reconstruct everything that's out of view of the camera and basically "see around corners". Maybe one day such imaging tools won't just be science fiction anymore.

    link to this | view in thread ]

  46. identicon
    TripMN, 20 Dec 2018 @ 7:35pm

    Re: Re: Re:

    James, aren't you a lawyer by trade?

    As a software engineer, I commend you on your understanding of computers that many outside of the engineering fields don't get. Computers do what they are told, very quickly and very efficiently. We usually run into trouble with "software" and "data" because the humans involved don't fully understand what they've told the computer to do or don't understand how their code will act/react on data coming in from the real world.

    Works in the lab/on my computer is often taken to mean that things will work the same in the real world ... and they never do.

    link to this | view in thread ]

  47. icon
    nasch (profile), 20 Dec 2018 @ 9:31pm

    Re: Re: Re:

    As a rule, computers don't make mistakes.

    That depends on how you define making a mistake. And it gets much fuzzier when you throw in machine learning (I'm not sure if this system uses that). Even without it, if the system is supposed to identify people in a database, and it has false positives, I'd say the system (meaning the software and hardware) made a mistake. Did the machine correctly follow the instruction sets that were fed to the processor? Yes, but we can look at computer systems at a higher level than that, and analyze whether they are fulfilling their function correctly. In this case, this system was not.

    link to this | view in thread ]

  48. identicon
    Anonymous Coward, 20 Dec 2018 @ 11:17pm

    Re: Re: who makes balaclavas?

    it that like an apparel item achievement in a game?

    link to this | view in thread ]

  49. icon
    Scary Devil Monastery (profile), 21 Dec 2018 @ 5:28am

    Re: Re: Re: Re: Re:

    "Of course you wouldn't want someone filming your daughters. Too bad there's nothing you could legally do to make them stop. "

    What.

    Seriously, look at...most examples of western law on this. If you manage to film someone in such a way that it intrudes upon or violates personal integrity of others then that is, by definition, breaking the law.

    Hence why placing a camera is usually circumscribed with legal concerns - you WILL be held responsible for what it captures.

    And as someone pointed out in this thread already, police appear to be getting away with this since there is literally no overview.

    link to this | view in thread ]

  50. icon
    Scary Devil Monastery (profile), 21 Dec 2018 @ 5:33am

    Re:

    "is there actually ANYTHING that the UK can do right"

    Fudge, marmalade, various curds. They used to be very good at dry british humor as well but they've been slipping in that regard now that the flying circus isn't active any more.

    link to this | view in thread ]

  51. identicon
    Anonymous Coward, 21 Dec 2018 @ 8:15am

    Re: Re: Re:

    Guess I missed them.

    ...you missed The Dark Knight?

    At a certain point, "deliberately avoided" is the correct term to use.

    link to this | view in thread ]

  52. icon
    PaulT (profile), 23 Dec 2018 @ 12:51am

    Re: Re: Re: Re: Re: Re:

    "And as someone pointed out in this thread already, police appear to be getting away with this since there is literally no overview."

    Are they "getting away with it"? Or, is it just that there's different standards applied to someone deliberately targeting your property than applied to someone happening to catch a small part of your property when looking at the public street? If the scenario he fears is even happening at all.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.