Find A Vulnerability In Apple Software; Lose Your License As An Apple Developer

from the kill-the-messenger dept

It appears that Apple is the latest company to take a "kill the messenger" approach to security vulnerabilities. Hours after security researcher Charlie Miller found a huge vulnerability in iOS, which would allow malicious software to be installed on iOS devices, Apple responded by taking away his developer's license.

The obvious implication: don't search for security vulnerabilities in Apple products, and if you do find them, keep them to yourself.

First off, here's Miller explaining the security hole:
To be fair, Miller did get Apple to approve an app that he was using to demo the security flaw. However, kicking him out of its developer program is exactly the wrong response. Miller, clearly, was not looking to use the code maliciously -- just demoing a problem with their system. In other words, he was helping Apple become more secure, and they punished him for it. The message seems to be that Apple doesn't want you to help make their system more secure. Instead, they'd rather let the malicious hackers run wild. As Miller noted to Andy Greenberg at Forbes (the link above):
“I’m mad,” he says. “I report bugs to them all the time. Being part of the developer program helps me do that. They’re hurting themselves, and making my life harder.”
And, no, this is not a case where he went public first either. He told Apple about this particular bug back on October 14th. Either way, this seems like a really brain-dead move by Apple. It's only going to make Apple's systems less secure when it punishes the folks who tell it about security vulnerabilities.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: blame the messenger, charlie miller, ios, security, vulnerabilities
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Gaurav, 8 Nov 2011 @ 8:25am

    Security by obscurity FTW!

    PS: Another reason I am staying away from Apple and the shine.

    link to this | view in chronology ]

  • icon
    weneedhelp (profile), 8 Nov 2011 @ 8:27am

    I hope the next developer just goes public with it. What a stupid thing to do on crApple's part.

    link to this | view in chronology ]

  • identicon
    Evil Closet Monkey, 8 Nov 2011 @ 8:30am

    False Blame

    > The obvious implication: don't search for security
    > vulnerabilities in Apple products, and if you do find
    > them, keep them to yourself.

    This is not at all the proper implication. Miller, the author, was not removed from the developer program for finding and reporting the bug. He was removed for knowingly creating and uploading an application to the App Store that *exploited* the bug! The application was sitting in the App Store for over a month.

    Although unlikely, someone else could have downloaded the app, found the vulnerability and maliciously exploited it.

    He broke the rules to which he agreed when signing up to the developer program. He was removed from the program as a result of that.

    The true "obvious implication" to draw from this is: If you find a vulnerability report it through the appropriate means, create something that demonstrates the exploitation (if you so desire) but don't upload the app you know breaks the rules to the App Store.

    link to this | view in chronology ]

    • icon
      molecule (profile), 8 Nov 2011 @ 8:35am

      Re: False Blame

      "The true "obvious implication" to draw from this is:"

      If you find a vulnerability, then sell it to the highest bidder.

      link to this | view in chronology ]

      • icon
        Mike42 (profile), 8 Nov 2011 @ 8:56am

        Re: Re: False Blame

        Yep, that's exactly what to draw from this.
        Monkey, you need a dose of reality. Just because you can do something, doesn't mean it's a good idea to do it. Lots of other companies (Sun, Microsoft, etc) have already found this out. If you don't pay attention to the bug that's reported, and you don't give any feedback, security guys will force your hand by creating an app that leverages the flaw. They've been doing it for years. This is Apples first time through the security line, because this is the first time they have had enough users to make it profitable to hackers. We're about to find out how virus proof Apple isn't.

        link to this | view in chronology ]

        • icon
          Evil Closet Monkey (profile), 8 Nov 2011 @ 9:32am

          Re: Re: Re: False Blame

          Your timeline of what happened is incorrect, Mike.

          The author did not report the bug and wait. He didn't tell Apple he found an exploitation and has an app waiting to demonstrate it. He didn't upload the app to the App Store after waiting a week, month... or however long.

          The reality is Miller found an exploit, created an app, and uploaded it to the App Store. After that, he made public that he had found an exploit and demonstrated it using the publicly available app (which had been there for over a month).

          There was no bug reported. There was no opportunity to force the corporate hand. What you don't do is exploit that bug first in the working sandbox first.

          Did we not read the last paragraph in my original post? The part that points out that one *SHOULD* report bugs and create something that demonstrates the exploitations?

          link to this | view in chronology ]

          • icon
            Greevar (profile), 8 Nov 2011 @ 10:04am

            Re: Re: Re: Re: False Blame

            "And, no, this is not a case where he went public first either. He told Apple about this particular bug back on October 14th."

            I think this points out how wrong you are. He told Apple first, then he went public with it. It seems reasonable to me. They don't listen otherwise. Tell them first and then make them take action by telling everyone else. Unfortunately, Apple acted foolishly in response. I think Apple doesn't want people to know that their products aren't as secure as they claim they are. Lack of confidence in the brand can be costly.

            link to this | view in chronology ]

            • icon
              Evil Closet Monkey (profile), 8 Nov 2011 @ 10:18am

              Re: Re: Re: Re: Re: False Blame

              The app was in the App Store since September.

              link to this | view in chronology ]

              • icon
                Greevar (profile), 8 Nov 2011 @ 2:16pm

                Re: Re: Re: Re: Re: Re: False Blame

                Yes, and he didn't go public with this information until Apple was informed. How was he going to test and demo the vulnerability without replicating it? The only way to prove the vulnerability is to actually inject the malicious app into the app store. You're just trying to invent a way to exonerate Apple and blame Miller.

                link to this | view in chronology ]

    • icon
      Beta (profile), 8 Nov 2011 @ 8:52am

      Re: False Blame

      "...Someone else could have downloaded the app, found the vulnerability and maliciously exploited it."

      Are you suggesting that a miscreant could have discovered a vulnerability by downloading an app with very simple behavior, decompiling it, then sifting through the code looking for Easter eggs? Anyone with the skill to do that could have found the vulnerability far more easily by studying the code-signing protocol, the way Miller did.

      Apple's actions may be defensible legally, but only legally.

      link to this | view in chronology ]

      • icon
        Evil Closet Monkey (profile), 8 Nov 2011 @ 9:19am

        Re: Re: False Blame

        Nice use of ellipsis -- it's like a movie quote! Looking beyond the omission of "although unlikely" in the original sentence, your analysis of the level of effort to discover such an item is over-complicated.

        Understanding the nature of what the vulnerability was (in hindsight) illustrates that it could have been discovered by very simple means. As unlikely as the an individual picking this app is.

        The discussions over at Gizmodo on this story are at least somewhat insightful from a technical point of view, instead of people just running at it from a "big company keeping down the little guy" point of view.

        Apple as a sandbox called the "App Store". They've made rules for playing in the sandbox; like "don't piss in the sandbox". If you piss in the sandbox, you're not allowed to play in the sandbox for a little while.

        > Apple's actions may be defensible legally, but only
        > legally.

        You're stretching the notion of "legally" a bit there. We're taking a private company here. If Apple wanted to refuse an app because they didn't like the color scheme, they could. They don't have to have a reason to reject an app from the store -- they just can, because they want to.

        This isn't a moral question. There are rules set up for developers who wish to participate in the App Store. Miller *BROKE* those rules!

        Should Apple be working their butts off to fix this? Yes.

        Should they have fixed it sooner? Dunno. Maybe they've been trying since it was discovered. Maybe they've been busy playing table tennis instead.

        Could Apple have looked the other way? Could they have punished him a little less? Could they still reverse or revise the decision? Yes. Yes. Yes.

        Do they need to? No. A developer knowingly introduced an app into the store that exploited a security flaw, and was punished according to the agreements he signed.

        link to this | view in chronology ]

        • icon
          Beta (profile), 8 Nov 2011 @ 12:19pm

          Re: Re: Re: False Blame

          "Understanding the nature of what the vulnerability was (in hindsight) illustrates that it could have been discovered by very simple means. As unlikely as the an individual picking this app is."

          If I'm parsing this correctly, you're saying that once the app was in the store, the vulnerability was as likely to be discovered by someone picking the app (from all the apps in the store), decompiling it, analyzing the results and discovering the exploit, as by studying iOS. That is absurd. I have lost count of the times I've heard of an independent researcher discovering a hole in a large, supposedly secure IT system; I have never heard of someone discovering a hole by deconstructing an app or other published software which shows no sign of malicious behavior.

          (As for the rest of your argument, I really can't make any sense of it. You seem to be agreeing with me, then claiming victory.)

          link to this | view in chronology ]

          • icon
            Evil Closet Monkey (profile), 8 Nov 2011 @ 1:35pm

            Re: Re: Re: Re: False Blame

            Your analysis of the statement is incorrect. I am saying that decompiling the app is not at all a necessary step, because it is not. But the likelihood of someone finding that particular app and discovering the exploit is irrelevant to the argument.

            > (As for the rest of your argument, I really can't make any
            > sense of it. You seem to be agreeing with me, then claiming
            > victory.)

            If you are referring to the "actions may be defensible legally" aspect of the conversation, then yes - I am agreeing with you. I hadn't claimed (or meant to) anything to the contrary.

            The simple fact is Miller broke his agreement that he signed with Apple when he uploaded a compromised application to the app store. As a result, his license was suspended per that agreement.

            Others have pointed this out from slightly different angles...

            See John Fenderson's post in this thread, or the "Two things" or "He broke the App store agreement..." threads below.

            link to this | view in chronology ]

            • identicon
              S, 9 Nov 2011 @ 3:03pm

              Re: Re: Re: Re: Re: False Blame

              This is you:

              "You broke the rules and breaking the rules is bad and you need to be punished."

              "So, did I actually do anything wrong?"

              "You broke the rules and breaking the rules is bad and you need to be punished."

              link to this | view in chronology ]

    • icon
      DannyB (profile), 8 Nov 2011 @ 9:42am

      Re: False Blame

      > Although unlikely, someone else could have downloaded
      > the app, found the vulnerability and maliciously
      > exploited it.

      Considering that (from TFA) the guy had worked for the NSA, I suspect he would have prevented that possibility. The App's purpose was to demonstrate the bug, not exploit it.

      The ultimate demonstration is having a signed, trusted app do something untrusted.


      > He broke the rules to which he agreed when signing
      > up to the developer program.

      Without doing so, it may be impossible to confirm and demonstrate the bug.

      link to this | view in chronology ]

      • icon
        Evil Closet Monkey (profile), 8 Nov 2011 @ 10:15am

        Re: Re: False Blame

        > Without doing so, it may be impossible to confirm and
        > demonstrate the bug.

        In part, I agree. To expand on something you mentioned earlier in your post...

        > The ultimate demonstration is having a signed, trusted
        > app do something untrusted.

        I would expand on that and add that getting such an app into the App Store (i.e., getting it approved by Apple itself) is the ultimate demonstration. You may have been implying this already by "signed, trusted app".

        To show that a signed app, running in the iOS sandbox, could do something untrusted does not require its presence in the App Store. You can demonstrate this on a development platform.

        It becomes tricky when you add "trusted" into it -- as you point out. At what point is it "trusted"? When Apple approves it for the App Store?

        I'll say "yes" to the above. The app has passed the final review, so it can't be any more trusted then that.

        The bump in the road here is that the exploitation was not previously disclosed. First step is to reveal the vulnerability and give the appropriate company (Apple, in this case) time to fix it (how much time is a different discussion).

        If the exploitation had been disclosed, it would have provided Apple the opportunity to (1) fix it, or at least (2) watch for it and deny that "trusted" status so it never made it to the App Store.

        But, okay. People don't agree with that sentiment. We'll switch up the argument -- creating the app and getting it onto the App Store to demonstrate the full vulnerability was the right course.

        Miller should have, or could have: (1) pointed out the issue to Apple privately, (2) publicly exposed the issue and removed the app, or (3) discretely waited until the security conference until exposing it. Unfortunately, he made it public on Twitter well in advance.

        I *don't* believe he meant this to be malicious in any way. But he isn't just a "messenger".

        link to this | view in chronology ]

      • icon
        John Fenderson (profile), 8 Nov 2011 @ 10:36am

        Re: Re: False Blame

        I agree, but I think my response is more nuanced: they're both right.

        Miller did the right thing in his actions. They were for the greater good. But the right thing he did was to fall on his sword, because Apple also did the right thing in terminating his developer agreement. Miller deliberately violated a very important term in the contract, and to ignore that fact would also make Apple look bad by, in a sense, playing favorites.

        link to this | view in chronology ]

        • icon
          Evil Closet Monkey (profile), 8 Nov 2011 @ 10:53am

          Re: Re: Re: False Blame

          John, you rock.

          I hope my obvious lack of Internet commenting is apparent, because your post summarizes exactly what I've obviously not been able to express in umpteen replies.

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 Nov 2011 @ 9:55am

      Re: False Blame

      This is where it's tricky. He was exploiting an issue with the app approval process, which I would assume is just some sort of automated test suite that Apple runs application requests through. The only way to test the exploit is to have an app successfully go through the approval process and post to the app store.

      link to this | view in chronology ]

      • icon
        Evil Closet Monkey (profile), 8 Nov 2011 @ 10:31am

        Re: Re: False Blame

        True. It is nice when people see the different sides of an argument and analyze it critically... instead of just jumping on me. :)

        Part of what he was testing was the approval process, in that "trusted" apps could do untrusted things. DannyB points out the same notion above, to which I also commented on.

        But the ability to demonstrate how to run untrusted code from within an app did not require it to be in the App Store. The ability to get that app approved was, yes.

        But from within the development community, demonstrating the process outside the App Store would have not required a significant mental leap by anyone to see the potential.

        Had he demonstrated this on an app outside the App Store first, then waited (how long is a different conversation) before testing the app approval process, this would absolutely be a different story.

        link to this | view in chronology ]

      • identicon
        Anonymous Coward, 8 Nov 2011 @ 10:36am

        Re: Re: False Blame

        just some sort of automated test suite that Apple runs application requests through

        Nope, every application is reviewed by humans. Remarkably stupid humans based on the number and types of rejections.

        I tried to put a simple app in the store and it went through 5 rejections. The best one was "Use of a folder icon." The design guide says you shouldn't make any UI which implies the existence of a directory structure ... because if Apple users ever accidentally learn how computers work they won't be Apple users anymore.

        link to this | view in chronology ]

  • identicon
    Christopher Gizzi, 8 Nov 2011 @ 8:34am

    Disappointed

    I'm a huge Apple fan and have looked past some of the weirder things done. But I agree, this is going to make them less secure. Who want's to risk their license to promote a platform with apps if they risk losing their hard work. I'd like to see Apple reinstate this guy's license (or change the license such that there is a documented process for exposing security bugs - which might exist and I'm just unfamiliar with).

    Anyway...

    I do want to ask the security community a question, though. How fast does a large corporation have to respond to a security issue before going public? Is one month a realistic timeframe when there are development cycles that can't always to respond to the countless number of submissions by the community? Google, Microsoft, Apple and others aren't immune to the volume of requests that come in on any given day.

    I'm not sure one month is enough and while it's good that the issue was discussed in publich without detailing the flaw exactly, do people have to be so trigger happy with disclosure when, as a security analysist, he's said it's a pretty obscure bug. Dangerous... but very obscure.

    What if a three strikes rule (I almost shudder at saying such a thing in these forums when its such a hot topic in other circles) is a way to compromise on disclosure? Can security researchers agree that if after a third time following up with a vendor on software defects - with one month span in between emails - they go public?

    It just seems to have disjoined responses by different security researchers on these types of bugs. Otherwise, Apple et al are left putting out fires in their development cycle and disrupting change on the threat someone doesn't think they move fast enough (which, again, could be different from person to person).

    link to this | view in chronology ]

    • identicon
      Christopher Gizzi, 8 Nov 2011 @ 8:37am

      Re: Disappointed

      BTW... I've just noted that this comment had a lot of gramatical errors (possibly spelling, too). That's what you get for having two cups of coffee, no breakfast, and typing on an iPad.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 Nov 2011 @ 9:18am

      Re: Disappointed

      Responsible bug disclosure has been a very active discussion for decades. There are *tons* of procedures that have been outlined and proposed and followed by grey hats for a long time on how to responsibly disclose security matters. They all take slightly different approaches to balancing the interest of the maintainer and the community.

      Any procedure generally held in respect by the community usually has explicit protocol to follow if the maintainer in question fails to address the vulnerability. For example, a tiered approach, where in the first month the vulnerability is disclosed to the company, in the second month it is disclosed to security researchers of at least 3 respected security firms, and in the third month is disclosed to the entire internet, including the black hat community.

      Apple probably feels it acted in accordance with its policy regarding the app store, but I don't think they're being very smart about the matter. They're drawing publicity to a security vulnerability they haven't fixed yet.

      link to this | view in chronology ]

    • icon
      Atkray (profile), 8 Nov 2011 @ 9:23am

      Re: Disappointed

      "How fast does a large corporation have to respond to a security issue before going public?"

      Response time should be near zero, implementation is a different story.

      If something is brought to your attention you need to have a verifiable process for dealing with and prioritizing it.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 Nov 2011 @ 10:41am

      Re: Disappointed

      Most companies, like Microsoft and Google, have a very specific protocol and will work with the hacker on a timeline for releasing a fix. Often the hacker agrees to wait until the fix can be implemented but there are a few exceptions which typically include: when the company decides on a timeline that is far too long or inadequate (i.e. the next version of Windows), when the hacker finds that someone is already exploiting the bug, or when the company refuses to respond.

      I'm guessing Apple shrugged off his concerns and didn't give him any kind of response (let alone a timeline for a fix) which pretty much forces your hand. While announcing these to the public is temporarily dangerous, failure to inform people is far worse.

      link to this | view in chronology ]

    • icon
      John Fenderson (profile), 8 Nov 2011 @ 10:47am

      Re: Disappointed

      I work for a very major company that produces security software, and handle software faults that themselves present security implications. I don't know how other companies handle it, but I can tell you how mine does:

      Someone (99% of the time a customer) reports a security vulnerability. This person is responded to immediately the problem is handled with them the same way any other software fault is handled: an ongoing dialog while our internal process continues.

      This does not necessarily mean the fault is solved immediately -- maybe not even within a month. It depends on the fault. Some things (even things that seem easy) require a tremendous effort to resolve.

      If the reporter were to publicly disclose the problem prior to our resolution, it would not force our hand or make things get fixed faster. Security problems are already given maximum attention. All it would do would be to increase the risk to other customers until the problem is resolved.

      Not all companies take such issues so seriously, of course, and the threat of public disclosure may spur them into action. We try to reassure the reporter that things are being taken seriously through dialogue in part to prevent premature disclosure.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Nov 2011 @ 8:14am

      Re: Disappointed

      If a company provides a 'faulty' product to their customers, how long should they have to 'fix' the situation and remedy the 'faulty' product before someone else comes along and exploits the flaw causing untold havoc to their customers?

      Are you seriously saying that companies should have 3 months to fix 'zero-day' exploits, and that the customers 'exploited' during those 3 months should have no recourse?

      If you go to the store and buy a product, get home and it doesn't work, would you allow the store 3 months, with 1 month between each e-mail before they provide you with a working product?

      link to this | view in chronology ]

  • icon
    DannyB (profile), 8 Nov 2011 @ 8:34am

    This is exactly how Microsoft once responded

    And other vendors also.

    The result was that the policy of those finding vulnerabilities became to publicly disclose the vulnerability immediately.

    But then vendors cried that it was irresponsible to publicly disclose these vulnerabilities before they could be patched.

    So then the policy became to secretly disclose the vulnerability only to the vendor so they could fix it.

    Then vendors just sat on those reports and did not fix the vulnerabilities.

    So then a policy became to disclose to the vendor, and then after a reasonable period, regardless of vendor (in)action, disclose to the public.

    And yes, in the last three decades, there have been others who have been punished for trying to help a vendor fix a security problem in their software.

    The moral of the story: disclose anonymously to the vendor. Later, disclose anonymously to the public.

    Some vendors figured out to PAY bounties to those who discover vulnerabilities instead of punishing them.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 Nov 2011 @ 8:47am

      Re: This is exactly how Microsoft once responded

      The only policy that works is immediate full public disclosure. Microsoft, Apple, et.al. have BILLIONS of dollars and should be paying highly qualified senior engineers to be on standby 24x7 for just such events -- and therefore should not whine about being blindsided. (Want fewer disclosures? Fix your damn code.)

      There's no reason at all to give these companies the benefit of free professional consulting. They've conclusively proven that they will (a) ignore (b) deny (c) threaten (d) punish -- so why bother with them? Just publish on the full-disclosure list or elsewhere, and let them reap the whirlwind.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 8 Nov 2011 @ 9:18am

        Re: Re: This is exactly how Microsoft once responded

        Public disclosure is a last resort for a reason. I get that you are trying to "stick it to the man", but the fallout from the harm you cause others makes you the villan.

        Can you cite incidents where Apple has ignored, denied, threatened or punished someone for reporting vulnerabilities through the proper channels? Answer: NO, because there are no incidents like that.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 8 Nov 2011 @ 9:29am

          Re: Re: Re: This is exactly how Microsoft once responded

          but the fallout from the harm you cause others makes you the villan.

          That's obvious nonsense, of course. Merely reporting facts accurately makes NOBODY the villain. Clearly, those to blame for any adverse consequences are (a) those who take malicious action based on the facts and (b) those whose incompetence, laziness, and stupidity are directly responsible for the situation.

          Can you cite incidents where Apple has ignored, denied, threatened or punished someone for reporting vulnerabilities through the proper channels?

          If you can't, then you're clearly far too uninformed to be worthy of participation in this discussion. Please run along and provide yourself with the appropriate remedial education before continuing.

          link to this | view in chronology ]

          • identicon
            S, 8 Nov 2011 @ 2:05pm

            Re: Re: Re: Re: This is exactly how Microsoft once responded

            Remedial education doesn't fix remedial people.

            link to this | view in chronology ]

            • icon
              nasch (profile), 8 Nov 2011 @ 6:41pm

              Re: Re: Re: Re: Re: This is exactly how Microsoft once responded

              Remedial education doesn't fix remedial people.

              People aren't remedial. Remedial means "intended to remedy, as in a shortcoming".

              link to this | view in chronology ]

      • icon
        DannyB (profile), 8 Nov 2011 @ 9:50am

        Re: Re: This is exactly how Microsoft once responded

        > The only policy that works is immediate full public disclosure.

        That is an irresponsible policy.

        A more responsible one is to disclose to the vendor and then disclose publicly after a short window for the vendor to fix it. It is not a matter of free consulting. (In fact, I like that Google has paid bounties to people who find these kinds of things.) It is a matter of social responsibility.

        If you disclose to the public with no prior notice to the vendor, then how is this getting you any monetary gain? It's not. So I don't see your argument about "free consulting". Yet immediate disclosure is clearly less socially responsible because it gives the bad guys an opportunity to exploit it.

        As you say, the vendor has millions of dollars. A short window to fix it should be plenty -- regardless of the severity of the problem -- because the vendor has vast resources to get it fixed quickly. Assuming the vendor feels a responsibility to fix it at all.

        Whether or not the vendor fixes it in a short time, you should publish your findings.

        link to this | view in chronology ]

        • identicon
          Anonymous American, 8 Nov 2011 @ 10:21am

          Re: Re: Re: This is exactly how Microsoft once responded

          ...immediate disclosure is clearly less socially responsible because it gives the bad guys an opportunity to exploit it.

          Doesn't this presume they're not already exploiting it?

          link to this | view in chronology ]

          • icon
            The Groove Tiger (profile), 8 Nov 2011 @ 11:23am

            Re: Re: Re: Re: This is exactly how Microsoft once responded

            Since it's not public, the only ones exploiting it are those who have found it in a first person basis.

            If a larger group was exploiting, it would have become public already.

            Complete public disclosure enables anybody with skill and malicious intent to exploit it.

            So, I have to agree that the company should be given a warning shot "fix this or else". Companies will eventually catch on and try to avoid the "or else".

            link to this | view in chronology ]

            • identicon
              Rich Kulawiec, 8 Nov 2011 @ 12:50pm

              Re: Re: Re: Re: Re: This is exactly how Microsoft once responded

              Since it's not public, the only ones exploiting it are those who have found it in a first person basis.

              Wrong.

              There are some very healthy marketplaces where vulnerabilities are bought and sold and traded. They're not public -- well, not very public. But they exist, and it's quite routine for security holes and sample exploit code to be sold -- either once, to the highest bidder, or repeatedly, to anyone with sufficient funds.

              And because these known marketplaces exist, we must posit the existence of others that may be entirely unknown except to the very few who frequent them. These no doubt cater to exclusive clients who have demonstrated a willingness to pay handsomely.

              Of course any of the buyers in either kind of marketplace could then re-sell what they have -- or use it themselves. Or both.

              link to this | view in chronology ]

        • identicon
          Rich Kulawiec, 8 Nov 2011 @ 11:19am

          Re: Re: Re: This is exactly how Microsoft once responded

          You are making two very foolish assumptions.

          First, you're presuming that the bad guys don't already have it. In all likelihood, they do. And it's further likely that they've had it for a while. Keep in mind that the bad guys have long since demonstrated superior intelligence, ingenuity and persistence when compared to the software engineers at places like Microsoft and Oracle and Apple. Their livelihood depends on it. Given this, failure to disclose publicly immediately provides considerable aid to the bad guys, who can continue exploiting security holes against victims who don't even know they're victims and thus are unlikely to take steps to defend themselves.

          Second, you're presuming that "disclosing to the vendor" is not the same as "disclosing to the bad guys". But of course in many cases it is, due to pitifully poor vendor security -- or the simple expedient of having a well-paid informant on the inside. (If I were a blackhat, I'd certainly cultivate contacts at major software vendors. I'm sure they have staff that are underpaid and/or disgruntled, and thus readily susceptible to bribes. Or blackmail.) So if they didn't already know...they'll know now. And once again, victims will be the only ones left in the dark.

          The only way to level the playing field is to fully disclose everything immediately and publicly. It's unlikely to be news to the more talented bad guys and it will at least inform victims and potential victims what they're up against. Not doing so is a major win for blackhats.

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 8 Nov 2011 @ 10:44am

        Re: Re: This is exactly how Microsoft once responded

        Want fewer disclosures? Fix your damn code.

        Spoken like someone who has never written any code in his/her life.

        And let me preempt any "I'm a full time programmer" response by saying, "No, you are not, or you would have never made such a stupid uneducated statement."

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 8 Nov 2011 @ 11:51am

          Re: Re: Re: This is exactly how Microsoft once responded

          I've written code for decades. If you run any modern BSD or Linux variant, you're running my code. (It might still be in Solaris, as well.)

          No, I'm not a full-time programmer, although I have been. What I am is paranoid, careful, meticulous. exacting, and thorough. And I expect no less of others -- but clearly, the overwhelming majority of people who program are thoroughly incompetent.

          link to this | view in chronology ]

          • identicon
            S, 8 Nov 2011 @ 2:09pm

            Re: Re: Re: Re: This is exactly how Microsoft once responded

            Dude, you're tilting at windmills; the idea of "do it right or don't do it" is so fucking outmoded it's sickening -- but you have my sympathy.

            (By which, I mean that it's sickening that such an idea became outmoded, because in practice it saves enormous amounts of money and immediately and permanently negates the present dilution of the programmer's trade.)

            link to this | view in chronology ]

          • identicon
            Anonymous Coward, 9 Nov 2011 @ 8:25am

            Re: Re: Re: Re: This is exactly how Microsoft once responded

            What I am is paranoid, careful, meticulous. exacting, and thorough.

            Which still doesn't guarantee bug free code.

            Let's not get confused, I'm not advocating that customers should be used as your first form of testing, far from it. The code should be reviewed, unit tested, regression tested, black box tested, and then go through multiple beta phases.

            But the flippant notion that you can write 100% bug free code the first time every time is asinine and only a person with an astounding lack of intelligence would make such an assertion.

            link to this | view in chronology ]

        • identicon
          jakn, 8 Nov 2011 @ 1:31pm

          Re: Re: Re: This is exactly how Microsoft once responded

          Spoken like someone who has never written any code in his/her life.


          Spoken like someone who doesn't write good code.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 9 Nov 2011 @ 8:19am

            Re: Re: Re: Re: This is exactly how Microsoft once responded

            Really? You write 100% bug free code the first time every time.

            I'll take that bet, I'll pay you $100,000,000 if you've written code for more than 2 years and managed to produce a product as complicated as an operating system which has zero bugs ... yeah, that's what I thought.

            link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 Nov 2011 @ 8:49am

    The true lesson to take from this is, disclose it in public anonymously, companies will punish you either way, but at least the public got to know what they need to stay away from.

    C'mon if this was a water poisoning would anybody accept that the problems be treated secretly and be disclosed after the whole mess was cleaned up?

    link to this | view in chronology ]

    • icon
      DannyB (profile), 8 Nov 2011 @ 9:54am

      Re:

      This is not like water poisoning.

      Poison water hurts people immediately whether disclosed or not.

      An un-exploited vulnerability hurts no one as long as it remains un-exploited by the bad guys.

      Giving the vendor a chance to fix it is likely to prevent harm. (IF the vendor fixes it, and IF no bad guys find it in the meantime.)


      Your analogy is wrong. Water poisoning is like the bad guy exploiting the vulnerability. That is the actual poisoning. The vulnerability is a way in which poison can be introduced. The knowledge of the vulnerability should be disclosed to those who can fix the vulnerability and prevent poison from being introduced into the water system.

      link to this | view in chronology ]

      • identicon
        Rich Kulawiec, 8 Nov 2011 @ 12:02pm

        Re: Re:

        An un-exploited vulnerability hurts no one as long as it remains un-exploited by the bad guys.

        This is seriously wrong.

        First, it presumes a nonsensical situation: that a vulnerability will remain unexploited. As everyone with any security clue at all knows, that's not going to happen. Now...it may be exploited poorly, or ineffectively, but it is absolutely inevitable it will be exploited.

        Second, existence of vulnerability X is often indicative of vulnerabilities Y and Z. And vice-versa. In other words, it is foolish to presume that a vulnerability known to the vendor is the ONLY one. It's equally foolish to presume that a vulnerability known to the bad guys is the ONLY one. Bad code usually doesn't have just one bug.

        Third, vulnerabilities are there for anyone with sufficient knowledge and determination to find. But they're also there for anyone with sufficient funds (to purchase) or guile (to steal). (Yes, I know that "steal" isn't quite the right verb, but a better one doesn't suggest itself immediately.) It is one of the massive, arrogant conceits of those who oppose immediate full public disclosure that they actually believe these things can be kept secret.

        They can't.

        They won't.

        The biggest beneficiaries of any attempt to conceal any part of this process are the very people it purports to defend against. Secrecy serves their purposes well.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 Nov 2011 @ 8:50am

    And some people are against Hacktivism, why?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 Nov 2011 @ 8:56am

    Regardless of his intentions his actions violated the terms of the agreement. As a developer on this platform I know the terms of the agreement and they clearly indicate that you cannot publish code that runs other code. Also, another term of the agreement is that the app cannot hide its intended purpose, in other words you can't release a trojan horse app. He should have informed Apple that the vulnerability was there and been done with it. Instead he published the app, violating the terms of the agreement. He got what he deserved.

    link to this | view in chronology ]

    • icon
      bjupton (profile), 8 Nov 2011 @ 9:09am

      Re:

      This blind reading of contract language without any attempt to look at motivation or spirit of the law is so disheartening.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 8 Nov 2011 @ 9:13am

        Re: Re:

        The flagrant disregard of mutual agreements has started more than one war. There is an open channel of communication to Apple to report vulnerabilities, he wanted to showboat by creating a video so he published his backdoor app to their store. Who is to say that he wouldn't do it again? The vulnerability will be corrected, he should have reported it to Apple without publishing the app.

        link to this | view in chronology ]

        • icon
          bjupton (profile), 8 Nov 2011 @ 9:24am

          Re: Re: Re:

          The vulnerability now will be corrected, since it was publicly pointed out. Apple won't be able to just ignore it.

          Shoot the messenger.

          link to this | view in chronology ]

          • icon
            Evil Closet Monkey (profile), 8 Nov 2011 @ 9:43am

            Re: Re: Re: Re:

            > Shoot the messenger.

            Unfortunately, he wasn't a messenger.

            Miller did not inform Apple of the bug before hand. He did not provide an opportunity to acknowledge and fix the bug.

            He knowingly uploaded an app that violated the terms and introduced a vulnerability. He then, other a month later, made a public announcement that the bug existed and the app to prove it is already in the App Store.

            link to this | view in chronology ]

            • identicon
              DCX2, 8 Nov 2011 @ 9:45am

              Re: Re: Re: Re: Re:

              The article says he did report the vuln to Apple.

              Can you cite any evidence to show when the app was uploaded and when Apple was notified?

              Also, how can you prove that a vuln exists without a proof-of-concept exploit? Is there any way to confirm the vuln without uploading something to the app store?

              link to this | view in chronology ]

              • icon
                Evil Closet Monkey (profile), 8 Nov 2011 @ 10:47am

                Re: Re: Re: Re: Re: Re:

                > The article says he did report the vuln to Apple.

                True. After he had already created and gotten an app approved into the App Store. So the vulnerability was public.

                > Can you cite any evidence to show when the app was
                > uploaded and when Apple was notified?

                From the original Forbes article:
                "Miller had, admittedly, created a proof-of-concept application to demonstrate his security exploit, and even gotten Apple to approve it for distribution in Apple’s App Store by hiding it inside a fake stock ticker program..."

                > Also, how can you prove that a vuln exists without a
                > proof-of-concept exploit? Is there any way to confirm the
                > vuln without uploading something to the app store?

                It depends on what you claim the ultimate vulnerability is. This is being discussed in greater detail in other threads there people are ripping me apart.

                There are two vulnerabilities here:
                1) The ability to actually run untrusted code
                2) The ability to get an app approved in the App Store that exploits that.

                In answer to your question for those two points:
                1) Yes. This can be shown outside the App Store.
                2) No, obviously not, since getting the app approved is the end-goal.

                Had he made #1 public first then demonstrated #2 after giving Apple a chance to respond this would absolutely be all about Apple covering something up.

                link to this | view in chronology ]

                • icon
                  khory (profile), 8 Nov 2011 @ 1:27pm

                  Re: Re: Re: Re: Re: Re: Re:

                  He could have gotten approval, but he did not have to publish the app and make it available to the public. Even if he did publish he could have pulled it immediately after. Why leave it up? Maybe he thought it would be good PR when he went public.

                  He made malicious code available to the public. This puts it in the hands of others who would use it for evil. And now they know to look for it. Hopefully the exploit is patched before that happens.

                  Anyone know if he tested his proof of concept by running code on someones device? That would be a serious issue whether the code was malicious or not.

                  link to this | view in chronology ]

        • icon
          bjupton (profile), 8 Nov 2011 @ 10:24am

          Re: Re: Re:

          I realize this is a sidebar.

          You said: "The flagrant disregard of mutual agreements has started more than one war."

          It is true that this is often seen as the flash point. But really, it is often the one sided nature of a 'negotiating process' that sets up the later conflict. A signed contract from the weaker party makes it a lot easier to invade. Even better if the contract that was signed had impossible to meet terms.

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 8 Nov 2011 @ 11:04am

        Re: Re:

        This blind reading of contract language without any attempt to look at motivation or spirit of the law is so disheartening.

        I've been increasingly disturbed by the courts desire to hold contract law above all other laws. There are still some rights you cannot contract away but they seem to be diminishing. I understand why contracts have gotten where they are today, I just don't understand how this is sustainable. Most individuals are party to dozens if not hundreds of contracts at any given time and it is ridiculous to think that people can truly understand and agree to that many contracts (home (probably 5 - 10 contracts alone), phone, cable, car, software, etc.)

        link to this | view in chronology ]

  • identicon
    Jeff, 8 Nov 2011 @ 9:32am

    I have said this crap before...

    that Apple doesn't care about its security. They like to boast that they are more secure... but they aren't.

    Also, why is the video unlisted? Make it public so Apple can find out why it's a bad idea to punish developers.

    link to this | view in chronology ]

  • identicon
    Steve, 8 Nov 2011 @ 9:57am

    Yet another example of Apple's poor security model

    This just further publicizes what security researchers have known for years. The PR projection that Apple products are inherently more secure is a farce. They are just as insecure as anything else out there, but much more dangerous because their loyal fan base is absolutely convinced that Apple products have superior security.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Nov 2011 @ 2:23pm

      Re: Yet another example of Apple's poor security model

      Take that sentence. Replace Apple with Google. You're still 100% right.

      Then do the same exercise with M$. Right again. Fun, no?

      link to this | view in chronology ]

  • identicon
    Troy Rosman, 8 Nov 2011 @ 10:05am

    Are you all really that sure of a reported timeline?

    So, how do any of you actually know what the timeline was? Really? Who do you trust from an informational standpoint? Is one source really more credible than another? Who's server logs do you trust to NOT have been tampered with? Who's email "timestamp" do you really believe?

    Does it really matter regardless?

    The basic fact that Miller was able to get an application approved and in the store that leveraged a very simple, and in hindsight, obvious hack in iOS is what should actually be setting off the alarms for everyone. I see this as the smallest tip of the iceberg starting to show above the waters for Apple. From a practical standpoint they'd be better off slapping Millers' wrist publicly then thanking him with an incentive behind the scenes. Perhaps even adding an associated non-disclosure and prioritizing any bug notices he sends them from now on. Technically what he did was in fact against the ToS but the point is that ill intentions are NOT going to be blocked by a ToS! Apple really needs to step up their own security and responses or things are going to get ugly for them really fast.

    link to this | view in chronology ]

    • icon
      bjupton (profile), 8 Nov 2011 @ 10:26am

      Re: Are you all really that sure of a reported timeline?

      "Technically what he did was in fact against the ToS but the point is that ill intentions are NOT going to be blocked by a ToS!"

      Well said, thank you

      link to this | view in chronology ]

  • identicon
    darren, 8 Nov 2011 @ 10:17am

    people still use apple?

    link to this | view in chronology ]

  • identicon
    Aaron, 8 Nov 2011 @ 10:57am

    Two things

    In Apples defence, they can't take the risk that he may have (though unlikely) been exploiting the security hole he exposed. His point that no one would take problem seriously unless there was an app in the app store is valid, but from the companies perspective, the first and fastest way to prevent more users from being infected by his app is to cut off the source. Hence, they killed his developer account so he can't upload anymore apps that may flaunt this security hole.

    If Apple did not kill his account and allowed him to upload more apps that did exploit other security holes to harass iPhone users, Apple could be liable because they knew he was someone who had published a malicious app in the past and took no action to prevent it in the future.

    Perhaps they could have just flagged the account to make all apps submitted by him are rejected until the flaw is fixed, but I doubt their system is that sophisticated.

    If Apple stops there and does nothing else, shame on them. However, I suspect they will patch this in an update soon.

    link to this | view in chronology ]

  • identicon
    MAC, 8 Nov 2011 @ 11:06am

    He broke the App store agreement...

    Why are so many of you brain dead?
    Or is it just that you want someone to jump on, especially if they are a big company with lots of fanboys/girls.

    He broke his agreement that he signed with Apple when he uploaded a compromised application to the app store. Period.

    Apple was well within it's rights to retaliate for this action.

    It has nothing to do with whether it's right or wrong or it made his life harder.

    He broke a contract that he, let me emphasize this; that he signed. Period. There is no other issue involved.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Nov 2011 @ 9:12am

      Re: He broke the App store agreement...

      I know right? These TD fanboys and anti-apple fanboys need to learn the definition of contract. Sheesh. Blame Apple all you want, the guy messed up. Badly.

      Sure their response might not have been appropriate, but it's not like the guy's was... "Ignore me and I'l post hacks disguised as apps, then complain when I get caught". Awesome. And you guys all bit and aren't letting go.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 Nov 2011 @ 11:19am

    Something similar to this happened a few years ago. Two people found a MAJOR vulnerability in MAC OS, told apple about it, then two months later went public with it. In response, if I remember correctly, Apple sued them, released a press release that said there were no issues, then 2-3 days later released a mandatory 130 MB security update for their system.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 Nov 2011 @ 3:40pm

    psssh

    Next time just go public on day 1 with massive security holes, sit back and laugh as apple goes through its Sony Phase and gets its systems utterly violated by a wide variety of interested parties......

    link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 8 Nov 2011 @ 4:32pm

    Given Apple's history of handling issues with their systems this does not surprise me. There are tons of charges being made to peoples iTunes accounts that they did not make. Many often point to 1 specific program, Apple refuses comment to the people who have their money taken. Apple refuses to respond to the large number of annoyed customers, and when they do it is condescending claiming that nothing is wrong with iTunes. Some people get their money back, some get shafted. If it was a few accounts I would blame the user, but when large numbers of your users are being hit with identical issues over and over... there is a flaw in your system.

    Apple is like the Pope if you think about it sideways.
    Apple tells you everything Apple is magical and just works, to admit a flaw is anathema to them.
    The Pope is the word of God, everything he says is true and just so. It is impossible for the Pope to admit a mistake because of Papal infallibility.

    Apple is so focused and concerned about their image, that they are alienating consumers. The sad thing is these consumers stick with Apple, even when they are forced to bear the burden of Apples failure.

    If you shoot the good guy who comes and tells you your flawed, do not then cry when they stop showing up. Do not complain when your myopic staff can not see the flaws and then less than good people will exploit for profit.

    What really needs to happen is there needs to be a system put in place where exploits can be submitted, verified, send to the affected company with a deadline of fixed or not this goes public in X days. They can hate on the faceless group then, rather than target the person who found the bug.
    Some flaws should not be suppressed, and it is far cheaper and easier for a corporation to launch lawsuits to crush the finder of the bug, than to fix the flaw. Then when someone else finds and exploits that flaw they can point at the original reporter to shift the blame to, didn't warn us, didn't give us enough time, released it to hurt us...

    So what exactly is the benefit to looking for flaws if your not out to exploit them? You get treated like crap, sued, and painted as evil. Are we surprised some people just skip to being evil? How many times can you kick them before they kick back?

    link to this | view in chronology ]

  • identicon
    John Cipolletti, 8 Nov 2011 @ 5:38pm

    Apple Negatively Attacks A Developer

    I am a pioneer software maker from the 1980s. One of my biggest problems with developing programs was Apple. They tried to control what I made for their machine. Never had a problem with any other company and their computer. I saw my first cease and desist letter from Apple. I told them that they were not the software police and I would fight them in court. Well, years later, things haven't changed. However, now Apple's huge size may finally make them the software police or at least a threatening bully!

    link to this | view in chronology ]

  • identicon
    Upgrade Software, 9 Nov 2011 @ 12:52am

    Off Course Apple

    Is this another sign that Apple are slowly veering off course without Captain Jobs at the helm?

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.