Evil Closet Monkey’s Techdirt Profile

boiler98

About Evil Closet Monkey




Evil Closet Monkey’s Comments comment rss

  • Nov 8th, 2011 @ 1:35pm

    Re: Re: Re: Re: False Blame

    Your analysis of the statement is incorrect. I am saying that decompiling the app is not at all a necessary step, because it is not. But the likelihood of someone finding that particular app and discovering the exploit is irrelevant to the argument.

    > (As for the rest of your argument, I really can't make any
    > sense of it. You seem to be agreeing with me, then claiming
    > victory.)

    If you are referring to the "actions may be defensible legally" aspect of the conversation, then yes - I am agreeing with you. I hadn't claimed (or meant to) anything to the contrary.

    The simple fact is Miller broke his agreement that he signed with Apple when he uploaded a compromised application to the app store. As a result, his license was suspended per that agreement.

    Others have pointed this out from slightly different angles...

    See John Fenderson's post in this thread, or the "Two things" or "He broke the App store agreement..." threads below.
  • Nov 8th, 2011 @ 10:53am

    Re: Re: Re: False Blame

    John, you rock.

    I hope my obvious lack of Internet commenting is apparent, because your post summarizes exactly what I've obviously not been able to express in umpteen replies.
  • Nov 8th, 2011 @ 10:47am

    Re: Re: Re: Re: Re: Re:

    > The article says he did report the vuln to Apple.

    True. After he had already created and gotten an app approved into the App Store. So the vulnerability was public.

    > Can you cite any evidence to show when the app was
    > uploaded and when Apple was notified?

    From the original Forbes article:
    "Miller had, admittedly, created a proof-of-concept application to demonstrate his security exploit, and even gotten Apple to approve it for distribution in Apple’s App Store by hiding it inside a fake stock ticker program..."

    > Also, how can you prove that a vuln exists without a
    > proof-of-concept exploit? Is there any way to confirm the
    > vuln without uploading something to the app store?

    It depends on what you claim the ultimate vulnerability is. This is being discussed in greater detail in other threads there people are ripping me apart.

    There are two vulnerabilities here:
    1) The ability to actually run untrusted code
    2) The ability to get an app approved in the App Store that exploits that.

    In answer to your question for those two points:
    1) Yes. This can be shown outside the App Store.
    2) No, obviously not, since getting the app approved is the end-goal.

    Had he made #1 public first then demonstrated #2 after giving Apple a chance to respond this would absolutely be all about Apple covering something up.
  • Nov 8th, 2011 @ 10:31am

    Re: Re: False Blame

    True. It is nice when people see the different sides of an argument and analyze it critically... instead of just jumping on me. :)

    Part of what he was testing was the approval process, in that "trusted" apps could do untrusted things. DannyB points out the same notion above, to which I also commented on.

    But the ability to demonstrate how to run untrusted code from within an app did not require it to be in the App Store. The ability to get that app approved was, yes.

    But from within the development community, demonstrating the process outside the App Store would have not required a significant mental leap by anyone to see the potential.

    Had he demonstrated this on an app outside the App Store first, then waited (how long is a different conversation) before testing the app approval process, this would absolutely be a different story.
  • Nov 8th, 2011 @ 10:18am

    Re: Re: Re: Re: Re: False Blame

    The app was in the App Store since September.
  • Nov 8th, 2011 @ 10:15am

    Re: Re: False Blame

    > Without doing so, it may be impossible to confirm and
    > demonstrate the bug.

    In part, I agree. To expand on something you mentioned earlier in your post...

    > The ultimate demonstration is having a signed, trusted
    > app do something untrusted.

    I would expand on that and add that getting such an app into the App Store (i.e., getting it approved by Apple itself) is the ultimate demonstration. You may have been implying this already by "signed, trusted app".

    To show that a signed app, running in the iOS sandbox, could do something untrusted does not require its presence in the App Store. You can demonstrate this on a development platform.

    It becomes tricky when you add "trusted" into it -- as you point out. At what point is it "trusted"? When Apple approves it for the App Store?

    I'll say "yes" to the above. The app has passed the final review, so it can't be any more trusted then that.

    The bump in the road here is that the exploitation was not previously disclosed. First step is to reveal the vulnerability and give the appropriate company (Apple, in this case) time to fix it (how much time is a different discussion).

    If the exploitation had been disclosed, it would have provided Apple the opportunity to (1) fix it, or at least (2) watch for it and deny that "trusted" status so it never made it to the App Store.

    But, okay. People don't agree with that sentiment. We'll switch up the argument -- creating the app and getting it onto the App Store to demonstrate the full vulnerability was the right course.

    Miller should have, or could have: (1) pointed out the issue to Apple privately, (2) publicly exposed the issue and removed the app, or (3) discretely waited until the security conference until exposing it. Unfortunately, he made it public on Twitter well in advance.

    I *don't* believe he meant this to be malicious in any way. But he isn't just a "messenger".
  • Nov 8th, 2011 @ 9:43am

    Re: Re: Re: Re:

    > Shoot the messenger.

    Unfortunately, he wasn't a messenger.

    Miller did not inform Apple of the bug before hand. He did not provide an opportunity to acknowledge and fix the bug.

    He knowingly uploaded an app that violated the terms and introduced a vulnerability. He then, other a month later, made a public announcement that the bug existed and the app to prove it is already in the App Store.
  • Nov 8th, 2011 @ 9:32am

    Re: Re: Re: False Blame

    Your timeline of what happened is incorrect, Mike.

    The author did not report the bug and wait. He didn't tell Apple he found an exploitation and has an app waiting to demonstrate it. He didn't upload the app to the App Store after waiting a week, month... or however long.

    The reality is Miller found an exploit, created an app, and uploaded it to the App Store. After that, he made public that he had found an exploit and demonstrated it using the publicly available app (which had been there for over a month).

    There was no bug reported. There was no opportunity to force the corporate hand. What you don't do is exploit that bug first in the working sandbox first.

    Did we not read the last paragraph in my original post? The part that points out that one *SHOULD* report bugs and create something that demonstrates the exploitations?
  • Nov 8th, 2011 @ 9:19am

    Re: Re: False Blame

    Nice use of ellipsis -- it's like a movie quote! Looking beyond the omission of "although unlikely" in the original sentence, your analysis of the level of effort to discover such an item is over-complicated.

    Understanding the nature of what the vulnerability was (in hindsight) illustrates that it could have been discovered by very simple means. As unlikely as the an individual picking this app is.

    The discussions over at Gizmodo on this story are at least somewhat insightful from a technical point of view, instead of people just running at it from a "big company keeping down the little guy" point of view.

    Apple as a sandbox called the "App Store". They've made rules for playing in the sandbox; like "don't piss in the sandbox". If you piss in the sandbox, you're not allowed to play in the sandbox for a little while.

    > Apple's actions may be defensible legally, but only
    > legally.

    You're stretching the notion of "legally" a bit there. We're taking a private company here. If Apple wanted to refuse an app because they didn't like the color scheme, they could. They don't have to have a reason to reject an app from the store -- they just can, because they want to.

    This isn't a moral question. There are rules set up for developers who wish to participate in the App Store. Miller *BROKE* those rules!

    Should Apple be working their butts off to fix this? Yes.

    Should they have fixed it sooner? Dunno. Maybe they've been trying since it was discovered. Maybe they've been busy playing table tennis instead.

    Could Apple have looked the other way? Could they have punished him a little less? Could they still reverse or revise the decision? Yes. Yes. Yes.

    Do they need to? No. A developer knowingly introduced an app into the store that exploited a security flaw, and was punished according to the agreements he signed.

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it