Telcos Deny Trying To Turn FCC's Open Network Diagnostics Into A Closed, Proprietary Affair

from the well-of-course-they-are dept

The FCC has been working with M-Lab to measure basic network diagnostics using an open source solution, providing public information about internet network performance. This seems like a good thing... though you can see why not everyone would like data public about the performance of their networks. Over the weekend, a warning went up that the telcos are pushing the FCC to stop using M-Lab and switch to their own ISP-managed diagnostics tools. Vint Cerf is raising the alarm about this:
Recently, the FCC measurement program has backed sharply away from their commitment to transparency, apparently at the bidding of the telcos in the program. The program is now proposing to replace the M-Lab platform with only ISP-managed servers. This effectively replaces transparency with a closed platform in which the ISPs -- whose performance this program purports to measure -- are in control of the measurements. This closed platform would provide the official US statistics on broadband performance. I view this as scientifically unacceptable.

For the health of the Internet, and for the future of credible data-based policy, the research community must push back against this move.
The FCC keeps insisting that it's committed to openness -- but all too frequently seems to give in to telco demands. So this warning is concerning.

For what it's worth, the telcos are claiming that Cerf is overreacting. In a response to his call for action, Verizon's David Young responded that there's nothing to see here, and that M-Lab and the telco efforts have co-existed and can continue to co-exist going forward.
Vint breathlessly suggests that the FCC is now backing away from this openness "at the bidding of the telcos" and claims the program is proposing to replace the M-Lab platform with only ISP-managed servers. THIS IS FALSE. ISPs have made no such request of the FCC nor has the FCC proposed to eliminate use of M-Lab’s servers.

What has been proposed is that, in addition to continuing to use the data collected via the M-Lab servers, the FCC and SamKnows may also rely on the ISP provided servers that have been in use since the beginning of the project. These ISP-provided servers meet the specifications required by SamKnows as do the M-Labs servers. In fact, it was only because of the presence of these non-M-Lab, ISP-donated servers, that SamKnows was able to identify problems with an M-Lab server that was affecting the results of the tests being conducted. M-Labs did not identify this server problem on their own. It was only fixed when SamKnows brought the issue to their attention. By the way, this problem forced the FCC to abandon a month's worth of test data, extend the formal test period and delay production of their report. Later, another M-Lab server location had transit problems that again affected results. This was the second M-Labs-related server problem in two months and once again, it was SamKnows, using the ISP-provided servers as a reference who identified the problem and brought it to M-Labs attention.
As with many such disputes, the reality may be somewhere in between the two claims here. It seems like Cerf's fear is that by establishing the telcos' servers on equal footing with the M-Labs' open setup, it opens the door to replacing the M-Labs' efforts and then potentially locking up the data. Young is correct that the openness is mainly due to FCC policy at this point, but that policy is dependent on the current leadership of the FCC, which could change. At the very least, it would be nice to see a stated commitment to keeping the information open on an ongoing basis, so that there isn't any need to worry going forward.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: broadband, fcc, network diagnostics, open, proprietary, telcos, vint cerf
Companies: verizon


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    weneedhelp (profile), 18 Jul 2012 @ 1:33pm

    commitment to openness

    You mistook this commitment. The openness they meant was their wallets will be open to the highest bidder.

    link to this | view in thread ]

  2. icon
    art guerrilla (profile), 18 Jul 2012 @ 1:39pm

    i am a sam knows participant...

    ...and while i appreciate the service they are doing, i am not 100% certain *their* measurements are either correct, or are not being spoofed by my ISP (who i despise, but I HAVE NO CHOICE)...
    to wit: starting at just before xmas 2011, our 3 Mbs DSL was *almost* unusable (and *was* -in fact- unusable for -you know- crazy stuff like watching videos or listening to music online) for almost 6 freaking MONTHS...
    needless to say, calling our ISP resulted in nothing but lies and bullshit (and NOW they say we NEVER called during this 6 month period, the lying bastards!)...
    the monthly report they gave me during this time showed the EXTREME variable speed, but didn't reflect that we were getting 1/10th to 1/20th the speed during our 'normal' usage time : from after-work-o-clock, to midnight...
    sure, i bet if you measured at 3-4 in the morning, the speed was *somewhat* better; but for 90% of the time, IT SUCKED...
    in any event, either they are not measuring 'real' performance, are taking random samples which didn't reflect our crappy service, or the ISP was spoofing the connection, who knows...
    but -you know- putting the foxes in charge of the henhouse is always a good idea...
    art guerrilla
    aka ann archy
    eof

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 18 Jul 2012 @ 1:59pm

    Re: commitment to openness

    So I can just take money from the FCC? I'd better get on that

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 18 Jul 2012 @ 2:00pm

    I'm surprised that the ISPs haven't learned anything from SOPA. It can be a great distraction.

    What Verizon's David Young should have said when confronted was "Look over there! They are trying to sneak in SOPA again!" While everyone turns to look, he should drop a smoke bomb and let out an evil chuckle while running all the way to the bank.

    =P

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 18 Jul 2012 @ 2:02pm

    Where did Cerf get his information? Does he have any proof to show us? We can't just assume that the telcos are really doing anything.

    link to this | view in thread ]

  6. icon
    ECA (profile), 18 Jul 2012 @ 2:15pm

    Comments

    1. sign-in isnt working, not for me anyway..
    2. that funny bar on the bottom is stupid.

    Ok,
    For those that understand a few things about BENCHMARK programs..and how MANY corps have inserted their OWN code to bypass or MOD the program to work BEST on their OWN CARDS.

    Then comes the idea of a CORP offering you to USE a certain SPEED program to test their SITE..
    there are many things to SEE/TEST when you test a site, and connections.
    OS-LAG
    SITE-LAG
    How many JUMPS-LAG
    SYSTEM-LAG
    Even your video card can add LAG..as windows WAITS for your video to DO SOMETHING before it decides to keep connecting.(fun isnt this)
    LAG is a general term. Different programs TEST in different ways also. from JUST testing from your NET card to another NET CARD, is very quick. TESTING a PROGRAM, transfer and render, and then RETURN that program is more thorough. AND TESTS MORE THEN ping from 1 machine to another.
    I wont even get into TRAFFIC monitoring but certain GROUPS, which can also ADD to your lag times..

    For those of us OLDER then dirt, we remember some of the OLD programs that DID something, in a straight forward fashion and gave us DETAILS and information we could use that was TRUTHFUL. and in a way would tell us WHERE the problems were.

    link to this | view in thread ]

  7. icon
    schulzrinne (profile), 18 Jul 2012 @ 2:16pm

    FCC take on story

    Yesterday, Vint Cerf distributed an open letter regarding concerns about the Measuring Broadband America measurement infrastructure. We share the objectives of the letter writers that “Open data and an independent, transparent measurement framework must be the cornerstones of any scientifically credible broadband Internet access measurement program.” Unfortunately, the letter claims: “Specifically, that the Federal Communications Commission (FCC) is considering a proposal to replace the Measurement Lab server infrastructure with closed infrastructure, run by the participating Internet service providers (ISPs) whose own speeds are being measured.” This is false.

    The FCC is not considering replacing the Measurement Labs infrastructure. As part of a consensus-based discussion in the Measurement Collaborative, a group of public interest, research and ISP representatives, we have discussed how to enhance the existing measurement infrastructure to ensure the validity of the measurement data. Any such enhancements would be implemented solely to provide additional resiliency for the measurement infrastructure, not to replace existing infrastructure. Any data gathered would be subject to the same standards of data access and openness.

    We look forward to continue to work with all participants in a process that has provided American consumers and the research community with network performance data of an unmatched scale and scientific rigor. We appreciate the contributions of all participants, in particular Measurement Labs, to this effort.

    Henning Schulzrinne
    CTO, FCC

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 18 Jul 2012 @ 2:41pm

    Re: FCC take on story

    Read the CTO as GTFO and thought "My, what a rude way to end a letter" then I learned to actually read what I read.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 18 Jul 2012 @ 4:15pm

    Re: Comments

    So what is the point here. That this is okay or not?

    link to this | view in thread ]

  10. identicon
    Jacob Blaustein, 18 Jul 2012 @ 4:16pm

    Re: FCC take on story

    So no panic?

    link to this | view in thread ]

  11. icon
    art guerrilla (profile), 18 Jul 2012 @ 6:50pm

    Re: Re: Comments

    not to speak for him, but i *think* his point is, cpu, gpu, other hardware and software companies have been gaming 'benchmarks' *for-freaking-ever*...
    it would hardly be surprising if ISPs rigged their benchmarks too...

    gpu manufs went (and prob still do) to EXTREME lengths to try and game the various popular graphics benchmarks...
    ...and it worked ! they would beat the other guys by reverse-engineering the benchmark code, and figuring out how they could trick it, anticipate it, or otherwise game the testing software/hardware...
    the point being -made in the concurrent article about leahy's cameo, and the subsequent private showing that wasn't a gift 'cause they gamed it- *whatever* 'laws' (how quaint), 'rules', 'regulations', 'guidelines', 'by-laws', or other strictures we mere 99% *attempt* to emplace upon our betters, is ONLY worth the enforcement we can engender...

    if we can't enforce (even weak-tea laws), then laws are all but meaningless... in fact, *worse* than meaningless, because they offer the *appearance* of lawfulness, when there is none...

    harsh laws for us 99%, with draconian enforcement; and squishy, malleable, hardly-worth-mentioning 'laws' for the 1%, and those unenforced, at that !
    i am certain that is a sure-fire recipe for a stable society...

    art guerrilla
    aka ann archy
    eof

    link to this | view in thread ]

  12. icon
    Dave (profile), 19 Jul 2012 @ 5:44pm

    Network Testing

    IMHO, the only effective test for web performance would be a measurement made every five minutes for a period of one week, between two points on the network. This would be repeated for every major node, for each ISP, in every city in the US, on identical off-the-shelf equipment running identical open-source software (assuming multiple tests were run at the same time). Might get a little expensive for the tester, and take years, but we'd at least have valid data.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.