Smart vs. Globe: It's about standards, silly

26 Sep 2012

Last week, the National Telecommunications Commission (NTC) released the results of its Quality of Service (QoS) benchmarking tests for the second quarter of this year.

The purpose of the testing was to measure the telcos’ cellular network performance based on “existing NTC prescribed minimum service performance standards.” More importantly, though, this yardstick can be used to inform and guarantee consumers that the telcos are providing services at a certain level of quality.

In fact, QoS tests can be used as the basis for Service Level Agreements (SLAs) between the telcos and their customers.

The results, announced through a press release, revealed that Smart prepaid performed better in four out of the five metrics (dropped call rate, average receive signal level, average signal quality, and call set-up time). Globe prepaid had an edge in one metric (blocked call or grade of service).

Like grade-conscious students vying for the top post in a class of two, Globe and Smart engaged in a word war. Globe asserted that its legacy network is better than Smart’s upgraded network. The dominant player, meanwhile, accused Globe of selectively reading the test results to its favor. Media reports were quick to frame the benchmarking test like a contest. They can’t be blamed, though. NTC’s own wording sounded like it was pitting the two telcos against each other. Netizens also aired their two cents—some said they agree with the results, others said they were surprised.

Amid the furor over which telco is better, the limelight is deflected from the tester, the NTC. Truth be told, it is rare for the telecom regulator to be causing this much public stir. It is rare for the public to be paying any attention to it, at all. And this gem of an opportunity the NTC should grab and, hopefully, use to advance better telco service and consumer education.

So for the next test results, the regulator might want to consider releasing a lengthier and more robust report instead of just a two-page press release. While the parameters of the QoS tests are pretty straightforward, the way it has been interpreted and reported left much to the imagination.

This problem is similar to the weather bureau’s reports back in the day on the amount of rainfall, which were dismissed like trigonometry. The numbers only became better understood after Typhoon Ondoy struck in 2009. It has since been used as the benchmark for flood warnings. As for transparency, perhaps the testing methodology and actual results could be published, and made open for scrutiny and feedback.

Under the Public Telecommunications Policy Act, the regulator is tasked “to ensure the quality and reliability of telecommunications facilities and services in conformity with internationally acceptable standards and specifications.” The benchmarking test, however, should only be the first step toward this objective.

The NTC should also be able to require (not merely recommend) the telcos to address their shortcomings, particularly in blocked calls, an area in which both operators failed to meet the minimum standards. These tests, after all, are meant to keep telcos on their toes. Acceptable quality level should not be based merely on each telco’s current performance; it’s more about telcos’ performance vis-à-vis minimum QoS standards, rather than one telco against the other.

Related content

No Comments Yet! Be the first to share what you think!