The iPhone 3G has brought Apple and AT&T truckloads of money in the last year. If you remember, after the iPhone 3G launched, AT&T’s network started imploading immediately under the “immense” load of all the new customers. People were complaining left and right of low 3G coverge when AT&T’s coverage map showed their area clearly within the boundaries. How did Apple (and most likely AT&T) remedy the situation? By rigging the iPhone to show false signal stength. One way to find out the REAL signal strength is to enter “field test mode”. To get into field test mode, select the keypad and type in *3001#12345#* then press “dial”. The number that appears is the signal reading in decibals of resistance. The lower the number the better. You can easily try it yourself if you don’t believe the pictures.
You’ll notice in the pictures directly below, when in Edge coverage, the signal meter is pretty accurate. In an area where you have low (read 1 bar of edge coverage) dial the field test number, you’ll see that the db number is likely -99+.
Now, to truly show the difference, in an area where you have a full 5 bars of 3G coverage, dial the number and (if you’re in a low coverage area) could see the db number as high as -119 (which is a god awful signal by the way) yet amazingly you still have a “perfect” signal.
It sickens me that instead of actually addressing the problem by like, I don’t know, upgrading your network (AT&T) or putting a little function before design (Apple) they choose to put a fake band aid over the problem to appease the general public. If more people raised a stink about this then maybe we could get some fix whether it be pushing AT&T to upgrade their network, or Apple to start designing a little better.
What are your thoughts? Shout out below!
October 7, 2009 at 5:55 am
Finally the truth uncovered. Perhaps the reason for the AT&T 5 bars ad campaign?
August 15, 2009 at 10:25 am
You should be aware of several issues not properly described in your posting. First, AT&T uses two different systems for the iPhones. The AT&T GSM (Global System for Mobiles) system provides the voice communications. A separate AT&T UMTS (Universal Mobile Telephone System) network provides the data connection. That is why you might be in a ‘5 bar’ area, yet not have good data. It depends on how AT&T has deployed its network at particular cell sites (GSM only or GSM and UMTS). So, when you say the “db number as high as 119 yet amazingly you still have a ‘perfect’ signal” you have to mention that the 119 dB reading is for the UMTS system; not the GSM system. Also, signal strength is actually a negative number in units of dBm, so a “119 dB” reading is really “-119 dBm” which is a very, very, very low signal; a signal strength of -79 dB is a much greater signal strength. Every time the signal strength increases by 3 dB (i.e., -119 dBm signal increases to -116 dBm), the signal strength doubles. A 3 dB increase in signal is doubling the strength. A 10 dB increase in signal strength is a 10-times increase in signal strength. A 20 dB increase in signal strength is a 100-times increase in signal strength. The math counts, so understanding decibels in the context of RF transmissions is very useful.
Regards,
Jonathan Kramer, Esq.
RF Engineer and Local Government Telecom Attonrney
http://www.TelecomLawFirm.com
August 16, 2009 at 4:44 pm
While I know the decibel ratings are negative, it was a careless oversight by myself in leaving out the “-”. Thanks for the reminder. I also already understand wireless signal strengths, wireless signals, and how to read them. What I’m trying to get at by this article is that Apple and/or AT&T are deliberately misleading customers into fooling them into believing they have a good/decent signal. A decibel rating between EDGE and 3G should correspond to the same quality signal which in return should return the same graphical representation in the form of “bars” on your cellular display. In the instance in this post, I’m merely highlighting how when using 3G, the bars/visual representation of signal strength is grossly exaggerated.
October 29, 2009 at 11:05 am
Actually no, you cannot directly compare the figures for GSM and UMTS networks.. While -95 dBm on GSM is quite low (yet still good enough to maintain perfect call), – 95 dBm on UMTS is a full signal. It’s not Apple specific, either, all Nokia phones display full seven bars on -95 dBm UMTS signal (actually it’s the treshold, -96 dBm is six bars, and it goes down fast, one bar per 2 dB). Also, raw signal strength figure is not very useful in UMTS system (on contrary with GSM), because in a single frequency spread spectrum system like UMTS (which is really CDMA), you need to factor in things like a best server Ec/Io, pilot pollution etc…
October 29, 2009 at 11:39 am
Thanks for the added insight! I understand now how it all works.
August 1, 2009 at 8:16 am
I’ve never cursed and clenched my teeth so much until I changed to AT&T and the iPhone.
can u say "most dropped cslls in the nation"!!
January 26, 2009 at 12:58 am
Dude that is just unbelievable. Crap like that is what scares me about ATT.