Posts Tagged ‘iPhone’

Just when you think you’ve got all the windows closed and doors locked on your IT security, a new and unexpected hole is revealed to get you started on that next ulcer — or at least that’s how it seems at times. Here are a couple of interesting hacks that take advantages of weaknesses you may have never thought of but hackers have …

WireLurker: Most iPhone and iPad users never get a second thought to malware on their devices. After all, Apple scrubs all the apps that go into their app store, right? And, if you’ve been good and haven’t jailbroken your device, that “walled garden” of security should protect you since there’s no way to instal apps, malicious or otherwise, from other sources, right? Not exactly. What if you download an infected program to your Mac that then passes malware to your iPhone when you connect it via USB? Meet WireLurker. Here’s a description from MacRumors.com:

Once installed, WireLurker can collect information from iOS devices like contacts and iMessages, and it’s able to request updates from attackers. It’s said to be under “active development” with an unclear “ultimate goal.”

Didn’t see that one coming? Try this one on for size…

Gyrophone: I’ve posted here before about the possibility of malware surreptitiously turning on the microphone (or camera, yikes!) on a mobile phone turning your trusty sidekick into an always on surveillance device. One of the protections against this sort of attack is that apps, even bad ones, typically need to ask for your permission in order to access the mic (or camera). Of course, if the malware is disguised as a benign program you might be willing to grant access but it turns out that you may not have to. Researchers at Stanford found that the gyroscopes in modern phones that help them know how the device is oriented in your hand. so that the screen can rotate accordingly, are so sensitive that they can pick up the vibrations of ambient sound. In other words, you talk, your phone vibrates, the built-in gyro registers the movement (ever so slight as it may be) and then a program could pick up on this and transmit what you are saying without your knowledge. But wouldn’t you have to grant access to the gyroscope to the malicious program? No, because designers apparently never anticipated this sort of use (abuse?) of that feature. Read more about it and watch a video here.

Hacked Hotel: I’ll leave you with one more bit of grist for the mill from an article in the South China Morning Post:

A San Francisco-based cybersecurity expert claims he has hacked and taken control of hundreds of highly automated rooms at a five-star Shenzhen hotel.

Jesus Molina was staying at the St Regis Shenzhen, which provides guests with an iPad and digital “butler” app to control features of the room including the thermostat, lights, and television.

Realising how vulnerable the system was, Molina wrote a piece of code spoofing the guest iPad so he could control the room from his laptop.

After some investigation, and three room changes, he discovered that the network addresses of each room and the devices within them were sequential, allowing him to write a script to potentially control every one of the hotel’s more than 250 rooms.

“Hotels are particularly bad when it comes to security,” Molina said. “[They’re] using all this new technology, which I think is great, but the problem is that the security architecture and security problems are way different than for residential buildings”.

This sort of Internet of Things technology is great. Unfortunately, so are the opportunities for abuse. Clearly, we in the IT Security industry have some work to do. In the meantime, break out the tin foil hats… 🙂

For some, the mere fact that newer mobile phone models exist is reason enough to desire them. For others, the tried and true, trusty sidekick that has served them well (not to mention, that it took it 2 years to figure out how to use the thing) is more than adequate.

I confess to belonging to the first group because I love new technology. I also lean that direction because of my security roots.

As I mentioned in my previous post, old phones run old software and old software has old security bugs that haven’t been fixed.

In keeping with that theme, Graham Cluely recently wrote a thought provoking post entitled “If You Care About Security, Throw Away Your iPhone 4 Right Now.” OK, it’s a bit alarmist, but the underlying point is valid.

This isn’t picking on Apple because Android has the same issue. The point is that while newer doesn’t always mean better, when it comes to security you need to remember that …

Goldilocksold phone = unsupported = security bugs that won’t ever get fixed

Of course, the flip side of this is that …

new phone = not fully tested = new security bugs (that will, hopefully, one day get fixed)

Oh, and, by the way, the same holds true for operating systems, apps, etc.

So, what are you supposed to do? Balance is the key. Too old or too new is always going to be riskier than “just right.” I think Goldilocks said that in the sequel …

Apple recently announced their latest iPhone — the 5S — and among the new features that has created a fair amount of buzz is a built-in biometric fingerprint reader, which can be used to unlock the phone or confirm iTunes purchases in place of a PIN or passcode.

That’s probably true but there is another side to consider. There’s a reason (in fact, there are many) why biometric systems haven’t replaced passwords universally and one of those is the potential for impersonation. One would think that since fingerprints are unique that this would be a great way to authenticate people but it turns out that they can also be faked.

This is not new news. In May 2002 (that’s over a decade ago for those of you keeping score at home),  Tsutomu Matsumoto, a researcher from Yokohama National University, demonstrated how he could fool fingerprint readers about 80% of the time using $10’s worth of commonly available materials. Here’s a link to the presentation with some nice graphics:

http://web.mit.edu/6.857/OldStuff/Fall03/ref/gummy-slides.pdf

Fast forward to September 2013 and Apple’s Touch ID comes onto the scene and I begin the countdown clock to when someone will pull off a similar attack. Not surprisingly, it didn’t take long. Within 2 weeks this video from the Chaos Computer Club (CCC) surfaced which shows a successful impersonation attack.

I won’t go into the details here but here’s a quick description from macworld.com. And if you’re wondering just where someone might be able to get the fingerprints from the authorized user in order to duplicate them, take a closer look at the CCC video and pay close attention to what the iPhone’s screen looks like when it’s turned off — fingerprint heaven.

So, should we give up on biometrics and declare Touch ID a failure. Maybe not. Apple says that  roughly half of iPhone owners don’t even bother to set up a PIN to protect their devices due to the inconvenience of having to enter it (which is great news for thieves). So, even if Touch ID isn’t perfect (and no biometric system ever will be), the fact that it is so much simpler to use than passcodes means that, hopefully, more people will use it and, therefore, security will be improved since even a relatively weak biometric is more secure than a stock phone with no PIN at all.

Here’s a not so fun fact … apparently now you can’t even trust the charger you have your phone plugged into to not attack. OK, before you break out the tin foil hats, it might not be as bad as all that but there is a bit of fire amidst all the smoke.

A researcher at Georgia Tech revealed details at the latest Black Hat security conference that a modified USB charger could install malicious apps on a connected iPhone. According to a PCWorld article:

Once you plug your iPhone, the Universal Device ID (UDID) can be extracted just as long as the device doesn’t have a passcode unlock. The Mactans then claims your device as a test subject with any validated Apple developer ID and you can’t reject it since it doesn’t ask for their permission or offer any visual evidence that there’s anything going on in the background. 

So far there is no evidence that anyone has actually tried to exploit this vulnerability and the good news is that Apple says they have a fix coming in iOS 7 which will notify you before it’s too late. Also, you can help yourself considerably by adding a passcode to the phone, which is something you should do anyway.

The reason I find this interesting is that it exposes yet another area of “presumed security.” No one thinks that a charger could do harm to your phone (assuming it doesn’t zap the circuitry). In fact, most don’t even consider the fact that the same connection that supplies power is also used for data transfer — a great idea for simplifying the design of mobile devices but not so good from a security perspective, where isolation of functions is preferable.

We are conditioned to think of a power outlet as a relatively passive connection that does nothing more than supply juice to our gadgets but, in reality, it can do much more and, since it can, just as we all leverage that fact to our advantage, you can bet that a bad guy will try to do the same.

So the lesson here is not so much about iPhone chargers as it is about questioning long held assumptions because that what the hackers are already doing. The only thing in doubt is which side will figure this stuff out first…