As mobile application developers we should be familiar with possible security risks that a mobile application might face. Knowing possible risks makes it easier to avoid possible pitfalls and write more secure applications.
OWASP (Open Web Application Security Project) is an online community of security specialists that have created freely available learning materials, documentation and tools to help build secure web and mobile applications. Among others they have compiled a list of 10 most common threats to mobile applications.
Although the documentation by OWASP is excellent I still had a difficult time understanding how these risks can be taken advantage of in the real world and how vulnerable the applications we use every day can really be.
In this article I will try to give you a short overview of the top 10 mobile risks and provide examples of real world disclosed vulnerabilities for each risk. This article should motivate you to think more about the security of the app you are developing.
Before we look into each risk in more detail let’s talk about statistics. Most popular apps on the App store and on Google Play store should not be vulnerable to these risks right? Right? Unfortunately they are.
NowSecure tested apps on the App store and Google Play store and found out that 85% of apps violate at least one top 10 risk.
Of these apps 50% have insecure data storage and almost the same number of apps use insecure communication.
Keep this in mind when reading about the real world vulnerabilities below, these mistakes happen a lot more often than we might think.
M1 Improper platform usage
Misuse of a platform feature or failure to use platform security controls.
- android intents,
- platform permissions,
- misuse of TouchID,
- misuse the Keychain,
- misuse of other security controls.
Example: Citrix Worx apps
It was discovered, that it was possible to bypass Touch ID for Citrix Worx apps by
- rebooting the iPhone,
- opening one of the Citrix Worx apps,
- starting authentication, but cancelling Touch ID,
- closing the application and opening it again.
The problem seemed to be that the secret that was retrieved by passing Touch ID was stored incorrectly. Hence the app assumed that the user was correctly authenticated when the authentication process was cancelled and app restarted.
M2 Insecure data storage
Covers insecure data storage and unintended data leakage.
- wrong keychain accessibility option,
(f. ex. kSecAttrAccessibleWhenUnlocked vs. kSecAttrAccessibleAlways)
- insufficient file data protection,
(f. ex. NSFileProtectionNone vs NSFileProtectionComplete)
- access to privacy resources when using this data incorrectly.
Tinder app vulnerability: how sharing location data harms your privacy
The trendy dating app Tinder has been praised for its simplicity, but it's security and privacy practices don't deserve…www.abine.com
Tinder introduced a feature that showed people logged on near you. Problem: the exact location of every person near you was sent to the device.
First fix was to provide distance only, but it was possible to spoof the location and to use triangulation.
Second fix was to send this data without precision.
To show how bad the discovered vulnerability was an app was created by users to show tinder users with exact location
M3 Insecure communication
- poor handshaking/weak negotiation,
(f. ex. lack of certificate pinning)
- incorrect SSL versions,
- cleartext communication of sensitive assets,
- HTTP instead of HTTPS.
Example: Misafe smart watches
Communication was not encrypted and not correctly authenticated.
- retrieve real-time GPS coordinates of the kids’ watches,
- call the child on their watch,
- create a covert one-way audio call, spying on the child,
- send audio messages to the child on the watch, bypassing the approved caller list,
- retrieve a photo of the child, plus their name, date of birth, gender, weight and height.
M4 Insecure authentication
Problems authenticating the end user or bad session management.
- failing to identify the user at all when that should be required,
- failure to maintain the user’s identity when it is required,
- weaknesses in session management.
Example: Grab Android app
The security researcher was able to bypass 2FA by brute forcing 4 digit code. There was no limit of how many times the sent 4 digit code could be entered.
Problem: gain access to account with information on rides, payment methods, orders.
M5 Insufficient cryptography
Cryptography was attemted, but insufficient in some way. For example developer might have used an outdated cryptographic algorithm or written a custom vulnerable algorithm.
Example: Ola app
Major Bug in Ola App can Make you Either Rich or Poor! - Appknox | Mobile App Security, Resources…
Despite the hiccup caused by the recent ban on cab aggregators, India's cab-hiring app market continues to grow at…blog.appknox.com
Appknox scanned the Ola app and discovered major weaknesses in how cryptographic keys were used. They discovered that the cryptographic key used was “PRODKEYPRODKEY12“. The same key was also used to encrypt passwords which means that users other accounts where they were reusing passwords might have been at risk as well.
The researchers were able to intercept requests between the app and the server, fake request for money and receive the money.
M6 Insecure authorization
- failures in authorization,
(e.g., authorization decisions in the client side, forced browsing, etc.)
- able to execute over-privileged functionality.
It is distinct from authentication issues (e.g., device enrolment, user identification, etc.).
Example: Viper smart start
A security researcher discovered that the Viper smart start failed to correctly authorize users. After you log in to the server it was possible to change the id number of the car and gain access among other things the cars location. It was also possible to change data about the car and open the car remotely.
M7 Client code quality
Catch-all for code-level implementation problems in the mobile client.
- buffer overflows,
- format string vulnerabilities,
- various other code-level mistakes where the solution is to rewrite some code that’s running on the mobile device.
WhatsApp engineers found that it was possible to create a buffer overflow by sending specially crafted series of packets to WhatsApp when making a call. For this to work the call does not need to be answered and the adversary can run arbitrary code.
It was discovered that the vulnerability was used to install spyware on the phone. This service was sold by the Israeli company NSO Group.
M8 Code tampering
- binary patching,
- local resource modification,
- method hooking and swizzling,
- dynamic memory modification.
Example: Pokemon GO
How Pokémon Go Fans Hacked 'Em All: And How to Prevent Similar Reverse-Engineering | Nordic APIs |
Every developer hopes for huge user bases populated by large amounts of monthly users. People using an application to…nordicapis.com
Fans reverse engineered the application, fed wrong geolocation data and time to find rare pokemon and make eggs hatch faster. A website was created that showed the location of every pokemon on a map, which changed the game dynamics quite a lot.
This hack might not seem as dangerous as the ones above but it or how Niantic handled it still cost the company reputation and users.
M9 Reverse engineering
Might include analysis of the final core binary to determine its
- source code,
- algorithms and other assets.
Reverse engineering makes it easier to exploit other vulnerabilities in the application. It can reveal information about backend servers, cryptographic constants and ciphers, and intellectual property.
Example: other examples above
I decided to not include a separate example, since reverse engineering was used on most examples listed.
M10 Extraneous functionality
- hidden backdoor functionality,
- other internal development security controls not intended for production environment.
Example: Wifi File Transfer
Wifi File Transfer App opens port on Android device to allow connections from the computer.
Intended use: transfer files, photos, anything stored on SD card.
Problem: there was no authentication like password, anyone could connect to device and have full access.
A group of researchers from the University of Michigan that discovered this flaw found that on Google Play store:
- 1,632 apps open ports,
- 410 have potentially no or weak protection,
- 57 apps were confirmed as exploitable.
Hopefully these examples motivate you to look more into mobile application security. Even when you are not the one testing the security of the application it makes sense to have these risks in mind when developing a mobile app. Of course the OWASP mobile top 10 is just the tip of the iceberg to look at, but it is a good starting point.