HomeOur Team

OWASP mobile top 10 security risks explained with real world examples

By Vitus White
Published in Apps & Software
April 07, 2020
6 min read

As mobile application developers we should be familiar with possible security risks that a mobile application might face. Knowing possible risks makes it easier to avoid possible pitfalls and write more secure applications.

OWASP (Open Web Application Security Project) is an online community of security specialists that have created freely available learning materials, documentation and tools to help build secure web and mobile applications. Among others they have compiled a list of 10 most common threats to mobile applications.

Although the documentation by OWASP is excellent I still had a difficult time understanding how these risks can be taken advantage of in the real world and how vulnerable the applications we use every day can really be.

In this article I will try to give you a short overview of the top 10 mobile risks and provide examples of real world disclosed vulnerabilities for each risk. This article should motivate you to think more about the security of the app you are developing.


Statistics

Before we look into each risk in more detail let’s talk about statistics. Most popular apps on the App store and on Google Play store should not be vulnerable to these risks right? Right? Unfortunately they are.

Results by NowSecure: [https://www.nowsecure.com/blog/2018/07/11/a-decade-in-how-safe-are-your-ios-and-android-apps/]

[NowSecure] tested apps on the App store and Google Play store and found out that 85% of apps violate at least one top 10 risk.

Of these apps 50% have insecure data storage and almost the same number of apps use insecure communication.

Keep this in mind when reading about the real world vulnerabilities below, these mistakes happen a lot more often than we might think.


M1 Improper platform usage

Misuse of a platform feature or failure to use platform security controls.

Might include:

  • android intents,
  • platform permissions,
  • misuse of TouchID,
  • misuse the Keychain,
  • misuse of other security controls.

Example: Citrix Worx apps

[Bypassing Apple’s Touch ID… but from certain apps!
_Many people consider Apple’s Touch ID the state-of-the-art of fingerprint recognition for mobile devices. Maybe is, but…_medium.com] It was discovered, that it was possible to bypass Touch ID for Citrix Worx apps by

  • rebooting the iPhone,
  • opening one of the Citrix Worx apps,
  • starting authentication, but cancelling Touch ID,
  • closing the application and opening it again.

The problem seemed to be that the secret that was retrieved by passing Touch ID was stored incorrectly. Hence the app assumed that the user was correctly authenticated when the authentication process was cancelled and app restarted.


M2 Insecure data storage

Covers insecure data storage and unintended data leakage.

Might include:

  • wrong keychain accessibility option,
    (f. ex. kSecAttrAccessibleWhenUnlocked vs. kSecAttrAccessibleAlways)
  • insufficient file data protection,
    (f. ex. NSFileProtectionNone vs NSFileProtectionComplete)
  • access to privacy resources when using this data incorrectly.

Example: Tinder

[Tinder app vulnerability: how sharing location data harms your privacy_The trendy dating app Tinder has been praised for its simplicity, but it’s security and privacy practices don’t deserve…]www.abine.com
Tinder introduced a feature that showed people logged on near you.

Problem: the exact location of every person near you was sent to the device. First fix was to provide distance only, but it was possible to spoof the location and to use triangulation. Second fix was to send this data without precision.

To show how bad the discovered vulnerability was an app was created by users to show tinder users with exact location


M3 Insecure communication

Might include:

  • poor handshaking/weak negotiation, (f. ex. lack of certificate pinning)
  • incorrect SSL versions,
  • cleartext communication of sensitive assets,
  • HTTP instead of HTTPS.

Example: Misafe smart watches

Hacking MiSafes’ smartwatches for kids is child’s play MiSafes, the maker of surveillance devices meant to track kids, is back in the news.

Attackers could:

  • retrieve real-time GPS coordinates of the kids’ watches,
  • call the child on their watch,
  • create a covert one-way audio call, spying on the child,
  • send audio messages to the child on the watch, bypassing the approved caller list,
  • retrieve a photo of the child, plus their name, date of birth, gender, weight and height.

M4 Insecure authentication

Problems authenticating the end user or bad session management.

Might include:

  • failing to identify the user at all when that should be required,
  • failure to maintain the user’s identity when it is required,
  • weaknesses in session management.

Example: Grab Android app

Grabtaxi Holdings Pte Ltd disclosed on HackerOne: Two-factor… I found a two-factor authentication bypass on the endpoint, used by Grab Android App. The team was very responsible and… hackerone.com The security researcher was able to bypass 2FA by brute forcing 4 digit code. There was no limit of how many times the sent 4 digit code could be entered. Problem: gain access to account with information on rides, payment methods, orders.


M5 Insufficient cryptography

Cryptography was attemted, but insufficient in some way. For example developer might have used an outdated cryptographic algorithm or written a custom vulnerable algorithm.

Example: Ola app

Major Bug in Ola App can Make you Either Rich or Poor! - Appknox | Mobile App Security, Resources…
_Despite the hiccup caused by the recent ban on cab aggregators, India’s cab-hiring app market continues to grow at… blog.appknox.com Appknox scanned the Ola app and discovered major weaknesses in how cryptographic keys were used. They discovered that the cryptographic key used was PRODKEYPRODKEY12. The same key was also used to encrypt passwords which means that users other accounts where they were reusing passwords might have been at risk as well. The researchers were able to intercept requests between the app and the server, fake request for money and receive the money.


M6 Insecure authorization

Might include:

  • failures in authorization,
    (e.g., authorization decisions in the client side, forced browsing, etc.)
  • able to execute over-privileged functionality.

It is distinct from authentication issues (e.g., device enrolment, user identification, etc.).

Example: Viper smart start

Remote smart car hacking with just a phone.
_tl;dr: Calamp which provides the backend for a lot of really well known car alarm systems had a misconfigured reporting…_medium.com A security researcher discovered that the Viper smart start failed to correctly authorize users. After you log in to the server it was possible to change the id number of the car and gain access among other things the cars location. It was also possible to change data about the car and open the car remotely.


M7 Client code quality

Catch-all for code-level implementation problems in the mobile client.

Might include:

  • buffer overflows,
  • format string vulnerabilities,
  • various other code-level mistakes where the solution is to rewrite some code that’s running on the mobile device.

Example: WhatsApp

[Hackers Used WhatsApp 0-Day Flaw to Secretly Install Spyware On Phones
_Whatsapp has recently patched a severe vulnerability that was being exploited by attackers to remotely install…_thehackernews.com] WhatsApp engineers found that it was possible to create a buffer overflow by sending specially crafted series of packets to WhatsApp when making a call. For this to work the call does not need to be answered and the adversary can run arbitrary code. It was discovered that the vulnerability was used to install spyware on the phone. This service was sold by the Israeli company [NSO Group]


M8 Code tampering

Might include:

  • binary patching,
  • local resource modification,
  • method hooking and swizzling,
  • dynamic memory modification.

Example: Pokemon GO

[How Pokémon Go Fans Hacked ‘Em All: And How to Prevent Similar Reverse-Engineering | Nordic APIs |
_Every developer hopes for huge user bases populated by large amounts of monthly users. People using an application to…_nordicapis.com] Fans reverse engineered the application, fed wrong geolocation data and time to find rare pokemon and make eggs hatch faster. A website was created that showed the location of every pokemon on a map, which changed the game dynamics quite a lot. This hack might not seem as dangerous as the ones above but it or how Niantic handled it still cost the company reputation and users.


M9 Reverse engineering

Might include analysis of the final core binary to determine its

  • source code,
  • libraries,
  • algorithms and other assets.

Reverse engineering makes it easier to exploit other vulnerabilities in the application. It can reveal information about backend servers, cryptographic constants and ciphers, and intellectual property.

Example: other examples above

I decided to not include a separate example, since reverse engineering was used on most examples listed.


M10 Extraneous functionality

Might include:

  • hidden backdoor functionality,
  • other internal development security controls not intended for production environment.

Example: Wifi File Transfer

[An Obscure App Flaw Creates Backdoors In Millions of Smartphones
_For hackers, scanning for an open port-a responsive, potentially vulnerable internet connection on a would-be…_www.wired.com] Wifi File Transfer App opens port on Android device to allow connections from the computer. Intended use: transfer files, photos, anything stored on SD card. Problem: there was no authentication like password, anyone could connect to device and have full access. A group of researchers from the University of Michigan that discovered this flaw found that on Google Play store:

  • 1,632 apps open ports,
  • 410 have potentially no or weak protection,
  • 57 apps were confirmed as exploitable.

Conclusion

Hopefully these examples motivate you to look more into mobile application security. Even when you are not the one testing the security of the application it makes sense to have these risks in mind when developing a mobile app. Of course the OWASP mobile top 10 is just the tip of the iceberg to look at, but it is a good starting point.

source: medium.com


Tags

#secure
Previous Article
Over 50,000 Windows MS-SQL and PHPMyAdmin Servers Worldwide with 20 Different Payloads
Vitus White

Vitus White

Web Developer

Table Of Contents

1
Statistics
2
M1 Improper platform usage
3
M2 Insecure data storage
4
Example: Tinder
5
M3 Insecure communication
6
Example: Misafe smart watches
7
M4 Insecure authentication
8
Example: Grab Android app
9
M5 Insufficient cryptography
10
Example: Ola app
11
M6 Insecure authorization
12
Example: Viper smart start
13
M7 Client code quality
14
Example: WhatsApp
15
M8 Code tampering
16
Example: Pokemon GO
17
M9 Reverse engineering
18
Example: other examples above
19
M10 Extraneous functionality
20
Example: Wifi File Transfer
21
Conclusion

Related Posts

The Microsoft Edge browser will warn users about the leaked passwords
April 17, 2020
1 min
© 2022, All Rights Reserved.

Quick Links

Our TeamContact Us

Legal Stuff

Social Media