Mobile Applications: A Cesspool of Security Issues

By Robert Lemos

An analysis of more than a half-million mobile apps find encryption problems, privacy issues, and known vulnerabilities in third-party code. What can users and developers do?

An analysis of more than half a million mobile applications found that nearly one in five had hardcoded encryption keys, nearly one in six used software components with known vulnerabilities, and nearly two-thirds used broken or weak encryption. 

Overall, the vast majority of mobile applications had a significant security weakness, despite user tendencies to trust the apps on their phones, says Andrew Hoog, co-founder and board member at NowSecure, a mobile-device penetration testing firm. In a presentation next week at the RSA Conference, he will discuss the findings of the company's analysis of hundreds of thousands of applications.

Developers like to develop and often do not spend enough time on designing their applications securely, he says.

"The good news is these issues are very easily solvable," he says. "People just have to have visibility and know that there are problems — and there are lots of problems."

In 2025, mobile devices represent a massive attack surface. The average smartphone owner uses seven unique applications per day and a total of 26 per month, according to the "State of Mobile 2025" report published by digital-intelligence provider SensorTower. In 2024, worldwide users spent more than $80 billion on in-app purchases in games and $69 billion on nonentertainment applications, the same report estimated.

App Store Scanning: Not What Devs Think

Mobile devices and their app ecosystems are extremely attractive to cybercriminals. The devices frequently use an in-app Web browser to create an online connection, are often written using third-party software development kits, and typically have 18 or more sensors — including GPS, cameras, and accelerometers.

Third-party software development kits make creating applications easier, so no wonder that more than 60% of apps use an SDK. Yet nearly 16% of those software components have known vulnerabilities, Hoog says. Usually, developers do not know about these issues, because the over-burdened managers of the Common Vulnerability Enumeration (CVE) system do not have the bandwidth to handle the massive influx of issues, he says.

Few developers scan their apps or focus on security during the design phase, Hoog says. Instead, they often rely on the third-party code providers or the app stores to secure their software. That puts their app in danger, he says.

"What people don't realize is you ship your entire mobile app and all your code to this public store where any attacker can download it and reverse it," Hoog says. "That's vastly different than how you develop a Web app or an API, which sit behind a WAF and a firewall and servers."

Mobile platforms are difficult for security researchers to analyze, Hoog says. One problem is that developers rely too much on the scanning conducted by Apple and Google on their app stores. When a developer loads an application, either company will conduct specific scans to detect policy violations and to make malicious code more difficult to upload to the repositories.

However, developers often believe the scanning is looking for security issues, but it should not be considered a security control, Hoog says.

"Everybody thinks Apple and Google have tested the apps — they have not," he says. "They're testing apps for compliance with their rules. They're looking for malicious malware and just egregious things. They are not testing your application or the apps that you use in the way that people think."

A Secure Platform, if Done Right

Despite that, mobile platforms tend to be much more secure than their workstation counterparts for two reasons. First, Apple and Google are both assiduous about updates, although Apple's vertically integrated ecosystems is easier to update in a timely manner.

Please login to comment
  • No comments found