Motion sensors also can help map someone’s travels, such as on a subway or bus ride. A trip produces motion data that are different from the more brief, jerkier movements of something like a phone being pulled from a pocket.
For a 2017 study, researchers designed an app to extract the data signatures of various subway routes. They used accelerometer readings from Samsung smartphones of people riding the subway in Nanjing, China.
A tracking app picked out which segments of the subway system a user was riding. It did this with an accuracy of 59 to 88 percent. How well it performed depended on how many subway stations the people rode through. (The app improved as the rides lengthened from three stations to seven stations long.) Someone who can trace a user’s subway movements might figure out where the traveler lives and works. They might tell where the user shops or map out someone’s entire daily schedule. It might even — if the app is tracking multiple people — figure out who the user meets at various places.
Accelerometer data also can plot driving routes. And other sensors can be used to track people in more confined spaces.
One team, for instance, synced a smartphone mic and portable speaker. That let them create an on-the-fly sonar system to map movements throughout a house. The team reported the work in a September 2017 study.
Selcuk Uluagac is an electrical and computer engineer. He works at Florida International University in Miami. “Fortunately, there is not anything like [these sensor spying techniques] in real life that we’ve seen yet,” he notes. “But this doesn’t mean there isn’t a clear danger out there that we should be protecting ourselves against.”
That’s because the types of algorithms that researchers have used to comb through sensor data are getting more advanced and user-friendly all the time, says Mehrnezhad at Newcastle University. It’s not just people with PhDs who can design these types of privacy invasions, she says. App developers who don’t understand machine-learning algorithms can easily get this kind of code online to build sensor-sniffing programs.
What’s more, smartphone sensors don’t just provide snooping opportunities for cybercrooks who peddle info-stealing software. Legitimate apps often harvest info to compile such things as your search-engine and app-download history. The makers of these apps sell that info to advertising companies and outside parties. They could use the data to learn aspects of a user’s life that this person might want to keep private.
Take a health-insurance company. It may charge more to insure someone who doesn’t get much exercise. So “you may not like them to know if you are a lazy person or you are an active person,” Mehrnezhad says. Yet with your phone’s motion sensors, “which are reporting the amount of activity you’re doing every day, they could easily identify what type of user you are.”
Sensor safeguards
It’s getting ever easier for an untrustworthy party to figure out private details of your life from data they get from your phone’s sensors. So researchers are devising ways to give people more control over what information apps can siphon data from their devices.
Some safeguard apps could appear as standalone programs. Others are tools that would be built into future updates of the operating system for your phone’s onboard computer.
Uluagac and his colleagues recently proposed a system called 6thSense. It monitors a phone’s sensor activity. Then it alerts an owner when it detects unusual behaviors. Users train this system to recognize their phone’s normal sensor behavior. This might include tasks like calling, Web browsing or driving. Then, 6thSense continually checks the phone’s sensor activity against these learned behaviors.
That program is on the lookout for something odd. This might be the motion sensors reaping data when a user is just sitting and texting. Then, 6thSense alerts the user. Users can check if a recently downloaded app is responsible for a suspicious activity. If so, they can delete the app from their phones.
Uluagac’s team recently tested a prototype of 6thSense on Samsung smartphones. The owners of 50 of these phones trained with 6thSense to identify their typical sensor activity. The researchers then fed the 6thSense system examples of benign data from daily activities mixed with bits of malicious sensor operations. 6thSense correctly picked out the problematic bits more than 96 percent of the time.
Supriyo Chakraborty is a privacy and security researcher at IBM in Yorktown Heights, N.Y. His team devised DEEProtect for people who want more active control over their data. It’s a system that blunts the ability of apps to draw conclusions about user activity from a phone’s sensor data. People could use DEEProtect to specify what their apps would be allowed to do with sensor data. For example, someone may want an app to transcribe speech but not identify the speaker.
DEEProtect intercepts whatever raw sensor data an app tries to access. It then strips those data down to only the features needed to make user-approved inferences.
Consider speech-to-text translation. For this, the phone typically needs sound frequencies and the probabilities of particular words following each other in a sentence. But sound frequencies could also help a spying app deduce a speaker’s identity. So DEEProtect distorts the dataset before releasing it to the app. However, it leaves alone data on word orders. Those data have little or no bearing on a speaker’s identity.
Users get to control how much DEEProtect changes the data. More distortion offers more privacy — but at a price: It degrades app functions.
Giuseppe Petracca is a computer scientist and engineer at Pennsylvania State University in University Park. He and his colleagues took a different approach. They are trying to protect users from accidentally allowing sensor access to deceitful apps. Their security system is called AWare.
When they are first installed, apps have to get a user permission to access certain sensors. This might include the mic and camera. But people can be careless about granting those permissions, Uluagac says. All too often, “people blindly give permission,” he says, to use the phone’s camera or microphone. They may give no thought to why the apps might — or might not — need them.
AWare would instead request permission from a user before an app can access a certain sensor the first time a user provided a certain input. For instance, this might happen when you press a camera’s button the first time after downloading an app. On top of that, the AWare system memorizes the state of the phone when the user grants that first permission. It remembers the exact appearance of the screen, the sensors that were requested and other information. That way, AWare can tell users if and when the app later attempts to trick them into granting unintended permissions.
The Penn State researchers imagined a crafty data-stealing app. It would ask for camera access when the user first pushes a camera button. But it would then also try to access the mic when the user later pushes that same button. The AWare system would realize the mic access wasn’t part of the initial deal. It would then ask the user again if he or she would like to grant this additional permission.
Petracca and his colleagues tested AWare with people using Nexus smartphones. Those using phone equipped with AWare avoided unwanted authorizations about 93 percent of the time. That’s compared with just 9 percent among people using smartphones with typical first-use or install-time permission policies.
The Price of Privacy
The security team in Google’s Android division is also trying to mitigate the privacy risks posed by app sensor data collection. Rene Mayrhofer is an Android security engineer in Austria at Johannes Kepler University in Linz. He and his colleagues are keeping tabs on the latest security studies coming out of university labs.
But just because someone has a successful prototype of a new smartphone-security system doesn’t mean it will show up in future phone updates. Android hasn’t incorporated any of these proposed sensor safeguards yet. That’s because its security team is still looking for the right balance. The team wants to restrict access for nefarious apps but not slow or degrade the functions of trustworthy programs, Mayrhofer explains.
“The whole [app] ecosystem is so big,” he notes. “And there are so many different apps out there that have a totally legitimate purpose.” Any kind of new security system that curbs an app’s access to the phone’s sensors, he says, could pose “a real risk of breaking” legitimate apps.
Tech companies may also be reluctant to adopt more security measures. Why? These extra protections can come at the cost of user friendliness. (AWare’s additional permissions pop-ups, for instance.)
Mani Srivastava is an engineer at the University of California, Los Angeles. There’s always a trade-off between security and convenience, he says. “You’re never going to have this magical sensor shield [that] gives you this perfect balance of privacy and utility.”
But phones are relying on ever more — and more powerful — sensors. And algorithms for analyzing their data are becoming more wise. Because of this, even smartphone makers may eventually admit that the current sensor protections aren’t cutting it. “It’s like cat and mouse,” Al-Haiqi says. “Attacks will improve. Solutions will improve.” Then more clever attacks will emerge. And security teams will engineer still more clever solutions. And on and on it goes.
The game will continue, Chakraborty agrees. “I don’t think we’ll get to a place where we can declare a winner and go home.”
Which two statements identify the central ideas of the text?
A. As technology continues to develop smartphone apps will become more advanced and privacy violations will not occur.
B. Software designers have developed tools to help protect users' personal information from improper gathering.
C. The only way to ensure that private information is not secretly collected is to ban smartphones.
D. Smartphone sensors put the user's private information at risk much more than they improve the user's experience.
E. Cybercriminals have not yet developed a way to determine what people are typing on their phone keypads.
F. Smartphones are equipped with a variety of sensors that collect more information than a person may realize or be comfortable with.
1 answer
The two statements that identify the central ideas of the text are:
B. Software designers have developed tools to help protect users' personal information from improper gathering.
F. Smartphones are equipped with a variety of sensors that collect more information than a person may realize or be comfortable with.
These statements capture the text's focus on the risks smartphones pose to privacy due to sensor data collection, as well as the development of protective measures against such risks.