The idea of Apple scanning your phone raises the stakes on a fundamental question: can you trust Big? Tech?

The idea of Apple scanning your phone raises the stakes on a fundamental question: can you trust Big? Tech?

Apple, Facebook, Google and other companies have long scanned customers’ images that are stored on the companies’ servers for this material. Scanning data on users’ devices is a significant change.

Apple’s plan to scan your phone raises the stakes on a key question: Can you trust Big Tech?
Medford (US), Sep 15 (The Conversation) Apple’s plan to scan customers’ phones and other devices for images depicting child sexual abuse generated a backlash over privacy concerns, which led the company to announce a delay.

Highlights

  • Mobile Phones
    Specifically, customers are forced to trust Apple to only use this system as described, run the system securely over time, and put the interests of their users over the interests of other parties, including the most powerful governments on the planet.

  • However well-intentioned, and whether or not Apple is willing and able to follow through on its promises to protect customers’ privacy, the company’s plan highlights the fact that people who buy iPhones are not masters of their own devices. In addition, Apple is using a complicated scanning system that is hard to audit. Thus, customers face a stark reality: If you use an iPhone, you have to trust Apple.

What is trust?

Despite Apple’s so-far-unique plan, the problem of trust isn’t specific to Apple. Other large tech companies also have considerable control over customers’ devices and insight into their data.

Trust is “the willingness of a party to be vulnerable to the actions of another party,” according to social scientists. People base the decision to trust on experience, signs and signals. But past behavior, promises, the way someone acts, evidence and even contracts only give you data points. They cannot guarantee future action.

Therefore, trust is a matter of probabilities. You are, in a sense, rolling the dice whenever you trust someone or an organisation.

Trustworthiness is a hidden property. People collect information about someone’s likely future behavior, but cannot know for sure whether the person has the ability to stick to their word, is truly benevolent and has the integrity – principles, processes and consistency – to maintain their behavior over time, under pressure or when the unexpected occurs. Trust in Apple and Big Tech

Apple has stated that their scanning system will only ever be used for detecting child sexual abuse material and has multiple strong privacy protections. The technical details of the system indicate that Apple has taken steps to protect user privacy unless the targeted material is detected by the system. For example, humans will review someone’s suspect material only when the number of times the system detects the targeted material reaches a certain threshold. However, Apple has given little proof regarding how this system will work in practice. After analysing the “NeuralHash” algorithm that Apple is basing its scanning system on, security researchers and civil rights organizations warn that the system is likely vulnerable to hackers, in contrast to Apple’s claims.

Critics also fear that the system will be used to scan for other material, such as indications of political dissent. Apple, along with other Big Tech players, has caved to the demands of authoritarian regimes, notably China, to allow government surveillance of technology users. In practice, the Chinese government has access to all user data. What will be different this time? It should also be noted that Apple is not operating this system on its own. In the US, Apple plans to use data from, and report suspect material to, the nonprofit National Center for Missing and Exploited Children. Thus, trusting Apple is not enough. Users must also trust the company’s partners to act benevolently and with integrity.

Another concern is unintended consequences. Apple might really want to protect children and protect users’ privacy at the same time. Nevertheless, the company has now announced – and staked its trustworthiness to – a technology that is well-suited to spying on large numbers of people. Governments might pass laws to extend scanning to other material deemed illegal. Would Apple, and potentially other tech firms, choose to not follow these laws and potentially pull out of these markets, or would they comply with potentially draconian local laws? There’s no telling about the future, but Apple and other tech firms have chosen to acquiesce to oppressive regimes before. Tech companies that choose to operate in China are forced to submit to censorship, for example.

This case exists within a context of regular Big Tech privacy invasions and moves to further curtail consumer freedoms and control. The companies have positioned themselves as responsible parties, but many privacy experts say there is too little transparency and scant technical or historical evidence for these claims. Big Tech’s less-than-encouraging track record