Jan 15, 2021

Accenture: Secure data sharing - without the risks

data sharing
Technology
Accenture
AI
Janet Brice
4 min
Data sharing
Privacy Preserving Computation (PPC) techniques can create a safe climate of secure data sharing to maximise collaboration and trust between partners...

Trust is the key stumbling block to data sharing, according to a report from Accenture who highlight how organisations can maximise collaboration through secure data sharing without the fear of losing their competitive advantage.     

A new family of Privacy Preserving Computation (PPC) techniques will allow data to be jointly analysed between parties without the fear of risk, according to a report from Accenture which outlines how these four techniques can be used across different industries including healthcare.

“PPC techniques open many new opportunities for enterprise collaborations that were not previously possible due to risk or regulation,” claims the report, Maximize collaboration through secure data sharing.

The need to share data is reflected in a recent Accenture C-suite survey in which 36% of executives said the number of organisations they had partnered with had doubled or more in the last two years. It also revealed that 71% of executives predict the volume of data exchanged with ecosystems will increase in the future.

But there are two main hurdles organisations need to overcome before feeling confident about sharing data: 

  • Trust remains elusive
  • The risk of sharing data is disproportionally higher than the potential value of sharing data – even in the presence of trust

The PPC techniques address these two key barrier points by allowing data to be jointly analysed without sharing all aspects of that data. “By doing so, companies can gain back control of their data and the risks associated with sharing it, even when used beyond their borders,” says Accenture.

What are PPC techniques?

PPC techniques are a family of cybersecurity techniques that look at how to represent data in a form that can be shared, analysed and operated on without exposing the raw information. 

According to Accenture, encryption techniques often form the core of how PPC techniques provide these capabilities, but here they are used in a slightly different way.

“PPC techniques use encryption differently to provide a mechanism to share data with other parties while limiting how or where the other parties can access the data, what parts of the data they can see, or what they can infer from the data,” says the report.

This can be done by one or more of the following:

  • Control the environment within which the data can be operated on
  • Obscure the data to protect the privacy of the data and remove identifying traits
  • Provide a way to allow the data to be operated on while encrypted

“You could think of this as cooking a meal without seeing the ingredients or doing a jigsaw puzzle without seeing the picture of the intended outcome.”

Four of the primary PPC techniques highlighted in the report are:

  1. Trusted execution environment

An environment with special hardware modules that allow for data processing within hardware-provided, encrypted private memory areas directly on the microprocessor chip only accessible to the running process.

  1. Differential privacy

A data obfuscation mechanism - often used with other traditional anonymization - that allows broad statistical information to be gathered from data without the specifics of individual items being exposed.

  1. Homomorphic Encryption

A technology that enables computation on encrypted data without the need to decrypt it first. 

  1. Secure Multi Party Computation (MPC)

A technology that provides a mechanism that allows a group to share the benefits of combining their data to create useful outputs while keeping their source data private.

“While these PPC techniques and technologies are still new, they are rapidly maturing and are now at a point where they can be used in real business use cases,” comments Accenture who focus on industries from Google to the Danish Sugar Company and Kara who are maximising the benefits of secure data sharing with their partners and consumers.  

According to Accenture there are also emerging opportunities to disrupt existing markets through the combined effect of PPC techniques and other technologies like blockchain and Internet of Things (IoT). 

One example is sighted as MyHealthMyData (MHMD), an EU-funded project, which is looking at how to share anonymised data for medical care, research and development, while giving people ownership over their health data. To perform this secure function the platform combines blockchain, smart contracts, dynamic consent and a suite of data privacy and secure analytics tools including Homomorphic Encryption and MPC.

“Beyond the traceability and control of data considerations, these technologies enable partners to work in a decentralised way, giving them the opportunity to jointly investigate common or shared business issues. Companies are also able to apply Artificial Intelligence and improved analysis methods to datasets that they had not previously had access to. This means collaborations with external parties - even competitors - are now possible, and in some cases, well underway,” concludes Accenture.

Read more

For more information on business topics in Asia Pacific, Australia and New Zealand, please take a look at the latest edition of Business Chief APAC.

Follow Business Chief on LinkedIn and Twitter. 

Share article

Jun 17, 2021

Chinese Firm Taigusys Launches Emotion-Recognition System

Taigusys
China
huawei
AI
3 min
Critics claim that new AI emotion-recognition platforms like Taigusys could infringe on Chinese citizens’ rights ─ Taigusys disagrees

In a detailed investigative report, the Guardian reported that Chinese tech company Taigusys can now monitor facial expressions. The company claims that it can track fake smiles, chart genuine emotions, and help police curtail security threats. ‘Ordinary people here in China aren’t happy about this technology, but they have no choice. If the police say there have to be cameras in a community, people will just have to live with it’, said Chen Wei, company founder and chairman. ‘There’s always that demand, and we’re here to fulfil it’. 

 

Who Will Use the Data? 

As of right now, the emotion-recognition market is supposed to be worth US$36bn by 2023—which hints at rapid global adoption. Taigusys counts Huawei, China Mobile, China Unicom, and PetroChina among its 36 clients, but none of them has yet revealed if they’ve purchased the new AI. In addition, Taigusys will likely implement the technology in Chinese prisons, schools, and nursing homes.

 

It’s not likely that emotion-recognition AI will stay within the realm of private enterprise. President Xi Jinping has promoted ‘positive energy’ among citizens and intimated that negative expressions are no good for a healthy society. If the Chinese central government continues to gain control over private companies’ tech data, national officials could use emotional data for ideological purposes—and target ‘unhappy’ or ‘suspicious’ citizens. 

 

How Does It Work? 

Taigusys’s AI will track facial muscle movements, body motions, and other biometric data to infer how a person is feeling, collecting massive amounts of personal data for machine learning purposes. If an individual displays too much negative emotion, the platform can recommend him or her for what’s termed ‘emotional support’—and what may end up being much worse. 

 

Can We Really Detect Human Emotions? 

This is still up for debate, but many critics say no. Psychologists still debate whether human emotions can be separated into basic emotions such as fear, joy, and surprise across cultures or whether something more complex is at stake. Many claim that AI emotion-reading technology is not only unethical but inaccurate since facial expressions don’t necessarily indicate someone’s true emotional state. 

 

In addition, Taigusys’s facial tracking system could promote racial bias. One of the company’s systems classes faces as ‘yellow, white, or black’; another distinguishes between Uyghur and Han Chinese; and sometimes, the technology picks up certain ethnic features better than others. 

 

Is China the Only One? 

Not a chance. Other countries have also tried to decode and use emotions. In 2007, the U.S. Transportation Security Administration (TSA) launched a heavily contested training programme (SPOT) that taught airport personnel to monitor passengers for signs of stress, deception, and fear. But China as a nation rarely discusses bias, and as a result, its AI-based discrimination could be more dangerous. 

 

‘That Chinese conceptions of race are going to be built into technology and exported to other parts of the world is troubling, particularly since there isn’t the kind of critical discourse [about racism and ethnicity in China] that we’re having in the United States’, said Shazeda Ahmed, an AI researcher at New York University (NYU)

 

Taigusys’s founder points out, on the other hand, that its system can help prevent tragic violence, citing a 2020 stabbing of 41 people in Guangxi Province. Yet top academics remain unconvinced. As Sandra Wachter, associate professor and senior research fellow at the University of Oxford’s Internet Institute, said: ‘[If this continues], we will see a clash with fundamental human rights, such as free expression and the right to privacy’. 

 

Share article