Data (R)Evolution: Consumer welfare and growth in the digital economy
Rod Sims
Speech: Consumer Policy Research Centre 2019 Conference (19 November 2019)
ACCC Chair Rod Sims addresses the Consumer Policy Research Centre 2019 conference about consumer welfare and the emerging challenges for policymakers and regulators in the face of the growing digital economy.
Full speech reproduced below
Reproduced in accordance with CC BY 3.0 AU licence
© Commonwealth of Australia
Source link
Data (R)Evolution: Consumer welfare and growth in the digital economy
Speakers:
Mr Rod Sims, Chair
Conference:
CPRC Conference
19 November 2019
ACCC Chair Rod Sims addresses the Consumer Policy Research Centre 2019 conference about consumer welfare and the emerging challenges for policymakers and regulators in the face of the growing digital economy.
Transcript:
Check against delivery.
Over the first two decades of this century, the digital platforms have been a major change agent in all our communications, and indeed in our economy.
Australians undoubtedly benefit from the many ‘free’ services offered by digital platforms. They connect business to consumers, provide access to information on a scale not seen before, and have the ability to intimately link family and friends across the globe.
While all this is of great value, increasingly Australians want to know, where is this all leading us? What have been heralded as great innovations are now having their value questioned.
As in other sectors, with greater market power comes greater responsibility, and with it, greater scrutiny.
In 2017 the Australian Federal Government implemented broad media reforms, and as part of the reform package directed the ACCC to undertake the Digital Platforms Inquiry (DPI) to look at the impact of digital platforms on media, advertisers and consumers.
We found that for business users and advertisers, Google and Facebook are essentially the gatekeepers to consumers in Australia.
Each month, approximately:
19.2 million Australians use Google Search
17.3 million Australians access Facebook
17.6 million Australians watch YouTube, owned by Google; and
11.2 million Australians access Instagram, owned by Facebook.
The market dominance of Google and Facebook gives them access to unparalleled amounts of private and public data. Not only do they collect data from consumers spending time on their own websites and apps, which accounts for over 39 per cent of Australians’ time spent online, they have the largest network of trackers on third-party websites and apps.
A study of the top 1 million websites found a Google tracker on more than 70 per cent of websites and a Facebook tracker on over 20 per cent. Research of the top 1 million apps on the Google Play store found that 88 per cent of the apps were sending data to Google and 43 per cent of the apps were sending data to Facebook. Virtually every keystroke registers a hit in their data bank and reinforces their market power.
This unmatched access to data on consumer behaviours online as well as their advanced data analytics tools confers Google and Facebook considerable competitive advantage, reinforcing their market power in Australian media and advertising markets.
But as a Google insider, recently tweeted:
‘Data is not the new gold, data is the new uranium. Sometimes you can make money from it, but it can be radioactive, it's dangerous to store, has military uses, you generally don't want to concentrate it too much, and it's regulated.’
While most users now have some understanding that certain types of data and personal information is collected in return for their use of a service, our consumer surveys show that there still remains little understanding of what is collected and the economic power of that collected data.
Today I want to bring out these points by focusing on:
Our DPI and some of the consumer harms it identified
Data sharing in the changing world, and
Some observations on behaviours post our Digital Platforms Inquiry.
1. Our DPI report data shows consumer harms
In our recent Inquiry the ACCC found that few consumers are fully informed of, nor can they effectively control, how their data is collected, used and shared by digital platforms. Nor do they fully appreciate the extent to which their actions online are monitored and recorded for different purposes including targeted advertising; nor do they understand the breadth of permissions they grant to digital platforms regarding the use of their user-uploaded content.
Vague, long and complex data policies contribute to this substantial disconnect between how consumers think their data should be treated and how it is actually treated.
The fact digital platform users are not well informed about the collection and use of their data affects competition and consumer welfare.
A lack of clear information reduces consumers’ ability to make informed choices based on how their data will be handled, in turn preventing competition on this important element of digital platforms’ service offerings.
The collection of user data is at the heart of business models of digital platforms like Google and Facebook, as it allows them to offer highly targeted advertising opportunities. Some 84 per cent of Google’s revenue comes from advertising, while 98 per cent of Facebook’s revenue comes from advertising opportunities.
The ubiquity of Google and Facebook means many consumers feel they have to join or use these platforms, and agree to their non-negotiable terms of use, in order to receive communications and remain involved in community life. That can mean accepting clickwrap agreements containing take-it-or-leave-it terms with numerous bundled consents for the collection, use and sharing of their user data.
In the ACCC review of several large digital platforms terms and conditions we found that each of their terms of service required a user to grant the digital platform a broad licence to store, display, or use any content uploaded by the user.
What’s more, terms and conditions about data collection and use can often be changed unilaterally, and once consumers have agreed to the collection of their data they have minimal control over how that data is kept and shared in the future.
These extensive data collection practices of the digital platforms reflect an imbalance of bargaining power between platforms and their users.
Our Inquiry made 23 recommendations on how to start addressing the challenges of the fast evolving digital world. At their core is the need to have a holistic approach to the interrelated competition, consumer protection, privacy, media and advertising issues.
Our key recommendations included:
the need for continued scrutiny of the competitive implications and potential consumer harms associated with digital platforms
a further inquiry into ad-tech which would lead to a better understanding of how consumer data is used and shared in the ad-tech supply chain
the introduction of an Unfair Practices prohibition and penalties for unfair contract terms which would effectively deter conduct arising in digital markets that may not neatly fit within existing consumer protection provisions
the creation of an ombudsman scheme or the extension of an existing ombudsman scheme to more effectively resolve disputes between digital platforms and consumers and small business
a code of practice for digital platforms to regulate their data practices of particular concern to consumers, and
importantly, both strengthening the Privacy Act and undertaking a broader review of privacy law to ensure it remains fit for purpose in the digital age.
In relation to the last two points, we have been working closely with Angelene Falk and the OAIC. They, and the Attorney-General’s Ministry will play the lead role in taking those reforms forward.
2. Data sharing in a changing world
Consumers’ rights over their data can be further weakened where a company holding their consumer data is sold or merged. The ACCC’s review of terms and policies found that each set of terms and policies included a right for the platform to transfer the user’s data to a third party in the event of a merger or acquisition. Presumably without their consent and, often with different terms and conditions.
While parties frequently claim that they don’t have the incentive or ability to combine data sets, history tells us that incentives and ability can and do change. Often over a relatively short period of time. This can be as a result of a change of heart, or via advancements in technology enabling such combinations to occur.
At the time of Google’s acquisition of DoubleClick, DoubleClick reportedly denied that the data it collects through its system for serving ads would be combined with Google’s search data. Eight years later, Google updated its privacy policy and removed a commitment not to combine Doubleclick data with personally identifiable data held by Google.
At the time of Facebook’s acquisition of WhatsApp, Facebook claimed it was unable to establish reliable matching between Facebook users and WhatsApp users’ accounts. Two years later, WhatsApp updated its terms of service and privacy policy, indicating it could link WhatsApp users’ phone numbers with Facebook users’ identities.
These changes in data collection policies create a very uncertain world for consumers who share very personal information and often commit their digital lives to particular platforms.
The issue of platforms expanding and potentially increasing or enhancing market power by combining data sets is already being raised by some commentators in relation to Google’s recently announced proposed acquisition of Fitbit.
A significant attraction for some consumers signing up to Fitbit’s hugely successful wristband products has been its reputation for having robust privacy policies and settings, including controls over data collection and deletion.
Therefore an important issue for consumers will be whether those privacy settings will remain in place after the acquisition. While both Fitbit and Google have indicated that Fitbit users health and wellbeing data will not be used for Google Ads, given the prior history I have just mentioned, it is a stretch to believe that commitment will still be in place five years from now.
It is also worth noting that while the FitBit users’ health and wellbeing data may not be used for Google Ads it may potentially be used by Google for other purposes. It may also be the case that the data collected by FitBit which is not ‘health and wellbeing’ data could be used for Google Ads.
On top of this, how will we know anyway?
Considering the past, it is no surprise to see reports that Fitbit users are concerned about their data, in spite of the public promises to honour Fitbit’s privacy policies.
Clearly personal health data is an increasingly valuable commodity. I am sure many of you would have seen the recent reports from the US of Google’s ‘Project Nightingale’, which involves the use of health data from a large health care provider. This has been described as ‘the largest data transfer of its kind so far in the healthcare field’ and involves the personal health information of patients across a network of 2,600 hospitals, clinics and other medical outlets.
Understandably concerns have been expressed that so much sensitive health data which is likely to be extremely valuable is being provided to a single global tech company which makes its wealth by monetising data.
Facebook’s recent announcement of its planned offering of cryptocurrency Libra, is potentially also a cause for concern. It is easy to see how there could be issues when a social media company that collects users’ personal data on a significant scale will now also provide them with financial services.
We understand that banks share their data but this is different.
Here we have an organisation, whose lifeblood is to monetise data, getting into the financial services industry.
While Facebook has publicly announced that its digital wallet, Calibra, will not share account information or financial data with Facebook or any third party without customer consent; and that the Calibra financial data will not be used to improve ad targeting on Facebook products, our ability to rely on the implementation of this broad commitment remains an issue.
The risks of these data practices is reinforced from the ACCC Inquiry, which showed around 80 per cent of users considered digital platforms tracking of their online behaviour to create profiles, and also the sharing of their personal information with an unknown third party, to be a misuse of their information.
Even seemingly benign data collection programs designed to reward consumers can be misleading. Recently we have seen how customer loyalty schemes, such as frequent flyer and supermarket loyalty programs, are used to build data profiles that are sold onto third parties without the customer’s knowledge.
Then there are the instances of sensitive user data, collected by websites and apps, being passed through to third parties for a range of purposes, including marketing.
Research has found some period tracker apps share data with third parties including companies that provide health and fitness services as well as digital platforms such as Facebook. The information shared includes information such as alcohol consumption, symptoms experienced, and text entered as private diary entries.
The proliferation of data and the advancement of data analytics has led to a growth in scams and identity theft, as well as growing risks of consumer harm arising from reduced competition and potential for discrimination and exclusion.
For example, data could be used to target individuals to buy goods at inflated prices; or can provide sellers and advertisers with information that allows discrimination between buyers based on income, geographic location or health issues.
Extensive profiling of individuals also opens our society up to other harms such as fake news and interference in our political processes. Just this last week we have seen the example of mining baron Twiggy Forest’s attack on Facebook for carrying fraudulent advertising that uses his name and image without permission. He is one of a number of high profile people, including James Packer and former NSW Premier Mike Baird, who are being used by scammers in fake ads showing on Facebook.
That Facebook have been slow to correct this problem is worrying, and a surprise to me. They seem to be able to do quite a lot of remarkable things with their algorithms so I find it interesting that they say they can’t deal with this.
At its heart it is a question of responsibility. To have a position of basically no or little responsibility for what’s on the platform is of great concern and is fundamentally unsustainable.
3. Some observations on behaviour post our DPI
We have not been idle.
We recently commenced a court action against Google alleging misleading conduct and false or misleading representations to consumers about the personal location data Google collects, keeps and uses.
The ACCC has alleged that Google breached the Australian Consumer Law when it made on-screen representations on Android mobile phones and tablets. We allege these representations from at least 2017, misled consumers about the location data Google collected or used when certain Google Account settings were enabled or disabled.
I won’t say any more about the matter as it is currently before the courts.
The ACCC has also instituted proceedings in the Federal Court against online health booking platform HealthEngine Pty Ltd for misleading and deceptive conduct relating to the sharing of consumer information with insurance brokers and the publishing of patient reviews and ratings.
The ACCC claims that between 31 March 2015 to 1 March 2018, HealthEngine manipulated the patient reviews it published, and misrepresented to consumers why HealthEngine did not publish a rating for some health practices.
This conduct is particularly egregious because patients would have visited doctors at their time of need based on, we allege, manipulated reviews that did not accurately reflect the experience of other patients.
We also allege that patients were misled into thinking their information would stay with HealthEngine but, instead, their information was sold off to insurance brokers.
Again this is before the courts so I will not say any more.
We have not been alone in acting against the digital platforms.
If anyone was to doubt the influence and commercial effect on consumer markets, you only have to look at some of the verdicts and the penalties coming from investigations around the world.
The European Commission fined Google €2.42 billion in 2017 for abusing their market dominance as a search engine by giving illegal advantage to its own comparison shopping service, Google Shopping.
The Italian Competition Authority fined Facebook €5 million twice in December 2018. They also ordered them to cease unfair practices such as transmitting and receiving user data without their prior express consent to and from third party websites and/or apps for commercial purposes and for user profiling.
In 2019 the US Federal Trade Commission (FTC) fined Facebook US$5 billion for deceiving users about their ability to control the privacy of their personal information.[1] The FTC also fined Google US$22.5 million in relation to charges it misrepresented to consumers the extent of privacy it offered to users of Apple’s Safari internet browser;[2] and a further US$170 million to settle charges YouTube illegally collected personal information from children without their parents’ consent.[3]
All these cases and subsequent judgements provide welcome precedents for regulators worldwide. Some not only set the bar for penalties but invariably provide insight into the often opaque operations of the digital platforms. Certainly these actions and penalties emphasise the size of the issues we are all dealing with in relations to digital platforms.
Since the publication of our Inquiry’s Final Report, the major digital platforms have made a number of announcements that suggest they are responding to the increased scrutiny of their practices by consumers and regulators across the world.
In August 2019, Facebook announced ‘Off-Facebook Activity’, a privacy control that will reportedly allow its users to clear their browsing history from their Facebook Accounts.
On 5 August 2019, Facebook announced a number of new partnerships with Australian news media businesses to fund and feature original news content on its Facebook Watch service.
On 12 September 2019, Google announced it had changed its search algorithms to prioritise original journalism.
In a September 2019, Facebook announced the suspensions of thousands of apps resulting from a privacy review it launched after the Cambridge Analytica data scandal.
On 31 October 2019, Twitter announced that will ban all paid political advertising worldwide, starting 22 November.
While political advertising was not a strong focus in our Inquiry, the algorithms-based messaging was raised as part of our inquiry into disinformation, misinformation and mal-information on digital platforms.
Most recently, on 25 October, Facebook announced it would introduce the ‘Facebook News Tab’, a content aggregator to prominently feature news content that Facebook is essentially syndicating from publishers, including allowing users free access to select articles on otherwise pay-walled publications.
Facebook CEO, Mark Zuckerberg said about the tab, that ‘for the first time there will be a place in the Facebook app dedicated solely to high-quality news.’
I see that News Corp chief executive Robert Thomson, a long-standing critic of Facebook, called the launch of their news service ‘a fundamental change to the digital ecosystem’.
As yet there is no indication if it will be offered locally or affect local companies.
One recommendation from our Inquiry was to implement a Bargaining Code, to address the imbalance in bargaining between the digital platforms and news businesses.
These recent announcements are vindication of our recommendations; when there is sufficient pressure on their behaviour, the digital platforms are willing to have genuine negotiations and recognise the role they perform in news distribution.
In conclusion
In an ideal world of efficient, data-driven markets, the digital platforms would innovate and compete for well-informed consumers who could choose between services based on a range of price and quality considerations as well as different levels of privacy protections and data security.
We are, however, far from an ideal world.
We are concerned that protections for consumers need to match the new digital age as the existing regulatory frameworks for the collection and use of data has not held up well to the challenges of digitalisation.
The scale and complexity of the problems identified by our Inquiry highlight the need decisive action.
The government’s response to the DPI is coming by year end.
While we are already seeing some fightback by vested interests, we are also seeing some initial change from the digital platforms.
Concerted pressure produces change, and that change also emphasises the need to maintain the pressure.