Data Privacy Under Spotlight Again in US and Europe

Two recent developments have once again brought data privacy under the spotlight. The Federal Trade Commission in the US is exploring rules to crack down on harmful commercial surveillance and lax data security.

In a communique, the commission said it was seeking public comments on the harms stemming from commercial surveillance, and is looking into whether new rules are needed to protect people’s privacy and information.

“Commercial surveillance is the business of collecting, analyzing, and profiting from information about people. Mass surveillance has heightened the risks and stakes of data breaches, deception, manipulation, and other abuses,” the agency said.

While the commission continued its use of harsh language for the industry, one silver lining was its use of the terminology “commercial surveillance” instead of “data brokers” or “murky data marketplace”, which it had been using earlier, and which many feared unfairly put the burden of misdeeds of a handful of bad players on the entire industry.

Meanwhile, across the Atlantic, a recent ruling by the European Court of Justice has now put companies dealing with consumer data under increased pressure, especially those profiling users to target them with behavioral ads or personalized content.

“Personal data which are, by their nature, particularly sensitive in relation to fundamental rights and freedoms merit specific protection as the context of their processing could create significant risks to the fundamental rights and freedoms,” the court said while ruling on a case involving public officials in Lithuania who had their spouses’ names published online. The court said this could indicate their sexual orientation, and that’s why is sensitive in nature.

ALSO READ: Location Data Privacy Takes Centerstage in US as FCC, FTC Crack Down on Violators

“Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited,” the EU court ruled.

While GDPR mandates that the information about a person’s personal/political views, religion, health or sexual orientation be treated as sensitive, prohibiting companies from processing them without applying safeguards, experts are now of the opinion that this could extend to other types of potentially sensitive information as well, including a person’s location, travel history or dating app activities.

This may also involve unstructured data in many cases. Further, as Washington Post points out, many companies with large data sets may not know they hold details that indirectly relate to sensitive information.

Both the moves are significant. They will ultimately streamline how the tech industry handles user data. As FTC Chair Lina M. Khan underlines: “The growing digitization of our economy — coupled with business models that can incentivize endless hoovering up of sensitive user data and a vast expansion of how this data is used — means that potentially unlawful practices may be prevalent.”

For instance, the recent announcement by Amazon that it is acquiring home cleaning robotics company iRobot has raised now raised concerns that it may give the tech giant a treasure trove of personal data of people – interior maps of all the users.

This is nothing but unnerving.

These “unlawful practices” not only mean profiteering as exhibited many times by the Big Tech (including those like Facebook and Google) but may actually pose physical threat to users as was witnessed during the data privacy scandal regarding the abortion controversy in the US when two data players were found selling information of people who visited such clinics, or another recent revelation about the Department of Home Security buying location data of millions of cellphone users from third party players without warrants to track their movement.

Also, as the FTC rightly points out, the algorithms used by data players are prone to errors, bias, and inaccuracy. “As a result, commercial surveillance practices may discriminate against consumers based on legally protected characteristics like race, gender, religion, and age, harming their ability to obtain housing, credit, employment, or other critical needs.”

Streamlining the rules and regulations will go a long way in addressing many of these concerns.

Disclaimer: Views Expressed are Author's Own. Geospatial World May or May Not Endorse it

If you like the article, Please share on social media

Picture of Anusuya Datta

Anusuya Datta

A writer based out of Canada, Anusuya is the Editor (Technology & Innovation) focused on developments in North America. Earlier she has worked with Geospatial World as the Executive Editor. A published author on several international platforms, she has worked with some of the finest brands in Indian media. A writer by choice, an editor by profession, and a technology commentator by chance, Anusuya is passionate about news and numbers, but it is the intersection of technology and sustainability and humanitarian issues that excites her most.

Related Articles