Canada to Implement new privacy law soon

This is an interesting situation, internet regulation is good but they this seems to go too far in my opinion. It refers to implementing the right to be forgotten here. It was recommended by the privacy commissioners office

I read the entire report it doesn’t say much. It basically says that the regulation should be like GBPR and the CCPA and that there should be safeguards for small businesses and that the privacy commissioner should have the power to fine businesses that break the regulation. The odd part is that it seems to take a disproportional aim at AI for some reason.

link to the report: https://www.priv.gc.ca/en/about-the-opc/what-we-do/consultations/completed-consultations/consultation-ai/pol-ai_202011/

Yes but our laws are much different than in America. If it’s a new technology and it has privacy implications, it must be law abiding and the Federal Government requires amendments to our privacy laws or as a bill enacted in Parliament.

As well, Canada is not a fan of using AI or other technologies to collect data on its own citizens and they often take a hard stance. Even more so because our privacy laws do not protect or provide or safeguard this type of analytics and data collection on the general population. At best we can protect only with law enforcement involvement but this has far bigger consequences when we add commercial businesses to the mix.

Much of this is because of recent events, like this one: https://www.cbc.ca/news/politics/cadillac-fairview-5-million-images-1.5781735

1 Like

The problem is that security is often an afterthought. In the EU, when you plan a new project / new product, you have to involve the companies Data Protection Officer at the planning stage, and what they say goes (they also have employment protection, they cannot be fired or penalised for raising security or data protection concerns during the time of their tenure or for 4 years thereafter - i.e. if they make unpopular decisions, thus keeping the company on the straight and narrow, but annoying senior management, they can’t be picked on or fired for doing their job).

In the past I’ve seen enough projects, where the aim is to get it off the ground and running, with little or no thought to data protection or even basic security - testing for SQL-Injection (security 101 since the turn of the century) is beyond most project teams I’ve dealt with - in one case, I pointed out their eShop had a SQL-Injection vulnerability during testing, they weren’t interested, even after I sent a list of customer data to them; in the end, I just did a “DROP DATABASE;” on them, now that got their attention!

AI has a lot of areas where it is hard to test, therefore it needs to be doubly certain that it is doing what it should and not straying from its mandate - just look at how lousy facial recognition has been, especially on non-Caucasian males. Big IT is its own worst enemy, when it comes to actually following the laws around the world and in many cases, it is cheaper to keep breaking the law than to comply - Facebook was caught doing something illegal in Russia a couple of years back, what exactly escapes me, but the maximum fine was around $15,000 US, so it was cheaper to keep paying the fine every few months than to actually set programmers to deal with it and the corresponding drop in ad-revenue.

The same thing seems to be the case in Canada currently, my reading of the article is that the only recourse open to them was to tell the real estate company not to do it again. It is like the old joke about English Bobbies (police officers), only armed with truncheons, when a robber runs away, they shout “Stop! Or I’ll shout stop again!”

I understand the difference but I read the entire report and it seems to do very little from a technical perspective.

This recommendation actually has little to do with the Cadillac Fairview situation the Privacy commission is doing a separate investigation. Also, I just did a keyword search on the report and it doesn’t say a word about facial recognition.

yes but the report refers to GDPR and the CCPA many times but doesn’t seem to do much that they do.

Here are all the recommendations:

  1. PIPEDA should take an explicit rights-based approach reflected in a preamble and in an amended purpose clause.
  2. PIPEDA should establish rights and obligations rather than provide recommendations.
  3. The definition of personal information in PIPEDA should be amended to make clear that personal information includes collected personal information and inferences about individuals.
  4. PIPEDA should empower the OPC to issue binding orders and financial penalties with enforcement discretion. It should shift the OPC model from an ombudsperson to a regulator with the ability to issue binding guidelines.
  5. PIPEDA should incorporate private rights of action for breaches of statutory obligations.
  6. PIPEDA should incorporate a consent exception when collection or processing serves the public good.
  7. PIPEDA should add flexibility to the collection, use, and dissemination of de-identified information for research purposes while protecting privacy and human rights.
  8. PIPEDA should allow for greater flexibility for further processing of information not initially identified in original purpose when the new purpose is compatible with the original purpose.
  9. PIPEDA should mandate safeguards when exceptions or flexibility measures to consent are invoked: (i) balancing tests and (ii) privacy impact assessments.
  10. PIPEDA should not define AI and should maintain technological neutrality.
  11. PIPEDA should incorporate a right for individuals to obtain a meaningful explanation when subject to automated decision-making and using their personal information.
  12. PIPEDA should create a right to contest decisions that are made about an individual using his or her personal information based on automated decision-making.
  13. PIPEDA should incorporate a right to demonstrable accountability for individuals, including a right to data traceability.
  14. PIPEDA should require organizations to implement appropriate technical and organizational measures for designing for privacy and human rights prior to and in all phases of collection and processing, including the prohibition of deceptive design.
  15. PIPEDA should implement the privacy impact assessment and third-party audit framework by establishing that companies will be held in breach of PIPEDA and sanctioned if organizations fail to take preventive measures and harm to individuals occurs.
  16. PIPEDA should exempt small and medium enterprises from active recordkeeping and from any third-party audits and privacy impact assessment requirements.

My point is that these recommendations don’t layout actions that should be regulated they just layout vague ideas and also about half of it is about enforcement of current regulations.

That looks fairly standard fare, although clauses 6 and 7 seem to be a get-out-of-jail-free card of sorts.

It certainly doesn’t seem as tough as the GDPR in this early form.

It’s not even really an early Form,just a recommendation. I think it would help if they had someone who has experience in tech.