Anne Cavoukian University of Toronto Alumni Talk

28-Feb-2017

Joseph's Notes and Commentary

I attended the talk and made hand written notes.  This is mainly a dump of those notes.  Anne's talk made me think and I've intermingled a few comments where that happened.  My commentary is preceded with "(JS:)"  Anne's slide deck: Feb 28th - UofT Alum - Dr. Cavoukian.ppt

Title:  "It's Time To Protect Your Privacy Before It Is Taken Away From You"

Privacy and Data Protection

  • Up to you
  • Data is (becoming) the oil of the new economy
  • Privacy is about preserving our freedom.  Privacy is the basis of freedom
  • Pendulum is currently swinging away from privacy.
    • Terrorist incidents lead to greater concern for public safety
    • Personal privacy seen as less important than public safety under these circumstances, and even inhibiting public safety
  • Privacy ≠ Secrecy
  • Privacy = Control
    • The state having access to everything (to enchance public safety, for example) is counter to personal freedom
    • Freedom of choice:  individual chooses what they want to reveal
    • Former totalitarian states now greatly value individual control over personal information.
      • Germany:  "Informational Self Determination"
    • Context is key
      • An example is the Ontario health privacy law – "PHIPA"
      • Doctors and health care providers were against the act because, in their view, sharing of health information is paramount to patient care
        • See privacy as a hindrance
      • But, a patient is (likely) willing to share their health information with their health care providers
        • It's up to the patient to decide who has access to their health information
  • Privacy is essential to personal freedom
    • greater freedom for blue sky thinking
    • breeds creativity
    • surveillance is the opposite

Privacy by Design (PbD)

  • Embed privacy into new technologies and business practices from the outset
  • Essence:  power of "and" vs. "or"
    • Zero sum assumption is wrong: sacrifice privacy in order to improve something else (public safety or health care)
    • False dichotomy
    • replace "or" with "and"
    • (JS: Also resonates with accessibility:  "if we add accessibility it will detract from ..." (zero sum), and our counter argument is that it will improve things for everyone (positive sum)).
  • When discussing the idea of adding privacy from the outset with engineers:
    • engineers response: not a problem; can be done.
    • then why was it not done?
    • "they" didn't put it into the requirements; "they" being the CEOs (management)
    • Management usually promotes privacy at the end as bolted-on after the fact
    • (JS:  this has been and is a common complaint with respect to accessibility, and we even use the same phrase:  "accessibility is bolted on at the end")

Seven Principles of PbD

  1. Proactive
  2. Default Setting
    • Privacy is the default
    • Privacy is not opt-in
  3. Embedded in design, automatic
  4. Full functionality
    • Technology does whatever it is supposed to do + privacy
    • Positive sum, not zero sum
  5. End-to-end security
    • Security includes privacy
  6. Visible and Transparent to individual
    • Your data belongs to you
    • You can check it for accuracy
  7. User centric
    • respect for user privacy is built in

(JS:  Exercise: replace "privacy" with "accessibility" in the above seven principles.  Or "inclusion".)

PbD is not just a set of principles and way of thinking about privacy, but has been operationalized (implemented) in, for example:

  • CCTV
  • bio-metric devices
  • smart meters, smart grid
  • RFID
  • remote home health care
  • big data

(JS: the above list are examples of some cases where PbD has been implemented.  That is, some smart meter systems used PbD (e.g., Ontario's Hydro One smart electric meters), but not all.  Similarly some bio-metric devices are safe and don't leak personal information, but others do.  The point is that there are cases where PbD has been successfully implemented according to its seven principles and these constitute an existence proof that PbD is not merely pie in the sky but is do-able.)

Economics of PbD

  • There is a cost of implementing PbD in a technology.
  • That cost is a reason why management is reluctant to address privacy issues in their products; however...
  • The cost to not implement is much greater:
    • Law suits - individual or class action against the company if personal information is accidentally leaked
    • Damage to company brand
    • Loss of consumer confidence

European Union Law – "General Data Protection Regulation" (GDPR)

  • PbD is embedded in the law
  • "if you implement PbD, then you have mastered the GDPR"
    • This is quote from the EU, not Anne Cavoukian.

What Can You Do

  • When asked to provide information, ask:
    • what is my information going to be used for?
    • who else will have access to that information?
    • E.g., giving out an email address to receive an e-receipt for a purchase:
      • it's okay if the information is being used for the intended purpose, namely, to send you your receipt
      • it's not okay if your email address will be passed along to someone else to, say, send you advertisements
    • Ask how they will protect your information
  • On social media, check and engage privacy settings.
    • E.g., Facebook have implemented "strong privacy", but it is not the default.  Individuals must opt-in  (Good that is it available, but it's not PbD since it violates the second principle)
  • Use only apps that provide end-to-end encryption
    • E.g. Apple iPhone – not even Apple can decrypt your information (JS: is that true?  I suspect that Apple can, if they wanted to.  Also, what about Android?)
  • Use DuckDuckGo for internet searches
  • Where possible, use VPN or TOR, especially in unprotected coffee shop WiFi networks.
    • (JS: VPN is a "virtual private network" that allows users to send and receive information over a public network, but as if it were directly connected to a private network.  Applications running on a VPN may achieve the same security benefits of the private network.)
    • (JS: TOR is software for anonymous communication, concealing your identity and making tracking difficult. See also the TOR browser for anonymous web surfing)

Internet of Things and Privacy

  • The main risk is third party monitoring
  • Latest:  Ars Technica article about teddy bears (plush toys) that have microphones that stores everything said in the cloud but has leaked the information, including email addresses.
  • Examples of "things" hooked up to the internet:
    1. Wearable computing (e.g., fitbit)
    2. Quantified self (JS: ?)
    3. Home automation (e.g., Amazon's Alexa/Echo and Google Home)
  • European Union Article 29 Data Protection Working Party
    • Members are data protection authority representatives of each EU member state.  They agreed to:
    • make privacy settings the default
    • delete all raw data after processing/using it
    • respect self control
    • politely seek consent from users
  • PbD is a selling point
    • Being able to say your smart device is PbD makes it attractive to consumers
    • (JS: Similar to accessibility, at least circa 2000 - 2009.  Slapping an "accessible" label on your product was a selling point, especially to US government agencies that were looking to procure products that satisfied section 508 of the American Rehabilitation Act).
  • FTC (US Federal Trade Commission) investigated wearable smart devices
    • found that data was shared with 76 different parties without the knowledge of the user
  • Pew Research Internet Project – "Public Perceptions of Privacy and Security in the Post-Snowden Era"
    • 91% of respondents, given the opportunity of a discount if they consented to sharing their information, would refuse the discount.
    • 80% of social network users are concerned about third party use of their data
    • 80% of adults are concerned about government surveillance
    • Bottom line:  privacy is more important to consumers at the present time
  • AI assistants (e.g., Alexa)
    • servant or spy?
    • can't speak freely in your own home because you don't know if what is said will remain within the four walls of your house.
    • legal implications:  can law enforcement simply get access to the information acquired by your AI assistant at their will?
      • worry is not that they could get a warrant and have legal access to the information, but since the device is not secured, they may access it without you even knowing it.
    • Smart Barbie and "Klara" – smart dolls
      • Klara was outlawed in Germany
    • What's missing is knowing and transparency, hence, what's missing is control by users over their information

Conclusions

  • Privacy risks managed by proactive embedding of PbD
  • Focus on prevention; more cost effective
  • Zero sum thinking is wrong; embrace positive sum; we can have it all
  • Lead with PbD, not Privacy by Disaster
  • Freedom and Privacy are inextricably tied together