<?xml version="1.0" encoding="UTF-8" ?>
<?xml-stylesheet href="https://rss.buzzsprout.com/styles.xsl" type="text/xsl"?>
<rss version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:podcast="https://podcastindex.org/namespace/1.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:psc="http://podlove.org/simple-chapters" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
  <atom:link href="https://rss.buzzsprout.com/2059470.rss" rel="self" type="application/rss+xml" />
  <atom:link href="https://pubsubhubbub.appspot.com/" rel="hub" xmlns="http://www.w3.org/2005/Atom" />
  <title>The Shifting Privacy Left Podcast</title>

  <lastBuildDate>Thu, 12 Mar 2026 22:55:32 -0400</lastBuildDate>
  <link>https://shiftingprivacyleft.com/</link>
  <language>en-us</language>
  <copyright>© 2026 © 2002-2023 Principled LLC. All rights reserved.</copyright>
  <podcast:locked>yes</podcast:locked>
    <podcast:guid>131c75d1-ada6-5450-9a00-50e1f258f78e</podcast:guid>
<podcast:podroll>
    <podcast:remoteItem feedGuid="aeab66de-6f81-502b-ad70-805856b20036" feedUrl="https://feeds.buzzsprout.com/2186686.rss" />
    <podcast:remoteItem feedGuid="52954d49-3947-5785-8259-820941918dd1" feedUrl="https://redcloveradvisors.libsyn.com/rss" />
    <podcast:remoteItem feedGuid="75f6d39c-b65f-5817-9c37-7dc361e02035" feedUrl="https://partially-redacted.podbean.com/feed.xml" />
    <podcast:remoteItem feedGuid="060fda0b-6ef8-56de-ae58-85320e785f51" feedUrl="https://feeds.buzzsprout.com/2128447.rss" />
    <podcast:remoteItem feedGuid="0297ffa9-f054-54cf-a57f-6c1f952fc9b9" feedUrl="https://www.accountabilitystudio.org/feed/privacyabbreviated/" />
    <podcast:remoteItem feedGuid="cd09c26c-f1c5-5e67-8c46-a5ad41b9b9c6" feedUrl="https://feeds.captivate.fm/data-mesh-radio/" />
    <podcast:remoteItem feedGuid="adb775f8-e5fc-52d0-b911-e9016e92c4cf" feedUrl="https://media.rss.com/theprivacywhisperer/feed.xml" />
  </podcast:podroll>
  <itunes:author>Debra J. Farber (Shifting Privacy Left)</itunes:author>
  <itunes:type>episodic</itunes:type>
  <itunes:explicit>false</itunes:explicit>
  <description><![CDATA[<p>Shifting Privacy Left features lively discussions on the need for organizations to embed privacy by design into the UX/UI, architecture, engineering / DevOps and the overall product development processes BEFORE code or products are ever shipped. Each Tuesday, we publish a new episode that features interviews with privacy engineers, technologists, researchers, ethicists, innovators, market makers, and industry thought leaders. We dive deeply into this subject and unpack the exciting elements of emerging technologies and tech stacks that are driving privacy innovation; strategies and tactics that win trust; privacy pitfalls to avoid; privacy tech issues ripped from the headlines; and other juicy topics of interest.&nbsp;</p>]]></description>
  <generator>Buzzsprout (https://www.buzzsprout.com)</generator>
  <itunes:keywords>privacy, tech, security, engineering, innovation, ethics, developers, devops, design, data science, privacy engineer, architecture</itunes:keywords>
  <itunes:owner>
    <itunes:name>Debra J. Farber (Shifting Privacy Left)</itunes:name>
  </itunes:owner>
  
  <itunes:image href="https://storage.buzzsprout.com/zelrr1k09viids5vbp9vww0bytjk?.jpg" />
  <itunes:category text="Technology" />
  <itunes:category text="Business">
    <itunes:category text="Entrepreneurship" />
  </itunes:category>
  <itunes:category text="News">
    <itunes:category text="Tech News" />
  </itunes:category>
  <podcast:person role="host" href="https://shiftingprivacyleft.com" img="https://storage.buzzsprout.com/uo7g3ql65n9xbsopos2pqcm8x9nb">Debra J Farber</podcast:person>
  <item>
    <itunes:title>S3E15: &#39;New Certification: Enabling Privacy Engineering in AI Systems&#39; with Amalia Barthel &amp; Eric Lybeck</itunes:title>
    <title>S3E15: &#39;New Certification: Enabling Privacy Engineering in AI Systems&#39; with Amalia Barthel &amp; Eric Lybeck</title>
    <itunes:summary><![CDATA[In this episode, I'm joined by Amalia Barthel, founder of Designing Privacy, a consultancy that  helps businesses integrate privacy into business operations; and Eric Lybeck, a seasoned independent privacy engineering consultant with over two decades of experience in cybersecurity and privacy. Eric recently served as Director of Privacy Engineering at Privacy Code. Today, we discuss: the importance of more training for privacy engineers on AI system enablement; why it's not enough for pr...]]></itunes:summary>
    <description><![CDATA[<p>In this episode, I&apos;m joined by <a href='https://www.linkedin.com/in/amaliabarthel/'>Amalia Barthel</a>, founder of <a href='https://designingprivacy.ca/'>Designing Privacy</a>, a consultancy that  helps businesses integrate privacy into business operations; and <a href='https://www.linkedin.com/in/ericlybeck/overlay/about-this-profile/'>Eric Lybeck</a>, a seasoned independent privacy engineering consultant with over two decades of experience in cybersecurity and privacy. Eric recently served as Director of Privacy Engineering at Privacy Code. Today, we discuss: the importance of more training for privacy engineers on AI system enablement; why it&apos;s not enough for privacy professionals to solely focus on AI governance; and how their new hands-on course, &quot;Privacy Engineering in AI Systems Certificate program,&quot; can fill this need. <br/><br/>Throughout our conversation, we explore the differences between AI system enablement and AI governance and why Amalia and Eric were inspired to develop this certification program. They share examples of what is covered in the course and outline the key takeaways and practical toolkits that enrollees will get - including case studies, frameworks, and weekly live sessions throughout. <br/><br/><b>Topics Covered</b>: </p><ul><li>How AI system enablement differs from AI governance and why we should focus on AI as part of privacy engineering </li><li>Why Eric and Amalia designed an AI systems certificate course that bridges the gaps between privacy engineers and privacy attorneys</li><li>The unique ideas and practices presented in this course and what attendees will take away </li><li>Frameworks, cases, and mental models that Eric and Amalia will cover in their course</li><li>How Eric &amp; Amalia structured the Privacy Engineering in AI Systems Certificate program&apos;s coursework </li><li>The importance of upskilling for privacy engineers and attorneys</li></ul><p><br/><b>Resources Mentioned</b>:</p><ul><li>Enroll in the <a href='https://designingprivacy.ca/products/privacy-engineering-in-ai-systems-certificate'>&apos;Privacy Engineering in AI Systems Certificate program&apos;</a> (Save $300 with promo code: PODCAST300 - enter this into the Inquiry Form instead of directly purchasing the course)</li><li>Read: <a href='https://link.springer.com/book/10.1007/978-1-4302-6356-2'>&apos;The Privacy Engineer&apos;s Manifesto&apos;</a></li><li>Take the free European Commission&apos;s course, <a href='https://academy.europa.eu/courses/understanding-law-as-code'>&apos;Understanding Law as Code&apos;</a></li></ul><p><br/><b>Guest Info</b>: </p><ul><li>Connect with Amalia on <a href='https://www.linkedin.com/in/amaliabarthel/'>LinkedIn</a></li><li>Connect with Eric on <a href='https://www.linkedin.com/in/ericlybeck/'>LinkedIn</a></li><li>Learn about <a href='https://designingprivacy.ca/'>Designing Privacy</a></li></ul><p><br/></p><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this episode, I&apos;m joined by <a href='https://www.linkedin.com/in/amaliabarthel/'>Amalia Barthel</a>, founder of <a href='https://designingprivacy.ca/'>Designing Privacy</a>, a consultancy that  helps businesses integrate privacy into business operations; and <a href='https://www.linkedin.com/in/ericlybeck/overlay/about-this-profile/'>Eric Lybeck</a>, a seasoned independent privacy engineering consultant with over two decades of experience in cybersecurity and privacy. Eric recently served as Director of Privacy Engineering at Privacy Code. Today, we discuss: the importance of more training for privacy engineers on AI system enablement; why it&apos;s not enough for privacy professionals to solely focus on AI governance; and how their new hands-on course, &quot;Privacy Engineering in AI Systems Certificate program,&quot; can fill this need. <br/><br/>Throughout our conversation, we explore the differences between AI system enablement and AI governance and why Amalia and Eric were inspired to develop this certification program. They share examples of what is covered in the course and outline the key takeaways and practical toolkits that enrollees will get - including case studies, frameworks, and weekly live sessions throughout. <br/><br/><b>Topics Covered</b>: </p><ul><li>How AI system enablement differs from AI governance and why we should focus on AI as part of privacy engineering </li><li>Why Eric and Amalia designed an AI systems certificate course that bridges the gaps between privacy engineers and privacy attorneys</li><li>The unique ideas and practices presented in this course and what attendees will take away </li><li>Frameworks, cases, and mental models that Eric and Amalia will cover in their course</li><li>How Eric &amp; Amalia structured the Privacy Engineering in AI Systems Certificate program&apos;s coursework </li><li>The importance of upskilling for privacy engineers and attorneys</li></ul><p><br/><b>Resources Mentioned</b>:</p><ul><li>Enroll in the <a href='https://designingprivacy.ca/products/privacy-engineering-in-ai-systems-certificate'>&apos;Privacy Engineering in AI Systems Certificate program&apos;</a> (Save $300 with promo code: PODCAST300 - enter this into the Inquiry Form instead of directly purchasing the course)</li><li>Read: <a href='https://link.springer.com/book/10.1007/978-1-4302-6356-2'>&apos;The Privacy Engineer&apos;s Manifesto&apos;</a></li><li>Take the free European Commission&apos;s course, <a href='https://academy.europa.eu/courses/understanding-law-as-code'>&apos;Understanding Law as Code&apos;</a></li></ul><p><br/><b>Guest Info</b>: </p><ul><li>Connect with Amalia on <a href='https://www.linkedin.com/in/amaliabarthel/'>LinkedIn</a></li><li>Connect with Eric on <a href='https://www.linkedin.com/in/ericlybeck/'>LinkedIn</a></li><li>Learn about <a href='https://designingprivacy.ca/'>Designing Privacy</a></li></ul><p><br/></p><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/15455483-s3e15-new-certification-enabling-privacy-engineering-in-ai-systems-with-amalia-barthel-eric-lybeck.mp3" length="28105261" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/imwirxpuzfq5c992j8f7340iyzie?.jpg" />
    <itunes:author>Debra J. Farber / Amalia Barthel &amp; Eric Lybeck</itunes:author>
    <guid isPermaLink="false">Buzzsprout-15455483</guid>
    <pubDate>Tue, 23 Jul 2024 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15455483/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15455483/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15455483/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15455483/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/15455483/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="Privacy Engineering in AI Systems" />
  <psc:chapter start="5:48" title="Bridging Legal and Technical Realm" />
  <psc:chapter start="14:47" title="Navigating AI Risk Frameworks and Policy" />
  <psc:chapter start="26:24" title="Advanced AI and Privacy Course Discussion" />
</psc:chapters>
    <itunes:duration>2338</itunes:duration>
    <itunes:keywords>AI systems</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>15</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E14: &#39;Why We Need Fairness Enhancing Technologies Rather Than PETs&#39; with Gianclaudio Malgieri (Brussels Privacy Hub)</itunes:title>
    <title>S3E14: &#39;Why We Need Fairness Enhancing Technologies Rather Than PETs&#39; with Gianclaudio Malgieri (Brussels Privacy Hub)</title>
    <itunes:summary><![CDATA[Today, I chat with Gianclaudio Malgieri, an expert in privacy, data protection, AI regulation, EU law, and human rights. Gianclaudio is an Associate Professor of Law at Leiden University, the Co-director of the Brussels Privacy Hub, Associate Editor of the Computer Law &amp; Security Review, and co-author of the paper "The Unfair Side of Privacy Enhancing Technologies: Addressing the Trade-offs Between PETs and Fairness". In our conversation, we explore this paper and why privacy-enhancing te...]]></itunes:summary>
    <description><![CDATA[<p>Today, I chat with <a href='https://www.linkedin.com/in/gianclaudio-malgieri-410718a1/'>Gianclaudio Malgieri</a>, an expert in privacy, data protection, AI regulation, EU law, and human rights. Gianclaudio is an Associate Professor of Law at Leiden University, the Co-director of the <a href='https://brusselsprivacyhub.eu/'>Brussels Privacy Hub</a>, Associate Editor of the <a href='https://www.sciencedirect.com/journal/computer-law-and-security-review'>Computer Law &amp; Security Review</a>, and co-author of the paper <a href='https://dl.acm.org/doi/pdf/10.1145/3630106.3659024'>&quot;The Unfair Side of Privacy Enhancing Technologies: Addressing the Trade-offs Between PETs and Fairness&quot;</a>. In our conversation, we explore this paper and why privacy-enhancing technologies (PETs) are essential but not enough on their own to address digital policy challenges.<br/><br/>Gianclaudio explains why PETs alone are insufficient solutions for data protection and discusses the obstacles to achieving fairness in data processing – including bias, discrimination, social injustice, and market power imbalances. We discuss data alteration techniques such as anonymization, pseudonymization, synthetic data, and differential privacy in relation to GDPR compliance. Plus, Gianclaudio highlights the issues of representation for minorities in differential privacy and stresses the importance of involving these groups in identifying bias and assessing AI technologies. We also touch on the need for ongoing research on PETs to address these challenges and share our perspectives on the future of this research. <br/><br/><b>Topics Covered: </b></p><ul><li>What inspired Gianclaudio to research fairness and PETs</li><li>How PETs are about power and control</li><li>The legal / GDPR and computer science perspectives on &apos;fairness&apos;</li><li>How fairness relates to discrimination, social injustices, and market power imbalances </li><li>How data obfuscation techniques relate to AI / ML </li><li>How well the use of anonymization, pseudonymization, and synthetic data techniques address data protection challenges under the GDPR</li><li>How the use of differential privacy techniques may led to unfairness </li><li>Whether the use of encrypted data processing tools and federated and distributed analytics achieve fairness </li><li>3 main PET shortcomings and how to overcome them: 1) bias discovery; 2) harms to people belonging to protected groups and individuals autonomy; and 3) market imbalances.</li><li>Areas that warrant more research and investigation </li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Read:<a href='https://dl.acm.org/doi/pdf/10.1145/3630106.3659024'> &quot;The Unfair Side of Privacy Enhancing Technologies: Addressing the Trade-offs Between PETs and Fairness&quot;</a></li></ul><p><br/><b>Guest Info: </b></p><ul><li>Connect with Gianclaudio on <a href='https://www.linkedin.com/in/gianclaudio-malgieri-410718a1/'>LinkedIn</a></li><li>Learn more about <a href='https://brusselsprivacyhub.com/'>Brussles Privacy Hub</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Today, I chat with <a href='https://www.linkedin.com/in/gianclaudio-malgieri-410718a1/'>Gianclaudio Malgieri</a>, an expert in privacy, data protection, AI regulation, EU law, and human rights. Gianclaudio is an Associate Professor of Law at Leiden University, the Co-director of the <a href='https://brusselsprivacyhub.eu/'>Brussels Privacy Hub</a>, Associate Editor of the <a href='https://www.sciencedirect.com/journal/computer-law-and-security-review'>Computer Law &amp; Security Review</a>, and co-author of the paper <a href='https://dl.acm.org/doi/pdf/10.1145/3630106.3659024'>&quot;The Unfair Side of Privacy Enhancing Technologies: Addressing the Trade-offs Between PETs and Fairness&quot;</a>. In our conversation, we explore this paper and why privacy-enhancing technologies (PETs) are essential but not enough on their own to address digital policy challenges.<br/><br/>Gianclaudio explains why PETs alone are insufficient solutions for data protection and discusses the obstacles to achieving fairness in data processing – including bias, discrimination, social injustice, and market power imbalances. We discuss data alteration techniques such as anonymization, pseudonymization, synthetic data, and differential privacy in relation to GDPR compliance. Plus, Gianclaudio highlights the issues of representation for minorities in differential privacy and stresses the importance of involving these groups in identifying bias and assessing AI technologies. We also touch on the need for ongoing research on PETs to address these challenges and share our perspectives on the future of this research. <br/><br/><b>Topics Covered: </b></p><ul><li>What inspired Gianclaudio to research fairness and PETs</li><li>How PETs are about power and control</li><li>The legal / GDPR and computer science perspectives on &apos;fairness&apos;</li><li>How fairness relates to discrimination, social injustices, and market power imbalances </li><li>How data obfuscation techniques relate to AI / ML </li><li>How well the use of anonymization, pseudonymization, and synthetic data techniques address data protection challenges under the GDPR</li><li>How the use of differential privacy techniques may led to unfairness </li><li>Whether the use of encrypted data processing tools and federated and distributed analytics achieve fairness </li><li>3 main PET shortcomings and how to overcome them: 1) bias discovery; 2) harms to people belonging to protected groups and individuals autonomy; and 3) market imbalances.</li><li>Areas that warrant more research and investigation </li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Read:<a href='https://dl.acm.org/doi/pdf/10.1145/3630106.3659024'> &quot;The Unfair Side of Privacy Enhancing Technologies: Addressing the Trade-offs Between PETs and Fairness&quot;</a></li></ul><p><br/><b>Guest Info: </b></p><ul><li>Connect with Gianclaudio on <a href='https://www.linkedin.com/in/gianclaudio-malgieri-410718a1/'>LinkedIn</a></li><li>Learn more about <a href='https://brusselsprivacyhub.com/'>Brussles Privacy Hub</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/15291009-s3e14-why-we-need-fairness-enhancing-technologies-rather-than-pets-with-gianclaudio-malgieri-brussels-privacy-hub.mp3" length="34340836" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/nyw9vns9pf55k92rqumu52zhf0r1?.jpg" />
    <itunes:author>Debra J. Farber / Gianclaudio Malgieri</itunes:author>
    <guid isPermaLink="false">Buzzsprout-15291009</guid>
    <pubDate>Tue, 25 Jun 2024 11:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15291009/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15291009/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15291009/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15291009/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/15291009/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="Shifting Privacy Left Podcast" />
  <psc:chapter start="6:54" title="Unpacking Fairness and Data Protection" />
  <psc:chapter start="25:09" title="Navigating Privacy-Enhancing Technology Shortcomings" />
  <psc:chapter start="36:12" title="Modernizing Privacy With Fairness Technology" />
</psc:chapters>
    <itunes:duration>2858</itunes:duration>
    <itunes:keywords>PETs, Fairness Enhancing Technologies, fairness, GDPR, privacy enhancing technologies, federated analytics, distributed analytics, anonymization, pseudonymization, synthetic data, differential privacy</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>14</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E13: &#39;Building Safe AR / VR/ MR / XR Technology&quot; with Spatial Computing Pioneer Avi Bar Zeev (XR Guild)</itunes:title>
    <title>S3E13: &#39;Building Safe AR / VR/ MR / XR Technology&quot; with Spatial Computing Pioneer Avi Bar Zeev (XR Guild)</title>
    <itunes:summary><![CDATA[In this episode, I had the pleasure of talking with Avi Bar-Zeev, a true tech pioneer and the Founder and President of The XR Guild. With over three decades of experience, Avi has an impressive resume, including launching Disney's Aladdin VR ride, developing Second Life's 3D worlds, co-founding Keyhole (which became Google Earth), co-inventing Microsoft's HoloLens, and contributing to the Amazon Echo Frames. The XR Guild is a nonprofit organization that promotes ethics in extended reality (XR...]]></itunes:summary>
    <description><![CDATA[<p>In this episode, I had the pleasure of talking with Avi Bar-Zeev, a true tech pioneer and the Founder and President of The XR Guild. With over three decades of experience, Avi has an impressive resume, including launching Disney&apos;s Aladdin VR ride, developing Second Life&apos;s 3D worlds, co-founding Keyhole (which became Google Earth), co-inventing Microsoft&apos;s HoloLens, and contributing to the Amazon Echo Frames. The XR Guild is a nonprofit organization that promotes ethics in extended reality (XR) through mentorship, networking, and educational resources. </p><p>Throughout our conversation, we dive into privacy concerns in augmented reality (AR), virtual reality (VR), and the metaverse, highlighting increased data misuse and manipulation risks as technology progresses. Avi shares his insights on how product and development teams can continue to be innovative while still upholding responsible, ethical standards with clear principles and guidelines to protect users&apos; personal data. Plus, he explains the role of eye-tracking technology and why he advocates classifying its data as health data. We also discuss the challenges of anonymizing biometric data, informed consent, and the need for ethics training in all of the tech industry. </p><p><b>Topics Covered</b>: </p><ul><li>The top privacy and misinformation issues that Avi has noticed when it comes to AR, VR, and metaverse data</li><li>Why Avi advocates for classifying eye tracking data as health data </li><li>The dangers of unchecked AI manipulation and why we need to be more aware and in control of our online presence </li><li>The ethical considerations for experimentation in highly regulated industries</li><li>Whether it is possible to anonymize VR and AR data</li><li>Ways these product and development teams can be innovative while maintaining ethics and avoiding harm </li><li>AR risks vs VR risks</li><li>Advice and privacy principles to keep in mind for technologists who are building AR and VR systems </li><li>Understanding The XR Guild </li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Read: <a href='https://www.nitafarahany.com/the-battle-for-your-brain'>The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology</a></li><li>Read:<a href='https://ournextreality.com/'> Our Next Reality</a></li></ul><p><b>Guest Info</b>: </p><ul><li>Connect with Avi on <a href='https://www.linkedin.com/in/avi-bar-zeev/'>LinkedIn</a></li><li>Check out the <a href='https://xrguild.org/'>XR Guild</a></li><li>Learn about <a href='https://www.realityprime.com/'>Avi&apos;s Consulting Services</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this episode, I had the pleasure of talking with Avi Bar-Zeev, a true tech pioneer and the Founder and President of The XR Guild. With over three decades of experience, Avi has an impressive resume, including launching Disney&apos;s Aladdin VR ride, developing Second Life&apos;s 3D worlds, co-founding Keyhole (which became Google Earth), co-inventing Microsoft&apos;s HoloLens, and contributing to the Amazon Echo Frames. The XR Guild is a nonprofit organization that promotes ethics in extended reality (XR) through mentorship, networking, and educational resources. </p><p>Throughout our conversation, we dive into privacy concerns in augmented reality (AR), virtual reality (VR), and the metaverse, highlighting increased data misuse and manipulation risks as technology progresses. Avi shares his insights on how product and development teams can continue to be innovative while still upholding responsible, ethical standards with clear principles and guidelines to protect users&apos; personal data. Plus, he explains the role of eye-tracking technology and why he advocates classifying its data as health data. We also discuss the challenges of anonymizing biometric data, informed consent, and the need for ethics training in all of the tech industry. </p><p><b>Topics Covered</b>: </p><ul><li>The top privacy and misinformation issues that Avi has noticed when it comes to AR, VR, and metaverse data</li><li>Why Avi advocates for classifying eye tracking data as health data </li><li>The dangers of unchecked AI manipulation and why we need to be more aware and in control of our online presence </li><li>The ethical considerations for experimentation in highly regulated industries</li><li>Whether it is possible to anonymize VR and AR data</li><li>Ways these product and development teams can be innovative while maintaining ethics and avoiding harm </li><li>AR risks vs VR risks</li><li>Advice and privacy principles to keep in mind for technologists who are building AR and VR systems </li><li>Understanding The XR Guild </li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Read: <a href='https://www.nitafarahany.com/the-battle-for-your-brain'>The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology</a></li><li>Read:<a href='https://ournextreality.com/'> Our Next Reality</a></li></ul><p><b>Guest Info</b>: </p><ul><li>Connect with Avi on <a href='https://www.linkedin.com/in/avi-bar-zeev/'>LinkedIn</a></li><li>Check out the <a href='https://xrguild.org/'>XR Guild</a></li><li>Learn about <a href='https://www.realityprime.com/'>Avi&apos;s Consulting Services</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/15272946-s3e13-building-safe-ar-vr-mr-xr-technology-with-spatial-computing-pioneer-avi-bar-zeev-xr-guild.mp3" length="37184138" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/1o4pwucufkufqnhzxm4yzu1iea3m?.jpg" />
    <itunes:author>Debra J. Farber / Avi Bar-Zeev</itunes:author>
    <guid isPermaLink="false">Buzzsprout-15272946</guid>
    <pubDate>Tue, 18 Jun 2024 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15272946/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15272946/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15272946/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15272946/transcript.vtt" type="text/vtt" />
    <itunes:duration>3095</itunes:duration>
    <itunes:keywords>XR, MR, AR, VR, spacial computing, XR Guild, Generative AI, transparency, fairness, ethics, eye-tracking, </itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>13</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E12: &#39;How Intentional Experimentation in A/B Testing Supports Privacy&#39; with Matt Gershoff (Conductrics)</itunes:title>
    <title>S3E12: &#39;How Intentional Experimentation in A/B Testing Supports Privacy&#39; with Matt Gershoff (Conductrics)</title>
    <itunes:summary><![CDATA[Today, I'm joined by Matt Gershoff, Co-founder and CEO of Conductrics, a software company specializing in A/B testing, multi-armed bandit techniques, and customer research and survey software. With a strong background in resource economics and artificial intelligence, Matt brings a unique perspective to the conversation, emphasizing simplicity and intentionality in decision-making and data collection.   In this episode, Matt dives into Conductrics' background, the role of A/B testing and expe...]]></itunes:summary>
    <description><![CDATA[<p>Today, I&apos;m joined by <a href='https://www.linkedin.com/in/mattgershoff/'>Matt Gershoff</a>, Co-founder and CEO of <a href='https://www.conductrics.com/'>Conductrics</a>, a software company specializing in A/B testing, multi-armed bandit techniques, and customer research and survey software. With a strong background in resource economics and artificial intelligence, Matt brings a unique perspective to the conversation, emphasizing simplicity and intentionality in decision-making and data collection. <br/><br/>In this episode, Matt dives into Conductrics&apos; background, the role of A/B testing and experimentation in privacy, data collection at a specific and granular level, and the details of Conductrics&apos; processes. He emphasizes the importance of intentionally collecting data with a clear purpose to avoid unnecessary data accumulation and touches on the value of experimentation in conjunction with data minimization strategies. Matt also discusses his upcoming talk at the PEPR Conference and shares his hopes for what privacy engineers will learn from the event. <br/><br/><b>Topics Covered: </b></p><ul><li>Matt’s background and how he started A/B testing and experimentation at Conductrics</li><li>The major challenges that arise when companies run experiments and how Conductrics works to solve them </li><li>Breaking down A/B testing</li><li>How being intentional about A/B testing and experimentation supports high level privacy</li><li>The process of the data collection, testing, and experimentation </li><li>Collecting the data while minimizing privacy risks </li><li>The value of attending the USENIX Conference on Privacy Engineering Practice &amp; Respect (PEPR24) and what to expect from Matt’s talk </li></ul><p><br/><b>Guest Info: </b></p><ul><li>Connect with Matt on <a href='https://www.linkedin.com/in/mattgershoff/'>LinkedIn</a></li><li>Learn more about <a href='https://www.conductrics.com/'>Conductrics</a></li><li>Read about George Box&apos;s quote, <a href='https://en.wikipedia.org/wiki/All_models_are_wrong'>&quot;All models are wrong&quot; </a></li><li>Learn about the <a href='https://www.usenix.org/conference/pepr24'>PEPR Conference</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Today, I&apos;m joined by <a href='https://www.linkedin.com/in/mattgershoff/'>Matt Gershoff</a>, Co-founder and CEO of <a href='https://www.conductrics.com/'>Conductrics</a>, a software company specializing in A/B testing, multi-armed bandit techniques, and customer research and survey software. With a strong background in resource economics and artificial intelligence, Matt brings a unique perspective to the conversation, emphasizing simplicity and intentionality in decision-making and data collection. <br/><br/>In this episode, Matt dives into Conductrics&apos; background, the role of A/B testing and experimentation in privacy, data collection at a specific and granular level, and the details of Conductrics&apos; processes. He emphasizes the importance of intentionally collecting data with a clear purpose to avoid unnecessary data accumulation and touches on the value of experimentation in conjunction with data minimization strategies. Matt also discusses his upcoming talk at the PEPR Conference and shares his hopes for what privacy engineers will learn from the event. <br/><br/><b>Topics Covered: </b></p><ul><li>Matt’s background and how he started A/B testing and experimentation at Conductrics</li><li>The major challenges that arise when companies run experiments and how Conductrics works to solve them </li><li>Breaking down A/B testing</li><li>How being intentional about A/B testing and experimentation supports high level privacy</li><li>The process of the data collection, testing, and experimentation </li><li>Collecting the data while minimizing privacy risks </li><li>The value of attending the USENIX Conference on Privacy Engineering Practice &amp; Respect (PEPR24) and what to expect from Matt’s talk </li></ul><p><br/><b>Guest Info: </b></p><ul><li>Connect with Matt on <a href='https://www.linkedin.com/in/mattgershoff/'>LinkedIn</a></li><li>Learn more about <a href='https://www.conductrics.com/'>Conductrics</a></li><li>Read about George Box&apos;s quote, <a href='https://en.wikipedia.org/wiki/All_models_are_wrong'>&quot;All models are wrong&quot; </a></li><li>Learn about the <a href='https://www.usenix.org/conference/pepr24'>PEPR Conference</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/15199785-s3e12-how-intentional-experimentation-in-a-b-testing-supports-privacy-with-matt-gershoff-conductrics.mp3" length="32708832" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/1ksj4yxz94huo440xkitktu05tqu?.jpg" />
    <itunes:author>Debra J. Farber / Matt Gershoff</itunes:author>
    <guid isPermaLink="false">Buzzsprout-15199785</guid>
    <pubDate>Tue, 04 Jun 2024 09:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15199785/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15199785/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15199785/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/15199785/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/15199785/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="Privacy and Decision Making in Tech" />
  <psc:chapter start="5:43" title="The Value of a/B Testing" />
  <psc:chapter start="16:20" title="Value of a-B Testing and Experimentation" />
  <psc:chapter start="20:17" title="Privacy by Design and Data Minimization" />
  <psc:chapter start="36:31" title="Pepper 24 Conference Discussion" />
  <psc:chapter start="40:54" title="Benefits of Privacy in Data Management" />
</psc:chapters>
    <itunes:duration>2722</itunes:duration>
    <itunes:keywords>Conductrics, A/B Testing, Experimentation, PEPR, data minimization</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>12</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E11: &#39;Decision-Making Governance &amp; Design: Combating Dark Patterns with Fair Patterns&#39; with Marie Potel-Saville (Amurabi &amp; FairPatterns)</itunes:title>
    <title>S3E11: &#39;Decision-Making Governance &amp; Design: Combating Dark Patterns with Fair Patterns&#39; with Marie Potel-Saville (Amurabi &amp; FairPatterns)</title>
    <itunes:summary><![CDATA[In this episode, Marie Potel-Saville joins me to shed light on the widespread issue of dark patterns in design. With her background in law, Marie founded the 'FairPatterns' project with her award-winning privacy and innovation studio, Amurabi, to detect and fix large-scale dark patterns. Throughout our conversation, we discuss the different types of dark patterns, why it is crucial for businesses to prevent them from being coded into their websites and apps, and how designers can ensure that ...]]></itunes:summary>
    <description><![CDATA[<p>In this episode, <a href='https://www.linkedin.com/in/marie-potel-saville/?originalSubdomain=fr'>Marie Potel-Saville</a> joins me to shed light on the widespread issue of dark patterns in design. With her background in law, Marie founded the <a href='https://fairpatterns.com'>&apos;FairPatterns&apos;</a> project with her award-winning privacy and innovation studio, Amurabi, to detect and fix large-scale dark patterns. Throughout our conversation, we discuss the different types of dark patterns, why it is crucial for businesses to prevent them from being coded into their websites and apps, and how designers can ensure that they are designing fair patterns in their projects.</p><p><br/>Dark patterns are interfaces that deceive or manipulate users into unintended actions by exploiting cognitive biases inherent in decision-making processes. Marie explains how dark patterns are harmful to our economic and democratic models, their negative impact on individual agency, and the ways that FairPatterns provides countermeasures and safeguards against the exploitation of people&apos;s cognitive biases. She also shares tips for designers and developers for designing and architecting fair patterns.<br/><br/></p><p><b>Topics Covered</b>: </p><ul><li>Why Marie shifted her career path from practicing law to deploying and lecturing on Legal UX design &amp; combatting Dark Patterns at Amurabi</li><li>The definition of ‘Dark Patterns’ and the difference between them and ‘deceptive patterns’</li><li>What motivated Marie to found FairPatterns.com and her science-based methodology to combat dark patterns</li><li>The importance of decision making governance </li><li>Why execs should care about preventing dark patterns from being coded into their websites, apps, &amp; interfaces</li><li>How dark patterns exploit our cognitive biases to our detriment</li><li>What global laws say about dark patterns</li><li>How dark patterns create structural risks for our economies &amp; democratic models</li><li>How &quot;Fair Patterns&quot; serve as countermeasures to Dark Patterns</li><li>The 7 categories of Dark Patterns in UX design &amp; associated countermeasures </li><li>Advice for designers &amp; developers to ensure that they design &amp; architect Fair Patterns when building products &amp; features</li><li>How companies can boost sales &amp; gain trust with Fair Patterns </li><li>Resources to learn more about Dark Patterns &amp; countermeasures</li></ul><p><b>Guest Info</b>: </p><ul><li>Connect with <a href='https://www.linkedin.com/in/marie-potel-saville/?originalSubdomain=fr'>Marie on LinkedIn</a></li><li>Learn more about <a href='https://amurabi.eu/en/'>Amurabi</a></li><li>Check out <a href='https://fairpatterns.com'>FairPatterns.com</a></li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Learn about the <a href='https://en.wikipedia.org/wiki/Seven_stages_of_action'>7 Stages of Action Model</a></li><li>Take FairPattern&apos;s course: <a href='https://fairpatterns.com/masterclass-30-may-2024/'>Dark Patterns 101</a> </li><li>Read <a href='https://www.deceptive.design/'>Deceptive Design Patterns</a></li><li>Listen to FairPatterns&apos; <a href='https://shows.acast.com/fighting-dark-patterns'>Fighting Dark P</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this episode, <a href='https://www.linkedin.com/in/marie-potel-saville/?originalSubdomain=fr'>Marie Potel-Saville</a> joins me to shed light on the widespread issue of dark patterns in design. With her background in law, Marie founded the <a href='https://fairpatterns.com'>&apos;FairPatterns&apos;</a> project with her award-winning privacy and innovation studio, Amurabi, to detect and fix large-scale dark patterns. Throughout our conversation, we discuss the different types of dark patterns, why it is crucial for businesses to prevent them from being coded into their websites and apps, and how designers can ensure that they are designing fair patterns in their projects.</p><p><br/>Dark patterns are interfaces that deceive or manipulate users into unintended actions by exploiting cognitive biases inherent in decision-making processes. Marie explains how dark patterns are harmful to our economic and democratic models, their negative impact on individual agency, and the ways that FairPatterns provides countermeasures and safeguards against the exploitation of people&apos;s cognitive biases. She also shares tips for designers and developers for designing and architecting fair patterns.<br/><br/></p><p><b>Topics Covered</b>: </p><ul><li>Why Marie shifted her career path from practicing law to deploying and lecturing on Legal UX design &amp; combatting Dark Patterns at Amurabi</li><li>The definition of ‘Dark Patterns’ and the difference between them and ‘deceptive patterns’</li><li>What motivated Marie to found FairPatterns.com and her science-based methodology to combat dark patterns</li><li>The importance of decision making governance </li><li>Why execs should care about preventing dark patterns from being coded into their websites, apps, &amp; interfaces</li><li>How dark patterns exploit our cognitive biases to our detriment</li><li>What global laws say about dark patterns</li><li>How dark patterns create structural risks for our economies &amp; democratic models</li><li>How &quot;Fair Patterns&quot; serve as countermeasures to Dark Patterns</li><li>The 7 categories of Dark Patterns in UX design &amp; associated countermeasures </li><li>Advice for designers &amp; developers to ensure that they design &amp; architect Fair Patterns when building products &amp; features</li><li>How companies can boost sales &amp; gain trust with Fair Patterns </li><li>Resources to learn more about Dark Patterns &amp; countermeasures</li></ul><p><b>Guest Info</b>: </p><ul><li>Connect with <a href='https://www.linkedin.com/in/marie-potel-saville/?originalSubdomain=fr'>Marie on LinkedIn</a></li><li>Learn more about <a href='https://amurabi.eu/en/'>Amurabi</a></li><li>Check out <a href='https://fairpatterns.com'>FairPatterns.com</a></li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Learn about the <a href='https://en.wikipedia.org/wiki/Seven_stages_of_action'>7 Stages of Action Model</a></li><li>Take FairPattern&apos;s course: <a href='https://fairpatterns.com/masterclass-30-may-2024/'>Dark Patterns 101</a> </li><li>Read <a href='https://www.deceptive.design/'>Deceptive Design Patterns</a></li><li>Listen to FairPatterns&apos; <a href='https://shows.acast.com/fighting-dark-patterns'>Fighting Dark P</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14961753-s3e11-decision-making-governance-design-combating-dark-patterns-with-fair-patterns-with-marie-potel-saville-amurabi-fairpatterns.mp3" length="39070478" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/631702kjwlwwlrv2kmril3zcz6b5?.jpg" />
    <itunes:author>Debra J. Farber (Shifting Privacy Left)</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14961753</guid>
    <pubDate>Tue, 30 Apr 2024 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14961753/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14961753/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14961753/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14961753/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14961753/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E11: &#39;Decision-Making Governance &amp; Design: Combating Dark Patterns with Fair Patterns&#39; with Marie Potel-Saville (Amurabi &amp; FairPatterns)" />
  <psc:chapter start="1:54" title="Introducing Marie Potel-Saville, CEO &amp; Founder at Amurabi and Founder of FairPatterns.com" />
  <psc:chapter start="3:25" title="How Marie moved from a career practicing law to a focus on legal UX and combatting dark patterns" />
  <psc:chapter start="6:50" title="Marie tells us about Amurabi, her legal design firm, and what she means by &quot;innovation by design&quot;" />
  <psc:chapter start="12:21" title="Marie explains her motivation behind founding FairPatterns and her science-based methodology to combat dark patterns" />
  <psc:chapter start="18:15" title="Why business execs should generally care about preventing dark patterns from being coded into their websites and apps and interfaces" />
  <psc:chapter start="21:14" title="How dark patterns exploit our cognitive biases to our detriment" />
  <psc:chapter start="24:44" title="What global laws say about dark patterns" />
  <psc:chapter start="28:37" title="How dark patterns create structural risks for our economies and then, ultimately, our democratic models" />
  <psc:chapter start="32:44" title="Marie defines &quot;fair patterns&quot; and explains the benefits to using them as countermeasures to dark patterns" />
  <psc:chapter start="44:15" title="Marie&#39;s advice for designers &amp; developers to ensure that they design fair patterns as they build products &amp; features" />
  <psc:chapter start="47:08" title="How companies can boost sales, gain customer trust, and increase privacy and agency by using fair patterns " />
  <psc:chapter start="49:54" title="Marie shares resources on where to learn more about dark patterns and their countermeasures" />
</psc:chapters>
    <itunes:duration>3252</itunes:duration>
    <itunes:keywords>fair design, dark patterns</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>11</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E10: &#39;How a Privacy Engineering Center of Excellence Shifts Privacy Left&#39; with Aaron Weller (HP)</itunes:title>
    <title>S3E10: &#39;How a Privacy Engineering Center of Excellence Shifts Privacy Left&#39; with Aaron Weller (HP)</title>
    <itunes:summary><![CDATA[In this episode, I sat down with Aaron Weller, the Leader of HP's Privacy Engineering Center of Excellence (CoE), focused on providing technical solutions for privacy engineering across HP's global operations. Throughout our conversation, we discuss: what motivated HP's leadership to stand up a CoE for Privacy Engineering; Aaron's approach to staffing the CoE; how a CoE's can shift privacy left in a large, matrixed organization like HP's; and, how to leverage the CoE to proactively manage pri...]]></itunes:summary>
    <description><![CDATA[<p>In this episode, I sat down with <a href='https://www.linkedin.com/in/aaronweller/'>Aaron Weller</a>, the Leader of HP&apos;s Privacy Engineering Center of Excellence (CoE), focused on providing technical solutions for privacy engineering across HP&apos;s global operations. Throughout our conversation, we discuss: what motivated HP&apos;s leadership to stand up a CoE for Privacy Engineering; Aaron&apos;s approach to staffing the CoE; how a CoE&apos;s can shift privacy left in a large, matrixed organization like HP&apos;s; and, how to leverage the CoE to proactively manage privacy risk.<br/><br/>Aaron emphasizes the importance of understanding an organization&apos;s strategy when creating a CoE and shares his methods for gathering data to inform the center&apos;s roadmap and team building. He also highlights the great impact that a Center of Excellence can offer and gives advice for implementing one in your organization. We touch on the main challenges in privacy engineering today and the value of designing user-friendly privacy experiences. In addition, Aaron provides his perspective on selecting the right combination of Privacy Enhancing Technologies (PETs) for anonymity, how to go about implementing PETs, and the role that AI governance plays in his work. </p><p><b>Topics Covered: </b></p><ul><li>Aaron’s deep privacy and consulting background and how he ended up leading HP&apos;s Privacy Engineering Center of Excellence </li><li>The definition of a &quot;Center of Excellence&quot; (CoE) and how a Privacy Engineering CoE can drive value for an organization and shift privacy left</li><li>What motivates a company like HP to launch a CoE for Privacy Engineering and what it&apos;s reporting line should be</li><li>Aaron&apos;s approach to creating a Privacy Engineering CoE roadmap; his strategy for staffing this CoE; and the skills &amp; abilities that he sought</li><li>How HP&apos;s Privacy Engineering CoE works with the business to advise on, and select, the right PETs for each business use case</li><li>Why it&apos;s essential to know the privacy guarantees that your organization wants to assert before selecting the right PETs to get you there</li><li>Lessons Learned from setting up a Privacy Engineering CoE and how to get executive sponsorship</li><li>The amount of time that Privacy teams have had to work on AI issues over the past year, and advice on preventing burnout</li><li>Aaron&apos;s hypothesis about the value of getting an early handle on governance over the adoption of innovative technologies</li><li>The importance of being open to continuous learning in the field of privacy engineering </li></ul><p><b>Guest Info: </b></p><ul><li>Connect with<a href='https://www.linkedin.com/in/aaronweller/'> Aaron on LinkedIn</a></li><li>Learn about <a href='https://iapp.org/news/a/a-look-at-hps-privacy-engineering-center-of-excellence/'>HP&apos;s Privacy Engineering Center of Excellence</a></li><li>Review the <a href='https://owasp.org/www-project-top-10-for-large-language-model-applications/'>OWASP Machine Learning Security Top 10</a></li><li>Review the <a href='https://owasp.org/www-project-top-10-for-large-language-model-applications/'>OWASP Top 10 for LLM Applications</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this episode, I sat down with <a href='https://www.linkedin.com/in/aaronweller/'>Aaron Weller</a>, the Leader of HP&apos;s Privacy Engineering Center of Excellence (CoE), focused on providing technical solutions for privacy engineering across HP&apos;s global operations. Throughout our conversation, we discuss: what motivated HP&apos;s leadership to stand up a CoE for Privacy Engineering; Aaron&apos;s approach to staffing the CoE; how a CoE&apos;s can shift privacy left in a large, matrixed organization like HP&apos;s; and, how to leverage the CoE to proactively manage privacy risk.<br/><br/>Aaron emphasizes the importance of understanding an organization&apos;s strategy when creating a CoE and shares his methods for gathering data to inform the center&apos;s roadmap and team building. He also highlights the great impact that a Center of Excellence can offer and gives advice for implementing one in your organization. We touch on the main challenges in privacy engineering today and the value of designing user-friendly privacy experiences. In addition, Aaron provides his perspective on selecting the right combination of Privacy Enhancing Technologies (PETs) for anonymity, how to go about implementing PETs, and the role that AI governance plays in his work. </p><p><b>Topics Covered: </b></p><ul><li>Aaron’s deep privacy and consulting background and how he ended up leading HP&apos;s Privacy Engineering Center of Excellence </li><li>The definition of a &quot;Center of Excellence&quot; (CoE) and how a Privacy Engineering CoE can drive value for an organization and shift privacy left</li><li>What motivates a company like HP to launch a CoE for Privacy Engineering and what it&apos;s reporting line should be</li><li>Aaron&apos;s approach to creating a Privacy Engineering CoE roadmap; his strategy for staffing this CoE; and the skills &amp; abilities that he sought</li><li>How HP&apos;s Privacy Engineering CoE works with the business to advise on, and select, the right PETs for each business use case</li><li>Why it&apos;s essential to know the privacy guarantees that your organization wants to assert before selecting the right PETs to get you there</li><li>Lessons Learned from setting up a Privacy Engineering CoE and how to get executive sponsorship</li><li>The amount of time that Privacy teams have had to work on AI issues over the past year, and advice on preventing burnout</li><li>Aaron&apos;s hypothesis about the value of getting an early handle on governance over the adoption of innovative technologies</li><li>The importance of being open to continuous learning in the field of privacy engineering </li></ul><p><b>Guest Info: </b></p><ul><li>Connect with<a href='https://www.linkedin.com/in/aaronweller/'> Aaron on LinkedIn</a></li><li>Learn about <a href='https://iapp.org/news/a/a-look-at-hps-privacy-engineering-center-of-excellence/'>HP&apos;s Privacy Engineering Center of Excellence</a></li><li>Review the <a href='https://owasp.org/www-project-top-10-for-large-language-model-applications/'>OWASP Machine Learning Security Top 10</a></li><li>Review the <a href='https://owasp.org/www-project-top-10-for-large-language-model-applications/'>OWASP Top 10 for LLM Applications</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14809102-s3e10-how-a-privacy-engineering-center-of-excellence-shifts-privacy-left-with-aaron-weller-hp.mp3" length="29003302" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/86sms5zkpn2vpus47f7cjkuqe2pe?.jpg" />
    <itunes:author>Debra J. Farber / Aaron Weller</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14809102</guid>
    <pubDate>Tue, 09 Apr 2024 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14809102/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14809102/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14809102/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14809102/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14809102/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E10: &#39;How a Privacy Engineering Center of Excellence Shifts Privacy Left&#39; with Aaron Weller (HP)" />
  <psc:chapter start="2:43" title="Introducing Aaron Weller, Leader of the HP Privacy Engineering Center of Excellence" />
  <psc:chapter start="4:43" title="The definition of a &quot;Center of Excellence&quot;(CoE); and how a Privacy Engineering CoE can drive value for an organization &amp; enable it to shift privacy left" />
  <psc:chapter start="6:50" title="What motivates a company like HP to launch a CoE for Privacy Engineering and what it&#39;s reporting line should be" />
  <psc:chapter start="9:54" title="Aaron&#39;s approach to creating a Privacy Engineering CoE roadmap; his strategy for staffing this CoE; and the skills &amp; abilities that he sought" />
  <psc:chapter start="13:57" title="How HP&#39;s Privacy Engineering CoE works with the business to advise on and select the right PETs for each use case" />
  <psc:chapter start="21:23" title="Lessons Learned from setting up a Privacy Engineering CoE; and advice on how to convince business leaders at other companies of the ROI for a CoE" />
  <psc:chapter start="27:04" title="The amount of time that Privacy teams have had to work on AI issues over the past year; and Aaron&#39;s advice on preventing burnout for Privacy Engineering teams" />
  <psc:chapter start="30:15" title="Aaron&#39;s hypothesis about the value of getting an early handle on governance over the adoption of innovative technologies" />
  <psc:chapter start="34:36" title="Aaron&#39;s advice for those who want to get into privacy engineering" />
</psc:chapters>
    <itunes:duration>2413</itunes:duration>
    <itunes:keywords>Privacy Engineering Center of Excellence, Privacy Engineering, CoE, HP, hiring, PETs, Privacy Enhancing Technologies, OWASP</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>10</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E9: &#39;Building a Culture of Privacy &amp; Achieving Compliance without Sacrificing Innovation&#39; with Amaka Ibeji (Cruise)</itunes:title>
    <title>S3E9: &#39;Building a Culture of Privacy &amp; Achieving Compliance without Sacrificing Innovation&#39; with Amaka Ibeji (Cruise)</title>
    <itunes:summary><![CDATA[Today, I’m joined by Amaka Ibeji, Privacy Engineer at Cruise where she designs and implements robust privacy programs and controls. In this episode, we discuss Amaka's passion for creating a culture of privacy and compliance within organizations and engineering teams. Amaka also hosts the PALS Parlor Podcast, where she speaks to business leaders and peers about privacy, AI governance, leadership, and security and explains technical concepts in a digestible way. The podcast aims to enable busi...]]></itunes:summary>
    <description><![CDATA[<p>Today, I’m joined by Amaka Ibeji, Privacy Engineer at Cruise where she designs and implements robust privacy programs and controls. In this episode, we discuss Amaka&apos;s passion for creating a culture of privacy and compliance within organizations and engineering teams. Amaka also hosts the PALS Parlor Podcast, where she speaks to business leaders and peers about privacy, AI governance, leadership, and security and explains technical concepts in a digestible way. The podcast aims to enable business leaders to do more with their data and provides a way for the community to share knowledge with one other.<br/><br/>In our conversation, we touch on her career trajectory from security engineer to privacy engineer and the intersection of cybersecurity, privacy engineering, and AI governance. We highlight the importance of early engagement with various technical teams to enable innovation while still achieving privacy compliance. Amaka also shares the privacy-enhancing technologies (PETs) that she is most excited about, and she recommends resources for those who want to learn more about strategic privacy engineering. Amaka emphasizes that privacy is a systemic, &apos;wicked problem&apos; and offers her tips for understanding and approaching it. <br/><br/><b>Topics Covered</b>:</p><ul><li>How Amaka&apos;s compliance-focused experience at Microsoft helped prepare her for her Privacy Engineering role at Cruise</li><li>Where privacy overlaps with the development of AI </li><li>Advice for shifting privacy left to make privacy stretch beyond a compliance exercise</li><li>What works well and what doesn&apos;t when building a &apos;Culture of Privacy&apos;</li><li>Privacy by Design approaches that make privacy &amp; innovation a win-win rather than zero-sum game</li><li>Privacy Engineering trends that Amaka sees; and, the PETs about which she&apos;s most excited</li><li>Amaka&apos;s Privacy Engineering resource recommendations, including: <ul><li>Hoepman&apos;s &quot;Privacy Design Strategies&quot; book;</li><li>The LINDDUN Privacy Threat Modeling Framework; and</li><li>The PLOT4AI Framework</li></ul></li><li>&quot;The PALS Parlor Podcast,&quot; focused on Privacy Engineering, AI Governance, Leadership, &amp; Security<ul><li>Why Amaka launched the podcast;</li><li>Her intended audience; and</li><li>Topics that she plans to cover this year</li></ul></li><li>The importance of collaboration; building a community of passionate privacy engineers, and addressing the systemic issue of privacy </li></ul><p><b>Guest Info &amp; Resources</b>:</p><ul><li>Follow <a href='https://www.linkedin.com/in/amakai'>Amaka on LinkedIn</a></li><li>Listen to <a href='https://pals.buzzsprout.com/'>The PALS Parlor Podcast</a></li><li>Read Jaap-Henk Hoepman&apos;s &quot;<a href='https://www.cs.ru.nl/~jhh/publications/pds-booklet.pdf'>Privacy Design Strategies (The Little Blue Book)</a>&quot;</li><li>Read Jason Cronk&apos;s &quot;<a href='https://iapp.org/resources/article/strategic-privacy-by-design/'>Strategic Privacy by Design, 2nd Edition</a>&quot;</li><li>Check out <a href='https://linddun.org/'>The LINDDUN Privacy Threat Modeling Framework</a></li><li>Check out <a href='http://plot4.ai'>The Privacy Library of Threats for Artificial Intelligence (PLO</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Today, I’m joined by Amaka Ibeji, Privacy Engineer at Cruise where she designs and implements robust privacy programs and controls. In this episode, we discuss Amaka&apos;s passion for creating a culture of privacy and compliance within organizations and engineering teams. Amaka also hosts the PALS Parlor Podcast, where she speaks to business leaders and peers about privacy, AI governance, leadership, and security and explains technical concepts in a digestible way. The podcast aims to enable business leaders to do more with their data and provides a way for the community to share knowledge with one other.<br/><br/>In our conversation, we touch on her career trajectory from security engineer to privacy engineer and the intersection of cybersecurity, privacy engineering, and AI governance. We highlight the importance of early engagement with various technical teams to enable innovation while still achieving privacy compliance. Amaka also shares the privacy-enhancing technologies (PETs) that she is most excited about, and she recommends resources for those who want to learn more about strategic privacy engineering. Amaka emphasizes that privacy is a systemic, &apos;wicked problem&apos; and offers her tips for understanding and approaching it. <br/><br/><b>Topics Covered</b>:</p><ul><li>How Amaka&apos;s compliance-focused experience at Microsoft helped prepare her for her Privacy Engineering role at Cruise</li><li>Where privacy overlaps with the development of AI </li><li>Advice for shifting privacy left to make privacy stretch beyond a compliance exercise</li><li>What works well and what doesn&apos;t when building a &apos;Culture of Privacy&apos;</li><li>Privacy by Design approaches that make privacy &amp; innovation a win-win rather than zero-sum game</li><li>Privacy Engineering trends that Amaka sees; and, the PETs about which she&apos;s most excited</li><li>Amaka&apos;s Privacy Engineering resource recommendations, including: <ul><li>Hoepman&apos;s &quot;Privacy Design Strategies&quot; book;</li><li>The LINDDUN Privacy Threat Modeling Framework; and</li><li>The PLOT4AI Framework</li></ul></li><li>&quot;The PALS Parlor Podcast,&quot; focused on Privacy Engineering, AI Governance, Leadership, &amp; Security<ul><li>Why Amaka launched the podcast;</li><li>Her intended audience; and</li><li>Topics that she plans to cover this year</li></ul></li><li>The importance of collaboration; building a community of passionate privacy engineers, and addressing the systemic issue of privacy </li></ul><p><b>Guest Info &amp; Resources</b>:</p><ul><li>Follow <a href='https://www.linkedin.com/in/amakai'>Amaka on LinkedIn</a></li><li>Listen to <a href='https://pals.buzzsprout.com/'>The PALS Parlor Podcast</a></li><li>Read Jaap-Henk Hoepman&apos;s &quot;<a href='https://www.cs.ru.nl/~jhh/publications/pds-booklet.pdf'>Privacy Design Strategies (The Little Blue Book)</a>&quot;</li><li>Read Jason Cronk&apos;s &quot;<a href='https://iapp.org/resources/article/strategic-privacy-by-design/'>Strategic Privacy by Design, 2nd Edition</a>&quot;</li><li>Check out <a href='https://linddun.org/'>The LINDDUN Privacy Threat Modeling Framework</a></li><li>Check out <a href='http://plot4.ai'>The Privacy Library of Threats for Artificial Intelligence (PLO</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14772627-s3e9-building-a-culture-of-privacy-achieving-compliance-without-sacrificing-innovation-with-amaka-ibeji-cruise.mp3" length="31302152" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/cnrlfifm47pdw8fanclxvlqlrp3u?.jpg" />
    <itunes:author>Debra J. Farber / Amaka Ibeji</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14772627</guid>
    <pubDate>Tue, 02 Apr 2024 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14772627/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14772627/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14772627/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14772627/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14772627/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E9: &#39;Building a Culture of Privacy &amp; Achieving Compliance without Sacrificing Innovation&#39; with Amaka Ibeji (Cruise)" />
  <psc:chapter start="2:35" title="Introducing Amaka Ibeji, Privacy Engineer at Cruise; how she&#39;s moved from security engineer to privacy engineer; how privacy issues overlap with AI issues" />
  <psc:chapter start="7:42" title="How Amaka&#39;s compliance-focused experience at Microsoft has helped prepare her for her privacy engineering role at Cruise" />
  <psc:chapter start="10:32" title="Amaka&#39;s advice for engineers who want to shift privacy left into design, architecture, engineering, &amp; data science to actually impact privacy instead of viewing privacy as just a compliance exercise" />
  <psc:chapter start="14:27" title="What works well and what doesn&#39;t when building a &#39;Culture of Privacy&#39;" />
  <psc:chapter start="18:39" title="Privacy by Design approaches that make privacy &amp; innovation a win-win rather than zero-sum game" />
  <psc:chapter start="22:41" title="Privacy Engineering trends that Amaka sees; and, the PETs that she&#39;s most excited about" />
  <psc:chapter start="28:04" title="Amaka&#39;s recommended Privacy Engineering resources, including: Hoepman&#39;s &quot;Privacy Design Strategies&quot; book, The LINDDUN Privacy Threat Modeling Framework &amp; the PLOT4AI Framework" />
  <psc:chapter start="31:49" title="Amaka shares why she launched &quot;The PALS Parlor Podcast&quot; focused on Privacy Engineering, AI Governance, Leadership, &amp; Security; who is her intended audience; and some topics that she plans for her show to cover" />
</psc:chapters>
    <itunes:duration>2604</itunes:duration>
    <itunes:keywords>privacy by design, innovation, PALS Parlor Podcast, Amaka Ibeji, LINDDUN, PLOT4AI, Hoepman Privacy Design Strategies, Strategic Privacy by Design, Jaap-Henk Hoepman, Jason Cronk, Kim Wuyts, Isabel Barbera, PETs, Cruise, Microsoft, culture of privacy</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>9</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E8: &#39;Recent FTC Enforcement: What Privacy Engineers Need to Know&#39; with Heidi Saas (H.T. Saas)</itunes:title>
    <title>S3E8: &#39;Recent FTC Enforcement: What Privacy Engineers Need to Know&#39; with Heidi Saas (H.T. Saas)</title>
    <itunes:summary><![CDATA[In this week's episode, I am joined by Heidi Saas, a privacy lawyer with a reputation for advocating for products and services built with privacy by design and against the abuse of personal data. In our conversation, she dives into recent FTC enforcement actions, analyzing five FTC actions and some enforcement sweeps by Colorado &amp; Connecticut.   Heidi shares her insights on the effect of the FTC enforcement actions and what privacy engineers need to know, emphasizing the need for data man...]]></itunes:summary>
    <description><![CDATA[<p>In this week&apos;s episode, I am joined by Heidi Saas, a privacy lawyer with a reputation for advocating for products and services built with privacy by design and against the abuse of personal data. In our conversation, she dives into recent FTC enforcement actions, analyzing five FTC actions and some enforcement sweeps by Colorado &amp; Connecticut. <br/><br/>Heidi shares her insights on the effect of the FTC enforcement actions and what privacy engineers need to know, emphasizing the need for data management practices to be transparent, accountable, and based on affirmative consent. We cover the role of privacy engineers in ensuring compliance with data privacy laws; why &apos;browsing data&apos; is &apos;sensitive data;&apos; the challenges companies face regarding data deletion; and the need for clear consent mechanisms, especially with the collection and use of location data. We also discuss the need to audit the privacy posture of products and services - which includes a requirement to document who made certain decisions - and how to prioritize risk analysis to proactively address risks to privacy.<br/><br/><b>Topics Covered</b>: </p><ul><li>Heidi’s journey into privacy law and advocacy for privacy by design and default</li><li>How the FTC brings enforcement actions, the effect of their settlements, and why privacy engineers should pay closer attention</li><li>Case 1: FTC v. InMarket Media - Heidi explains the implication of the decision: where data that are linked to a mobile advertising identifier (MAID) or an individual&apos;s home are not considered de-identified</li><li>Case 2: FTC v. X-Mode Social / OutLogic - Heidi explains the implication of the decision, focused on: affirmative express consent for location data collection; definition of a &apos;data product assessment&apos; and audit programs; and data retention &amp; deletion requirements</li><li>Case 3: FTC v. Avast - Heidi explains the implication of the decision: &apos;browsing data&apos; is considered &apos;sensitive data&apos;</li><li>Case 4: The People (CA) v. DoorDash - Heidi explains the implications of the decision, based on CalOPPA: where companies that share personal data with one another as part of a &apos;marketing cooperative&apos; are, in fact, selling of data</li><li>Heidi discusses recent State Enforcement Sweeps for privacy, specifically in Colorado and Connecticut and clarity around breach reporting timelines</li><li>The need to prioritize independent third-party audits for privacy</li><li>Case 5: FTC v. Kroger - Heidi explains why the FTC&apos;s blocking of Kroger&apos;s merger with Albertson&apos;s was based on antitrust and privacy harms given the sheer amount of personal data that they process</li><li>Tools and resources for keeping up with FTC cases and connecting with your privacy community </li></ul><p><b>Guest Info</b>: </p><ul><li>Follow Heidi on <a href='https://www.linkedin.com/in/heidi-saas-31a7a16/'>LinkedIn</a></li><li>Read (book): <a href='https://www.amazon.com/Means-Control-Alliance-Government-Surveillance/dp/0593443225'> &apos;Means of Control: How the Hidden Alliance of Tech and Government is Creating a New American Surveillance State&apos;</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this week&apos;s episode, I am joined by Heidi Saas, a privacy lawyer with a reputation for advocating for products and services built with privacy by design and against the abuse of personal data. In our conversation, she dives into recent FTC enforcement actions, analyzing five FTC actions and some enforcement sweeps by Colorado &amp; Connecticut. <br/><br/>Heidi shares her insights on the effect of the FTC enforcement actions and what privacy engineers need to know, emphasizing the need for data management practices to be transparent, accountable, and based on affirmative consent. We cover the role of privacy engineers in ensuring compliance with data privacy laws; why &apos;browsing data&apos; is &apos;sensitive data;&apos; the challenges companies face regarding data deletion; and the need for clear consent mechanisms, especially with the collection and use of location data. We also discuss the need to audit the privacy posture of products and services - which includes a requirement to document who made certain decisions - and how to prioritize risk analysis to proactively address risks to privacy.<br/><br/><b>Topics Covered</b>: </p><ul><li>Heidi’s journey into privacy law and advocacy for privacy by design and default</li><li>How the FTC brings enforcement actions, the effect of their settlements, and why privacy engineers should pay closer attention</li><li>Case 1: FTC v. InMarket Media - Heidi explains the implication of the decision: where data that are linked to a mobile advertising identifier (MAID) or an individual&apos;s home are not considered de-identified</li><li>Case 2: FTC v. X-Mode Social / OutLogic - Heidi explains the implication of the decision, focused on: affirmative express consent for location data collection; definition of a &apos;data product assessment&apos; and audit programs; and data retention &amp; deletion requirements</li><li>Case 3: FTC v. Avast - Heidi explains the implication of the decision: &apos;browsing data&apos; is considered &apos;sensitive data&apos;</li><li>Case 4: The People (CA) v. DoorDash - Heidi explains the implications of the decision, based on CalOPPA: where companies that share personal data with one another as part of a &apos;marketing cooperative&apos; are, in fact, selling of data</li><li>Heidi discusses recent State Enforcement Sweeps for privacy, specifically in Colorado and Connecticut and clarity around breach reporting timelines</li><li>The need to prioritize independent third-party audits for privacy</li><li>Case 5: FTC v. Kroger - Heidi explains why the FTC&apos;s blocking of Kroger&apos;s merger with Albertson&apos;s was based on antitrust and privacy harms given the sheer amount of personal data that they process</li><li>Tools and resources for keeping up with FTC cases and connecting with your privacy community </li></ul><p><b>Guest Info</b>: </p><ul><li>Follow Heidi on <a href='https://www.linkedin.com/in/heidi-saas-31a7a16/'>LinkedIn</a></li><li>Read (book): <a href='https://www.amazon.com/Means-Control-Alliance-Government-Surveillance/dp/0593443225'> &apos;Means of Control: How the Hidden Alliance of Tech and Government is Creating a New American Surveillance State&apos;</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14768200-s3e8-recent-ftc-enforcement-what-privacy-engineers-need-to-know-with-heidi-saas-h-t-saas.mp3" length="54445550" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/6vd6u1ufjea7c2is2kmu40puv31l?.jpg" />
    <itunes:author>Debra J. Farber / Heidi Saas</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14768200</guid>
    <pubDate>Tue, 26 Mar 2024 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14768200/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14768200/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14768200/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14768200/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14768200/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E8: &#39;Recent FTC Enforcement: What Privacy Engineers Need to Know&#39; with Heidi Saas (H.T. Saas)" />
  <psc:chapter start="1:56" title="Introducing Heidi Saas" />
  <psc:chapter start="4:23" title="Heidi&#39;s journey into privacy law and why advocating for privacy by design and default has been so important to her" />
  <psc:chapter start="10:15" title="How the FTC brings enforcement actions, the effect of their settlements, and why do privacy engineers should pay closer attention" />
  <psc:chapter start="15:51" title="Case 1: FTC v. InMarket - Heidi explains the implication of the decision - data linked to a mobile advertising identifier or an individual&#39;s home is not considered de-identified" />
  <psc:chapter start="21:51" title="Case 2: FTC v. X-Mode Social / OutLogic - Heidi explains the implication of the decision, focused on: affirmative, express consent; definition of a &#39;data product assessment&#39; and audit programs; and data retention &amp; deletion requirements" />
  <psc:chapter start="32:09" title="Case 3: FTC v. Avast - Heidi explains the implication of the decision where &#39;browsing data&#39; is considered &#39;sensitive data&#39;" />
  <psc:chapter start="45:20" title="Case 4: The People (CA) v. DoorDash - Heidi explains the implications of the holding: where companies sharing personal data as part of a &#39;marketing cooperative&#39; is a &#39;sale of data&#39;" />
  <psc:chapter start="49:45" title="Heidi discusses recent State Enforcement Sweeps, specifically in Colorodo and Connecticut" />
  <psc:chapter start="1:01:04" title="Case 5: Heidi explains how the FTC blocked the Kroger merger with Albertson&#39;s based on the personal data they have" />
</psc:chapters>
    <itunes:duration>4533</itunes:duration>
    <itunes:keywords>FTC enforcement, Avast, Doordash, OutLogic, InMarket Media, de-identification, affirmative express consent, independent third-party audits, location data, browsing data, antitrust, privacy policies, privacy by design</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>8</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E7: &#39;Personal CRM: Embracing Digital Minimalism &amp; Privacy Empowerment&#39; with Chris Zeunstrom (Yorba)</itunes:title>
    <title>S3E7: &#39;Personal CRM: Embracing Digital Minimalism &amp; Privacy Empowerment&#39; with Chris Zeunstrom (Yorba)</title>
    <itunes:summary><![CDATA[This week's episode, I chat with Chris Zeunstrom, the Founder and CEO of Ruca and Yorba. Ruca is a global design cooperative and founder support network, while Yorba is a reverse CRM that aims to reduce your digital footprint and keep your personal information safe. Through his businesses, Chris focuses on solving common problems and creating innovative products. In our conversation, we talk about building a privacy-first company, the digital minimalist movement, and the future of decentraliz...]]></itunes:summary>
    <description><![CDATA[<p>This week&apos;s episode, I chat with Chris Zeunstrom, the Founder and CEO of Ruca and Yorba. Ruca is a global design cooperative and founder support network, while Yorba is a reverse CRM that aims to reduce your digital footprint and keep your personal information safe. Through his businesses, Chris focuses on solving common problems and creating innovative products. In our conversation, we talk about building a privacy-first company, the digital minimalist movement, and the future of decentralized identity and storage.</p><p>Chris shares his journey as a privacy-focused entrepreneur and his mission to prioritize privacy and decentralization in managing personal data. He also explains the digital minimalist movement and why its teachings reach beyond the industry. Chris touches on Yorba&apos;s collaboration with Consumer Reports to implement Permission Slip and creating a Data Rights Protocol ecosystem that automates data deletion for consumers. Chris also emphasizes the benefits of decentralized identity and storage solutions in improving personal privacy and security. Finally, he gives you a sneak peek at what&apos;s next in store for Yorba.</p><p><br/><b>Topics Covered: </b></p><ul><li>How Yorba was designed as a privacy-1st consumer CRM platform; the problems that Yorba solves; and key product functionality &amp; privacy features</li><li>Why Chris decided to bring a consumer product to market for privacy rather than a B2B product</li><li>Why Chris incorporated Yorba as a &apos;Public Benefit Corporation&apos; (PBC) and sought B Corp status</li><li>Exploring &apos;Digital Minimalism&apos; </li><li>How Yorba&apos;s is working with Consumer Reports to advance the CR Data Rights Protocol, leveraging &apos;Permission Slip&apos; - an authorized agent for consumers to submit data deletion requests</li><li>The architectural design decisions behind Yorba’s personal CRM system </li><li>The benefits to using Matomo Analytics or Fathom Analytics for greater privacy vs. using Google Analytics </li><li>The privacy benefits to deploying &apos;Decentralized Identity&apos; &amp; &apos;Decentralized Storage&apos; architectures</li><li>Chris&apos; vision for the next stage of the Internet; and, the future of Yorba</li></ul><p><b>Guest Info: </b></p><ul><li>Follow/Connect with <a href='https://www.linkedin.com/in/chriszeunstrom/?originalSubdomain=pt'>Chris on LinkedIn</a></li><li>Check out <a href='https://yorba.co/'>Yorba&apos;s website</a> </li></ul><p><b>Resources Mentioned: </b></p><ul><li>Read: <a href='https://techcrunch.com/2024/02/22/yorbas-service-is-like-mint-for-uncluttering-your-entire-digital-life/'>TechCrunch&apos;s review of Yorba</a></li><li>Read: &apos;<a href='https://www.amazon.com/Digital-Minimalism-Choosing-Focused-Noisy/dp/0525536515'>Digital Minimalism - Choosing a Focused Life In a Noisy World</a>&apos; by Cal Newport</li><li>Subscribe to the <a href='https://bulletjournal.com/'>Bullet Journal</a> (AKA Bujo) on Digital Minimalism by Ryder Carroll</li><li>Learn  about <a href='https://www.permissionslipcr.com/'>Consumer Reports&apos; Permission Slip Protocol </a></li><li>Check out <a href='https://matomo.org/'>Matomo Analytics</a>  and <a href='https://usefathom.com/'>Fathom</a>  for privacy-first analy</li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week&apos;s episode, I chat with Chris Zeunstrom, the Founder and CEO of Ruca and Yorba. Ruca is a global design cooperative and founder support network, while Yorba is a reverse CRM that aims to reduce your digital footprint and keep your personal information safe. Through his businesses, Chris focuses on solving common problems and creating innovative products. In our conversation, we talk about building a privacy-first company, the digital minimalist movement, and the future of decentralized identity and storage.</p><p>Chris shares his journey as a privacy-focused entrepreneur and his mission to prioritize privacy and decentralization in managing personal data. He also explains the digital minimalist movement and why its teachings reach beyond the industry. Chris touches on Yorba&apos;s collaboration with Consumer Reports to implement Permission Slip and creating a Data Rights Protocol ecosystem that automates data deletion for consumers. Chris also emphasizes the benefits of decentralized identity and storage solutions in improving personal privacy and security. Finally, he gives you a sneak peek at what&apos;s next in store for Yorba.</p><p><br/><b>Topics Covered: </b></p><ul><li>How Yorba was designed as a privacy-1st consumer CRM platform; the problems that Yorba solves; and key product functionality &amp; privacy features</li><li>Why Chris decided to bring a consumer product to market for privacy rather than a B2B product</li><li>Why Chris incorporated Yorba as a &apos;Public Benefit Corporation&apos; (PBC) and sought B Corp status</li><li>Exploring &apos;Digital Minimalism&apos; </li><li>How Yorba&apos;s is working with Consumer Reports to advance the CR Data Rights Protocol, leveraging &apos;Permission Slip&apos; - an authorized agent for consumers to submit data deletion requests</li><li>The architectural design decisions behind Yorba’s personal CRM system </li><li>The benefits to using Matomo Analytics or Fathom Analytics for greater privacy vs. using Google Analytics </li><li>The privacy benefits to deploying &apos;Decentralized Identity&apos; &amp; &apos;Decentralized Storage&apos; architectures</li><li>Chris&apos; vision for the next stage of the Internet; and, the future of Yorba</li></ul><p><b>Guest Info: </b></p><ul><li>Follow/Connect with <a href='https://www.linkedin.com/in/chriszeunstrom/?originalSubdomain=pt'>Chris on LinkedIn</a></li><li>Check out <a href='https://yorba.co/'>Yorba&apos;s website</a> </li></ul><p><b>Resources Mentioned: </b></p><ul><li>Read: <a href='https://techcrunch.com/2024/02/22/yorbas-service-is-like-mint-for-uncluttering-your-entire-digital-life/'>TechCrunch&apos;s review of Yorba</a></li><li>Read: &apos;<a href='https://www.amazon.com/Digital-Minimalism-Choosing-Focused-Noisy/dp/0525536515'>Digital Minimalism - Choosing a Focused Life In a Noisy World</a>&apos; by Cal Newport</li><li>Subscribe to the <a href='https://bulletjournal.com/'>Bullet Journal</a> (AKA Bujo) on Digital Minimalism by Ryder Carroll</li><li>Learn  about <a href='https://www.permissionslipcr.com/'>Consumer Reports&apos; Permission Slip Protocol </a></li><li>Check out <a href='https://matomo.org/'>Matomo Analytics</a>  and <a href='https://usefathom.com/'>Fathom</a>  for privacy-first analy</li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14687390-s3e7-personal-crm-embracing-digital-minimalism-privacy-empowerment-with-chris-zeunstrom-yorba.mp3" length="31135791" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/qk30n7wl9bnir4jj9azcjw7an0vz?.jpg" />
    <itunes:author>Debra J. Farber / Chris Zeunstrom</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14687390</guid>
    <pubDate>Tue, 19 Mar 2024 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14687390/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14687390/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14687390/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14687390/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14687390/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E7: &#39;Personal CRM: Embracing Digital Minimalism &amp; Privacy Empowerment&#39; with Chris Zeunstrom (Yorba)" />
  <psc:chapter start="2:09" title="Introducing Chris Zeunstrom, Founder &amp; CEO at Ruka &amp; Yorba" />
  <psc:chapter start="6:29" title="Chris describes Yorba, a privacy-1st consumer CRM platform; the problems that Yorba solves; and key product functionality &amp; privacy features" />
  <psc:chapter start="11:24" title="Why Chris decided to bring a consumer product to market to help achieve privacy rather than a B2B product" />
  <psc:chapter start="13:54" title="Chris explains why he incorporated Yorba as a &#39;Public Benefit Corporation&#39; (PBC) and sought B Corp status" />
  <psc:chapter start="18:26" title="Chris shares his passion for decluttering one&#39;s digital life with a &#39;Digital Minimalism&#39; design approach" />
  <psc:chapter start="22:41" title="Chris describes Yorba&#39;s work with Consumer Reports on its CR&#39;s Data Rights Protocol and associated app: Permission Slip - an authorized agent for you to submit data deletion requests through that platform" />
  <psc:chapter start="26:24" title="Chris shares how his team thought about &amp; implemented privacy-enabling architecture" />
  <psc:chapter start="28:21" title="Chris &amp; Debra discuss the benefits of using privacy-first analytics tools like Matomo" />
  <psc:chapter start="32:39" title="Debra &amp; Chris discuss the privacy benefts to deploying &#39;Decentralized Identity&#39; &amp; &#39;Decentralized Storage&#39;" />
  <psc:chapter start="37:31" title="Chris describes his vision of the next phase of the Internet where people will be able to manage relationships with companies and platforms, essentially via a combined of a &#39;Personal CRM System&#39; &amp; &#39;Private Profiles&#39;" />
  <psc:chapter start="38:40" title="Chris shares what&#39;s on Yorba&#39;s product roadmap and how to collaborate" />
</psc:chapters>
    <itunes:duration>2591</itunes:duration>
    <itunes:keywords>digital minimalism, personal CRM, data minimization, minimalist design, decentralized identity, decentralized storage,consumer privacy, Yorba, Consumer Reports, Permission Slip, Fathom, Matomo, B Corp, Public Benefit Corporation</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>7</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E6: &#39;Keys to Good Privacy Implementation: Exploring Anonymization, Consent, &amp; DSARs&#39; with Jake Ottenwaelder (Integrative Privacy)</itunes:title>
    <title>S3E6: &#39;Keys to Good Privacy Implementation: Exploring Anonymization, Consent, &amp; DSARs&#39; with Jake Ottenwaelder (Integrative Privacy)</title>
    <itunes:summary><![CDATA[In this week's episode, I sat down with Jake Ottenwaelder,  Principal Privacy Engineer at Integrative Privacy LLC. Throughout our conversation, we discuss Jake’s holistic approach to privacy implementation that considers business, engineering, and personal objectives, as well as the role of anonymization, consent management, and DSAR processes for greater privacy.   Jake believes privacy implementation must account for the interconnectedness of privacy technologies and human interac...]]></itunes:summary>
    <description><![CDATA[<p>In this week&apos;s episode, I sat down with <a href='https://www.linkedin.com/in/jake-ottenwaelder/'>Jake Ottenwaelder</a>,  Principal Privacy Engineer at <a href='https://integrativeprivacy.com/'>Integrative Privacy LLC</a>. Throughout our conversation, we discuss Jake’s holistic approach to privacy implementation that considers business, engineering, and personal objectives, as well as the role of anonymization, consent management, and DSAR processes for greater privacy. <br/><br/>Jake believes privacy implementation must account for the interconnectedness of privacy technologies and human interactions. He highlights what a successful implementation looks like and the negative consequences when done poorly. We also dive into the challenges of implementing privacy in fast-paced, engineering-driven organizations. We talk about the complexities of anonymizing data (a very high bar) and he offers valuable suggestions and strategies for achieving anonymity while making the necessary resources more accessible. Plus, Jake shares his advice for organizational leaders to see themselves as servant-leaders, leaving a positive legacy in the field of privacy. </p><p><b>Topics Covered: </b></p><ul><li>What inspired Jake’s initial shift from security engineering to privacy engineering, with a focus on privacy implementation</li><li>How Jake&apos;s previous role at Axon helped him shift his mindset to privacy</li><li>Jake’s holistic approach to implementing privacy </li><li>The qualities of a successful implementation and the consequences of an unsuccessful implementation</li><li>The challenges of implementing privacy in large organizations </li><li>Common blockers to the deployment of anonymization</li><li>Jake’s perspective on using differential privacy techniques to achieve anonymity</li><li>Common blockers to implementing consent management capabilities</li><li>The importance of understanding data flow &amp; lineage, and auditing data deletion </li><li>Holistic approaches to implementing a streamlined and compliant DSAR process with minimal business disruption </li><li>Why Jake believes it&apos;s important to maintain a servant-leader mindset in privacy</li></ul><p><b>Guest Info: </b></p><ul><li>Connect with Jake on <a href='https://www.linkedin.com/in/jake-ottenwaelder/'>LinkedIn</a></li><li><a href='https://integrativeprivacy.com/'>Integrative Privacy LLC</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this week&apos;s episode, I sat down with <a href='https://www.linkedin.com/in/jake-ottenwaelder/'>Jake Ottenwaelder</a>,  Principal Privacy Engineer at <a href='https://integrativeprivacy.com/'>Integrative Privacy LLC</a>. Throughout our conversation, we discuss Jake’s holistic approach to privacy implementation that considers business, engineering, and personal objectives, as well as the role of anonymization, consent management, and DSAR processes for greater privacy. <br/><br/>Jake believes privacy implementation must account for the interconnectedness of privacy technologies and human interactions. He highlights what a successful implementation looks like and the negative consequences when done poorly. We also dive into the challenges of implementing privacy in fast-paced, engineering-driven organizations. We talk about the complexities of anonymizing data (a very high bar) and he offers valuable suggestions and strategies for achieving anonymity while making the necessary resources more accessible. Plus, Jake shares his advice for organizational leaders to see themselves as servant-leaders, leaving a positive legacy in the field of privacy. </p><p><b>Topics Covered: </b></p><ul><li>What inspired Jake’s initial shift from security engineering to privacy engineering, with a focus on privacy implementation</li><li>How Jake&apos;s previous role at Axon helped him shift his mindset to privacy</li><li>Jake’s holistic approach to implementing privacy </li><li>The qualities of a successful implementation and the consequences of an unsuccessful implementation</li><li>The challenges of implementing privacy in large organizations </li><li>Common blockers to the deployment of anonymization</li><li>Jake’s perspective on using differential privacy techniques to achieve anonymity</li><li>Common blockers to implementing consent management capabilities</li><li>The importance of understanding data flow &amp; lineage, and auditing data deletion </li><li>Holistic approaches to implementing a streamlined and compliant DSAR process with minimal business disruption </li><li>Why Jake believes it&apos;s important to maintain a servant-leader mindset in privacy</li></ul><p><b>Guest Info: </b></p><ul><li>Connect with Jake on <a href='https://www.linkedin.com/in/jake-ottenwaelder/'>LinkedIn</a></li><li><a href='https://integrativeprivacy.com/'>Integrative Privacy LLC</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14595541-s3e6-keys-to-good-privacy-implementation-exploring-anonymization-consent-dsars-with-jake-ottenwaelder-integrative-privacy.mp3" length="39033691" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/qhrle62xiklq93p2b6oeutgc3fwt?.jpg" />
    <itunes:author>Debra J. Farber / Jake Ottenwaelder</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14595541</guid>
    <pubDate>Tue, 05 Mar 2024 12:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14595541/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14595541/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14595541/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14595541/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14595541/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E6: &#39;Keys to Good Privacy Implementation: Exploring Anonymization, Consent, &amp; DSARs&#39; with Jake Ottenwaelder (Integrative Privacy)" />
  <psc:chapter start="2:03" title="Introducing Jake Ottenwaeger, Principal Privacy Engineer at Integrative Privacy" />
  <psc:chapter start="2:55" title="How Jake moved from Security Engineer to Privacy Engineer and how he shifted his mindset to foster solutions for privacy" />
  <psc:chapter start="11:45" title="Jake talks about how previous role at Axon helped him shift into a privacy mindset" />
  <psc:chapter start="13:10" title="Jake explains why taking an integrative approach to privacy is important to him" />
  <psc:chapter start="16:09" title="Jake&#39;s definition of a &#39;successful implementation&#39; and what makes for a &#39;bad implementation&#39; or not as successful" />
  <psc:chapter start="20:06" title="Jake discusses consequences for bad implementations of privacy, like technical debt." />
  <psc:chapter start="24:23" title="Debra &amp; Jake discuss the challenges of working in privacy at an engineering heavy organization where you want to understand the privacy implications of everything, but usually don&#39;t have the ability to do so." />
  <psc:chapter start="31:13" title="Jake shares common blockers to the deployment of anonymization in orgs" />
  <psc:chapter start="43:19" title="Jake shares the current blockers to implementing Consent Management capabilities into organizations" />
  <psc:chapter start="45:59" title="Jake describes the current blockers to implementing rights management capabilities through DSARs" />
  <psc:chapter start="52:17" title="Jake explains why it&#39;s important to have a servant-leader mindset in privacy. " />
</psc:chapters>
    <itunes:duration>3249</itunes:duration>
    <itunes:keywords></itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>6</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E5: &#39;Nonconformist Innovation in Modern Digital Identity&#39; with Steve Tout (Integrated Solutions Group)</itunes:title>
    <title>S3E5: &#39;Nonconformist Innovation in Modern Digital Identity&#39; with Steve Tout (Integrated Solutions Group)</title>
    <itunes:summary><![CDATA[In this week's episode, I am joined by Steve Tout, Practice Lead at Integrated Solutions Group (ISG) and Host of The Nonconformist Innovation Podcast to discuss the intersection of privacy and identity. Steve has 18+ years of experience in global Identity &amp; Access Management (IAM) and is currently completing his MBA from Santa Clara University. Throughout our conversation, Steve shares his journey as a reformed technologist and advocate for 'Nonconformist Innovation' &amp; 'Tipping Point ...]]></itunes:summary>
    <description><![CDATA[<p>In this week&apos;s episode, I am joined by <a href='https://www.linkedin.com/in/stevetout/'>Steve Tout</a>, Practice Lead at <a href='https://www.isg-nw.com/'>Integrated Solutions Group </a>(ISG) and Host of The Nonconformist Innovation Podcast to discuss the intersection of privacy and identity. Steve has 18+ years of experience in global Identity &amp; Access Management (IAM) and is currently completing his MBA from Santa Clara University. Throughout our conversation, Steve shares his journey as a reformed technologist and advocate for &apos;Nonconformist Innovation&apos; &amp; &apos;Tipping Point Leadership.&apos;<br/><br/>Steve&apos;s approach to identity involves breaking it down into 4 components: 1) philosophy, 2) politics, 3) economics &amp; 4)technology, highlighting their interconnectedness. We also discuss his work with Washington State and its efforts to modernize Consumer Identity Access Management (IAM). We address concerns around AI, biometrics &amp; mobile driver&apos;s licenses. Plus, Steve offers his perspective on tipping point leadership and the challenges organizations face in achieving privacy change at scale.<br/><br/><b>Topics Covered: </b></p><ul><li>Steve&apos;s origin story; his accidental entry into identity &amp; access management (IAM)</li><li>Steve&apos;s perspective as a &apos;Nonconformist Innovator&apos; and why he launched &apos;The Nonconformist Innovation Podcast&apos;</li><li>The intersection of privacy &amp; identity</li><li>How to address organizational resistance to change, especially with lean resources</li><li>Benefits gained from &apos;Tipping Point Leadership&apos;</li><li>4 common hurdles to tipping point leadership </li><li>How to be a successful tipping point leader within a very bottom-up focused organization</li><li>&apos;Consumer IAM&apos; &amp; the driving need for modernizing identity in Washington State</li><li>How Steve has approached the challenges related to privacy, ethics &amp; equity </li><li>Differences between the mobile driver&apos;s license (mDL) &amp; verified credentials (VC) standards &amp; technology</li><li>How States are approaching the implementation of  mDL in different ways and the privacy benefits of &apos;selective disclosure&apos;</li><li>Steve&apos;s advice for privacy technologists to best position them and their orgs at the forefront of privacy and security innovation</li><li>Steve recommended books for learning more about tipping point leadership</li></ul><p><b>Guest Info: </b></p><ul><li>Connect with Steve on <a href='https://www.linkedin.com/in/stevetout/'>LinkedIn</a></li><li>Listen to <a href='https://www.nonconformistinnovation.com/episodes'>The Nonconformist Innovation Podcast </a></li></ul><p><b>Resources Mentioned: </b></p><ul><li>Steve&apos;s <a href='https://www.nonconformistinnovation.com/tom-kemp/'>Interview with Tom Kemp</a></li><li>Tipping Point Leadership books:<ul><li><a href='https://www.amazon.com/Change-Management-including-featured-Leading/dp/1422158004'>On Change Management </a></li><li><a href='https://open.umn.edu/opentextbooks/textbooks/organizational-behavior'>Organizational Behavior</a></li><li><a href='https://www.scu.edu/institute-for-technology-ethics-and-culture/itec-handbook/'>Ethics in the Age of Disruptive Technologies: An Operational Roadm</a></li></ul></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this week&apos;s episode, I am joined by <a href='https://www.linkedin.com/in/stevetout/'>Steve Tout</a>, Practice Lead at <a href='https://www.isg-nw.com/'>Integrated Solutions Group </a>(ISG) and Host of The Nonconformist Innovation Podcast to discuss the intersection of privacy and identity. Steve has 18+ years of experience in global Identity &amp; Access Management (IAM) and is currently completing his MBA from Santa Clara University. Throughout our conversation, Steve shares his journey as a reformed technologist and advocate for &apos;Nonconformist Innovation&apos; &amp; &apos;Tipping Point Leadership.&apos;<br/><br/>Steve&apos;s approach to identity involves breaking it down into 4 components: 1) philosophy, 2) politics, 3) economics &amp; 4)technology, highlighting their interconnectedness. We also discuss his work with Washington State and its efforts to modernize Consumer Identity Access Management (IAM). We address concerns around AI, biometrics &amp; mobile driver&apos;s licenses. Plus, Steve offers his perspective on tipping point leadership and the challenges organizations face in achieving privacy change at scale.<br/><br/><b>Topics Covered: </b></p><ul><li>Steve&apos;s origin story; his accidental entry into identity &amp; access management (IAM)</li><li>Steve&apos;s perspective as a &apos;Nonconformist Innovator&apos; and why he launched &apos;The Nonconformist Innovation Podcast&apos;</li><li>The intersection of privacy &amp; identity</li><li>How to address organizational resistance to change, especially with lean resources</li><li>Benefits gained from &apos;Tipping Point Leadership&apos;</li><li>4 common hurdles to tipping point leadership </li><li>How to be a successful tipping point leader within a very bottom-up focused organization</li><li>&apos;Consumer IAM&apos; &amp; the driving need for modernizing identity in Washington State</li><li>How Steve has approached the challenges related to privacy, ethics &amp; equity </li><li>Differences between the mobile driver&apos;s license (mDL) &amp; verified credentials (VC) standards &amp; technology</li><li>How States are approaching the implementation of  mDL in different ways and the privacy benefits of &apos;selective disclosure&apos;</li><li>Steve&apos;s advice for privacy technologists to best position them and their orgs at the forefront of privacy and security innovation</li><li>Steve recommended books for learning more about tipping point leadership</li></ul><p><b>Guest Info: </b></p><ul><li>Connect with Steve on <a href='https://www.linkedin.com/in/stevetout/'>LinkedIn</a></li><li>Listen to <a href='https://www.nonconformistinnovation.com/episodes'>The Nonconformist Innovation Podcast </a></li></ul><p><b>Resources Mentioned: </b></p><ul><li>Steve&apos;s <a href='https://www.nonconformistinnovation.com/tom-kemp/'>Interview with Tom Kemp</a></li><li>Tipping Point Leadership books:<ul><li><a href='https://www.amazon.com/Change-Management-including-featured-Leading/dp/1422158004'>On Change Management </a></li><li><a href='https://open.umn.edu/opentextbooks/textbooks/organizational-behavior'>Organizational Behavior</a></li><li><a href='https://www.scu.edu/institute-for-technology-ethics-and-culture/itec-handbook/'>Ethics in the Age of Disruptive Technologies: An Operational Roadm</a></li></ul></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14467700-s3e5-nonconformist-innovation-in-modern-digital-identity-with-steve-tout-integrated-solutions-group.mp3" length="39585281" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/nzdx70gjbp6l2blpbwsoig6ywmq1?.jpg" />
    <itunes:author>Debra J Farber / Steve Tout</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14467700</guid>
    <pubDate>Tue, 27 Feb 2024 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14467700/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14467700/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14467700/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14467700/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14467700/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E5: &#39;Nonconformist Innovation in Modern Digital Identity&#39; with Steve Tout (Integrated Solutions Group)" />
  <psc:chapter start="2:25" title="Introducing Steve Tout, Practice Lead at ISG and Host of The Nonconformist Innovation Podcast" />
  <psc:chapter start="4:14" title="Steve discusses his origin story and career path" />
  <psc:chapter start="7:18" title="Steve defines &quot;Nonconformist Innovation&quot; and describes how he&#39;s a nonconformist innovator" />
  <psc:chapter start="14:10" title="Steve tells us about his podcast show, The Non-Comformist Innovation Podcast" />
  <psc:chapter start="17:44" title="Debra &amp; Steve discuss the overlap of identity and privacy" />
  <psc:chapter start="26:42" title="Steve&#39;s advice on how to address organizational resistance to change, especially when your current resources are lean" />
  <psc:chapter start="30:10" title="Debra &amp; Steve discuss the benefits gained from &#39;Tipping Point Leadership,&#39; where business transformation can occur by gaining a critical mass can be driven by a small number of influential leaders or change agents within the organization" />
  <psc:chapter start="33:40" title="Steve shares how to be a successful tipping point leader if you work in a very bottom up focused organization " />
  <psc:chapter start="39:02" title="Steve explains what &#39;consumer IAM&#39; is and what&#39;s the driving need for modernizing identity in Washington State. " />
  <psc:chapter start="42:31" title="Steve shares some of the challenges to privacy, ethics and equity that he&#39;s come across and how his team has addressed them" />
  <psc:chapter start="46:12" title="Debra &amp; Steve discuss the differences between mDL, so mobile driver&#39;s license and verified credentials technology" />
  <psc:chapter start="48:59" title="Steve describes how the States are approaching their implementation of mobile drivers licenses (mDL) in different ways and articulates the benefits of &#39;selective disclosure&#39;" />
  <psc:chapter start="50:32" title="Steve&#39;s advice for privacy technologists so that they can best position themselves and their organizations at the forefront of privacy and security innovation" />
  <psc:chapter start="51:38" title="Steve lists the books and resources that he recommend so that people can learn more about this tipping point leadership style" />
</psc:chapters>
    <itunes:duration>3295</itunes:duration>
    <itunes:keywords>mobile drivers license, mDL, verified credentials, IAM, tipping point leadership, innovation, The Nonconformist Innovation Podcast</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>5</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E4: &#39;Supporting Developer Accountability for Privacy&#39; with Jake Ward (Data Protocol)</itunes:title>
    <title>S3E4: &#39;Supporting Developer Accountability for Privacy&#39; with Jake Ward (Data Protocol)</title>
    <itunes:summary><![CDATA[This week, I chat with Jake Ward, the Co-Founder and CEO of Data Protocol, to discuss how the Data Protocol platform supports developers' accountability for privacy by giving developers the relevant information in the way that they want it. Throughout the episode, we cover the Privacy Engineering course offerings and certification program; how to improve communication with  developers; and trends that Jake sees across his customers after 2 years of offering these courses to engineers.  I...]]></itunes:summary>
    <description><![CDATA[<p>This week, I chat with <a href='https://www.linkedin.com/in/jacobmward/'>Jake Ward</a>, the Co-Founder and CEO of <a href='https://app.dataprotocol.com/'>Data Protocol</a>, to discuss how the Data Protocol platform supports developers&apos; accountability for privacy by giving developers the relevant information in the way that they want it. Throughout the episode, we cover the Privacy Engineering course offerings and certification program; how to improve communication with  developers; and trends that Jake sees across his customers after 2 years of offering these courses to engineers.<br/><br/>In our conversation, we dive into the topics covered in the Privacy Engineering Certification Program course offering , led by instructor Nishant Bhajaria, and the impact that engineers can make in their organization after completing it. Jake shares why he&apos;s so passionate about  empowering developers, enabling them to build safer products. We  talk about the effects of privacy engineering on large tech companies and how to bridge the gap between developers and the support they need with collaboration and accountability. Plus, Jake reflects on his own career path as the Press Secretary for a U.S. Senator and the experiences that shaped his perspectives and brought him to where he is now.<br/><br/><b>Topics Covered</b>: </p><ul><li>Jake’s career journey and why he landed on supporting software developers </li><li>How Jake build Data Protocol and it’s community </li><li>What &apos;shifting privacy left&apos; means to Jake</li><li>Data Protocol&apos;s Privacy Engineering Courses, Labs, &amp; Certification Program and what developers will take away</li><li>The difference between Data Protocol&apos;s free Privacy Courses and paid Certification</li><li>Feedback from customers and &amp; trends observed</li><li>Whether tech companies have seen improvement in engineers&apos; ability to embed privacy into the development of products &amp; services after completing the Privacy Engineering courses and labs </li><li>Other privacy-related courses available on Data Protocol, and privacy courses  on the roadmap</li><li>Ways to leverage communications to surmount current challenges</li><li>How organizations can make their developers accountable for privacy, and the importance of aligning responsibility, accountability &amp; business processes</li><li>How Debra would operationalize this accountability into an organization</li><li>How you can use the PrivacyCode.ai privacy tech platform to enable the operationalization of privacy accountability for developers</li></ul><p><b>Resources Mentioned</b>: </p><ul><li>Check out <a href='https://app.dataprotocol.com/'>Data Protocol&apos;s courses</a>, based on topic</li><li>Enroll in <a href='https://app.dataprotocol.com/certifications/1'>The Privacy Engineering Certification Program</a> (courses are free)</li><li>Check out <a href='https://podcasts.apple.com/us/podcast/s3e2-my-top-20-privacy-engineering-resources-for-2024/id1651019312?i=1000643143674'>S3E2: &apos;My Top 20 Privacy Engineering Resources for 2024&apos; </a></li></ul><p><b>Guest Info</b>: </p><ul><li>Connect with Jake on <a href='https://www.linkedin.com/in/jacobmward/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, I chat with <a href='https://www.linkedin.com/in/jacobmward/'>Jake Ward</a>, the Co-Founder and CEO of <a href='https://app.dataprotocol.com/'>Data Protocol</a>, to discuss how the Data Protocol platform supports developers&apos; accountability for privacy by giving developers the relevant information in the way that they want it. Throughout the episode, we cover the Privacy Engineering course offerings and certification program; how to improve communication with  developers; and trends that Jake sees across his customers after 2 years of offering these courses to engineers.<br/><br/>In our conversation, we dive into the topics covered in the Privacy Engineering Certification Program course offering , led by instructor Nishant Bhajaria, and the impact that engineers can make in their organization after completing it. Jake shares why he&apos;s so passionate about  empowering developers, enabling them to build safer products. We  talk about the effects of privacy engineering on large tech companies and how to bridge the gap between developers and the support they need with collaboration and accountability. Plus, Jake reflects on his own career path as the Press Secretary for a U.S. Senator and the experiences that shaped his perspectives and brought him to where he is now.<br/><br/><b>Topics Covered</b>: </p><ul><li>Jake’s career journey and why he landed on supporting software developers </li><li>How Jake build Data Protocol and it’s community </li><li>What &apos;shifting privacy left&apos; means to Jake</li><li>Data Protocol&apos;s Privacy Engineering Courses, Labs, &amp; Certification Program and what developers will take away</li><li>The difference between Data Protocol&apos;s free Privacy Courses and paid Certification</li><li>Feedback from customers and &amp; trends observed</li><li>Whether tech companies have seen improvement in engineers&apos; ability to embed privacy into the development of products &amp; services after completing the Privacy Engineering courses and labs </li><li>Other privacy-related courses available on Data Protocol, and privacy courses  on the roadmap</li><li>Ways to leverage communications to surmount current challenges</li><li>How organizations can make their developers accountable for privacy, and the importance of aligning responsibility, accountability &amp; business processes</li><li>How Debra would operationalize this accountability into an organization</li><li>How you can use the PrivacyCode.ai privacy tech platform to enable the operationalization of privacy accountability for developers</li></ul><p><b>Resources Mentioned</b>: </p><ul><li>Check out <a href='https://app.dataprotocol.com/'>Data Protocol&apos;s courses</a>, based on topic</li><li>Enroll in <a href='https://app.dataprotocol.com/certifications/1'>The Privacy Engineering Certification Program</a> (courses are free)</li><li>Check out <a href='https://podcasts.apple.com/us/podcast/s3e2-my-top-20-privacy-engineering-resources-for-2024/id1651019312?i=1000643143674'>S3E2: &apos;My Top 20 Privacy Engineering Resources for 2024&apos; </a></li></ul><p><b>Guest Info</b>: </p><ul><li>Connect with Jake on <a href='https://www.linkedin.com/in/jacobmward/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14467641-s3e4-supporting-developer-accountability-for-privacy-with-jake-ward-data-protocol.mp3" length="32203678" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/vjaiggbg9w9s5nauck36829iab9z?.jpg" />
    <itunes:author>Debra J Farber / Jake Ward</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14467641</guid>
    <pubDate>Tue, 13 Feb 2024 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14467641/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14467641/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14467641/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14467641/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14467641/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E4: &#39;Supporting Developer Accountability for Privacy&#39; with Jake Ward (Data Protocol)" />
  <psc:chapter start="1:20" title="Introducing Jake Ward, Co-Founder &amp; CEO at Data Protocol" />
  <psc:chapter start="2:35" title="Jake&#39;s origin story, including his experience on Capitol Hill as a Press Secretary for a U.S. Senator" />
  <psc:chapter start="5:49" title="The Data Protocol platform, which offers educational support for developers" />
  <psc:chapter start="11:41" title="What &#39;Shifting Privacy Left&#39; means to Jake" />
  <psc:chapter start="15:24" title="Data Protocol&#39;s Privacy Engineering Courses &amp; Certification Program" />
  <psc:chapter start="20:09" title="What developers will learn from taking Data Protocol&#39;s Privacy Engineering Courses" />
  <psc:chapter start="21:52" title="The difference between Data Protocol&#39;s free Privacy Courses and paid Certification" />
  <psc:chapter start="23:52" title="Feedback on the course and &amp; trends observed" />
  <psc:chapter start="26:19" title="Whether tech companies have seen improvement in engineers&#39; ability to embed privacy into the development of products &amp; services" />
  <psc:chapter start="27:49" title="Jake discusses other available privacy-related courses, and courses that are on the roadmap and ways to leverage communications to surmount current challenges" />
  <psc:chapter start="33:04" title="How organizations can make their developers accountable for privacy" />
  <psc:chapter start="38:04" title="Debra describes how she would operationalize this accountability into an organization and Jake shares his thoughts" />
  <psc:chapter start="42:11" title="Debra highlights PrivacyCode.ai, a privacy tech platform that enables the operationalization of privacy accountability for developers, aligning to other areas of the business" />
</psc:chapters>
    <itunes:duration>2680</itunes:duration>
    <itunes:keywords>developer support, privacy accountability, Data Protocol, Privacy Engineering Certification, Privacy Engineering courses</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>4</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E3: &#39;Shifting Left from Practicing Attorney to Privacy Engineer’ with Jay Averitt (Microsoft)</itunes:title>
    <title>S3E3: &#39;Shifting Left from Practicing Attorney to Privacy Engineer’ with Jay Averitt (Microsoft)</title>
    <itunes:summary><![CDATA[My guest this week is Jay Averitt, Senior Privacy Product Manager and Privacy Engineer at Microsoft, where he transitioned his career from Technology Attorney to Privacy Counsel, and most recently to Privacy Engineer.  In this episode, we hear from Jay about: his professional path from a degree in Management Information Systems to Privacy Engineer; how Twitter and Microsoft navigated a privacy setup, and how to determine privacy program maturity; multiple of his Privacy Engineering community ...]]></itunes:summary>
    <description><![CDATA[<p>My guest this week is <a href='https://www.linkedin.com/in/jay-averitt/'>Jay Averitt</a>, Senior Privacy Product Manager and Privacy Engineer at Microsoft, where he transitioned his career from Technology Attorney to Privacy Counsel, and most recently to Privacy Engineer.<br/><br/>In this episode, we hear from Jay about: his professional path from a degree in Management Information Systems to Privacy Engineer; how Twitter and Microsoft navigated a privacy setup, and how to determine privacy program maturity; multiple of his Privacy Engineering community projects; and tips on how to spread privacy awareness and stay active within the industry. </p><p><br/><b>Topics Covered</b>:</p><ul><li>Jay’s unique professional journey from Attorney to Privacy Engineer</li><li>Jay’s big mindset shift from serving as Privacy Counsel to Privacy Engineer, from a day-to-day and internal perspective</li><li>Why constant learning is essential in the field of privacy engineering, requiring us to keep up with ever-changing laws, standards, and technologies</li><li>Jay’s comparison of what it&apos;s like to work for Twitter vs. Microsoft when it comes to how each company focuses on privacy and data protection </li><li>Two ways to determine Privacy Program Maturity, according to Jay</li><li>How engineering-focused organizations can unify around a corporate privacy strategy and how privacy pros can connect to people beyond their siloed teams</li><li>Why building and maintaining relationships is the key for privacy engineers to be seen as enablers instead of blockers </li><li>A detailed look at the &apos;Technical Privacy Review&apos; process</li><li>A peak into Privacy Quest’s gamified privacy engineering platform and the events that Jay &amp; Debra are leading as part of its DPD&apos;24 Festival Village month-long puzzles and events</li><li>Debra&apos;s &amp; Jay&apos;s experiences at the USENIX PEPR&apos;23; why it provided so much value for them both; and, why you should consider attending PEPR&apos;24  </li><li>Ways to utilize online Slack communities, LinkedIn, and other tools to stay active in the privacy engineering world</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Review talks from the University of Illinois<a href='https://cybersecurity.illinois.edu/2024-privacy-everywhere-conference/'> &apos;Privacy Everywhere Conference 2024</a>&apos;</li><li>Join the Privacy Quest Village&apos;s &apos;<a href='https://play.privacyquest.org/quests/privacy-quest-village-dpd24-festival-hub'>Data Privacy Day’24 Festival</a>&apos; (through Feb 18th)</li><li>Submit a Proposal / Register for the <a href='https://www.usenix.org/conference/pepr24'>USENIX PEPR ‘24 Conference</a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li>Connect with Jay on <a href='https://www.linkedin.com/in/jay-averitt/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>My guest this week is <a href='https://www.linkedin.com/in/jay-averitt/'>Jay Averitt</a>, Senior Privacy Product Manager and Privacy Engineer at Microsoft, where he transitioned his career from Technology Attorney to Privacy Counsel, and most recently to Privacy Engineer.<br/><br/>In this episode, we hear from Jay about: his professional path from a degree in Management Information Systems to Privacy Engineer; how Twitter and Microsoft navigated a privacy setup, and how to determine privacy program maturity; multiple of his Privacy Engineering community projects; and tips on how to spread privacy awareness and stay active within the industry. </p><p><br/><b>Topics Covered</b>:</p><ul><li>Jay’s unique professional journey from Attorney to Privacy Engineer</li><li>Jay’s big mindset shift from serving as Privacy Counsel to Privacy Engineer, from a day-to-day and internal perspective</li><li>Why constant learning is essential in the field of privacy engineering, requiring us to keep up with ever-changing laws, standards, and technologies</li><li>Jay’s comparison of what it&apos;s like to work for Twitter vs. Microsoft when it comes to how each company focuses on privacy and data protection </li><li>Two ways to determine Privacy Program Maturity, according to Jay</li><li>How engineering-focused organizations can unify around a corporate privacy strategy and how privacy pros can connect to people beyond their siloed teams</li><li>Why building and maintaining relationships is the key for privacy engineers to be seen as enablers instead of blockers </li><li>A detailed look at the &apos;Technical Privacy Review&apos; process</li><li>A peak into Privacy Quest’s gamified privacy engineering platform and the events that Jay &amp; Debra are leading as part of its DPD&apos;24 Festival Village month-long puzzles and events</li><li>Debra&apos;s &amp; Jay&apos;s experiences at the USENIX PEPR&apos;23; why it provided so much value for them both; and, why you should consider attending PEPR&apos;24  </li><li>Ways to utilize online Slack communities, LinkedIn, and other tools to stay active in the privacy engineering world</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Review talks from the University of Illinois<a href='https://cybersecurity.illinois.edu/2024-privacy-everywhere-conference/'> &apos;Privacy Everywhere Conference 2024</a>&apos;</li><li>Join the Privacy Quest Village&apos;s &apos;<a href='https://play.privacyquest.org/quests/privacy-quest-village-dpd24-festival-hub'>Data Privacy Day’24 Festival</a>&apos; (through Feb 18th)</li><li>Submit a Proposal / Register for the <a href='https://www.usenix.org/conference/pepr24'>USENIX PEPR ‘24 Conference</a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li>Connect with Jay on <a href='https://www.linkedin.com/in/jay-averitt/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14389554-s3e3-shifting-left-from-practicing-attorney-to-privacy-engineer-with-jay-averitt-microsoft.mp3" length="37375513" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/jb14j7n2jnayjlf01lxr6m0m9gye?.jpg" />
    <itunes:author>Debra J. Farber / Jay Averitt</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14389554</guid>
    <pubDate>Tue, 30 Jan 2024 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14389554/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14389554/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14389554/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14389554/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14389554/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="Introducing Jay Averitt, Sr. Privacy Product Manager &amp; Privacy Engineer (Microsoft)" />
  <psc:chapter start="9:05" title="Shifting Privacy Left from a Legal Mindset to a Privacy Engineering Mindset" />
  <psc:chapter start="24:49" title="Building Relationships within Your Org and Enabling Privacy Engineering" />
  <psc:chapter start="30:02" title="Conducting Technical Privacy Reviews" />
  <psc:chapter start="35:03" title="Jay Shares His Data Privacy Day Activities Related to Privacy Engineering" />
  <psc:chapter start="46:54" title="How Jay Stays Updated on Privacy Engineering Topics" />
</psc:chapters>
    <itunes:duration>3111</itunes:duration>
    <itunes:keywords>shifting left, privacy engineer, privacy counsel, technical privacy reviews</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>3</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E2: &#39;My Top 20 Privacy Engineering Resources for 2024&#39; with Debra Farber (Shifting Privacy Left)</itunes:title>
    <title>S3E2: &#39;My Top 20 Privacy Engineering Resources for 2024&#39; with Debra Farber (Shifting Privacy Left)</title>
    <itunes:summary><![CDATA[In Honor of Data Privacy Week 2024, we're publishing a special episode. Instead of interviewing a guest, Debra shares her 'Top 20 Privacy Engineering Resources' and why. Check out her favorite free privacy engineering courses, books, podcasts, creative learning platforms, privacy threat modeling frameworks, conferences, government resources, and more.  DEBRA's TOP 20 PRIVACY ENGINEERING RESOURCES (in no particular order) Privado's Free Course: 'Technical Privacy Masterclass'OpenMined's Free C...]]></itunes:summary>
    <description><![CDATA[<p>In Honor of Data Privacy Week 2024, we&apos;re publishing a special episode. Instead of interviewing a guest, Debra shares her &apos;Top 20 Privacy Engineering Resources&apos; and why. Check out her favorite free privacy engineering courses, books, podcasts, creative learning platforms, privacy threat modeling frameworks, conferences, government resources, and more.<br/><br/><b>DEBRA&apos;s</b> <b>TOP 20 PRIVACY ENGINEERING RESOURCES</b> (in no particular order)</p><ol><li>Privado&apos;s Free Course: &apos;Technical Privacy Masterclass&apos;</li><li>OpenMined&apos;s Free Course: &apos;Our Privacy Opportunity&apos; </li><li>Data Protocol&apos;s Privacy Engineering Certification Program</li><li>The Privacy Quest Platform &amp; Games; Bonus: The Hitchhiker&apos;s Guide to Privacy Engineering</li><li>&apos;Data Privacy: a runbook for engineers by Nishant Bhajaria</li><li>&apos;Privacy Engineering, a Data Flow and Ontological Approach&apos; by Ian Oliver</li><li>&apos;Practical Data Privacy: enhancing privacy and security in data&apos; by Katharine Jarmul</li><li>Strategic Privacy by Design, 2nd Edition by R. Jason Cronk</li><li>&apos;The Privacy Engineer&apos;s Manifesto: getting from policy to code to QA to value&apos; by Michelle Finneran-Dennedy, Jonathan Fox and Thomas R. Dennedy </li><li>USENIX Conference on Privacy Engineering Practice and Respect (PEPR)</li><li>IEEE&apos;s The International Workshop on Privacy Engineering (IWPE)</li><li>Institute of Operational Privacy Design (IOPD)</li><li>&apos;The Shifting Privacy Left Podcast,&apos; produced and hosted by Debra J Farber and sponsored by Privado</li><li>Monitaur&apos;s &apos;The AI Fundamentalists Podcast&apos; hosted by Andrew Clark &amp; Sid Mangalik</li><li>Skyflow&apos;s &apos;Partially Redacted Podcast&apos; with Sean Falconer</li><li>The LINDDUN Privacy Threat Model Framework &amp; LINDDUN GO Card Game</li><li>The Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai) Framework &amp; PLOT4ai Card Game</li><li>The IAPP Privacy Engineering Section</li><li>The NIST Privacy Engineering Program Collaboration Space</li><li>The EDPS Internet Privacy Engineering Network (IPEN)</li></ol><p>Read “<a href='https://www.privado.ai/post/privacy-engineering-resources'>Top 20 Privacy Engineering Resources</a>” on Privado’s Blog.</p><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In Honor of Data Privacy Week 2024, we&apos;re publishing a special episode. Instead of interviewing a guest, Debra shares her &apos;Top 20 Privacy Engineering Resources&apos; and why. Check out her favorite free privacy engineering courses, books, podcasts, creative learning platforms, privacy threat modeling frameworks, conferences, government resources, and more.<br/><br/><b>DEBRA&apos;s</b> <b>TOP 20 PRIVACY ENGINEERING RESOURCES</b> (in no particular order)</p><ol><li>Privado&apos;s Free Course: &apos;Technical Privacy Masterclass&apos;</li><li>OpenMined&apos;s Free Course: &apos;Our Privacy Opportunity&apos; </li><li>Data Protocol&apos;s Privacy Engineering Certification Program</li><li>The Privacy Quest Platform &amp; Games; Bonus: The Hitchhiker&apos;s Guide to Privacy Engineering</li><li>&apos;Data Privacy: a runbook for engineers by Nishant Bhajaria</li><li>&apos;Privacy Engineering, a Data Flow and Ontological Approach&apos; by Ian Oliver</li><li>&apos;Practical Data Privacy: enhancing privacy and security in data&apos; by Katharine Jarmul</li><li>Strategic Privacy by Design, 2nd Edition by R. Jason Cronk</li><li>&apos;The Privacy Engineer&apos;s Manifesto: getting from policy to code to QA to value&apos; by Michelle Finneran-Dennedy, Jonathan Fox and Thomas R. Dennedy </li><li>USENIX Conference on Privacy Engineering Practice and Respect (PEPR)</li><li>IEEE&apos;s The International Workshop on Privacy Engineering (IWPE)</li><li>Institute of Operational Privacy Design (IOPD)</li><li>&apos;The Shifting Privacy Left Podcast,&apos; produced and hosted by Debra J Farber and sponsored by Privado</li><li>Monitaur&apos;s &apos;The AI Fundamentalists Podcast&apos; hosted by Andrew Clark &amp; Sid Mangalik</li><li>Skyflow&apos;s &apos;Partially Redacted Podcast&apos; with Sean Falconer</li><li>The LINDDUN Privacy Threat Model Framework &amp; LINDDUN GO Card Game</li><li>The Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai) Framework &amp; PLOT4ai Card Game</li><li>The IAPP Privacy Engineering Section</li><li>The NIST Privacy Engineering Program Collaboration Space</li><li>The EDPS Internet Privacy Engineering Network (IPEN)</li></ol><p>Read “<a href='https://www.privado.ai/post/privacy-engineering-resources'>Top 20 Privacy Engineering Resources</a>” on Privado’s Blog.</p><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14385777-s3e2-my-top-20-privacy-engineering-resources-for-2024-with-debra-farber-shifting-privacy-left.mp3" length="39085229" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/pcpxu6svon47lk1ztx7j6oo8jjdb?.jpg" />
    <itunes:author>Debra J. Farber (Shifting Privacy Left)</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14385777</guid>
    <pubDate>Tue, 23 Jan 2024 14:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14385777/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14385777/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14385777/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14385777/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14385777/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E2: &#39;My Top 20 Privacy Engineering Resources for 2024&#39; with Debra Farber (Shifting Privacy Left)" />
  <psc:chapter start="2:39" title="1) Free Course: Privado&#39;s Technical Privacy Masterclass" />
  <psc:chapter start="5:28" title="2) Free Course: Our Privacy Opportunity" />
  <psc:chapter start="10:20" title="3) Free Course / Paid Cert: Data Protocol&#39;s Privacy Engineering Certification Program" />
  <psc:chapter start="14:22" title="4) Creative Privacy: Privacy Quest Platform &amp; Games; Bonus: The Hitchhiker&#39;s Guide to Privacy Engineering" />
  <psc:chapter start="19:00" title="5) Book: Data Privacy - a runbook for engineers" />
  <psc:chapter start="20:48" title="6) Book: Privacy Engineering, a Data Flow and Ontological Approach" />
  <psc:chapter start="22:09" title="7) Book: Practical Data Privacy, enhancing privacy and security in data" />
  <psc:chapter start="24:17" title="8) IAPP Textbook: Strategic Privacy by Design, 2nd Edition" />
  <psc:chapter start="27:19" title="9) Book: The Privacy Engineer&#39;s Manifesto: getting from policy to code to QA to value" />
  <psc:chapter start="29:23" title="10) USENIX Conference: Privacy Engineering Practice and Respect (PEPR)" />
  <psc:chapter start="32:33" title="11) IEEE Workshop: The International Workshop on Privacy Engineering (IWPE)" />
  <psc:chapter start="34:16" title="12) Non-Profit: Institute of Operational Privacy Design (IOPD)" />
  <psc:chapter start="36:43" title="13) Independent Podcast: The Shifting Privacy Left Podcast" />
  <psc:chapter start="38:58" title="14) Monitaur&#39;s Podcast: The AI Fundamentalists" />
  <psc:chapter start="40:42" title="15) Skyflow&#39;s Podcast: Partially Redacted" />
  <psc:chapter start="41:27" title="16) Threat Model Framework: LINDDUN Privacy Threat Model Framework &amp; LINDDUN GO Card Game" />
  <psc:chapter start="43:24" title="17) Threat Model Framework: The Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai) Framework &amp; PLOT4ai Card Game" />
  <psc:chapter start="45:13" title="18) IAPP Privacy Engineering Section" />
  <psc:chapter start="46:57" title="19) NIST Privacy Engineering Program Collaboration Space" />
  <psc:chapter start="49:36" title="20) EDPS Internet Privacy Engineering Network (IPEN)" />
</psc:chapters>
    <itunes:duration>3253</itunes:duration>
    <itunes:keywords>privacy engineering resources</itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>2</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S3E1: &quot;Privacy-preserving Machine Learning and NLP&quot; with Patricia Thaine (Private AI)</itunes:title>
    <title>S3E1: &quot;Privacy-preserving Machine Learning and NLP&quot; with Patricia Thaine (Private AI)</title>
    <itunes:summary><![CDATA[My guest this week is Patricia Thaine, Co-founder and CEO of Private AI, where she leads a team of experts in developing cutting-edge solutions using AI to identify, reduce, and remove Personally Identifiable Information (PII) in 52 languages across text, audio, images, and documents.  In this episode, we hear from Patricia about: her transition from starting a Ph.D. to co-founding an AI company; how Private AI set out to solve fundamental privacy problems to provide control and understanding...]]></itunes:summary>
    <description><![CDATA[<p>My guest this week is <a href='https://www.linkedin.com/in/patricia-thaine/'>Patricia Thaine</a>, Co-founder and CEO of <a href='https://www.private-ai.com/'>Private AI,</a> where she leads a team of experts in developing cutting-edge solutions using AI to identify, reduce, and remove Personally Identifiable Information (PII) in 52 languages across text, audio, images, and documents.<br/><br/>In this episode, we hear from Patricia about: her transition from starting a Ph.D. to co-founding an AI company; how Private AI set out to solve fundamental privacy problems to provide control and understanding of data collection; misunderstandings about how best to leverage AI regarding privacy-preserving machine learning; Private AI’s intention when designing their software, plus newly deployed features; and whether global AI regulations can help with current risks around privacy, rogue AI and copyright.<br/><br/><b>Topics Covered</b>:</p><ul><li>Patricia’s professional journey from starting a Ph.D. in Acoustic Forensics to co-founding an AI company</li><li>Why Private AI’s mission is to solve privacy problems and create a platform for developers to modularly and flexibly integrate it anywhere you want in your software pipeline, including  model ingress &amp; egress</li><li>How companies can avoid mishandling personal information when leveraging AI / machine learning; and Patricia’s advice to companies to avoid mishandling personal information </li><li>Why keeping track of ever-changing data collection and regulations make it hard to find personal information</li><li>Private AI&apos;s privacy-enabling architectural approach to finding personal data to prevent it from being used by or stored in an AI model</li><li>The approach that Privacy AI took to design their software</li><li>Private AI&apos;s extremely high matching rate, and how they aim for 99%+ accuracy</li><li>Private AI&apos;s roadmap &amp; R&amp;D efforts</li><li>Debra &amp; Patricia discuss AI Regulation and Patricia&apos;s insights from her article &apos;Thoughts on AI Regulation&apos;</li><li>A foreshadowing of AI’s copyright risk problem and whether regulations or licenses can help</li><li>ChatGPT’s popularity, copyright, and the need for embedding privacy, security, and safety by design from the beginning (in the MVP)</li><li>How to reach out to Patricia to connect, collaborate, or access a demo</li><li>How thinking about the fundamentals gets you a good way on your way to ensuring privacy &amp; security</li></ul><p><br/><b>Resources Mentioned</b>:</p><ul><li>Read: Yoshua Bengio’s blog post: <a href='https://yoshuabengio.org/2023/05/22/how-rogue-ais-may-arise/'>&quot;How Rogue AI&apos;s May Arise&quot;</a></li><li>Read: <a href='https://www.microsoft.com/en-us/security/security-insider/microsoft-digital-defense-report-2023'>Microsoft&apos;s Digital Defense Report 2023</a></li><li>Read Patricia’s article, <a href='https://www.linkedin.com/pulse/thoughts-ai-regulation-patricia-thaine-x2ubc/?trackingId=tylqAD0iSyCwM6fyRFpe%2Fw%3D%3D'>“Thoughts on AI Regulation” </a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li>Connect with Patricia on <a href='https://www.linkedin.com/in/patricia-thaine/'>LinkedIn</a></li><li>Check out </li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>My guest this week is <a href='https://www.linkedin.com/in/patricia-thaine/'>Patricia Thaine</a>, Co-founder and CEO of <a href='https://www.private-ai.com/'>Private AI,</a> where she leads a team of experts in developing cutting-edge solutions using AI to identify, reduce, and remove Personally Identifiable Information (PII) in 52 languages across text, audio, images, and documents.<br/><br/>In this episode, we hear from Patricia about: her transition from starting a Ph.D. to co-founding an AI company; how Private AI set out to solve fundamental privacy problems to provide control and understanding of data collection; misunderstandings about how best to leverage AI regarding privacy-preserving machine learning; Private AI’s intention when designing their software, plus newly deployed features; and whether global AI regulations can help with current risks around privacy, rogue AI and copyright.<br/><br/><b>Topics Covered</b>:</p><ul><li>Patricia’s professional journey from starting a Ph.D. in Acoustic Forensics to co-founding an AI company</li><li>Why Private AI’s mission is to solve privacy problems and create a platform for developers to modularly and flexibly integrate it anywhere you want in your software pipeline, including  model ingress &amp; egress</li><li>How companies can avoid mishandling personal information when leveraging AI / machine learning; and Patricia’s advice to companies to avoid mishandling personal information </li><li>Why keeping track of ever-changing data collection and regulations make it hard to find personal information</li><li>Private AI&apos;s privacy-enabling architectural approach to finding personal data to prevent it from being used by or stored in an AI model</li><li>The approach that Privacy AI took to design their software</li><li>Private AI&apos;s extremely high matching rate, and how they aim for 99%+ accuracy</li><li>Private AI&apos;s roadmap &amp; R&amp;D efforts</li><li>Debra &amp; Patricia discuss AI Regulation and Patricia&apos;s insights from her article &apos;Thoughts on AI Regulation&apos;</li><li>A foreshadowing of AI’s copyright risk problem and whether regulations or licenses can help</li><li>ChatGPT’s popularity, copyright, and the need for embedding privacy, security, and safety by design from the beginning (in the MVP)</li><li>How to reach out to Patricia to connect, collaborate, or access a demo</li><li>How thinking about the fundamentals gets you a good way on your way to ensuring privacy &amp; security</li></ul><p><br/><b>Resources Mentioned</b>:</p><ul><li>Read: Yoshua Bengio’s blog post: <a href='https://yoshuabengio.org/2023/05/22/how-rogue-ais-may-arise/'>&quot;How Rogue AI&apos;s May Arise&quot;</a></li><li>Read: <a href='https://www.microsoft.com/en-us/security/security-insider/microsoft-digital-defense-report-2023'>Microsoft&apos;s Digital Defense Report 2023</a></li><li>Read Patricia’s article, <a href='https://www.linkedin.com/pulse/thoughts-ai-regulation-patricia-thaine-x2ubc/?trackingId=tylqAD0iSyCwM6fyRFpe%2Fw%3D%3D'>“Thoughts on AI Regulation” </a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li>Connect with Patricia on <a href='https://www.linkedin.com/in/patricia-thaine/'>LinkedIn</a></li><li>Check out </li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.trustaffingpartners.com/data-privacy-solutions">TRU Staffing Partners</a><br>Top privacy talent - when you need it, where you need it.<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14215661-s3e1-privacy-preserving-machine-learning-and-nlp-with-patricia-thaine-private-ai.mp3" length="26610236" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/igm06spen3geqj3e3mo46oyyqz1o?.jpg" />
    <itunes:author>Debra J Farber / Patricia Thaine</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14215661</guid>
    <pubDate>Tue, 02 Jan 2024 12:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14215661/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14215661/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14215661/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14215661/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14215661/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S3E1: &quot;Privacy-preserving Machine Learning and NLP&quot; with Patricia Thaine (Private AI)" />
  <psc:chapter start="1:38" title="Introducing Patricia Thaine, Founder &amp; CEO at Private AI." />
  <psc:chapter start="3:35" title="Why Patricia chose to co-found Private AI, the company&#39;s mission, and some key privacy-enabling features " />
  <psc:chapter start="7:26" title="How companies can avoid mishandling personal information when leveraging AI / machine learning" />
  <psc:chapter start="8:56" title="Why it is so difficult to discover personal information in the first place" />
  <psc:chapter start="12:10" title="Private AI&#39;s privacy-enabling architectural approach to finding personal data and preventing it from being used by or stored in an AI model" />
  <psc:chapter start="13:51" title="Private AI&#39;s extremely high matching rate, and how they aim for 99%+ accuracy" />
  <psc:chapter start="15:21" title="Private AI&#39;s roadmap &amp; R&amp;D efforts" />
  <psc:chapter start="17:40" title="Debra &amp; Patricia discuss AI Regulation and Patricia&#39;s insights from her article &#39;Thoughts on AI Regulation&#39;" />
  <psc:chapter start="28:40" title="The importance of licensing data sets to respect copyright and enfranchise consumers" />
  <psc:chapter start="34:35" title="How listeners can reach out to Patricia, collaborate, or access a demo" />
  <psc:chapter start="35:21" title="How thinking about the fundamentals gets you a good way on your way to ensuring privacy &amp; security" />
</psc:chapters>
    <itunes:duration>2214</itunes:duration>
    <itunes:keywords>#AI, Private AI, Patricia Thaine, </itunes:keywords>
    <itunes:season>3</itunes:season>
    <itunes:episode>1</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E39: &#39;Contextual Responsive Intelligence &amp; Data Minimization for AI Training &amp; Testing&#39; with Kevin Killens (AHvos)</itunes:title>
    <title>S2E39: &#39;Contextual Responsive Intelligence &amp; Data Minimization for AI Training &amp; Testing&#39; with Kevin Killens (AHvos)</title>
    <itunes:summary><![CDATA[My guest this week is Kevin Killens, CEO of AHvos, a technology service that provides AI solutions for data-heavy businesses using a proprietary technology called Contextually Responsive Intelligence (CRI), which can act upon a business's private data and produce results without storing that data.  In this episode, we delve into this technology and learn more from Kevin about: his transition from serving in the Navy to founding an AI-focused company; AHvos’ architectural approach in support o...]]></itunes:summary>
    <description><![CDATA[<p>My guest this week is Kevin Killens, CEO of AHvos, a technology service that provides AI solutions for data-heavy businesses using a proprietary technology called Contextually Responsive Intelligence (CRI), which can act upon a business&apos;s private data and produce results without storing that data.<br/><br/>In this episode, we delve into this technology and learn more from Kevin about: his transition from serving in the Navy to founding an AI-focused company; AHvos’ architectural approach in support of data minimization and reduced attack surface; AHvos&apos; CRI technology and its ability to provide accurate answers based on private data sets; and how AHvos’ Data Crucible product helps AI teams to identify and correct inaccurate dataset labels.  <br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Kevin’s origin story, from serving in the Navy to founding AHvos</li><li>How Kevin thinks about privacy and the architectural approach he took when building AHvos</li><li>The challenges of processing personal data, &apos;security for privacy,&apos; and the applicability of the GDPR when using AHvos</li><li>Kevin explains the benefits of Contextually Responsive Intelligence (CRI): which abstracts out raw data to protect privacy; finds &amp; creates relevant data in response to a query; and identifies &amp; corrects inaccurate dataset labels</li><li>How human-created algorithms and oversight influence AI parameters and model bias; and, why transparency is so important</li><li>How customer data is ingested into models via AHvos</li><li>Why it is important to remove bias from Testing Data, not only Training Data; and, how AHvos ensures accuracy </li><li>How AHvos&apos; Data Crucible identifies &amp; corrects inaccurate data set labels</li><li>Kevin&apos;s advice for privacy engineers as they tackle AI challenges in their own organizations</li><li>The impact of technical debt on companies and the importance of building slowly &amp; correctly rather than racing to market with insecure and biased AI models</li><li>The importance of baking security and privacy into your minimum viable product (MVP), even for products that are still in &apos;beta&apos; </li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Kevin on <a href='https://www.linkedin.com/in/kevinlkillens/'>LinkedIn</a></li><li>Check out <a href='https://www.ahvos.com/'>AHvos</a></li><li>Check out <a href='https://trinsic.id/'>Trinsic Technologies</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>My guest this week is Kevin Killens, CEO of AHvos, a technology service that provides AI solutions for data-heavy businesses using a proprietary technology called Contextually Responsive Intelligence (CRI), which can act upon a business&apos;s private data and produce results without storing that data.<br/><br/>In this episode, we delve into this technology and learn more from Kevin about: his transition from serving in the Navy to founding an AI-focused company; AHvos’ architectural approach in support of data minimization and reduced attack surface; AHvos&apos; CRI technology and its ability to provide accurate answers based on private data sets; and how AHvos’ Data Crucible product helps AI teams to identify and correct inaccurate dataset labels.  <br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Kevin’s origin story, from serving in the Navy to founding AHvos</li><li>How Kevin thinks about privacy and the architectural approach he took when building AHvos</li><li>The challenges of processing personal data, &apos;security for privacy,&apos; and the applicability of the GDPR when using AHvos</li><li>Kevin explains the benefits of Contextually Responsive Intelligence (CRI): which abstracts out raw data to protect privacy; finds &amp; creates relevant data in response to a query; and identifies &amp; corrects inaccurate dataset labels</li><li>How human-created algorithms and oversight influence AI parameters and model bias; and, why transparency is so important</li><li>How customer data is ingested into models via AHvos</li><li>Why it is important to remove bias from Testing Data, not only Training Data; and, how AHvos ensures accuracy </li><li>How AHvos&apos; Data Crucible identifies &amp; corrects inaccurate data set labels</li><li>Kevin&apos;s advice for privacy engineers as they tackle AI challenges in their own organizations</li><li>The impact of technical debt on companies and the importance of building slowly &amp; correctly rather than racing to market with insecure and biased AI models</li><li>The importance of baking security and privacy into your minimum viable product (MVP), even for products that are still in &apos;beta&apos; </li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Kevin on <a href='https://www.linkedin.com/in/kevinlkillens/'>LinkedIn</a></li><li>Check out <a href='https://www.ahvos.com/'>AHvos</a></li><li>Check out <a href='https://trinsic.id/'>Trinsic Technologies</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14188363-s2e39-contextual-responsive-intelligence-data-minimization-for-ai-training-testing-with-kevin-killens-ahvos.mp3" length="31244888" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/bq7s34jnhixjmc85cjw3oeuqkz0b?.jpg" />
    <itunes:author>Debra J. Farber / Kevin Killens</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14188363</guid>
    <pubDate>Tue, 26 Dec 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14188363/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14188363/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14188363/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14188363/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14188363/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E39: &#39;Contextual Responsive Intelligence &amp; Data Minimization for AI Training &amp; Testing&#39; with Kevin Killens (AHvos)" />
  <psc:chapter start="1:49" title="Introducing Kevin Killens, Founder &amp; CEO at AHvos" />
  <psc:chapter start="3:19" title="Kevin tells us his origin story and how that led him to found AHvos" />
  <psc:chapter start="6:49" title="How Kevin thinks about privacy and the architectural approach he took when building AHvos" />
  <psc:chapter start="10:42" title="Debra &amp; Kevin discuss processing personal data, &quot;Security for Privacy,&quot; and the applicability of the GDPR when using AHvos; Kevin tells us about AHvos Contextually Responsive Intelligence (CRI)." />
  <psc:chapter start="15:42" title="Kevin describes several use cases for CRI, including the ability to identify and correct inaccurate dataset labels" />
  <psc:chapter start="18:44" title="Kevin tells us about the leading cause of AI model bias; and why transparency is so important" />
  <psc:chapter start="22:41" title="Kevin delves deeper into how customer data is ingested into models via AHvos and leveraging Trinsic as a backend" />
  <psc:chapter start="25:02" title="Why it is important to remove bias from Testing Data, not only Training Data" />
  <psc:chapter start="29:43" title="Kevin tells us about Data Crucible, AHvos&#39; solution for identifying and correcting inaccurate data set labels" />
  <psc:chapter start="32:42" title="Kevin&#39;s advice for privacy engineers as they tackle AI challenges in their own organizations" />
  <psc:chapter start="35:38" title="Debra &amp; Kevin discuss the impact of technical debt and the importance of building slowly and correctly rather than race to market with insecure and biased AI" />
  <psc:chapter start="41:49" title="How to reach out to Kevin and learn more about AHvos" />
</psc:chapters>
    <itunes:duration>2600</itunes:duration>
    <itunes:keywords>AHvos, Data Crucible, Contextually Responsive Intelligence, CRI, Data Minimization, AI bias, AI accuracy, Transit Technologies</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>39</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E38: &quot;PrivacyGPT: Bringing an AI Privacy Startup to Market&quot; with Nabanita De (Privacy License)</itunes:title>
    <title>S2E38: &quot;PrivacyGPT: Bringing an AI Privacy Startup to Market&quot; with Nabanita De (Privacy License)</title>
    <itunes:summary><![CDATA[My guest this week is Nabanita De, Software Engineer, Serial Entrepreneur, and Founder &amp; CEO at Privacy License where she's on a mission to transform the AI landscape. In this episode, we discuss Nabanita's transition from Engineering Manager at Remitly to startup founder; what she's learned from her experience at Antler's accelerator program, her first product to market: PrivacyGPT and her work to educate Privacy Champions.    Topics Covered: Nabanita’s origin story, from conducting...]]></itunes:summary>
    <description><![CDATA[<p>My guest this week is Nabanita De, Software Engineer, Serial Entrepreneur, and Founder &amp; CEO at Privacy License where she&apos;s on a mission to transform the AI landscape. In this episode, we discuss Nabanita&apos;s transition from Engineering Manager at Remitly to startup founder; what she&apos;s learned from her experience at Antler&apos;s accelerator program, her first product to market: PrivacyGPT and her work to educate Privacy Champions.  <br/><br/><b>Topics Covered:</b></p><ul><li>Nabanita’s origin story, from conducting AI research at Microsoft as an intern all the way to founding Privacy License</li><li>How Privacy License supports enterprises entering the global market while protecting privacy as a human right</li><li>A comparison between Nabanita&apos;s experience as a corporate role as Privacy Engineering Manager at Remitly versus her entrepreneurial role as Founder-in-Residence at Antler</li><li>How PrivacyGPT, a Chrome browser plugin, empowers people to use ChatGPT with added privacy protections and without compromising data privacy standards by redacting sensitive and personal data before sending to ChatGPT</li><li>NLP techniques that Nabanita leveraged to build out PrivacyGPT, including: &apos;regular expressions,&apos; &apos;parts of speech tagging,&apos; &amp; &apos;name entity recognition&apos;</li><li>How PrivacyGPT can be used to protect privacy across nearly all languages, even where a user has no Internet connection</li><li>How to use Product Hunt to gain visibility around a newly-launched product; and whether it&apos;s easier to raise a financial round in the AI space right now</li><li>Nabanita’s advice for software engineers who might found a privacy or AI startup in the near future</li><li>Why Nabanita created a Privacy Champions Program; and how it provides (non)-privacy folks with recommendations to prioritize privacy within their organizations</li><li>How to sign up for PrivacyGPT’s paid pilot app, connect with Nabanita to collaborate, or subscribe to &quot;Nabanita&apos;s Moonshots Newsletter&quot; on LinkedIn</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Check out <a href='https://privacyos.ai/'>Privacy License</a></li><li>Learn more about <a href='https://privacyos.ai/privacygpt'>PrivacyGPT</a></li><li>Install the <a href='https://chromewebstore.google.com/detail/privacygpt/iobeegngilbjemenccdplkjndkkpplbe?hl=en&amp;pli=1'>PrivacyGPT Chrome Extension</a></li><li>Learn about <a href='https://staysafeonline.org/programs/data-privacy-week/about/'>Data Privacy Week 2024</a></li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with Nabanita on <a href='https://www.linkedin.com/in/nabanitaai'>LinkedIn</a></li><li>Subscribe to the <a href='https://www.linkedin.com/newsletters/nabanita-s-moonshots-6896880807227588608/'>Nabanita&apos;s Moonshots Newsletter</a></li><li>Learn more about <a href='https://www.nabanitadefoundation.org/'>The Nabinita De Foundation </a></li><li>Learn more about <a href='https://www.covidhelpforindia.com/about'>Covid Help for India</a></li><li>Learn more about <a href='https://chromewebstore.google.com/detail/project-fib/njfkbbdphllgkbdomopoiibhdkkohnbf'>Project FiB</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>My guest this week is Nabanita De, Software Engineer, Serial Entrepreneur, and Founder &amp; CEO at Privacy License where she&apos;s on a mission to transform the AI landscape. In this episode, we discuss Nabanita&apos;s transition from Engineering Manager at Remitly to startup founder; what she&apos;s learned from her experience at Antler&apos;s accelerator program, her first product to market: PrivacyGPT and her work to educate Privacy Champions.  <br/><br/><b>Topics Covered:</b></p><ul><li>Nabanita’s origin story, from conducting AI research at Microsoft as an intern all the way to founding Privacy License</li><li>How Privacy License supports enterprises entering the global market while protecting privacy as a human right</li><li>A comparison between Nabanita&apos;s experience as a corporate role as Privacy Engineering Manager at Remitly versus her entrepreneurial role as Founder-in-Residence at Antler</li><li>How PrivacyGPT, a Chrome browser plugin, empowers people to use ChatGPT with added privacy protections and without compromising data privacy standards by redacting sensitive and personal data before sending to ChatGPT</li><li>NLP techniques that Nabanita leveraged to build out PrivacyGPT, including: &apos;regular expressions,&apos; &apos;parts of speech tagging,&apos; &amp; &apos;name entity recognition&apos;</li><li>How PrivacyGPT can be used to protect privacy across nearly all languages, even where a user has no Internet connection</li><li>How to use Product Hunt to gain visibility around a newly-launched product; and whether it&apos;s easier to raise a financial round in the AI space right now</li><li>Nabanita’s advice for software engineers who might found a privacy or AI startup in the near future</li><li>Why Nabanita created a Privacy Champions Program; and how it provides (non)-privacy folks with recommendations to prioritize privacy within their organizations</li><li>How to sign up for PrivacyGPT’s paid pilot app, connect with Nabanita to collaborate, or subscribe to &quot;Nabanita&apos;s Moonshots Newsletter&quot; on LinkedIn</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Check out <a href='https://privacyos.ai/'>Privacy License</a></li><li>Learn more about <a href='https://privacyos.ai/privacygpt'>PrivacyGPT</a></li><li>Install the <a href='https://chromewebstore.google.com/detail/privacygpt/iobeegngilbjemenccdplkjndkkpplbe?hl=en&amp;pli=1'>PrivacyGPT Chrome Extension</a></li><li>Learn about <a href='https://staysafeonline.org/programs/data-privacy-week/about/'>Data Privacy Week 2024</a></li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with Nabanita on <a href='https://www.linkedin.com/in/nabanitaai'>LinkedIn</a></li><li>Subscribe to the <a href='https://www.linkedin.com/newsletters/nabanita-s-moonshots-6896880807227588608/'>Nabanita&apos;s Moonshots Newsletter</a></li><li>Learn more about <a href='https://www.nabanitadefoundation.org/'>The Nabinita De Foundation </a></li><li>Learn more about <a href='https://www.covidhelpforindia.com/about'>Covid Help for India</a></li><li>Learn more about <a href='https://chromewebstore.google.com/detail/project-fib/njfkbbdphllgkbdomopoiibhdkkohnbf'>Project FiB</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14143077-s2e38-privacygpt-bringing-an-ai-privacy-startup-to-market-with-nabanita-de-privacy-license.mp3" length="29921067" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/krkd3gvtss6vi3fj9v03mdt7n1cq?.jpg" />
    <itunes:author>Debra J. Farber / Nabanita De</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14143077</guid>
    <pubDate>Tue, 19 Dec 2023 12:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14143077/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14143077/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14143077/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14143077/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14143077/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E38: &quot;PrivacyGPT: Bringing an AI Privacy Startup to Market&quot; with Nabanita De (Privacy License)" />
  <psc:chapter start="2:10" title="Introducing Nabanita De, Founder &amp; CEO at Privacy License" />
  <psc:chapter start="4:14" title="Nabanita describes her career path" />
  <psc:chapter start="6:17" title="Nabanita tells us about Privacy License, its mission, and future plans for the org" />
  <psc:chapter start="8:49" title="What it was like for Nabanita to transition from working as a Privacy Engineering Manager to privacy / AI startup Founder." />
  <psc:chapter start="12:41" title="Nabanita discusses PrivacyGPT, the first product from Privacy License, which she describes as a &#39;Privacy Firewall for ChatGPT Prompts&quot;" />
  <psc:chapter start="15:17" title="How PrivacyGPT works in practices, with discussion of its architecture" />
  <psc:chapter start="17:08" title="Nabanita describes some of the NLP techniques that she leveraged to build PrivacyGPT, including: regular expressions, parts of speech tagging, and name entity recognizers" />
  <psc:chapter start="22:52" title="Debra &amp; Nabanita discuss what it&#39;s like to bring a tech startup to market, including the use of Product Hunt for product feedback and what it&#39;s like to raise money in this market" />
  <psc:chapter start="33:04" title="Nabanita describes the Privacy Champions Program that she&#39;s launching via Privacy License" />
  <psc:chapter start="38:27" title="Nabanita shares info about how to collaborate with her as well as resources like her newsletter: Nabanita&#39;s Moonshots" />
</psc:chapters>
    <itunes:duration>2489</itunes:duration>
    <itunes:keywords>privacy tech, PrivacyGPT, Privacy License, Privacy Champions, privacy-by-design, privacy engineer</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>38</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>true</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E37: &quot;Embedding Privacy Engineering into Real Estate&quot; with Yusra Ahmad and Luke Beckley (The RED Foundation)</itunes:title>
    <title>S2E37: &quot;Embedding Privacy Engineering into Real Estate&quot; with Yusra Ahmad and Luke Beckley (The RED Foundation)</title>
    <itunes:summary><![CDATA[My guests this week are Yusra Ahmad, CEO of Acuity Data, and Luke Beckley, Data Protection Officer and Privacy Governance Manager at Correla, who work with The RED (Real Estate Data) Foundation, a sector-wide alliance that enables the real estate sector to benefit from an increased use of data, while voiding some of the risks that this presents, and better serving society.  We discuss the current drivers for change within the real estate industry and the complexities of the real estate indust...]]></itunes:summary>
    <description><![CDATA[<p>My guests this week are Yusra Ahmad, CEO of <a href='https://www.acuitydata.io/'>Acuity Data</a>, and Luke Beckley, Data Protection Officer and Privacy Governance Manager at <a href='https://www.correla.com/'>Correla</a>, who work with <a href='https://www.theredfoundation.org/'>The RED (Real Estate Data) Foundation</a>, a sector-wide alliance that enables the real estate sector to benefit from an increased use of data, while voiding some of the risks that this presents, and better serving society.<br/><br/>We discuss the current drivers for change within the real estate industry and the complexities of the real estate industry utilizing incredible amounts of data. You’ll learn the types of data protection, privacy, and ethical challenges The RED Foundation seeks to solve, especially now with the advent of new technologies. Yusra and Luke discuss some  ethical questions the real estate sector as it considers leveraging new technology. Yusra and Luke come to the conversation from the knowledgeable perspective as The RED Foundation’s Chair of the Data Ethics Steering Group and Chair of the Engagement and Awareness Group, respectively.</p><p><br/><b>Topics Covered:</b></p><ul><li>Introducing Luke Beckley (DPO, Privacy &amp; Governance Manager at Correla) and Yusra Ahmed (CEO of Acuity Data); who are here to talk about their data ethics work at The RED Foundation</li><li>How the scope, sophistication, &amp; connectivity of data is increasing exponentially in the real estate industry</li><li>Why ESG, workplace experience, &amp; smart city development are drivers of data collection; and the need for data ethics reform within the real estate industry</li><li>Discussion of types of personal data these real estate companies collect &amp; use across stakeholders: owners, operators, occupiers, employees, residents, etc.</li><li>Current approaches that retailers take to protect location data, when collected; and why it&apos;s important to simplify language,  increase transparency, &amp; make  consumers aware of tracking in in-store WIFi privacy notices</li><li>Overview of The RED Foundation &amp; mission: to ensure the real estate sector benefits from an increased use of data, avoids some of the risks that this presents, and is better placed to serve society</li><li>Some ethical questions with which the real estate sector needs to still align, along with examples</li><li>Why there’s a need to educate the real estate industry on privacy-enhancing tech</li><li>The need for privacy engineers and PETs in real estate; and why this will build trust with the different stakeholders</li><li>Guidance for privacy engineers who want to work in the real estate sector.</li><li>Ways to collaborate with The RED Foundation to standardize data ethics practices across the real estate industry</li><li>Why there&apos;s great opportunity to embed privacy into real estate; and why its current challenges are really obstacles, rather than blockers.</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Check out <a href='https://www.theredfoundation.org/'>The RED Foundation</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow <a href='https://www.linkedin.com/in/yusra-ahmad-581345a/'>Yusra on LinkedIn</a></li><li>Follow <a href='https://www.linkedin.com/in/luke-beckley/'>Luke on LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>My guests this week are Yusra Ahmad, CEO of <a href='https://www.acuitydata.io/'>Acuity Data</a>, and Luke Beckley, Data Protection Officer and Privacy Governance Manager at <a href='https://www.correla.com/'>Correla</a>, who work with <a href='https://www.theredfoundation.org/'>The RED (Real Estate Data) Foundation</a>, a sector-wide alliance that enables the real estate sector to benefit from an increased use of data, while voiding some of the risks that this presents, and better serving society.<br/><br/>We discuss the current drivers for change within the real estate industry and the complexities of the real estate industry utilizing incredible amounts of data. You’ll learn the types of data protection, privacy, and ethical challenges The RED Foundation seeks to solve, especially now with the advent of new technologies. Yusra and Luke discuss some  ethical questions the real estate sector as it considers leveraging new technology. Yusra and Luke come to the conversation from the knowledgeable perspective as The RED Foundation’s Chair of the Data Ethics Steering Group and Chair of the Engagement and Awareness Group, respectively.</p><p><br/><b>Topics Covered:</b></p><ul><li>Introducing Luke Beckley (DPO, Privacy &amp; Governance Manager at Correla) and Yusra Ahmed (CEO of Acuity Data); who are here to talk about their data ethics work at The RED Foundation</li><li>How the scope, sophistication, &amp; connectivity of data is increasing exponentially in the real estate industry</li><li>Why ESG, workplace experience, &amp; smart city development are drivers of data collection; and the need for data ethics reform within the real estate industry</li><li>Discussion of types of personal data these real estate companies collect &amp; use across stakeholders: owners, operators, occupiers, employees, residents, etc.</li><li>Current approaches that retailers take to protect location data, when collected; and why it&apos;s important to simplify language,  increase transparency, &amp; make  consumers aware of tracking in in-store WIFi privacy notices</li><li>Overview of The RED Foundation &amp; mission: to ensure the real estate sector benefits from an increased use of data, avoids some of the risks that this presents, and is better placed to serve society</li><li>Some ethical questions with which the real estate sector needs to still align, along with examples</li><li>Why there’s a need to educate the real estate industry on privacy-enhancing tech</li><li>The need for privacy engineers and PETs in real estate; and why this will build trust with the different stakeholders</li><li>Guidance for privacy engineers who want to work in the real estate sector.</li><li>Ways to collaborate with The RED Foundation to standardize data ethics practices across the real estate industry</li><li>Why there&apos;s great opportunity to embed privacy into real estate; and why its current challenges are really obstacles, rather than blockers.</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Check out <a href='https://www.theredfoundation.org/'>The RED Foundation</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow <a href='https://www.linkedin.com/in/yusra-ahmad-581345a/'>Yusra on LinkedIn</a></li><li>Follow <a href='https://www.linkedin.com/in/luke-beckley/'>Luke on LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/14044232-s2e37-embedding-privacy-engineering-into-real-estate-with-yusra-ahmad-and-luke-beckley-the-red-foundation.mp3" length="46613865" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/wv8mr8nx2a7yxf9oypto44z7fp7z?.jpg" />
    <itunes:author>Debra J. Farber (Shifting Privacy Left)</itunes:author>
    <guid isPermaLink="false">Buzzsprout-14044232</guid>
    <pubDate>Tue, 05 Dec 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14044232/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14044232/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14044232/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/14044232/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/14044232/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E37: &quot;Embedding Privacy Engineering into Real Estate&quot; with Yusra Ahmad and Luke Beckley (The RED Foundation)" />
  <psc:chapter start="1:59" title="Introducing Luke Beckley (DPO, Privacy &amp; Governance Manager at Correla) and Yusra Ahmed (CEO of Acuity Data); who are here to talk about their data ethics work at The RED Foundation" />
  <psc:chapter start="5:02" title="How the scope, sophistication, &amp; connectivity of data is increasing exponentially in the real estate industry and the drivers for this" />
  <psc:chapter start="14:06" title="Discussion of types of personal data these real estate companies are collecting and using across different stakeholders: owners, operators, occupiers, employees, residents, etc." />
  <psc:chapter start="16:47" title="Current approaches that retail establishments take to protect location data, when collected." />
  <psc:chapter start="26:57" title="Yusra gives us an overview of The RED Foundation &amp; it&#39;s mission: to ensure the real estate sector benefits from an increased use of data, avoids some of the risks that this presents and is better placed to serve society" />
  <psc:chapter start="33:28" title="Yusra highlights some ethical questions with which the real estate sector needs to still align, with examples" />
  <psc:chapter start="42:06" title="Luke discusses what it&#39;ll take to ensure that real estate companies start leveraging PETs for data capture &amp; sharing data in a privacy-preserving way" />
  <psc:chapter start="51:26" title="Luke&#39;s guidance for privacy engineers who want to work in the real estate sector." />
  <psc:chapter start="58:19" title="Ways to collaborate with The RED Foundation to standardize data ethics practices across the real estate industry" />
</psc:chapters>
    <itunes:duration>3880</itunes:duration>
    <itunes:keywords></itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>37</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E36: &quot;Privacy Engineering Contracting: State of the Market &amp; 2024 Predictions&quot; with Jared Coseglia (TRU Staffing)</itunes:title>
    <title>S2E36: &quot;Privacy Engineering Contracting: State of the Market &amp; 2024 Predictions&quot; with Jared Coseglia (TRU Staffing)</title>
    <itunes:summary><![CDATA[This week, I welcome Jared Coseglia, co-founder and CEO at TRU Staffing Partners, a contract staffing &amp; executive placement search firm that represents talent across 3 core industry verticals: data privacy, eDiscovery, &amp; cybersecurity. We discuss the current and future state of the contracting market for privacy engineering rols and the market drivers that affect hiring. You’ll learn about the hiring trends and the allure of 'part-time impact,' 'part-time perpetual,' and 'secondee' co...]]></itunes:summary>
    <description><![CDATA[<p>This week, I welcome <a href='https://www.linkedin.com/in/jaredcoseglia/'>Jared Coseglia</a>, co-founder and CEO at <a href='https://www.trustaffingpartners.com/'>TRU Staffing Partners</a>, a contract staffing &amp; executive placement search firm that represents talent across 3 core industry verticals: data privacy, eDiscovery, &amp; cybersecurity. We discuss the current and future state of the contracting market for privacy engineering rols and the market drivers that affect hiring. You’ll learn about the hiring trends and the allure of &apos;part-time impact,&apos; &apos;part-time perpetual,&apos; and &apos;secondee&apos; contract work. Jared illustrates the challenges that hiring managers face with a &apos;Do-it-Yourself&apos; staffing process; and he shares his predictions about the job market for privacy engineers over the next 2 years. Jared comes to the conversation with a lot of data that supports his predictions and sage advice for privacy engineering hiring managers and job seekers. </p><p><br/><b>Topics Covered</b>:</p><ul><li>How the privacy contracting market compares and contrasts to the full-time hiring market; and, why we currently see a steep rise in privacy contracting</li><li>Why full-time hiring for privacy engineers won&apos;t likely rebound until Q4 2024; and, how hiring for privacy typically follows a 2-year cycle</li><li>Why companies &amp; employees benefit from fractional contracts; and, the differences between contracting types: &apos;Part-Time - Impact,&apos; &apos;Part-Time - Perpetual,&apos; and &apos;Secondee&apos;</li><li>How hiring managers typically find privacy engineering candidates</li><li>Why it&apos;s far more difficult to hire privacy engineers for contracts; and, how a staffing partner like TRU can supercharge your hiring efforts and avoid the pitfalls of a &quot;do-it-yourself&quot; approach</li><li>How contract work benefits privacy engineers financially, while also providing them with project diversity</li><li>How salaries are calculated for privacy engineers; and, the driving forces behind pay discrepancies across privacy roles</li><li>Jared&apos;s advice to 2024 job seekers, based on his market predictions; and, why privacy contracting increases &apos;speed to hire&apos; compared to hiring FTEs</li><li>Why privacy engineers can earn more money by changing jobs in 2024 than they could by seeking raises in their current companies; and discussion of 2024 salary ranges across industry segments</li><li>Jared&apos;s advice on how privacy engineers can best position themselves to contract hiring managers in 2024</li><li>Recommended resources for privacy engineering employers and job seekers</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Read: <a href='https://info.trustaffingpartners.com/hubfs/PDFs/State%20of%20the%20Privacy%20Job%20Market%202023.pdf'>&quot;State of the Privacy Job Market Q3 2023”</a></li><li>Subscribe to <a href='https://www.trustaffingpartners.com/insights/tag/tru-tips'>TRU Insights</a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li>Connect with Jared on <a href='https://www.linkedin.com/in/jaredcoseglia/'>LinkedIn</a></li><li>Learn more about <a href='https://www.trustaffingpartners.com/'>TRU Staffing Partners</a></li><li>Engineering Managers: Check out <a href='https://www.trustaffingpartners.com/data-privacy-solutions'>TRU Staffing Data Privacy Staffing solutions</a></li><li>PE Candidates: Apply to <a href='https://jobs.trustaffingpartners.com/#/privacy'></a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, I welcome <a href='https://www.linkedin.com/in/jaredcoseglia/'>Jared Coseglia</a>, co-founder and CEO at <a href='https://www.trustaffingpartners.com/'>TRU Staffing Partners</a>, a contract staffing &amp; executive placement search firm that represents talent across 3 core industry verticals: data privacy, eDiscovery, &amp; cybersecurity. We discuss the current and future state of the contracting market for privacy engineering rols and the market drivers that affect hiring. You’ll learn about the hiring trends and the allure of &apos;part-time impact,&apos; &apos;part-time perpetual,&apos; and &apos;secondee&apos; contract work. Jared illustrates the challenges that hiring managers face with a &apos;Do-it-Yourself&apos; staffing process; and he shares his predictions about the job market for privacy engineers over the next 2 years. Jared comes to the conversation with a lot of data that supports his predictions and sage advice for privacy engineering hiring managers and job seekers. </p><p><br/><b>Topics Covered</b>:</p><ul><li>How the privacy contracting market compares and contrasts to the full-time hiring market; and, why we currently see a steep rise in privacy contracting</li><li>Why full-time hiring for privacy engineers won&apos;t likely rebound until Q4 2024; and, how hiring for privacy typically follows a 2-year cycle</li><li>Why companies &amp; employees benefit from fractional contracts; and, the differences between contracting types: &apos;Part-Time - Impact,&apos; &apos;Part-Time - Perpetual,&apos; and &apos;Secondee&apos;</li><li>How hiring managers typically find privacy engineering candidates</li><li>Why it&apos;s far more difficult to hire privacy engineers for contracts; and, how a staffing partner like TRU can supercharge your hiring efforts and avoid the pitfalls of a &quot;do-it-yourself&quot; approach</li><li>How contract work benefits privacy engineers financially, while also providing them with project diversity</li><li>How salaries are calculated for privacy engineers; and, the driving forces behind pay discrepancies across privacy roles</li><li>Jared&apos;s advice to 2024 job seekers, based on his market predictions; and, why privacy contracting increases &apos;speed to hire&apos; compared to hiring FTEs</li><li>Why privacy engineers can earn more money by changing jobs in 2024 than they could by seeking raises in their current companies; and discussion of 2024 salary ranges across industry segments</li><li>Jared&apos;s advice on how privacy engineers can best position themselves to contract hiring managers in 2024</li><li>Recommended resources for privacy engineering employers and job seekers</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Read: <a href='https://info.trustaffingpartners.com/hubfs/PDFs/State%20of%20the%20Privacy%20Job%20Market%202023.pdf'>&quot;State of the Privacy Job Market Q3 2023”</a></li><li>Subscribe to <a href='https://www.trustaffingpartners.com/insights/tag/tru-tips'>TRU Insights</a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li>Connect with Jared on <a href='https://www.linkedin.com/in/jaredcoseglia/'>LinkedIn</a></li><li>Learn more about <a href='https://www.trustaffingpartners.com/'>TRU Staffing Partners</a></li><li>Engineering Managers: Check out <a href='https://www.trustaffingpartners.com/data-privacy-solutions'>TRU Staffing Data Privacy Staffing solutions</a></li><li>PE Candidates: Apply to <a href='https://jobs.trustaffingpartners.com/#/privacy'></a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13973139-s2e36-privacy-engineering-contracting-state-of-the-market-2024-predictions-with-jared-coseglia-tru-staffing.mp3" length="41655004" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/82kt7rpb69tj18bi15p9g5at8swe?.jpg" />
    <itunes:author>Debra J. Farber (Shifting Privacy Left) / Jared Coseglia</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13973139</guid>
    <pubDate>Tue, 21 Nov 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13973139/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13973139/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13973139/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13973139/transcript.vtt" type="text/vtt" />
    <podcast:soundbite startTime="803.021" duration="59.0" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13973139/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E36: &quot;Privacy Engineering Contracting: State of the Market &amp; 2024 Predictions&quot; with Jared Coseglia (TRU Staffing)" />
  <psc:chapter start="2:15" title="Introducing Jared Coseglia, Founder and CEO at TRU Staffing Partners" />
  <psc:chapter start="4:06" title="How the market for privacy contracts compares and contrasts to that of full-time privacy hires" />
  <psc:chapter start="9:41" title="Why the full-time hiring market for privacy engineers won&#39;t likely rebound until Q4 2024; and, how privacy full-time hiring vs. contract hiring tends to go in waves on a 2-year cycle" />
  <psc:chapter start="13:38" title="The difference between contracting types: &#39;Part-Time - Impact,&#39; &#39;Part-Time - Perpetual,&#39; and &#39;Secondee&#39;" />
  <psc:chapter start="22:23" title="Trends and stats: How hiring managers typically find candidates for privacy engineering roles; and why it&#39;s far more difficult to hire privacy engineers for a contract; and how how a staffing partner like TRU can supercharge your hiring efforts" />
  <psc:chapter start="31:46" title="Jared shares his insight into how salaries are calculated for privacy engineers; and why posted salaries vary greatly across the board" />
  <psc:chapter start="40:37" title="Jared&#39;s advice to privacy engineering job seekers for 2024, based on his market predictions; and how privacy contracting enables a much faster &#39;speed to hire&#39; in this market than hiring for an FTE" />
  <psc:chapter start="47:10" title="Why Jared believes that privacy engineers can earn more money by changing jobs in 2024 than they could by asking for a raise; and a discussion of those 2024 salary ranges" />
  <psc:chapter start="53:01" title="Jared&#39;s advice to privacy engineers who will seek contracting roles in 2024 on how they can best position themselves to hiring managers" />
  <psc:chapter start="55:01" title="Jared&#39;s recommended resources for privacy engineering job seekers" />
</psc:chapters>
    <itunes:duration>3467</itunes:duration>
    <itunes:keywords>privacy engineering contracting, contracts, staff augmentation, privacy hiring</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>36</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E35: &quot;Embed Ethics into Your SDLC: From Reactive Firefighting to &#39;Responsible Firekeeping&#39;&quot; with Mathew Mytka &amp; Alja Isaković (Tethix)</itunes:title>
    <title>S2E35: &quot;Embed Ethics into Your SDLC: From Reactive Firefighting to &#39;Responsible Firekeeping&#39;&quot; with Mathew Mytka &amp; Alja Isaković (Tethix)</title>
    <itunes:summary><![CDATA[This week’s guests are Mathew Mytka and Alja Isakovoić, Co-Founders of Tethix, a company that builds products that embed ethics into the fabric of your organization. We discuss Matt and Alja’s core mission to bring ethical tech to the world, and Tethix’s services that work with your Agile development processes. You’ll learn about Tethix’s solution to address 'The Intent to Action Gap,' and what Elemental Ethics can provide organizations beyond other ethics frameworks. We discuss ways to becom...]]></itunes:summary>
    <description><![CDATA[<p>This week’s guests are <a href='https://www.linkedin.com/in/mathewmytka/'>Mathew Mytka</a> and <a href='https://www.linkedin.com/in/ialja/'>Alja Isakovoić</a>, Co-Founders of <a href='https://tethix.co/'>Tethix</a>, a company that builds products that embed ethics into the fabric of your organization. We discuss Matt and Alja’s core mission to bring ethical tech to the world, and Tethix’s services that work with your Agile development processes. You’ll learn about Tethix’s solution to address &apos;The Intent to Action Gap,&apos; and what Elemental Ethics can provide organizations beyond other ethics frameworks. We discuss ways to become a proactive Responsible Firekeeper, rather than remaining a reactive Firefighter, and how ETHOS, Tethix&apos;s suite of apps can help organizations embody and embed ethics into everyday practice. </p><p><br/><b>TOPICS COVERED</b>:</p><ul><li>What inspired Mat &amp; Alja to co-found Tethix and the company&apos;s core mission</li><li>What the &apos;Intent to Action Gap&apos; is and how Tethix address it</li><li>Overview of Tethix&apos;s Elemental Ethics framework; and how it empowers product development teams to &apos;close the &apos;Intent to Action Gap&apos; and move orgs from a state of &apos;Agile Firefighting&apos; to &apos;Responsible Firekeeping&apos;</li><li>Why Agile is an insufficient process for embedding ethics into software and product development; and how you can turn to Elemental Ethics and Responsible Firekeeping to embed &apos;Ethics-by-Design&apos; into your Agile workflows</li><li>The definition of &apos;Responsible Firekeeping&apos; and its benefits; and how Ethical Firekeeping transitions Agile teams from a reactive posture to a proactive one</li><li>Why you should choose Elemental Ethics over conventional ethics frameworks</li><li>Tethix&apos;s suite of apps called ETHOS: The Ethical Tension and Health Operating System apps, which help teams embed ethics into their collaboration tech stack (e.g., JIRA, Slack, Figma, Zoom, etc.)</li><li>How you can become a Responsible Firekeeper</li><li>The level of effort required to implement Elemental Ethics &amp; Responsible Firekeeping into Product Development based on org size and level of maturity</li><li>Alja&apos;s contribution to the ResponsibleTech.Work, an open source Responsible Product Development Framework, core elements of the Framework, and why we need it</li><li>Where to learn more about Responsible Firekeeping</li></ul><p><b>RESOURCES MENTIONED:</b></p><ul><li>Read: <a href='https://tethix.co/fire/day-in-the-life-of-a-responsible-firekeeper-your-journey-from-installing-to-embodying-ethos/'>&quot;Day in the Life of a Responsible Firekeeper&quot;</a></li><li>Review the <a href='https://responsibletech.work/'>ResponsibleTech.Work Framework</a></li><li>Subscribe to the <a href='https://tethix.co/pathfinders/'>Pathfinders Newmoonsletter</a></li></ul><p><b>GUEST INFO:</b></p><ul><li>Connect with Mat on <a href='https://www.linkedin.com/in/mathewmytka/'>LinkedIn</a></li><li>Connect with Alja on <a href='https://www.linkedin.com/in/ialja/'>LinkedIn</a></li><li>Check out <a href='https://tethix.co/'>Tethix’s Website</a> </li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week’s guests are <a href='https://www.linkedin.com/in/mathewmytka/'>Mathew Mytka</a> and <a href='https://www.linkedin.com/in/ialja/'>Alja Isakovoić</a>, Co-Founders of <a href='https://tethix.co/'>Tethix</a>, a company that builds products that embed ethics into the fabric of your organization. We discuss Matt and Alja’s core mission to bring ethical tech to the world, and Tethix’s services that work with your Agile development processes. You’ll learn about Tethix’s solution to address &apos;The Intent to Action Gap,&apos; and what Elemental Ethics can provide organizations beyond other ethics frameworks. We discuss ways to become a proactive Responsible Firekeeper, rather than remaining a reactive Firefighter, and how ETHOS, Tethix&apos;s suite of apps can help organizations embody and embed ethics into everyday practice. </p><p><br/><b>TOPICS COVERED</b>:</p><ul><li>What inspired Mat &amp; Alja to co-found Tethix and the company&apos;s core mission</li><li>What the &apos;Intent to Action Gap&apos; is and how Tethix address it</li><li>Overview of Tethix&apos;s Elemental Ethics framework; and how it empowers product development teams to &apos;close the &apos;Intent to Action Gap&apos; and move orgs from a state of &apos;Agile Firefighting&apos; to &apos;Responsible Firekeeping&apos;</li><li>Why Agile is an insufficient process for embedding ethics into software and product development; and how you can turn to Elemental Ethics and Responsible Firekeeping to embed &apos;Ethics-by-Design&apos; into your Agile workflows</li><li>The definition of &apos;Responsible Firekeeping&apos; and its benefits; and how Ethical Firekeeping transitions Agile teams from a reactive posture to a proactive one</li><li>Why you should choose Elemental Ethics over conventional ethics frameworks</li><li>Tethix&apos;s suite of apps called ETHOS: The Ethical Tension and Health Operating System apps, which help teams embed ethics into their collaboration tech stack (e.g., JIRA, Slack, Figma, Zoom, etc.)</li><li>How you can become a Responsible Firekeeper</li><li>The level of effort required to implement Elemental Ethics &amp; Responsible Firekeeping into Product Development based on org size and level of maturity</li><li>Alja&apos;s contribution to the ResponsibleTech.Work, an open source Responsible Product Development Framework, core elements of the Framework, and why we need it</li><li>Where to learn more about Responsible Firekeeping</li></ul><p><b>RESOURCES MENTIONED:</b></p><ul><li>Read: <a href='https://tethix.co/fire/day-in-the-life-of-a-responsible-firekeeper-your-journey-from-installing-to-embodying-ethos/'>&quot;Day in the Life of a Responsible Firekeeper&quot;</a></li><li>Review the <a href='https://responsibletech.work/'>ResponsibleTech.Work Framework</a></li><li>Subscribe to the <a href='https://tethix.co/pathfinders/'>Pathfinders Newmoonsletter</a></li></ul><p><b>GUEST INFO:</b></p><ul><li>Connect with Mat on <a href='https://www.linkedin.com/in/mathewmytka/'>LinkedIn</a></li><li>Connect with Alja on <a href='https://www.linkedin.com/in/ialja/'>LinkedIn</a></li><li>Check out <a href='https://tethix.co/'>Tethix’s Website</a> </li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13890617-s2e35-embed-ethics-into-your-sdlc-from-reactive-firefighting-to-responsible-firekeeping-with-mathew-mytka-alja-isakovic-tethix.mp3" length="32350279" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/xnqygc28nrr5gzv3ed0k6lzyw63v?.jpg" />
    <itunes:author>Debra J Farber / Mathew Mytka and Alja Isaković</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13890617</guid>
    <pubDate>Tue, 14 Nov 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13890617/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13890617/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13890617/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13890617/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13890617/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E35: &quot;Embed Ethics into Your SDLC: From Reactive Firefighting to &#39;Responsible Firekeeping&#39;&quot; with Mathew Mytka &amp; Alja Isaković (Tethix)" />
  <psc:chapter start="1:41" title="Introducing Mathew Mytka, Co-Founder of Tethix" />
  <psc:chapter start="2:32" title="Introducing Alja Isaković, Co-Founder of Tethix and Principal Contributor to the ResponsibleTech.Work Framework" />
  <psc:chapter start="3:36" title="What inspired Mat &amp; Alja to co-found Tethix and the company&#39;s core mission" />
  <psc:chapter start="4:39" title="What the &#39;Intent to Action Gap&#39; is and how Tethix address it" />
  <psc:chapter start="7:18" title="Alja explains The Elemental Ethics framework; how it empowers product development teams to &#39;close the &#39;Intent to Action Gap&#39; and move orgs from a state of &#39;Agile Firefighting&#39; to &#39;Responsible Firekeeping&#39;" />
  <psc:chapter start="10:39" title="Mat shares why Agile is an insufficient process for embedding ethics into software and product development; and how you can turn to Elemental Ethics and Responsible Firekeeping to embed ethics-by-design into your Agile workflows." />
  <psc:chapter start="13:12" title="Mat defines &#39;Responsible Firekeeping,&#39; the benefits realized for using this approach, and how Ethical Firekeeping transitions Agile teams from being reactive to more proactive when it comes to privacy and ethics" />
  <psc:chapter start="17:58" title="Alja shares why organizations should consider choosing Elemental Ethics over conventional ethics frameworks" />
  <psc:chapter start="20:45" title="Alja tells us about Tethix&#39;s suite of apps called ETHOS: The Ethical Tension and Health Operating System apps, which help teams embed ethics into their collaboration tech stack (JIRA, Slack, Figma, Zoom, etc.)" />
  <psc:chapter start="24:05" title="Mat explains the path to becoming a Responsible Firekeeper and urges us to read Alja&#39;s blog post on the subject" />
  <psc:chapter start="32:45" title="Mat &amp; Alja discuss the level of effort required to implement Elemental Ethics &amp; Responsible Firekeeping into Product Development based on organizations size and level of maturity" />
  <psc:chapter start="36:41" title="Alja shares info about ResponsibleTech.Work, an open source Responsible Product Development Framework, its core elements, and why we need such a framework" />
  <psc:chapter start="40:49" title="Alja &amp; Mat share resources including: Tethix&#39;s Pathfinder Newmoonsletter, their website, their Substack publication, workshops, and Virtual Tea Garden; and they leave us with some words of wisdom" />
</psc:chapters>
    <itunes:duration>2691</itunes:duration>
    <itunes:keywords>responsible firekeeping, elemental ethics, responsibletech.work, ethical intent to action gap, ethical tech, Tethix, tech ethics</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>35</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E34: &quot;Embedding Privacy by Design &amp; Threat Modeling for AI&quot; with Isabel Barberá (Rhite &amp; PLOT4ai)</itunes:title>
    <title>S2E34: &quot;Embedding Privacy by Design &amp; Threat Modeling for AI&quot; with Isabel Barberá (Rhite &amp; PLOT4ai)</title>
    <itunes:summary><![CDATA[This week’s guest is Isabel Barberá, Co-founder, AI Advisor, and Privacy Engineer at Rhite , a consulting firm specializing in responsible and trustworthy AI and privacy engineering, and creator of The Privacy Library Of Threats 4 Artificial Intelligence Framework and card game. In our conversation, we discuss: Isabel’s work with privacy-by-design, privacy engineering, privacy threat modeling, and building trustworthy AI; and info about Rhite’s forthcoming Self-Assessment Open-Source fra...]]></itunes:summary>
    <description><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/showcase/89686970/admin/feed/posts/#'>Isabel Barberá</a>, Co-founder, AI Advisor, and Privacy Engineer at <a href='https://www.linkedin.com/showcase/89686970/admin/feed/posts/#'>Rhite<b> </b></a>, a consulting firm specializing in responsible and trustworthy AI and privacy engineering, and creator of <a href='https://plot4.ai/'>The Privacy Library Of Threats 4 Artificial Intelligence</a> Framework and card game. In our conversation, we discuss: Isabel’s work with privacy-by-design, privacy engineering, privacy threat modeling, and building trustworthy AI; and info about Rhite’s forthcoming Self-Assessment Open-Source framework for AI maturity, SARAI®. As we wrap up the episode, Isabel shares details about PLOT4ai, her AI threat modeling framework and card game created based on a library of threats for artificial intelligence. </p><p><b>Topics Covered:</b></p><ul><li>How Isabel became interested in privacy engineering, data protection, privacy by design, threat modeling, and trustworthy AI</li><li>How companies are thinking (or not) about incorporating privacy-by-design strategies &amp; tactics and privacy engineering approaches within their orgs today</li><li>What steps can be taken so companies start investing in privacy engineering approaches; and whether AI has become a driver for such approaches.</li><li>Background on Isabel’s company, Rhite, and its mission to build responsible solutions for society and its individuals using a technical mindset. </li><li>What “Responsible &amp; Trustworthy AI” means to Isabel </li><li>The 5 core values that make up the acronym, R-H-I-T-E, and why they’re important for designing and building products &amp; services.</li><li>Isabel&apos;s advice for organizations as they approach AI risk assessments, analysis, &amp; remediation </li><li>The steps orgs can take in order to  build responsible AI products &amp; services</li><li>What Isabel hopes to accomplish through Rhite&apos;s new framework: SARAI® (for AI maturity), an open source AI Self-Assessment Tool and Framework, and an extension the Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai) Framework (i.e., a library of AI risks)</li><li>What motivated Isabel to focus on threat modeling for privacy</li><li>How PLOT4ai builds on LINDDUN (which focuses on software development) and extends threat modeling to the AI lifecycle stages: Design, Input, Modeling, &amp; Output</li><li>How Isabel’s experience with the LINDDUN Go card game inspired her to develop of a PLOT4ai card game to make it more accessible to teams.</li><li>Isabel calls for collaborators to contribute to the PLOT4ai open source database of AI threats as the community grows.</li></ul><p><b>Resources Mentioned:</b></p><ul><li><a href='https://plot4.ai/'>Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai)</a></li><li><a href='https://github.com/PLOT4ai/plot4ai-library'>PLOT4ai&apos;s Github Threat Repository</a></li><li><a href='https://rhite.tech/files/Threat-Modeling-Generative-AI-Systems_April-2023.pdf'>&quot;Threat Modeling Generative AI Systems with PLOT4ai”</a> </li><li> <a href='https://rhite.tech/en/services/audits-and-assessments'>Self-Assessment for Responsible AI (SARAI®)</a></li><li><a href='https://linddun.org/'>LINDDUN Privacy Threat Model Framework</a></li><li><a href='https://www.buzzsprout.com/2059470/13089122-s2e19-privacy-threat-modeling-mitigating-privacy-threats-in-software-with-kim-wuyts-ku-leuven'>&quot;S2E19: Privacy Threat Modeling - Mitigating Privacy Threats in Software with Kim Wuyts (KU Leuven)”</a></li><li><a href='https://www.amazon.com/Data-Privacy-engineers-Nishant-Bhajaria/dp/1617298999'>&quot;Data Privacy: a runbook for engineers&quot;</a></li></ul><p><b>Guest Info:</b></p><ul><li><a href='https://www.linkedin.com/in/isabelbarbera/'></a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/showcase/89686970/admin/feed/posts/#'>Isabel Barberá</a>, Co-founder, AI Advisor, and Privacy Engineer at <a href='https://www.linkedin.com/showcase/89686970/admin/feed/posts/#'>Rhite<b> </b></a>, a consulting firm specializing in responsible and trustworthy AI and privacy engineering, and creator of <a href='https://plot4.ai/'>The Privacy Library Of Threats 4 Artificial Intelligence</a> Framework and card game. In our conversation, we discuss: Isabel’s work with privacy-by-design, privacy engineering, privacy threat modeling, and building trustworthy AI; and info about Rhite’s forthcoming Self-Assessment Open-Source framework for AI maturity, SARAI®. As we wrap up the episode, Isabel shares details about PLOT4ai, her AI threat modeling framework and card game created based on a library of threats for artificial intelligence. </p><p><b>Topics Covered:</b></p><ul><li>How Isabel became interested in privacy engineering, data protection, privacy by design, threat modeling, and trustworthy AI</li><li>How companies are thinking (or not) about incorporating privacy-by-design strategies &amp; tactics and privacy engineering approaches within their orgs today</li><li>What steps can be taken so companies start investing in privacy engineering approaches; and whether AI has become a driver for such approaches.</li><li>Background on Isabel’s company, Rhite, and its mission to build responsible solutions for society and its individuals using a technical mindset. </li><li>What “Responsible &amp; Trustworthy AI” means to Isabel </li><li>The 5 core values that make up the acronym, R-H-I-T-E, and why they’re important for designing and building products &amp; services.</li><li>Isabel&apos;s advice for organizations as they approach AI risk assessments, analysis, &amp; remediation </li><li>The steps orgs can take in order to  build responsible AI products &amp; services</li><li>What Isabel hopes to accomplish through Rhite&apos;s new framework: SARAI® (for AI maturity), an open source AI Self-Assessment Tool and Framework, and an extension the Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai) Framework (i.e., a library of AI risks)</li><li>What motivated Isabel to focus on threat modeling for privacy</li><li>How PLOT4ai builds on LINDDUN (which focuses on software development) and extends threat modeling to the AI lifecycle stages: Design, Input, Modeling, &amp; Output</li><li>How Isabel’s experience with the LINDDUN Go card game inspired her to develop of a PLOT4ai card game to make it more accessible to teams.</li><li>Isabel calls for collaborators to contribute to the PLOT4ai open source database of AI threats as the community grows.</li></ul><p><b>Resources Mentioned:</b></p><ul><li><a href='https://plot4.ai/'>Privacy Library Of Threats 4 Artificial Intelligence (PLOT4ai)</a></li><li><a href='https://github.com/PLOT4ai/plot4ai-library'>PLOT4ai&apos;s Github Threat Repository</a></li><li><a href='https://rhite.tech/files/Threat-Modeling-Generative-AI-Systems_April-2023.pdf'>&quot;Threat Modeling Generative AI Systems with PLOT4ai”</a> </li><li> <a href='https://rhite.tech/en/services/audits-and-assessments'>Self-Assessment for Responsible AI (SARAI®)</a></li><li><a href='https://linddun.org/'>LINDDUN Privacy Threat Model Framework</a></li><li><a href='https://www.buzzsprout.com/2059470/13089122-s2e19-privacy-threat-modeling-mitigating-privacy-threats-in-software-with-kim-wuyts-ku-leuven'>&quot;S2E19: Privacy Threat Modeling - Mitigating Privacy Threats in Software with Kim Wuyts (KU Leuven)”</a></li><li><a href='https://www.amazon.com/Data-Privacy-engineers-Nishant-Bhajaria/dp/1617298999'>&quot;Data Privacy: a runbook for engineers&quot;</a></li></ul><p><b>Guest Info:</b></p><ul><li><a href='https://www.linkedin.com/in/isabelbarbera/'></a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13838867-s2e34-embedding-privacy-by-design-threat-modeling-for-ai-with-isabel-barbera-rhite-plot4ai.mp3" length="36086475" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/thiulyoehagdrxlcwg38x2cj6ijb?.jpg" />
    <itunes:author>Debra J Farber / Isabel Barberá</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13838867</guid>
    <pubDate>Tue, 07 Nov 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13838867/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13838867/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13838867/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13838867/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13838867/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E34: &quot;Embedding Privacy by Design &amp; Threat Modeling for AI&quot; with Isabel Barberá (Rhite &amp; PLOT4ai)" />
  <psc:chapter start="1:46" title="Introducing Isabel Barberá, Co-Founder &amp; CEO at Rhite" />
  <psc:chapter start="3:29" title="How Isabel became interested in privacy, engineering, data protection, privacy by design, threat modeling, &amp; trustworthy AI" />
  <psc:chapter start="6:37" title="How companies are thinking about incorporating privacy-by-design strategies &amp; tactics and privacy engineering approaches within their orgs today" />
  <psc:chapter start="9:13" title="The actions required to educate companies adequately so that they start investing in privacy engineering approaches; and whether AI is becoming a a driver" />
  <psc:chapter start="12:00" title="Isabel talks about her PbD, Responsible AI &amp; Privacy Engineering consulting company, Rhite, where she guides companies on how to build responsible solutions with society &amp; individuals in mind (with a technical mindset)" />
  <psc:chapter start="14:20" title="What &#39;Responsible &amp; Trustworthy AI&#39; means to Isabel" />
  <psc:chapter start="18:04" title="Isabel describes Rhite&#39;s 5 core values" />
  <psc:chapter start="21:30" title="How companies are approaching risk assessments, analysis, &amp; remediation of AI harms today" />
  <psc:chapter start="26:15" title="Isabel shares the steps orgs can take so that they build responsible AI products &amp; services" />
  <psc:chapter start="29:45" title="Isabel shares details about SARAI (assesses AI maturity level), an open source AI Self-Assessment Tool and Framework called SARAI - an extension the PLOT4ai framework (which is a library of AI risks)" />
  <psc:chapter start="36:15" title="What motivated Isabel to focus on threat modeling for privacy" />
  <psc:chapter start="38:22" title="Isabel shares information about the LINDDUN Privacy Threat Modeling Framework for software development" />
  <psc:chapter start="40:39" title="How PLOT4ai builds on LINDDUN (which focuses on software development) and extends threat modeling to the AI lifecycle stages: Design, Input, Modeling, &amp; Output" />
  <psc:chapter start="46:43" title="Isabel shares how she&#39;s seeking collaborators to help add to PLOT4ai&#39;s open source database of AI threats. " />
</psc:chapters>
    <itunes:duration>3003</itunes:duration>
    <itunes:keywords>privacy by design, privacy threat modeling, AI threat modeling, Rhite, PLOT4ai</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>34</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>true</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E33: &quot;Using Privacy Code Scans to Shift Left into DevOps&quot; with Vaibhav Antil (Privado)</itunes:title>
    <title>S2E33: &quot;Using Privacy Code Scans to Shift Left into DevOps&quot; with Vaibhav Antil (Privado)</title>
    <itunes:summary><![CDATA[This week, I sat down with Vaibhav Antil ('Vee'), Co-founder &amp; CEO at Privado, a privacy tech platform that's leverages privacy code scanning &amp; data mapping to bridge the privacy engineering gap.  Vee shares his personal journey into privacy, where he started out in Product Management and saw need for privacy automation in DevOps. We discuss obstacles created by the rapid pace of engineering teams and a lack of a shared vocabulary with Legal / GRC. You'll learn how code scanning ...]]></itunes:summary>
    <description><![CDATA[<p>This week, I sat down with <a href='https://www.linkedin.com/in/vaibhav-antil-a8103938/'>Vaibhav Antil</a> (&apos;Vee&apos;), Co-founder &amp; CEO at <a href='https://www.privado.ai/'>Privado</a>, a privacy tech platform that&apos;s leverages privacy code scanning &amp; data mapping to bridge the privacy engineering gap.  Vee shares his personal journey into privacy, where he started out in Product Management and saw need for privacy automation in DevOps. We discuss obstacles created by the rapid pace of engineering teams and a lack of a shared vocabulary with Legal / GRC. You&apos;ll learn how code scanning enables privacy teams to move swiftly and avoid blocking engineering. We then discuss the future of privacy engineering, its growth trends, and the need for cross-team collaboration. We highlight the importance of making privacy-by-design programmatic and discuss ways to scale up privacy reviews without stifling product innovation. </p><p><b>Topics Covered:</b></p><ul><li>How Vee moved from Product Manager to Co-Founding Privado, and why he focused on bringing Privacy Code Scanning to market.</li><li>What it means to &quot;Bridge the Privacy Engineering Gap&quot; and 3 reasons why Vee believes the gap exists.</li><li>How engineers can provide visibility into personal data collected and used by applications via Privacy Code Scans.</li><li>Why engineering teams should &apos;shift privacy left&apos; into DevOps.</li><li>How a Privacy Code Scanner differs from traditional static code analysis tools in security.</li><li>How Privado&apos;s Privacy Code Scanning &amp; Data Mapping capabilities (for the SDLC) differ from personal data discovery, correlation, &amp; data mapping tools (for the data lifecycle).</li><li>How Privacy Code Scanning helps engineering teams comply with new laws like Washington State&apos;s &apos;My Health My Data Act.&apos;</li><li>A breakdown of  Privado’s FREE &quot;Technical Privacy Masterclass.&quot;</li><li>Exciting features on Privado’s roadmap, which support its vision to be the platform for collaboration between privacy operations &amp; engineering teams.</li><li>Privacy engineering  trends and Vee’s predictions for the next two years. </li></ul><p><b>Privado</b> <b>Resources Mentioned:</b></p><ul><li>Free Course: <a href='https://learn.privado.ai/courses/technical-privacy-masterclass'>&quot;Technical Privacy Masterclass&quot; (led by Nishant Bhajaria)</a></li><li>Guide: <a href='https://www.privado.ai/ebooks/privacy-code-scanning-a-guide'>Introduction to Privacy Code Scanning</a></li><li>Guide: <a href='https://www.privado.ai/ebooks/privacy-data-mapping-a-guide'>Code Scanning Approach to Data Mapping</a></li><li>Slack: <a href='https://www.privado.ai/privacy-engineering-community'>Privado&apos;s Privacy Engineering Community</a></li><li>Open Source Tool: <a href='https://www.privado.ai/data-safety-report'>Play Store Data Safety Report Builder</a></li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Vee on <a href='https://www.linkedin.com/in/vaibhav-antil-a8103938/'>LinkedIn</a></li><li>Check out <a href='https://www.privado.ai/'>Privado&apos;s website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, I sat down with <a href='https://www.linkedin.com/in/vaibhav-antil-a8103938/'>Vaibhav Antil</a> (&apos;Vee&apos;), Co-founder &amp; CEO at <a href='https://www.privado.ai/'>Privado</a>, a privacy tech platform that&apos;s leverages privacy code scanning &amp; data mapping to bridge the privacy engineering gap.  Vee shares his personal journey into privacy, where he started out in Product Management and saw need for privacy automation in DevOps. We discuss obstacles created by the rapid pace of engineering teams and a lack of a shared vocabulary with Legal / GRC. You&apos;ll learn how code scanning enables privacy teams to move swiftly and avoid blocking engineering. We then discuss the future of privacy engineering, its growth trends, and the need for cross-team collaboration. We highlight the importance of making privacy-by-design programmatic and discuss ways to scale up privacy reviews without stifling product innovation. </p><p><b>Topics Covered:</b></p><ul><li>How Vee moved from Product Manager to Co-Founding Privado, and why he focused on bringing Privacy Code Scanning to market.</li><li>What it means to &quot;Bridge the Privacy Engineering Gap&quot; and 3 reasons why Vee believes the gap exists.</li><li>How engineers can provide visibility into personal data collected and used by applications via Privacy Code Scans.</li><li>Why engineering teams should &apos;shift privacy left&apos; into DevOps.</li><li>How a Privacy Code Scanner differs from traditional static code analysis tools in security.</li><li>How Privado&apos;s Privacy Code Scanning &amp; Data Mapping capabilities (for the SDLC) differ from personal data discovery, correlation, &amp; data mapping tools (for the data lifecycle).</li><li>How Privacy Code Scanning helps engineering teams comply with new laws like Washington State&apos;s &apos;My Health My Data Act.&apos;</li><li>A breakdown of  Privado’s FREE &quot;Technical Privacy Masterclass.&quot;</li><li>Exciting features on Privado’s roadmap, which support its vision to be the platform for collaboration between privacy operations &amp; engineering teams.</li><li>Privacy engineering  trends and Vee’s predictions for the next two years. </li></ul><p><b>Privado</b> <b>Resources Mentioned:</b></p><ul><li>Free Course: <a href='https://learn.privado.ai/courses/technical-privacy-masterclass'>&quot;Technical Privacy Masterclass&quot; (led by Nishant Bhajaria)</a></li><li>Guide: <a href='https://www.privado.ai/ebooks/privacy-code-scanning-a-guide'>Introduction to Privacy Code Scanning</a></li><li>Guide: <a href='https://www.privado.ai/ebooks/privacy-data-mapping-a-guide'>Code Scanning Approach to Data Mapping</a></li><li>Slack: <a href='https://www.privado.ai/privacy-engineering-community'>Privado&apos;s Privacy Engineering Community</a></li><li>Open Source Tool: <a href='https://www.privado.ai/data-safety-report'>Play Store Data Safety Report Builder</a></li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Vee on <a href='https://www.linkedin.com/in/vaibhav-antil-a8103938/'>LinkedIn</a></li><li>Check out <a href='https://www.privado.ai/'>Privado&apos;s website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13838702-s2e33-using-privacy-code-scans-to-shift-left-into-devops-with-vaibhav-antil-privado.mp3" length="40439713" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/7pnxdfhxye5pyd96g1lgtd0rocat?.jpg" />
    <itunes:author>Debra J Farber / Vaibhav Antil</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13838702</guid>
    <pubDate>Tue, 31 Oct 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13838702/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13838702/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13838702/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13838702/transcript.vtt" type="text/vtt" />
    <podcast:soundbite startTime="0.0" duration="40.5" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13838702/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E33: &quot;Using Privacy Code Scans to Shift Left into DevOps&quot; with Vaibhav Antil (Privado)" />
  <psc:chapter start="1:34" title="Introducing Vaibhav Antil (&quot;Vee&quot;), Co-Founder &amp; CEO at Privado.ai" />
  <psc:chapter start="4:22" title="How Vee made the move from Product Manager to Co-Founding Privado, and why he chose to focus on Privacy Code Scanning" />
  <psc:chapter start="9:01" title="The meaning behind Privado&#39;s mantra: &quot;Bridging the Privacy Engineering Gap&quot; and the 3 reasons why there is a gap" />
  <psc:chapter start="16:57" title="How engineers can provide visibility into the personal data that applications are collecting &amp; using via Privacy Code Scans" />
  <psc:chapter start="22:57" title="Vee explains why &#39;shifting privacy left&#39; into DevOps is an imperative" />
  <psc:chapter start="28:44" title="Vee describes the purpose of Privacy Code Scanning &amp; how it differs from traditional static code analysis tools in security" />
  <psc:chapter start="32:10" title="Vee &amp; Debra compare how Privado&#39;s privacy code scanning &amp; data mapping capabilities (SDLC) differ from personal data discovery, correlation, data mapping tools - platforms like BigID, Secuvy, Securiti.ai, etc. (data lifecycle)" />
  <psc:chapter start="38:16" title="Vee discuss how privacy code scanning enables engineering teams to stay compliant with new laws like Washington State&#39;s My Health My Data Act" />
  <psc:chapter start="41:13" title="Vee shares info about Privado&#39;s free Technical Privacy Masterclass, led by Nishant Bhajaria" />
  <psc:chapter start="47:43" title="Vee discusses some exciting features on Privado&#39;s Product Roadmap" />
  <psc:chapter start="51:30" title="Trends that Vee is seeing as the Privacy Engineering Profession grows" />
</psc:chapters>
    <itunes:duration>3366</itunes:duration>
    <itunes:keywords>DevPrivOps, shift privacy left, privacy code scanning</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>33</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E32: &quot;Privacy Red Teams, Protecting People &amp; 23andme&#39;s Data Leak&quot; with Rebecca Balebako (Balebako Privacy Engineer)</itunes:title>
    <title>S2E32: &quot;Privacy Red Teams, Protecting People &amp; 23andme&#39;s Data Leak&quot; with Rebecca Balebako (Balebako Privacy Engineer)</title>
    <itunes:summary><![CDATA[This week’s guest is Rebecca Balebako,  Founder and Principal Consultant at Balebako Privacy Engineer, where she enables data-driven organizations to build the privacy features that their customers love. In our conversation, we discuss all things privacy red teaming, including: how to disambiguate adversarial privacy tests from other software development tests; the importance of privacy-by-infrastructure; why privacy maturity influences the benefits received from investing in privacy red...]]></itunes:summary>
    <description><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/in/rebecca-balebako-42b688/'>Rebecca Balebako</a>,  Founder and Principal Consultant at <a href='https://www.privacyengineer.ch/'>Balebako Privacy Engineer</a>, where she enables data-driven organizations to build the privacy features that their customers love. In our conversation, we discuss all things privacy red teaming, including: how to disambiguate adversarial privacy tests from other software development tests; the importance of privacy-by-infrastructure; why privacy maturity influences the benefits received from investing in privacy red teaming; and why any database that identifies vulnerable populations should consider adversarial privacy as a form of protection. <br/><br/>We also discuss the 23andMe security incident that took place in October 2023 and affected over 1 mil Ashkenazi Jews (a genealogical ethnic group). Rebecca brings to light how Privacy Red Teaming and privacy threat modeling may have prevented this incident.  As we wrap up the episode, Rebecca gives her advice to Engineering Managers looking to set up a Privacy Red Team and shares key resources. <br/><br/><b>Topics Covered:</b></p><ul><li>How Rebecca switched from software development to a focus on privacy &amp; adversarial privacy testing</li><li>What motivated Debra to shift left from her legal training to privacy engineering</li><li>What &apos;adversarial privacy tests&apos; are; why they&apos;re important; and how they differ from other software development tests</li><li>Defining &apos;Privacy Red Teams&apos; (a type of adversarial privacy test) &amp; what differentiates them from &apos;Security Red Teams&apos;</li><li>Why Privacy Red Teams are best for orgs with mature privacy programs</li><li>The 3 steps for conducting a Privacy Red Team attack</li><li>How a Red Team differs from other privacy tests like conducting a vulnerability analysis or managing a bug bounty program</li><li>How 23andme&apos;s recent data leak, affecting 1 mil Ashkanazi Jews, may have been avoided via Privacy Red Team testing</li><li>How BigTech companies are staffing up their Privacy Red Teams</li><li>Frugal ways for small and mid-sized organizations to approach adversarial privacy testing</li><li>The future of Privacy Red Teaming and whether we should upskill security engineers or train privacy engineers on adversarial testing</li><li>Advice for Engineer Managers who seek to set up a Privacy Red Team for the first time</li><li>Rebecca&apos;s Red Teaming resources for the audience</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Listen to: <a href='https://www.buzzsprout.com/2059470/11807121-s1e7-privacy-engineers-the-next-generation-with-lorrie-cranor-cmu'>&quot;S1E7: Privacy Engineers: The Next Generation&quot; with Lorrie Cranor (CMU)</a></li><li>Review Rebecca&apos;s <a href='https://www.privacyengineer.ch/shiftleft'>Red Teaming Resources</a> </li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Rebecca on <a href='https://www.linkedin.com/in/rebecca-balebako-42b688/'>LinkedIn</a></li><li>Visit <a href='https://www.privacyengineer.ch/'>Balebako Privacy Engineer&apos;s website</a></li></ul><p><br/></p><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/in/rebecca-balebako-42b688/'>Rebecca Balebako</a>,  Founder and Principal Consultant at <a href='https://www.privacyengineer.ch/'>Balebako Privacy Engineer</a>, where she enables data-driven organizations to build the privacy features that their customers love. In our conversation, we discuss all things privacy red teaming, including: how to disambiguate adversarial privacy tests from other software development tests; the importance of privacy-by-infrastructure; why privacy maturity influences the benefits received from investing in privacy red teaming; and why any database that identifies vulnerable populations should consider adversarial privacy as a form of protection. <br/><br/>We also discuss the 23andMe security incident that took place in October 2023 and affected over 1 mil Ashkenazi Jews (a genealogical ethnic group). Rebecca brings to light how Privacy Red Teaming and privacy threat modeling may have prevented this incident.  As we wrap up the episode, Rebecca gives her advice to Engineering Managers looking to set up a Privacy Red Team and shares key resources. <br/><br/><b>Topics Covered:</b></p><ul><li>How Rebecca switched from software development to a focus on privacy &amp; adversarial privacy testing</li><li>What motivated Debra to shift left from her legal training to privacy engineering</li><li>What &apos;adversarial privacy tests&apos; are; why they&apos;re important; and how they differ from other software development tests</li><li>Defining &apos;Privacy Red Teams&apos; (a type of adversarial privacy test) &amp; what differentiates them from &apos;Security Red Teams&apos;</li><li>Why Privacy Red Teams are best for orgs with mature privacy programs</li><li>The 3 steps for conducting a Privacy Red Team attack</li><li>How a Red Team differs from other privacy tests like conducting a vulnerability analysis or managing a bug bounty program</li><li>How 23andme&apos;s recent data leak, affecting 1 mil Ashkanazi Jews, may have been avoided via Privacy Red Team testing</li><li>How BigTech companies are staffing up their Privacy Red Teams</li><li>Frugal ways for small and mid-sized organizations to approach adversarial privacy testing</li><li>The future of Privacy Red Teaming and whether we should upskill security engineers or train privacy engineers on adversarial testing</li><li>Advice for Engineer Managers who seek to set up a Privacy Red Team for the first time</li><li>Rebecca&apos;s Red Teaming resources for the audience</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Listen to: <a href='https://www.buzzsprout.com/2059470/11807121-s1e7-privacy-engineers-the-next-generation-with-lorrie-cranor-cmu'>&quot;S1E7: Privacy Engineers: The Next Generation&quot; with Lorrie Cranor (CMU)</a></li><li>Review Rebecca&apos;s <a href='https://www.privacyengineer.ch/shiftleft'>Red Teaming Resources</a> </li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Rebecca on <a href='https://www.linkedin.com/in/rebecca-balebako-42b688/'>LinkedIn</a></li><li>Visit <a href='https://www.privacyengineer.ch/'>Balebako Privacy Engineer&apos;s website</a></li></ul><p><br/></p><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13816051-s2e32-privacy-red-teams-protecting-people-23andme-s-data-leak-with-rebecca-balebako-balebako-privacy-engineer.mp3" length="35312002" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/dryn1566aa7wy6gua9i86nag64k2?.jpg" />
    <itunes:author>Debra J. Farber (Shifting Privacy Left)</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13816051</guid>
    <pubDate>Tue, 24 Oct 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13816051/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13816051/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13816051/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13816051/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13816051/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E32: &quot;Privacy Red Teams, Protecting People &amp; 23andme&#39;s Data Leak&quot; with Rebecca Balebako (Balebako Privacy Engineer)" />
  <psc:chapter start="1:47" title="Introducing Rebecca Balebako, Founder and Principal Consultant at Balebako Privacy Engineer" />
  <psc:chapter start="3:25" title="How Rebecca switched from software development to a focus on privacy and adversarial privacy testing" />
  <psc:chapter start="4:42" title="What motivated Debra to shift left from her legal training to privacy engineering" />
  <psc:chapter start="8:15" title="What &#39;adversarial privacy tests&#39; are; why they&#39;re important; and how it differs from other tests for software development" />
  <psc:chapter start="11:41" title="Defining &#39;Privacy Red Teams,&#39; one type of adversarial privacy test, and what differentiates them from &#39;Security Red Teams&#39;" />
  <psc:chapter start="16:58" title="Why Privacy Red Teams should primarily be used by organizations with mature privacy programs" />
  <psc:chapter start="21:03" title="The 3 steps to conducting a Privacy Red Team attack" />
  <psc:chapter start="24:14" title="How leveraging a Red Team differs from other privacy tests like conducting a vulnerability analysis or managing a bug bounty program" />
  <psc:chapter start="32:01" title="How 23andme&#39;s recent data leak, affecting 1 mil Ashkanazi Jews, could have been avoided via Privacy Red Team testing" />
  <psc:chapter start="40:53" title="The trend where BigTech companies are staffing up their Privacy Red Teams" />
  <psc:chapter start="42:28" title="How small and mid-sized organizations with mature privacy programs can approach adversarial privacy testing" />
  <psc:chapter start="44:12" title="The future of Privacy Red Teaming and whether we should upskill security engineers or train privacy engineers on adversarial testing" />
  <psc:chapter start="45:19" title="Rebecca&#39;s advice for Engineer Managers who seek to set up a Privacy Red Team for the first time" />
  <psc:chapter start="46:29" title="Rebecca shares information about Red Teaming resources that she put together for the audience" />
</psc:chapters>
    <itunes:duration>2938</itunes:duration>
    <itunes:keywords></itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>32</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E31: &quot;Leveraging a Privacy Ontology to Scale Privacy Processes&quot; with Steve Hickman (Epistimis)</itunes:title>
    <title>S2E31: &quot;Leveraging a Privacy Ontology to Scale Privacy Processes&quot; with Steve Hickman (Epistimis)</title>
    <itunes:summary><![CDATA[This week’s guest is Steve Hickman, the founder of Epistimis, a privacy-first process design tooling startup that evaluate rules and enables the fixing of privacy issues before they ever take effect. In our conversation, we discuss: why the biggest impediment to protecting and respecting privacy within organizations is the lack of a common language; why we need a common Privacy Ontology in addition to a Privacy Taxonomy; Epistimis' ontological approach and how it leverages semantic modeling f...]]></itunes:summary>
    <description><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/in/stevehickman/'>Steve Hickman</a>, the founder of <a href='https://www.epistimis.com/'>Epistimis</a>, a privacy-first process design tooling startup that evaluate rules and enables the fixing of privacy issues before they ever take effect. In our conversation, we discuss: why the biggest impediment to protecting and respecting privacy within organizations is the lack of a common language; why we need a common Privacy Ontology in addition to a Privacy Taxonomy; Epistimis&apos; ontological approach and how it leverages semantic modeling for privacy rules checking; and, examples of how Epistimis Privacy Design Process tooling complements privacy tech solutions on the market, not compete with them.</p><p><b>Topics Covered:</b></p><ul><li>How Steve’s deep engineering background in aerospace, retail, telecom, and then a short stint at Meta, led him to found Epistimis </li><li>Why its been hard for companies to get privacy right at scale</li><li>How Epistimis leverages &apos;semantic modeling&apos; for rule checking and how this helps to scale privacy as part of an ontological approach</li><li>The definition of a Privacy Ontology and Steve&apos;s belief that all should use one for common understanding at all levels of the business</li><li>Advice for designers, architects, and developers when it comes to creating and implementing privacy ontology, taxonomies &amp; semantic models</li><li>How to make a Privacy Ontology usable</li><li>How Epistimis&apos; process design tooling work with discovery and mapping platforms like BigID &amp; Secuvy.ai</li><li>How Epistimis&apos; process design tooling work along with a platform like Privado.ai, which scans a company&apos;s product code and then surfaces privacy risks in the code and detects processing activities for creating dynamic data maps</li><li>How Epistimis&apos; process design tooling works with PrivacyCode, which has a library of privacy objects, agile privacy implementations (e.g., success criteria &amp; sample code), and delivers metrics on the privacy engineering process is going</li><li>Steve calls for collaborators who are interested in POCs and/or who can provide feedback on Epistimis&apos; PbD processing tooling</li><li>Steve describes what&apos;s next on the Epistimis roadmap, including wargaming</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Read Dan Solove&apos;s article, &quot;<a href='https://deliverypdf.ssrn.com/delivery.php?ID=321102020088080121086092103017105069021063020068087078105085030089003088108097120119059058002121107109114094101106071114095016050009068002021115006114107021006104067078056073122025107116024076125070070072083119069004105007123020126004079072095074083&amp;EXT=pdf&amp;INDEX=TRUE'>Data is What Data Does: Regulating Based on Harm and Risk Instead of Sensitive Data</a>&quot;</li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Steve on <a href='https://www.linkedin.com/in/stevehickman/'>LinkedIn</a></li><li>Reach out to Steve via <a href='mailto:steve@epistimis.com'>Email</a></li><li>Learn more about <a href='https://www.epistimis.com/'>Epistimis</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/in/stevehickman/'>Steve Hickman</a>, the founder of <a href='https://www.epistimis.com/'>Epistimis</a>, a privacy-first process design tooling startup that evaluate rules and enables the fixing of privacy issues before they ever take effect. In our conversation, we discuss: why the biggest impediment to protecting and respecting privacy within organizations is the lack of a common language; why we need a common Privacy Ontology in addition to a Privacy Taxonomy; Epistimis&apos; ontological approach and how it leverages semantic modeling for privacy rules checking; and, examples of how Epistimis Privacy Design Process tooling complements privacy tech solutions on the market, not compete with them.</p><p><b>Topics Covered:</b></p><ul><li>How Steve’s deep engineering background in aerospace, retail, telecom, and then a short stint at Meta, led him to found Epistimis </li><li>Why its been hard for companies to get privacy right at scale</li><li>How Epistimis leverages &apos;semantic modeling&apos; for rule checking and how this helps to scale privacy as part of an ontological approach</li><li>The definition of a Privacy Ontology and Steve&apos;s belief that all should use one for common understanding at all levels of the business</li><li>Advice for designers, architects, and developers when it comes to creating and implementing privacy ontology, taxonomies &amp; semantic models</li><li>How to make a Privacy Ontology usable</li><li>How Epistimis&apos; process design tooling work with discovery and mapping platforms like BigID &amp; Secuvy.ai</li><li>How Epistimis&apos; process design tooling work along with a platform like Privado.ai, which scans a company&apos;s product code and then surfaces privacy risks in the code and detects processing activities for creating dynamic data maps</li><li>How Epistimis&apos; process design tooling works with PrivacyCode, which has a library of privacy objects, agile privacy implementations (e.g., success criteria &amp; sample code), and delivers metrics on the privacy engineering process is going</li><li>Steve calls for collaborators who are interested in POCs and/or who can provide feedback on Epistimis&apos; PbD processing tooling</li><li>Steve describes what&apos;s next on the Epistimis roadmap, including wargaming</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Read Dan Solove&apos;s article, &quot;<a href='https://deliverypdf.ssrn.com/delivery.php?ID=321102020088080121086092103017105069021063020068087078105085030089003088108097120119059058002121107109114094101106071114095016050009068002021115006114107021006104067078056073122025107116024076125070070072083119069004105007123020126004079072095074083&amp;EXT=pdf&amp;INDEX=TRUE'>Data is What Data Does: Regulating Based on Harm and Risk Instead of Sensitive Data</a>&quot;</li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Steve on <a href='https://www.linkedin.com/in/stevehickman/'>LinkedIn</a></li><li>Reach out to Steve via <a href='mailto:steve@epistimis.com'>Email</a></li><li>Learn more about <a href='https://www.epistimis.com/'>Epistimis</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13702869-s2e31-leveraging-a-privacy-ontology-to-scale-privacy-processes-with-steve-hickman-epistimis.mp3" length="37195731" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/1l7wyja360n2r9g1yo4piv20134l?.jpg" />
    <itunes:author>Debra J. Farber / Steve Hickman</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13702869</guid>
    <pubDate>Tue, 10 Oct 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13702869/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13702869/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13702869/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13702869/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13702869/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E31: &quot;Leveraging a Privacy Ontology to Scale Privacy Processes&quot; with Steve Hickman (Epistimis)" />
  <psc:chapter start="2:21" title="Introducing Steve Hickman, Founder of Epistimis: Privacy Process Design Tooling" />
  <psc:chapter start="7:14" title="Why its been hard for companies to get privacy right at scale" />
  <psc:chapter start="12:25" title="How Epistimis leverages &#39;semantic modeling&#39; for rule checking and how this helps to scale privacy as part of an ontological approach" />
  <psc:chapter start="19:38" title="Definition of an &#39;ontology&#39; and why a privacy ontology is necessary for scaling privacy at large orgs" />
  <psc:chapter start="26:08" title="Steve&#39;s advice for designers, architects, and developers when it comes to creating and implementing a privacy ontology, taxonomy, and semantic model to get started in their orgs" />
  <psc:chapter start="34:46" title="Steve explains how Epistimis&#39; process design tooling work with discovery and mapping platforms like BigID &amp; Secuvy.ai" />
  <psc:chapter start="37:50" title="Steve explains how Epistimis&#39; process design tooling work along with a platform like Privado.ai, which scans a company&#39;s product code and then surfaces privacy risks in the code and detects processing activities for creating dynamic data maps" />
  <psc:chapter start="40:00" title="Steve explains how Epistimis&#39; process design tooling works w/ PrivacyCode, which has a library of privacy objects, agile privacy implementations (e.g., success criteria &amp; sample code), and delivers metrics on the privacy engineering process is going" />
  <psc:chapter start="42:09" title="Steve calls for collaborators who are interested in POCs and/or who can provide feedback on Epistimis&#39; PbD processing tooling" />
  <psc:chapter start="44:34" title="Steve describes what&#39;s next on the Epistimis roadmap, including wargaming capabilities" />
</psc:chapters>
    <itunes:duration>3095</itunes:duration>
    <itunes:keywords>privacy ontology, privacy taxonomy, privacy by design, privacy design process, PbD, scaling privacy</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>31</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E30: &quot;LLMs, Knowledge Graphs, &amp; GenAI Architectural Considerations&quot; with Shashank Tiwari (Uno)</itunes:title>
    <title>S2E30: &quot;LLMs, Knowledge Graphs, &amp; GenAI Architectural Considerations&quot; with Shashank Tiwari (Uno)</title>
    <itunes:summary><![CDATA[This week's guest is Shashank Tiwari, a seasoned engineer and product leader who started with algorithmic systems of Wall Street before becoming Co-founder &amp; CEO of Uno.ai, a pathbreaking autonomous security company. He started with algorithmic systems on Wall Street and then transitioned to building Silicon Valley startups, including previous stints at Nutanix, Elementum, Medallia, &amp; StackRox. In this conversation, we discuss ML/AI, large language models (LLMs), temporal knowledge gr...]]></itunes:summary>
    <description><![CDATA[<p>This week&apos;s guest is <a href='https://www.linkedin.com/in/tshanky/'>Shashank Tiwari</a>, a seasoned engineer and product leader who started with algorithmic systems of Wall Street before becoming Co-founder &amp; CEO of <a href='https://uno.ai/'>Uno.ai</a>, a pathbreaking autonomous security company. He started with algorithmic systems on Wall Street and then transitioned to building Silicon Valley startups, including previous stints at Nutanix, Elementum, Medallia, &amp; StackRox. In this conversation, we discuss ML/AI, large language models (LLMs), temporal knowledge graphs, causal discovery inference models, and the Generative AI design &amp; architectural choices that affect privacy.  <br/><br/><b>Topics Covered</b>:</p><ul><li>Shashank describes his origin story, how he became interested in security, privacy, &amp; AI while working on Wall Street; &amp; what motivated him to found Uno</li><li>The benefits to using &quot;temporal knowledge graphs,&quot; and how knowledge graphs are used with LLMs to create a &quot;causal discovery inference model&quot; to prevent privacy problems</li><li>The explosive growth of Generative AI, it&apos;s impact on the privacy and confidentiality of sensitive and personal data, &amp; why a rushed approach could result in mistakes and societal harm  </li><li>Architectural privacy and security considerations for: 1) leveraging  Generative AI, and those to avoid certain mechanisms at all costs; 2) verifying, assuring, &amp; testing against &quot;trustful data&quot; rather than &quot;derived data;&quot; and 3) thwarting common Generative AI attack vectors</li><li>Shashank&apos;s predictions for Enterprise adoption of Generative AI over the next several years</li><li>Shashank&apos;s thoughts on proposed and future AI-related legislation may affect the Generative AI market overall and Enterprise adoption more specifically</li><li>Shashank&apos;s thoughts on the development of AI standards across tech stacks<br/><br/></li></ul><p><b>Resources Mentioned:</b></p><ul><li>Check out episode <a href='https://podcasts.apple.com/us/podcast/s2e29-synthetic-data-in-ai-challenges-techniques-use/id1651019312?i=1000629202559'>S2E29: Synthetic Data in AI: Challenges, Techniques &amp; Use Cases with Andrew Clark and Sid Mangalik (Monitaur.ai)</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Connect with Shashank on <a href='https://www.linkedin.com/in/tshanky/'>LinkedIn</a></li><li>Learn more about <a href='https://uno.ai/'>Uno.ai</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week&apos;s guest is <a href='https://www.linkedin.com/in/tshanky/'>Shashank Tiwari</a>, a seasoned engineer and product leader who started with algorithmic systems of Wall Street before becoming Co-founder &amp; CEO of <a href='https://uno.ai/'>Uno.ai</a>, a pathbreaking autonomous security company. He started with algorithmic systems on Wall Street and then transitioned to building Silicon Valley startups, including previous stints at Nutanix, Elementum, Medallia, &amp; StackRox. In this conversation, we discuss ML/AI, large language models (LLMs), temporal knowledge graphs, causal discovery inference models, and the Generative AI design &amp; architectural choices that affect privacy.  <br/><br/><b>Topics Covered</b>:</p><ul><li>Shashank describes his origin story, how he became interested in security, privacy, &amp; AI while working on Wall Street; &amp; what motivated him to found Uno</li><li>The benefits to using &quot;temporal knowledge graphs,&quot; and how knowledge graphs are used with LLMs to create a &quot;causal discovery inference model&quot; to prevent privacy problems</li><li>The explosive growth of Generative AI, it&apos;s impact on the privacy and confidentiality of sensitive and personal data, &amp; why a rushed approach could result in mistakes and societal harm  </li><li>Architectural privacy and security considerations for: 1) leveraging  Generative AI, and those to avoid certain mechanisms at all costs; 2) verifying, assuring, &amp; testing against &quot;trustful data&quot; rather than &quot;derived data;&quot; and 3) thwarting common Generative AI attack vectors</li><li>Shashank&apos;s predictions for Enterprise adoption of Generative AI over the next several years</li><li>Shashank&apos;s thoughts on proposed and future AI-related legislation may affect the Generative AI market overall and Enterprise adoption more specifically</li><li>Shashank&apos;s thoughts on the development of AI standards across tech stacks<br/><br/></li></ul><p><b>Resources Mentioned:</b></p><ul><li>Check out episode <a href='https://podcasts.apple.com/us/podcast/s2e29-synthetic-data-in-ai-challenges-techniques-use/id1651019312?i=1000629202559'>S2E29: Synthetic Data in AI: Challenges, Techniques &amp; Use Cases with Andrew Clark and Sid Mangalik (Monitaur.ai)</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Connect with Shashank on <a href='https://www.linkedin.com/in/tshanky/'>LinkedIn</a></li><li>Learn more about <a href='https://uno.ai/'>Uno.ai</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13680455-s2e30-llms-knowledge-graphs-genai-architectural-considerations-with-shashank-tiwari-uno.mp3" length="43476129" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/x0kqjg8nq46p4w0hmxu6z2dyppw5?.jpg" />
    <itunes:author>Debra J. Farber / Shashank Tiwari</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13680455</guid>
    <pubDate>Tue, 03 Oct 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13680455/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13680455/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13680455/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13680455/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13680455/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E30: &quot;LLMs, Knowledge Graphs, &amp; GenAI Architectural Considerations&quot; with Shashank Tiwari (Uno)" />
  <psc:chapter start="2:44" title="Shashank describes his origin story, why he became interested in security, privacy, &amp; AI; and what motivated him to found Uno" />
  <psc:chapter start="5:41" title="The benefits to using &quot;temporal knowledge graphs;&quot; how knowledge graphs are used with LLMs to create, what Shashank calls, &quot;causal discovery;&quot; and how this approach helps privacy" />
  <psc:chapter start="14:02" title="Shashank&#39;s thoughts on how the explosive growth of Generative AI has impacted privacy &amp; confidentiality" />
  <psc:chapter start="22:39" title="Architectural choices to consider when leveraging Generative AI, and those to avoid certain mechanisms at all costs" />
  <psc:chapter start="30:06" title="Architectural considerations for verifying, assuring, &amp; testing against &quot;trustful data;&quot; and what makes data &quot;trustful&quot;" />
  <psc:chapter start="37:09" title="Architectural considerations for thwarting common Generative AI attack vectors" />
  <psc:chapter start="45:36" title="Shashank&#39;s predictions for Enterprise adoption of Generative AI over the next few years" />
  <psc:chapter start="52:14" title="Shashank&#39;s thoughts on proposed and future AI-related legislation may affect the Generative AI market overall and Enterprise adoption more specifically" />
  <psc:chapter start="56:16" title="Shashank&#39;s thoughts on the development of AI standards across tech stacks" />
</psc:chapters>
    <itunes:duration>3619</itunes:duration>
    <itunes:keywords>generative AI, GenAI, knowledge graphs, causal inference, AI, LLMs, trustful data</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>30</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E29 - &quot;Synthetic Data in AI: Challenges, Techniques &amp; Use Cases&quot; with Andrew Clark and Sid Mangalik (Monitaur)</itunes:title>
    <title>S2E29 - &quot;Synthetic Data in AI: Challenges, Techniques &amp; Use Cases&quot; with Andrew Clark and Sid Mangalik (Monitaur)</title>
    <itunes:summary><![CDATA[This week I welcome Dr. Andrew Clark, Co-founder &amp; CTO of Monitaur, a trusted domain expert on the topic of machine learning, auditing and assurance; and Sid Mangalik, Research Scientist at Monitaur and PhD student at Stony Brook University. I discovered Andrew and Sid's new podcast show, The AI Fundamentalists Podcast. I very much enjoyed their lively episode on Synthetic Data &amp; AI, and am delighted to introduce them to my audience of privacy engineers.   In our conversation, we expl...]]></itunes:summary>
    <description><![CDATA[<p>This week I welcome <a href='https://www.linkedin.com/in/andrew-clark-b326b767/'>Dr. Andrew Clark</a>, Co-founder &amp; CTO of Monitaur, a trusted domain expert on the topic of machine learning, auditing and assurance; and <a href='https://www.linkedin.com/in/sid-mangalik/'>Sid Mangalik</a>, Research Scientist at Monitaur and PhD student at Stony Brook University. I discovered Andrew and Sid&apos;s new podcast show, <a href='https://theaifundamentalists.buzzsprout.com/2186686'>The AI Fundamentalists Podcast</a>. I very much enjoyed their lively episode on Synthetic Data &amp; AI, and am delighted to introduce them to my audience of privacy engineers. <br/><br/>In our conversation, we explore why data scientists must stress test their model validations, especially for consequential systems that affect human safety and reliability. In fact, we have much to learn from the aerospace engineering field who has been using ML/AI since the 1960s. We discuss the best and worst use cases for using synthetic data&apos;; problems with LLM-generated synthetic data; what can go wrong when your AI models lack diversity; how to build fair, performant systems; &amp; synthetic data techniques for use with AI.<br/><br/><b>Topics Covered:</b></p><ul><li>What inspired Andrew to found Monitaur and focus on AI governance</li><li>Sid’s career path and his current PhD focus on NLP</li><li>What motivated Andrew &amp; Sid to launch their podcast, The AI Fundamentalists</li><li>Defining &apos;synthetic data&apos; &amp; why academia takes a more rigorous approach to synthetic data than industry</li><li>Whether the output of LLMs are synthetic data &amp; the problem with training LLM base models with this data</li><li>The best and worst &apos;synthetic data&apos; use cases for ML/AI</li><li>Why the &apos;quality&apos; of input data is so important when training AI models </li><li>Thoughts on OpenAI&apos;s announcement that it will use LLM-generated synthetic data; and critique of OpenAI&apos;s approach, the AI hype machine, and the problems with &apos;growth hacking&apos; corner-cutting</li><li>The importance of diversity when training AI models; using &apos;multi-objective modeling&apos; for building fair &amp; performant systems</li><li>Andrew unpacks the &quot;fairness through unawareness fallacy&quot;</li><li>How &apos;randomized data&apos; differs from &apos;synthetic data&apos;</li><li>4 techniques for using synthetic data with ML/AI: 1) the Monte Carlo method; 2) Latin hypercube sampling; 3) gaussian copulas; &amp; 4) random walking</li><li>What excites Andrew &amp; Sid about synthetic data and how it will be used with AI in the future</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Check out <a href='https://www.podchaser.com/'>Podchaser</a> </li><li>Listen to <a href='https://theaifundamentalists.buzzsprout.com/'>The AI Fundamentalists Podcast</a></li><li>Check out <a href='https://www.monitaur.ai/'>Monitaur</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow <a href='https://www.linkedin.com/in/andrew-clark-b326b767/'>Andrew on LinkedIn</a></li><li>Follow <a href='https://www.linkedin.com/in/sid-mangalik/'>Sid on LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week I welcome <a href='https://www.linkedin.com/in/andrew-clark-b326b767/'>Dr. Andrew Clark</a>, Co-founder &amp; CTO of Monitaur, a trusted domain expert on the topic of machine learning, auditing and assurance; and <a href='https://www.linkedin.com/in/sid-mangalik/'>Sid Mangalik</a>, Research Scientist at Monitaur and PhD student at Stony Brook University. I discovered Andrew and Sid&apos;s new podcast show, <a href='https://theaifundamentalists.buzzsprout.com/2186686'>The AI Fundamentalists Podcast</a>. I very much enjoyed their lively episode on Synthetic Data &amp; AI, and am delighted to introduce them to my audience of privacy engineers. <br/><br/>In our conversation, we explore why data scientists must stress test their model validations, especially for consequential systems that affect human safety and reliability. In fact, we have much to learn from the aerospace engineering field who has been using ML/AI since the 1960s. We discuss the best and worst use cases for using synthetic data&apos;; problems with LLM-generated synthetic data; what can go wrong when your AI models lack diversity; how to build fair, performant systems; &amp; synthetic data techniques for use with AI.<br/><br/><b>Topics Covered:</b></p><ul><li>What inspired Andrew to found Monitaur and focus on AI governance</li><li>Sid’s career path and his current PhD focus on NLP</li><li>What motivated Andrew &amp; Sid to launch their podcast, The AI Fundamentalists</li><li>Defining &apos;synthetic data&apos; &amp; why academia takes a more rigorous approach to synthetic data than industry</li><li>Whether the output of LLMs are synthetic data &amp; the problem with training LLM base models with this data</li><li>The best and worst &apos;synthetic data&apos; use cases for ML/AI</li><li>Why the &apos;quality&apos; of input data is so important when training AI models </li><li>Thoughts on OpenAI&apos;s announcement that it will use LLM-generated synthetic data; and critique of OpenAI&apos;s approach, the AI hype machine, and the problems with &apos;growth hacking&apos; corner-cutting</li><li>The importance of diversity when training AI models; using &apos;multi-objective modeling&apos; for building fair &amp; performant systems</li><li>Andrew unpacks the &quot;fairness through unawareness fallacy&quot;</li><li>How &apos;randomized data&apos; differs from &apos;synthetic data&apos;</li><li>4 techniques for using synthetic data with ML/AI: 1) the Monte Carlo method; 2) Latin hypercube sampling; 3) gaussian copulas; &amp; 4) random walking</li><li>What excites Andrew &amp; Sid about synthetic data and how it will be used with AI in the future</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Check out <a href='https://www.podchaser.com/'>Podchaser</a> </li><li>Listen to <a href='https://theaifundamentalists.buzzsprout.com/'>The AI Fundamentalists Podcast</a></li><li>Check out <a href='https://www.monitaur.ai/'>Monitaur</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow <a href='https://www.linkedin.com/in/andrew-clark-b326b767/'>Andrew on LinkedIn</a></li><li>Follow <a href='https://www.linkedin.com/in/sid-mangalik/'>Sid on LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13634980-s2e29-synthetic-data-in-ai-challenges-techniques-use-cases-with-andrew-clark-and-sid-mangalik-monitaur.mp3" length="39319416" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/cvk32p5eng931jmcpdxmrdo0jvbd?.jpg" />
    <itunes:author>Debra J. Farber / Andrew Clark and Sid Mangalik</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13634980</guid>
    <pubDate>Tue, 26 Sep 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13634980/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13634980/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13634980/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13634980/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13634980/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E29 - &quot;Synthetic Data in AI: Challenges, Techniques &amp; Use Cases&quot; with Andrew Clark and Sid Mangalik (Monitaur)" />
  <psc:chapter start="1:47" title="Introducing Andrew Clark &amp; Sid Mangalik" />
  <psc:chapter start="4:06" title="What motivated Andrew to found Monitaur and focus on AI governance" />
  <psc:chapter start="9:09" title="Sid shares his career path, why he chose to focus on AI governance, and how he ended up at Monitaur" />
  <psc:chapter start="11:54" title="What motivated Andrew &amp; Sid to launch their own podcast show, The AI Fundamentalists, &amp; their intended audience" />
  <psc:chapter start="14:43" title="The definition of &#39;synthetic data&#39; and why academia takes a more rigorous approach to deploying and testing synthetic data than industry does" />
  <psc:chapter start="16:56" title="Whether the output of LLMs are synthetic data and the problem with continuing to train LLM base models with this data" />
  <psc:chapter start="22:34" title="What &#39;synthetic data&#39; use cases are most helpful when it comes to AI, and which ones are the most unhelpful?" />
  <psc:chapter start="26:59" title="Andrew &amp; Sid discuss why the &#39;quality&#39; of input data is so important for training AI models; and discussion of OpenAI&#39;s announcement that it plans to use LLM-generated synthetic data" />
  <psc:chapter start="29:48" title="Andrew &amp; Sid critique OpenAI&#39;s approach, the AI hype machine, and the problems with cutting corners via &#39;growth hacking&#39;" />
  <psc:chapter start="33:43" title="Andrew emphasizes the importance of diversity when training AI models and using &#39;multi-objective modeling&#39;" />
  <psc:chapter start="41:53" title="Andrew unpacks the &quot;fairness through unawareness fallacy&quot; for us" />
  <psc:chapter start="44:27" title="Sid explains the difference between using &#39;randomized data&#39; and &#39;synthetic data&#39; with a fun example" />
  <psc:chapter start="45:11" title="Andrew &amp; Sid describe 4 techniques for using synthetic data with ML/AI: 1) the Monte Carlo method; 2) Latin hypercube sampling; 3) gaussian copulas; &amp; 4) random walking" />
  <psc:chapter start="50:46" title="Andrew &amp; Sid describe what they are each most excited about when it comes to synthetic data and how it will be used in the future" />
</psc:chapters>
    <itunes:duration>3272</itunes:duration>
    <itunes:keywords>synthetic data, multi-objective modeling, model validation, stress testing, consequential systems</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>29</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E28: &quot;BigTech Privacy; Responsible AI; and Bias Bounties at DEF CON&quot; with Jutta Williams (Reddit)</itunes:title>
    <title>S2E28: &quot;BigTech Privacy; Responsible AI; and Bias Bounties at DEF CON&quot; with Jutta Williams (Reddit)</title>
    <itunes:summary><![CDATA[This week, I welcome Jutta Williams, Head of Privacy &amp; Assurance at Reddit, Co-founder of Humane Intelligence and BiasBounty.ai, Privacy &amp; Responsible AI Evangelist, and Startup Board Advisor. With a long history of accomplishments in privacy engineering, Jutta has a unique perspective on the growing field.  In our conversation, we discuss her transition from security engineering to privacy engineering; how privacy cultures differ across social media companies where she's worked: Goog...]]></itunes:summary>
    <description><![CDATA[<p>This week, I welcome <a href='https://www.linkedin.com/in/juttawms/'>Jutta Williams</a>, Head of Privacy &amp; Assurance at Reddit, Co-founder of Humane Intelligence and BiasBounty.ai, Privacy &amp; Responsible AI Evangelist, and Startup Board Advisor. With a long history of accomplishments in privacy engineering, Jutta has a unique perspective on the growing field.<br/><br/>In our conversation, we discuss her transition from security engineering to privacy engineering; how privacy cultures differ across social media companies where she&apos;s worked: Google, Facebook, Twitter, and now Reddit; the overlap of the privacy engineering &amp; responsible AI; how her non-profit, Humane Intelligence, supports AI model owners; her experience launching the largest Generative AI Red Teaming challenge ever at DEF CON; and, how a curious knowledge-enhancing approach to privacy will create engagement and allow for fun. <br/><br/><b>Topics Covered:</b></p><ul><li>How Jutta’s unique transition from security engineering landed her in the privacy engineering space. </li><li>A comparison of privacy cultures across Google, Facebook, Twitter (now &apos;X&apos;), and Reddit based on her privacy engineering experiences there.</li><li>Two open Privacy Engineering roles at Reddit, and Jutta&apos;s advice for those wanting to transition from security engineering to privacy engineering.</li><li>Whether Privacy Pros will be responsible for owning new regulatory obligations under the EU&apos;s Digital Services Act (DSA) &amp; the Digital Markets Act (DMA); and the role of the Privacy Engineer when overlapping with Responsible AI issues</li><li>Humane Intelligence,  Jutta&apos;s &apos;side quest,&apos; which she co-leads with Dr. Rumman Chowdhury, and supports AI model owners seeking &apos;Product Readiness Reviews&apos; at scale.</li><li>When, during the product development life cycle, companies should perform &apos;AI Readiness Reviews&apos;</li><li>How to de-biased at scale or whether attempting to do so is &apos;chasing windmills&apos;</li><li>Who should be hunting for biases in an AI Bias Bounty challenge</li><li>DEF CON 31&apos;s AI Village&apos;s &apos;Generative AI Red Teaming Challenge,&apos; which was a bias bounty that she co-designed; lessons learned; and what Jutta &amp; team have planned for DEF CON 32 next year</li><li>Why it&apos;s so important for people to &apos;love their side quests&apos;</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li><a href='https://aivillage.org/generative%20red%20team/generative-red-team'>DEF CON Generative Red Team Challenge</a></li><li><a href='https://www.humane-intelligence.org/'>Humane Intelligence</a></li><li><a href='https://www.biasbuccaneers.org/'>Bias Buccaneers Challenge</a></li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with Jutta on <a href='https://www.linkedin.com/in/juttawms/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, I welcome <a href='https://www.linkedin.com/in/juttawms/'>Jutta Williams</a>, Head of Privacy &amp; Assurance at Reddit, Co-founder of Humane Intelligence and BiasBounty.ai, Privacy &amp; Responsible AI Evangelist, and Startup Board Advisor. With a long history of accomplishments in privacy engineering, Jutta has a unique perspective on the growing field.<br/><br/>In our conversation, we discuss her transition from security engineering to privacy engineering; how privacy cultures differ across social media companies where she&apos;s worked: Google, Facebook, Twitter, and now Reddit; the overlap of the privacy engineering &amp; responsible AI; how her non-profit, Humane Intelligence, supports AI model owners; her experience launching the largest Generative AI Red Teaming challenge ever at DEF CON; and, how a curious knowledge-enhancing approach to privacy will create engagement and allow for fun. <br/><br/><b>Topics Covered:</b></p><ul><li>How Jutta’s unique transition from security engineering landed her in the privacy engineering space. </li><li>A comparison of privacy cultures across Google, Facebook, Twitter (now &apos;X&apos;), and Reddit based on her privacy engineering experiences there.</li><li>Two open Privacy Engineering roles at Reddit, and Jutta&apos;s advice for those wanting to transition from security engineering to privacy engineering.</li><li>Whether Privacy Pros will be responsible for owning new regulatory obligations under the EU&apos;s Digital Services Act (DSA) &amp; the Digital Markets Act (DMA); and the role of the Privacy Engineer when overlapping with Responsible AI issues</li><li>Humane Intelligence,  Jutta&apos;s &apos;side quest,&apos; which she co-leads with Dr. Rumman Chowdhury, and supports AI model owners seeking &apos;Product Readiness Reviews&apos; at scale.</li><li>When, during the product development life cycle, companies should perform &apos;AI Readiness Reviews&apos;</li><li>How to de-biased at scale or whether attempting to do so is &apos;chasing windmills&apos;</li><li>Who should be hunting for biases in an AI Bias Bounty challenge</li><li>DEF CON 31&apos;s AI Village&apos;s &apos;Generative AI Red Teaming Challenge,&apos; which was a bias bounty that she co-designed; lessons learned; and what Jutta &amp; team have planned for DEF CON 32 next year</li><li>Why it&apos;s so important for people to &apos;love their side quests&apos;</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li><a href='https://aivillage.org/generative%20red%20team/generative-red-team'>DEF CON Generative Red Team Challenge</a></li><li><a href='https://www.humane-intelligence.org/'>Humane Intelligence</a></li><li><a href='https://www.biasbuccaneers.org/'>Bias Buccaneers Challenge</a></li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with Jutta on <a href='https://www.linkedin.com/in/juttawms/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13507510-s2e28-bigtech-privacy-responsible-ai-and-bias-bounties-at-def-con-with-jutta-williams-reddit.mp3" length="39611823" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/0mi40g0kar5w72spoi5qw3moc4jm?.jpg" />
    <itunes:author>Debra J. Farber / Jutta Williams</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13507510</guid>
    <pubDate>Tue, 19 Sep 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13507510/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13507510/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13507510/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13507510/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13507510/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E28: &quot;BigTech Privacy; Responsible AI; and Bias Bounties at DEF CON&quot; with Jutta Williams (Reddit)" />
  <psc:chapter start="2:19" title="Introducing Jutta Williams, Head of Privacy &amp; Assurance at Reddit; and how Jutta moved from security engineering to privacy engineering" />
  <psc:chapter start="8:42" title="Jutta compares the different &#39;Privacy Cultures&#39; across Google, Facebook, Twitter, &amp; Reddit based on her privacy engineering experiences there." />
  <psc:chapter start="19:50" title="Jutta&#39;s advice for security engineers who want to transition to privacy engineering or expand their roles to take on some privacy engineering activities" />
  <psc:chapter start="25:55" title="Jutta&#39;s thoughts on whether Privacy Pros will be responsible for owning new regulatory obligations under the EU&#39;s Digital Services Act (DSA) &amp; the Digital Markets Act (DMA); and the role of the Privacy Engineer when overlapping with Responsible AI issues" />
  <psc:chapter start="30:28" title="Jutta talks about her side quest, Human Intelligence, that she co-leads with Dr. Rumman Chowdhury, which supports AI model owners seeking product readiness reviews at scale." />
  <psc:chapter start="36:07" title="Jutta explains when, during the product development life cycle, companies should perform &#39;AI Readiness Reviews&#39;" />
  <psc:chapter start="37:33" title="Jutta shares her wisdom on how to you de-biased at scale and whether attempting to do so is &#39;chasing windmills&#39;" />
  <psc:chapter start="40:29" title="Jutta explains who should be hunting for biases in an AI bias challenge" />
  <psc:chapter start="41:55" title="Jutta describes the DEF CON AI Village, Generative AI Red Teaming Challenge, which was a bias bounty that she co-designed" />
  <psc:chapter start="47:55" title="Lessons learned from this year&#39;s DEF CON Generative AI Red Teaming Challenge, and what Jutta &amp; team have planned for next year" />
  <psc:chapter start="52:46" title="Jutta shares why it&#39;s so important for people to &#39;love their side quests.&#39;" />
</psc:chapters>
    <itunes:duration>3297</itunes:duration>
    <itunes:keywords>bias bounty, bias bounties, red teaming, privacy engineer, privacy engineering, responsible AI, ethical AI, trust</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>28</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E27: &quot;Automated Privacy Decisions: Usability vs. Lawfulness&quot; with Simone Fischer-Hübner &amp; Victor Morel</itunes:title>
    <title>S2E27: &quot;Automated Privacy Decisions: Usability vs. Lawfulness&quot; with Simone Fischer-Hübner &amp; Victor Morel</title>
    <itunes:summary><![CDATA[Today, I welcome Victor Morel, PhD and Simone Fischer-Hübner, PhD to discuss their recent paper, "Automating Privacy Decisions – where to draw the line?" and their proposed classification scheme. We dive into the complexity of automating privacy decisions and emphasize the importance of maintaining both compliance and usability (e.g., via user control and informed consent). Simone is a Professor of Computer Science at Karlstad University with over 30 years of privacy &amp; security research e...]]></itunes:summary>
    <description><![CDATA[<p>Today, I welcome <a href='https://victor-morel.net/'>Victor Morel, PhD</a> and <a href='https://www.kau.se/en/researchers/simone-fischer-hubner'>Simone Fischer-Hübner, PhD</a> to discuss their recent paper, &quot;Automating Privacy Decisions – where to draw the line?&quot; and their proposed classification scheme. We dive into the complexity of automating privacy decisions and emphasize the importance of maintaining both compliance and usability (e.g., via user control and informed consent). Simone is a Professor of Computer Science at Karlstad University with over 30 years of privacy &amp; security research experience. Victor is a post-doc researcher at Chalmers University&apos;s Security &amp; Privacy Lab, focusing on privacy, data protection, and technology ethics.</p><p>Together, they share their privacy decision-making classification scheme and research across two dimensions: (1) the type of privacy decisions: privacy permissions, privacy preference settings, consent to processing, or rejection to processing; and (2) the level of decision automation: manual, semi-automated, or fully-automated. Each type of privacy decision plays a critical role in users&apos; ability to control the disclosure and processing of their personal data. They emphasize the significance of tailored recommendations to help users make informed decisions and discuss the potential of on-the-fly privacy decisions. We wrap up with organizations&apos; approaches to achieving usable and transparent privacy across various technologies, including web, mobile, and IoT. </p><p><br/><b>Topics Covered</b>:</p><ul><li>Why Simone &amp; Victor focused their research on automating privacy decisions </li><li>How GDPR &amp; ePrivacy have shaped requirements for privacy automation tools</li><li>The &apos;types&apos; privacy decisions &amp; associated &apos;levels of automation&apos;: privacy permissions, privacy preference settings, consent to processing, &amp; rejection to processing</li><li>The &apos;levels of automation&apos; for each privacy decision type: manual, semi-automated &amp; fully-automated; and the pros / cons of automating each privacy decision type</li><li>Preferences &amp; concerns regarding IoT Trigger Action Platforms</li><li>Why the only privacy decisions that you should &apos;fully automate&apos; are the rejection of processing: i.e., revoking consent or opting out</li><li>Best practices for achieving informed control</li><li>Automation challenges across web, mobile, &amp; IoT</li><li>Mozilla&apos;s automated cookie banner management &amp; why it&apos;s problematic (i.e., unlawful)</li></ul><p><b>Resources Mentioned</b>:</p><ul><li><a href='https://arxiv.org/pdf/2305.08747.pdf'>&quot;Automating Privacy Decisions – where to draw the line?&quot;</a></li><li><a href='https://www.cse.chalmers.se/research/group/security/cybersecit/'>CyberSecIT</a> at Chalmers University of Technology</li><li><a href='https://arxiv.org/pdf/2308.06148.pdf'>&quot;Tapping into Privacy: A Study of User Preferences and Concerns on Trigger-Action Platforms&quot;</a></li><li><a href='https://consentomatic.au.dk/'>Consent O Matic </a>browser extension</li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Today, I welcome <a href='https://victor-morel.net/'>Victor Morel, PhD</a> and <a href='https://www.kau.se/en/researchers/simone-fischer-hubner'>Simone Fischer-Hübner, PhD</a> to discuss their recent paper, &quot;Automating Privacy Decisions – where to draw the line?&quot; and their proposed classification scheme. We dive into the complexity of automating privacy decisions and emphasize the importance of maintaining both compliance and usability (e.g., via user control and informed consent). Simone is a Professor of Computer Science at Karlstad University with over 30 years of privacy &amp; security research experience. Victor is a post-doc researcher at Chalmers University&apos;s Security &amp; Privacy Lab, focusing on privacy, data protection, and technology ethics.</p><p>Together, they share their privacy decision-making classification scheme and research across two dimensions: (1) the type of privacy decisions: privacy permissions, privacy preference settings, consent to processing, or rejection to processing; and (2) the level of decision automation: manual, semi-automated, or fully-automated. Each type of privacy decision plays a critical role in users&apos; ability to control the disclosure and processing of their personal data. They emphasize the significance of tailored recommendations to help users make informed decisions and discuss the potential of on-the-fly privacy decisions. We wrap up with organizations&apos; approaches to achieving usable and transparent privacy across various technologies, including web, mobile, and IoT. </p><p><br/><b>Topics Covered</b>:</p><ul><li>Why Simone &amp; Victor focused their research on automating privacy decisions </li><li>How GDPR &amp; ePrivacy have shaped requirements for privacy automation tools</li><li>The &apos;types&apos; privacy decisions &amp; associated &apos;levels of automation&apos;: privacy permissions, privacy preference settings, consent to processing, &amp; rejection to processing</li><li>The &apos;levels of automation&apos; for each privacy decision type: manual, semi-automated &amp; fully-automated; and the pros / cons of automating each privacy decision type</li><li>Preferences &amp; concerns regarding IoT Trigger Action Platforms</li><li>Why the only privacy decisions that you should &apos;fully automate&apos; are the rejection of processing: i.e., revoking consent or opting out</li><li>Best practices for achieving informed control</li><li>Automation challenges across web, mobile, &amp; IoT</li><li>Mozilla&apos;s automated cookie banner management &amp; why it&apos;s problematic (i.e., unlawful)</li></ul><p><b>Resources Mentioned</b>:</p><ul><li><a href='https://arxiv.org/pdf/2305.08747.pdf'>&quot;Automating Privacy Decisions – where to draw the line?&quot;</a></li><li><a href='https://www.cse.chalmers.se/research/group/security/cybersecit/'>CyberSecIT</a> at Chalmers University of Technology</li><li><a href='https://arxiv.org/pdf/2308.06148.pdf'>&quot;Tapping into Privacy: A Study of User Preferences and Concerns on Trigger-Action Platforms&quot;</a></li><li><a href='https://consentomatic.au.dk/'>Consent O Matic </a>browser extension</li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13499257-s2e27-automated-privacy-decisions-usability-vs-lawfulness-with-simone-fischer-hubner-victor-morel.mp3" length="31946530" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/wvcmwsn68dsoy5il7lkx24gdj0j3?.jpg" />
    <itunes:author>Debra J Farber / Simone Fischer-Hübner &amp; Victor Morel</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13499257</guid>
    <pubDate>Tue, 12 Sep 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13499257/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13499257/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13499257/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13499257/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13499257/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E27: &quot;Automated Privacy Decisions: Usability vs. Lawfulness&quot; with Simone Fischer-Hübner &amp; Victor Morel" />
  <psc:chapter start="1:45" title="Introducing Victor Morel, PhD and Professor Simone Fischer-Hübner, PhD" />
  <psc:chapter start="4:03" title="What motivated Victor &amp; Simone to focus their research on the automation of privacy decisions and publish their paper, &quot;Automating Privacy Decisions where to Draw the Line&quot;" />
  <psc:chapter start="9:21" title="Discussion on the Types of Privacy Decisions identified and Levels of Automation of those decisions - determining whether each is lawful and provides individuals with meaningful control (usability)" />
  <psc:chapter start="16:42" title="Simone&#39;s &amp; Victor&#39;s findings around the different levels of automation for each type of privacy decision: manual, semi-automated, and fully-automated" />
  <psc:chapter start="22:54" title="Victor describes preferences and concerns regarding IoT Trigger Action Platforms, and his recent co-authored paper, &quot;Tapping into Privacy: A Study of User Preferences and Concerns on Trigger-Action Platforms&quot;" />
  <psc:chapter start="26:00" title="We discuss under which conditions, organizations should enable the automation of privacy decisions while complying with regulations, and which ones should not" />
  <psc:chapter start="28:27" title="We discuss best practices for informed control" />
  <psc:chapter start="32:02" title="Simone &amp; Victor give explain how organizations should think about achieving usable and transparent privacy with automation across technologies through a comprehensive approach" />
  <psc:chapter start="35:40" title="Victor explains the next steps for there research, which will focus on the lawfulness and usability issues of automating privacy decisions in the context of IoT technology" />
  <psc:chapter start="39:29" title="Victor share&#39;s Mozilla&#39;s approach to automated cookie banner management, and why it&#39;s problematic (i.e., unlawful)" />
</psc:chapters>
    <itunes:duration>2658</itunes:duration>
    <itunes:keywords>privacy usability, lawfulness, automated decisions, automated decision-making, privacy research</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>27</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E26: &quot;Building Ethical Machines&quot; with Reid Blackman, PhD (Virtue Consultants)</itunes:title>
    <title>S2E26: &quot;Building Ethical Machines&quot; with Reid Blackman, PhD (Virtue Consultants)</title>
    <itunes:summary><![CDATA[This week, I welcome philosopher, author, &amp; AI ethics expert, Reid Blackman, Ph.D., to discuss Ethical AI. Reid authored the book, "Ethical Machines," and is the CEO &amp; Founder of Virtue Consultants, a digital ethical risk consultancy. His extensive background in philosophy &amp; ethics, coupled with his engagement with orgs like AWS, U.S. Bank, the FBI, &amp; NASA, offers a unique perspective on the challenges &amp; misconceptions surrounding AI ethics.  In our conversation, we discus...]]></itunes:summary>
    <description><![CDATA[<p>This week, I welcome philosopher, author, &amp; AI ethics expert, <a href='https://www.linkedin.com/in/reid-blackman/'>Reid Blackman, Ph.D.</a>, to discuss Ethical AI. Reid authored the book, &quot;Ethical Machines,&quot; and is the CEO &amp; Founder of <a href='https://www.virtueconsultants.com/'>Virtue Consultants</a>, a digital ethical risk consultancy. His extensive background in philosophy &amp; ethics, coupled with his engagement with orgs like AWS, U.S. Bank, the FBI, &amp; NASA, offers a unique perspective on the challenges &amp; misconceptions surrounding AI ethics.<br/><br/>In our conversation, we discuss &apos;passive privacy&apos; &amp; &apos;active privacy&apos; and the need for individuals to exercise control over their data. Reid explains how the quest to train data for ML/AI can lead to privacy violations, particularly for BigTech companies. We touch on many concepts in the AI space including: automated decision making vs. keeping &quot;humans in the loop;&quot; combating AI ethics fatigue; and advice for technical staff involved in AI product development. Reid stresses the importance of protecting privacy, educating users, &amp; deciding whether to utilize external APIs or on-prem servers. <br/><br/>We end by highlighting his HBR article - &quot;Generative AI-xiety&quot; - and discuss the 4 primary areas of ethical concern for LLMs: </p><ol><li>the hallucination problem; </li><li>the deliberation problem; </li><li>the sleazy salesperson problem; &amp; </li><li>the problem of shared responsibility</li></ol><p><b>Topics Covered:</b></p><ul><li>What motivated Reid to write his book, &quot;Ethical Machines&quot;</li><li>The key differences between &apos;active privacy&apos; &amp; &apos;passive privacy&apos;</li><li>Why engineering incentives to collect more data to train AI models, especially in big tech, poses challenges to data minimization</li><li>The importance of aligning privacy agendas with business priorities</li><li>Why what companies infer about people can be a privacy violation; what engineers should know about &apos;input privacy&apos; when training AI models; and, how that effects the output of inferred data</li><li>Automated decision making: when it&apos;s necessary to have a &apos;human in the loop&apos;</li><li>Approaches for mitigating &apos;AI ethics fatigue&apos;</li><li>The need to backup a company&apos;s stated &apos;values&apos; with actions; and why there should always be 3 - 7 guardrails put in place for each stated value</li><li>The differences between &apos;Responsible AI&apos; &amp; &apos;Ethical AI,&apos; and why companies seem reluctant to talk about ethics</li><li>Reid&apos;s article, &quot;Generative AI-xiety,&quot; &amp; the 4 main risks related to generative AI</li><li>Reid&apos;s advice for technical staff building products &amp; services that leverage LLM&apos;s</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Read the book, &quot;<a href='https://www.reidblackman.com/ethical-machines/'>Ethical Machines</a>&quot;</li><li>Reid&apos;s podcast, <a href='https://www.reidblackman.com/ethical-machines-podcast/'>Ethical Machines</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Reid on <a href='https://www.linkedin.com/in/reid-blackman/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, I welcome philosopher, author, &amp; AI ethics expert, <a href='https://www.linkedin.com/in/reid-blackman/'>Reid Blackman, Ph.D.</a>, to discuss Ethical AI. Reid authored the book, &quot;Ethical Machines,&quot; and is the CEO &amp; Founder of <a href='https://www.virtueconsultants.com/'>Virtue Consultants</a>, a digital ethical risk consultancy. His extensive background in philosophy &amp; ethics, coupled with his engagement with orgs like AWS, U.S. Bank, the FBI, &amp; NASA, offers a unique perspective on the challenges &amp; misconceptions surrounding AI ethics.<br/><br/>In our conversation, we discuss &apos;passive privacy&apos; &amp; &apos;active privacy&apos; and the need for individuals to exercise control over their data. Reid explains how the quest to train data for ML/AI can lead to privacy violations, particularly for BigTech companies. We touch on many concepts in the AI space including: automated decision making vs. keeping &quot;humans in the loop;&quot; combating AI ethics fatigue; and advice for technical staff involved in AI product development. Reid stresses the importance of protecting privacy, educating users, &amp; deciding whether to utilize external APIs or on-prem servers. <br/><br/>We end by highlighting his HBR article - &quot;Generative AI-xiety&quot; - and discuss the 4 primary areas of ethical concern for LLMs: </p><ol><li>the hallucination problem; </li><li>the deliberation problem; </li><li>the sleazy salesperson problem; &amp; </li><li>the problem of shared responsibility</li></ol><p><b>Topics Covered:</b></p><ul><li>What motivated Reid to write his book, &quot;Ethical Machines&quot;</li><li>The key differences between &apos;active privacy&apos; &amp; &apos;passive privacy&apos;</li><li>Why engineering incentives to collect more data to train AI models, especially in big tech, poses challenges to data minimization</li><li>The importance of aligning privacy agendas with business priorities</li><li>Why what companies infer about people can be a privacy violation; what engineers should know about &apos;input privacy&apos; when training AI models; and, how that effects the output of inferred data</li><li>Automated decision making: when it&apos;s necessary to have a &apos;human in the loop&apos;</li><li>Approaches for mitigating &apos;AI ethics fatigue&apos;</li><li>The need to backup a company&apos;s stated &apos;values&apos; with actions; and why there should always be 3 - 7 guardrails put in place for each stated value</li><li>The differences between &apos;Responsible AI&apos; &amp; &apos;Ethical AI,&apos; and why companies seem reluctant to talk about ethics</li><li>Reid&apos;s article, &quot;Generative AI-xiety,&quot; &amp; the 4 main risks related to generative AI</li><li>Reid&apos;s advice for technical staff building products &amp; services that leverage LLM&apos;s</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Read the book, &quot;<a href='https://www.reidblackman.com/ethical-machines/'>Ethical Machines</a>&quot;</li><li>Reid&apos;s podcast, <a href='https://www.reidblackman.com/ethical-machines-podcast/'>Ethical Machines</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Reid on <a href='https://www.linkedin.com/in/reid-blackman/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13483508-s2e26-building-ethical-machines-with-reid-blackman-phd-virtue-consultants.mp3" length="37271657" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/xf2yihxmsvsrmc3ou1f9o1vpqj5d?.jpg" />
    <itunes:author>Debra J. Farber / Reid Blackman</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13483508</guid>
    <pubDate>Tue, 05 Sep 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13483508/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13483508/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13483508/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13483508/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13483508/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E26: &quot;Building Ethical Machines&quot; with Reid Blackman, PhD (Virtue Consultants)" />
  <psc:chapter start="1:41" title="Introducing Reid Blackman, Founder &amp; CEO, Virtue Consultants and Host, Ethical Machines Podcast; and how Reid got interested in AI Ethics" />
  <psc:chapter start="5:01" title="Reid discusses what motivated him, a Philosophy Professor, to write his book, &quot;Ethical Machines;&quot; and who he wrote it for" />
  <psc:chapter start="5:03" title="Reid makes a distinction between &#39;active privacy&#39; &amp; &#39;passive privacy&#39;" />
  <psc:chapter start="12:31" title="Challenges with the fact that &#39;the fuel of AI is other people&#39;s data&#39; and how business leaders should put guardrails around this based on business goals and public privacy commitments" />
  <psc:chapter start="19:07" title="Why what you infer about people can be a privacy violation; and, what engineers should know regarding AI training data &amp; &#39;input privacy&#39; and whether that effects the output of inferred data" />
  <psc:chapter start="24:32" title="Automated decision making: when it&#39;s necessary to have a &#39;human in the loop&#39; when making decisions with AI" />
  <psc:chapter start="29:00" title="Reid shares how we can avoid &#39;AI ethics fatigue&#39; to encourage technologists to take action" />
  <psc:chapter start="33:43" title="Reid explains how to back a company&#39;s stated &#39;values&#39; around privacy and AI with actions; why there should always be 3 - 7 guardrails put in place for each stated value" />
  <psc:chapter start="35:57" title="The differences between the terms &#39;Responsible AI&#39; &amp; &#39;Ethical AI,&#39; and why companies seem reluctant to talk about ethics" />
  <psc:chapter start="38:21" title="Reid&#39;s article, &quot;Generative AI-xiety&quot; (Harvard Business Review) and the 4 main risks related to Generative AI: 1) the hallucination problem; 2) the deliberation problem; 3) the sleazy salesperson problem; &amp; 4) the problem of shared responsibility" />
  <psc:chapter start="47:49" title="Reid&#39;s advice for technical staff (i.e. data scientists, architects, product managers, devs) as they build products &amp; services that leverage LLM&#39;s in a &#39;responsible&#39; manner" />
</psc:chapters>
    <itunes:duration>3101</itunes:duration>
    <itunes:keywords>ethical AI, ethics, responsible AI, privacy, LLMs, sleazy salesperson problem, hallucination problem, deliberation problem, shared responsibility, values, guardrails, input privacy, AI ethics fatigue</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>26</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E25: &quot;Anonymization &amp; Deletion at Scale&quot; with Engin Bozdag (Uber) &amp; Stefano Bennati (HERE)</itunes:title>
    <title>S2E25: &quot;Anonymization &amp; Deletion at Scale&quot; with Engin Bozdag (Uber) &amp; Stefano Bennati (HERE)</title>
    <itunes:summary><![CDATA[This week, we're chatting with Engin Bozdag, Senior Staff Privacy Architect at Uber, and Stefano Bennati, Privacy Engineer at HERE Technologies. Today, we explore their recent IWPE'23 talk, "Can Location Data Truly be Anonymized: a risk-based approach to location data anonymization" and discuss the technical &amp; business challenges to obtain anonymization. We also discuss the role of Privacy Engineers, how to choose a career path, and the importance of embedding privacy into product develop...]]></itunes:summary>
    <description><![CDATA[<p>This week, we&apos;re chatting with <a href='https://www.linkedin.com/in/enginbozdag/'>Engin Bozdag</a>, Senior Staff Privacy Architect at <a href='https://www.uber.com/'>Uber</a>, and <a href='https://www.linkedin.com/in/stefano-bennati-47007560/?originalSubdomain=ch'>Stefano Bennati</a>, Privacy Engineer at <a href='https://www.here.com/'>HERE Technologies</a>. Today, we explore their recent IWPE&apos;23 talk, &quot;Can Location Data Truly be Anonymized: a risk-based approach to location data anonymization&quot; and discuss the technical &amp; business challenges to obtain anonymization. We also discuss the role of Privacy Engineers, how to choose a career path, and the importance of embedding privacy into product development &amp; DevPrivOps; collaborating with cross-functional teams; &amp; staying up-to-date with emerging trends.</p><p><br/><b>Topics Covered</b>:</p><ul><li>Common roadblocks privacy engineers face with anonymization techniques &amp; how to overcome them</li><li>How to get budgets for anonymization tools; challenges with scaling &amp; regulatory requirements &amp; how to overcome them</li><li>What it means to be a &apos;Privacy Engineer&apos; today; good career paths; and necessary skill sets</li><li>How third-party data deletion tools can be integrated into a company&apos;s distributed architecture</li><li>What Privacy Engineers should understand about vendor privacy requirements for LLMs before bringing them into their orgs</li><li>The need to monitor code changes in data or source code via code scanning; how HERE Technologies uses Privado to monitor the compliance of its products &amp; data lineage; and how Privado detects new assets added to your inventory &amp; any new API endpoints</li><li>Advice on how to deal with conflicts between engineering, legal &amp; operations teams and hon how to get privacy issues fixed within an org</li><li>Strategies for addressing privacy issues within orgs, including collaboration, transparency, and continuous refinement</li></ul><p><br/><b>Resources Mentioned</b>:</p><ul><li><a href='https://iapp.org/resources/article/defining-privacy-engineering/'>IAPP Defining Privacy Engineering Infographic</a></li><li><a href='https://www.artificial-intelligence-act.com/'>EU AI Act</a></li><li><a href='https://www.politico.eu/wp-content/uploads/2019/04/POLITICO-AI-ethics-guidelines-HLEG-final-April-4.pdf'>Ethics Guidelines for Trustworthy AI</a></li><li><a href='https://cacm.acm.org/magazines/2021/11/256380-privacy-engineering-superheroes/fulltext'>Privacy Engineering Superheroes</a></li><li><a href='https://www.washingtonpost.com/technology/2023/07/13/ftc-openai-chatgpt-sam-altman-lina-khan/'>FTC Investigates OpenAI over Data Leak and ChatGPT’s Inaccuracy</a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li><a href='https://www.linkedin.com/in/enginbozdag/'>Follow Engin</a></li><li><a href='https://www.linkedin.com/in/stefano-bennati-47007560/'>Follow Stefano</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, we&apos;re chatting with <a href='https://www.linkedin.com/in/enginbozdag/'>Engin Bozdag</a>, Senior Staff Privacy Architect at <a href='https://www.uber.com/'>Uber</a>, and <a href='https://www.linkedin.com/in/stefano-bennati-47007560/?originalSubdomain=ch'>Stefano Bennati</a>, Privacy Engineer at <a href='https://www.here.com/'>HERE Technologies</a>. Today, we explore their recent IWPE&apos;23 talk, &quot;Can Location Data Truly be Anonymized: a risk-based approach to location data anonymization&quot; and discuss the technical &amp; business challenges to obtain anonymization. We also discuss the role of Privacy Engineers, how to choose a career path, and the importance of embedding privacy into product development &amp; DevPrivOps; collaborating with cross-functional teams; &amp; staying up-to-date with emerging trends.</p><p><br/><b>Topics Covered</b>:</p><ul><li>Common roadblocks privacy engineers face with anonymization techniques &amp; how to overcome them</li><li>How to get budgets for anonymization tools; challenges with scaling &amp; regulatory requirements &amp; how to overcome them</li><li>What it means to be a &apos;Privacy Engineer&apos; today; good career paths; and necessary skill sets</li><li>How third-party data deletion tools can be integrated into a company&apos;s distributed architecture</li><li>What Privacy Engineers should understand about vendor privacy requirements for LLMs before bringing them into their orgs</li><li>The need to monitor code changes in data or source code via code scanning; how HERE Technologies uses Privado to monitor the compliance of its products &amp; data lineage; and how Privado detects new assets added to your inventory &amp; any new API endpoints</li><li>Advice on how to deal with conflicts between engineering, legal &amp; operations teams and hon how to get privacy issues fixed within an org</li><li>Strategies for addressing privacy issues within orgs, including collaboration, transparency, and continuous refinement</li></ul><p><br/><b>Resources Mentioned</b>:</p><ul><li><a href='https://iapp.org/resources/article/defining-privacy-engineering/'>IAPP Defining Privacy Engineering Infographic</a></li><li><a href='https://www.artificial-intelligence-act.com/'>EU AI Act</a></li><li><a href='https://www.politico.eu/wp-content/uploads/2019/04/POLITICO-AI-ethics-guidelines-HLEG-final-April-4.pdf'>Ethics Guidelines for Trustworthy AI</a></li><li><a href='https://cacm.acm.org/magazines/2021/11/256380-privacy-engineering-superheroes/fulltext'>Privacy Engineering Superheroes</a></li><li><a href='https://www.washingtonpost.com/technology/2023/07/13/ftc-openai-chatgpt-sam-altman-lina-khan/'>FTC Investigates OpenAI over Data Leak and ChatGPT’s Inaccuracy</a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li><a href='https://www.linkedin.com/in/enginbozdag/'>Follow Engin</a></li><li><a href='https://www.linkedin.com/in/stefano-bennati-47007560/'>Follow Stefano</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13424366-s2e25-anonymization-deletion-at-scale-with-engin-bozdag-uber-stefano-bennati-here.mp3" length="36222875" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/t8q8eps40j2sggh884dcm7o6n0m4?.jpg" />
    <itunes:author>Debra J Farber / Engin Bozdag &amp; Stefano Bennati</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13424366</guid>
    <pubDate>Tue, 29 Aug 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13424366/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13424366/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13424366/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13424366/transcript.vtt" type="text/vtt" />
    <podcast:soundbite startTime="363.389" duration="58.0" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13424366/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E25: &quot;Anonymization &amp; Deletion at Scale&quot; with Engin Bozdag (Uber) &amp; Stefano Bennati (HERE)" />
  <psc:chapter start="2:25" title="Introducing Engin Bozdag (Uber) &amp;. Stefano Benatti" />
  <psc:chapter start="4:27" title="Engin &amp; Stefano describe their recent talk, &quot;Can location data truly be anonymized: a risk-based approach to location data anonymization&quot; and describe some of the technical &amp; business challenges in obtaining anonymization" />
  <psc:chapter start="11:38" title="Roadblocks when it comes to deploying anonymization techniques &amp; how to overcome them" />
  <psc:chapter start="16:24" title="How to get budgets for anonymization tools; some of the challenges with scaling &amp; regulatory requirements &amp; how you can overcome them" />
  <psc:chapter start="18:49" title="What it means to be a &#39;Privacy Engineer&#39; today" />
  <psc:chapter start="24:47" title="How third-party data deletion tools can be integrated into a company&#39;s distributed architecture" />
  <psc:chapter start="28:25" title="Stefano &amp; Engin describe good career paths / skill sets for becoming a Privacy Engineer" />
  <psc:chapter start="32:40" title="What Privacy Engineers should understand about vendor privacy requirements for LLMs if they&#39;re bringing that into their organization" />
  <psc:chapter start="37:32" title="Engin recommends reading the FTC&#39;s Demand Letter to OpenAI" />
  <psc:chapter start="38:03" title="The need to monitor code changes in data or source code via code scanning; how HERE Technologies uses Privado to monitor the compliance of its products &amp; data lineage; and how Privado detects new assets added to your inventory &amp; any new API endpoints" />
  <psc:chapter start="41:20" title="Advice on how to deal with conflicts between engineering, legal &amp; operations teams" />
  <psc:chapter start="44:06" title="Advice on how to get privacy issues fixed in an organization" />
  <psc:chapter start="47:13" title="Stefano&#39;s advice to Privacy Engineers: message for my fellow Privacy Engineers: &quot;Do not overlook the less technical aspects of the work, as they constitute the foundation of privacy-by-design&quot;" />
  <psc:chapter start="48:14" title="Engin&#39;s advice to Privacy Engineers: &quot;You should really have some passion for privacy in order to sustain in this field.&quot;" />
</psc:chapters>
    <itunes:duration>3014</itunes:duration>
    <itunes:keywords>anonymization, anonymized, privacy engineer, privacy architect, LLMs, AI, Privado, code scanning, DevPrivOps, vendor privacy, HERE Technologies, Uber</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>25</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E24: &quot;Cloud-Native Privacy Engineering via DevPrivOps&quot; with Elias Grünewald (TU Berlin)</itunes:title>
    <title>S2E24: &quot;Cloud-Native Privacy Engineering via DevPrivOps&quot; with Elias Grünewald (TU Berlin)</title>
    <itunes:summary><![CDATA[This week’s guest is Elias Grünewald, Privacy Engineering Research Associate at Technical University, Berlin, where he focuses on cloud-native privacy engineering, transparency, accountability, distributed systems, &amp; privacy regulation.   In this conversation, we discuss the challenge of designing privacy into modern cloud architectures; how shifting left into DevPrivOps can embed privacy within agile development methods; how to blend privacy engineering &amp; cloud engineering; the ...]]></itunes:summary>
    <description><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/in/eliasgruenewald/'>Elias Grünewald</a>, Privacy Engineering Research Associate at <a href='https://www.tu.berlin/en/'>Technical University, Berlin</a>, where he focuses on cloud-native privacy engineering, transparency, accountability, distributed systems, &amp; privacy regulation. <br/><br/>In this conversation, we discuss the challenge of designing privacy into modern cloud architectures; how shifting left into DevPrivOps can embed privacy within agile development methods; how to blend privacy engineering &amp; cloud engineering; the Hawk DevOps Framework; and what the Shared Responsibilities Model for cloud lacks. <br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Elias&apos;s courses at TU Berlin: &quot;Programming Practical Privacy: Web-based Application Engineering &amp; Data Management&quot; &amp; &quot;Advanced Distributed Systems Prototyping: Cloud-native Privacy Engineering&quot;</li><li>Elias&apos; 2022 paper, &quot;Cloud Native Privacy Engineering through DevPrivOps&quot; - his approach, findings, and framework</li><li>The Shared Responsibilities Model for cloud and how to improve it to account for privacy goals</li><li>Defining DevPrivOps &amp; how it works with agile development</li><li>How DevPrivOps can enable formal privacy-by-design (PbD) &amp; default strategies</li><li>Elias&apos; June 2023 paper, &quot;Hawk: DevOps-Driven Transparency &amp; Accountability in Cloud Native Systems,&quot; which helps data controllers align cloud-native DevOps with regulatory requirements for transparency &amp; accountability</li><li>Engineering challenges when trying to determine the details of personal data processing when responding to access &amp; deletion requests</li><li>A deep-dive into the Hawk 3-phase approach for implementing privacy into each DevOps phase: Hawk Release; Hawk Operate; &amp; Hawk Monitor</li><li>How open sourced project, TOUCAN, is documenting conceptual best practices for corresponding phases in the SDLC, and a call for collaboration</li><li>How privacy engineers can convince their management to adopt a DevPrivOps approach</li></ul><p><br/><b>Read Elias&apos; papers, talks, &amp; projects:</b></p><ul><li><a href='https://arxiv.org/pdf/2108.00927'>Cloud Native Privacy Engineering through DevPrivOps</a></li><li><a href='https://arxiv.org/pdf/2306.02496'>Hawk: DevOps-driven Transparency and Accountability in Cloud Native Systems </a></li><li><a href='https://www.youtube.com/watch?v=37oCWJLT-TU'>CPDP Talk: Privacy Engineering for Transparency &amp; Accountability </a></li><li><a href='https://github.com/Transparency-Information-Language/meta'>TILT: A GDPR-Aligned Transparency Information Language &amp; Toolkit for Practical Privacy Engineering</a></li><li><a href='https://www.tu.berlin/en/ise/research-projects/toucan'>TOUCAN </a></li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with Elias on <a href='https://www.linkedin.com/in/eliasgruenewald/'>LinkedIn</a></li><li>Contact Elias at <a href='https://www.tu.berlin/en/ise/about/elias-gruenewald'>TU Berlin</a></li></ul><p><br/></p><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/in/eliasgruenewald/'>Elias Grünewald</a>, Privacy Engineering Research Associate at <a href='https://www.tu.berlin/en/'>Technical University, Berlin</a>, where he focuses on cloud-native privacy engineering, transparency, accountability, distributed systems, &amp; privacy regulation. <br/><br/>In this conversation, we discuss the challenge of designing privacy into modern cloud architectures; how shifting left into DevPrivOps can embed privacy within agile development methods; how to blend privacy engineering &amp; cloud engineering; the Hawk DevOps Framework; and what the Shared Responsibilities Model for cloud lacks. <br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Elias&apos;s courses at TU Berlin: &quot;Programming Practical Privacy: Web-based Application Engineering &amp; Data Management&quot; &amp; &quot;Advanced Distributed Systems Prototyping: Cloud-native Privacy Engineering&quot;</li><li>Elias&apos; 2022 paper, &quot;Cloud Native Privacy Engineering through DevPrivOps&quot; - his approach, findings, and framework</li><li>The Shared Responsibilities Model for cloud and how to improve it to account for privacy goals</li><li>Defining DevPrivOps &amp; how it works with agile development</li><li>How DevPrivOps can enable formal privacy-by-design (PbD) &amp; default strategies</li><li>Elias&apos; June 2023 paper, &quot;Hawk: DevOps-Driven Transparency &amp; Accountability in Cloud Native Systems,&quot; which helps data controllers align cloud-native DevOps with regulatory requirements for transparency &amp; accountability</li><li>Engineering challenges when trying to determine the details of personal data processing when responding to access &amp; deletion requests</li><li>A deep-dive into the Hawk 3-phase approach for implementing privacy into each DevOps phase: Hawk Release; Hawk Operate; &amp; Hawk Monitor</li><li>How open sourced project, TOUCAN, is documenting conceptual best practices for corresponding phases in the SDLC, and a call for collaboration</li><li>How privacy engineers can convince their management to adopt a DevPrivOps approach</li></ul><p><br/><b>Read Elias&apos; papers, talks, &amp; projects:</b></p><ul><li><a href='https://arxiv.org/pdf/2108.00927'>Cloud Native Privacy Engineering through DevPrivOps</a></li><li><a href='https://arxiv.org/pdf/2306.02496'>Hawk: DevOps-driven Transparency and Accountability in Cloud Native Systems </a></li><li><a href='https://www.youtube.com/watch?v=37oCWJLT-TU'>CPDP Talk: Privacy Engineering for Transparency &amp; Accountability </a></li><li><a href='https://github.com/Transparency-Information-Language/meta'>TILT: A GDPR-Aligned Transparency Information Language &amp; Toolkit for Practical Privacy Engineering</a></li><li><a href='https://www.tu.berlin/en/ise/research-projects/toucan'>TOUCAN </a></li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with Elias on <a href='https://www.linkedin.com/in/eliasgruenewald/'>LinkedIn</a></li><li>Contact Elias at <a href='https://www.tu.berlin/en/ise/about/elias-gruenewald'>TU Berlin</a></li></ul><p><br/></p><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13424374-s2e24-cloud-native-privacy-engineering-via-devprivops-with-elias-grunewald-tu-berlin.mp3" length="46354875" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/pgwn121cnv8w5fne8syjhmaa3hjw?.jpg" />
    <itunes:author>Debra J Farber / Elias Grünewald</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13424374</guid>
    <pubDate>Tue, 22 Aug 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13424374/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13424374/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13424374/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13424374/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13424374/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E24: &quot;Cloud-Native Privacy Engineering via DevPrivOps&quot; with Elias Grünewald (TU Berlin)" />
  <psc:chapter start="2:15" title="Introducing Elias Grünewald" />
  <psc:chapter start="5:33" title="Elias discusses the courses that he teaches at TU Berlin: &quot;Programming Practical Privacy: Web-based Application Engineering &amp; Data Management&quot; and &quot;Advanced Distributed Systems Prototyping: Cloud-native Privacy Engineering&quot;" />
  <psc:chapter start="11:42" title="Discussion of Elias&#39; 2022 paper, &quot;Cloud Native Privacy Engineering through DevPrivOps&quot; - his approach, findings, and framework " />
  <psc:chapter start="18:58" title="Discussion of the Shared Responsibilities Model for cloud and how it can be improved to better account for privacy goals" />
  <psc:chapter start="21:50" title="Defining DevPrivOps and how it works with agile development" />
  <psc:chapter start="28:17" title="How DevPrivOps can enable formal privacy-by-design (PbD) &amp; default strategies" />
  <psc:chapter start="31:01" title="Discussion of Elias&#39; June 2023 paper, &quot;Hawk: DevOps-Driven Transparency &amp; Accountability in Cloud Native Systems,&quot; which helps data controllers align cloud-native DevOps with regulatory requirements for transparency &amp; accountability" />
  <psc:chapter start="36:08" title="The challenges that engineers run into when they try to determine the details of personal data processing, as they&#39;re respond to access requests or deletion requests" />
  <psc:chapter start="39:39" title="Elias describes his approach to integrating privacy into 3 phases of DevOps: 1) Hawk Release; 2) Hawk Operate; &amp; 3) Hawk Monitor" />
  <psc:chapter start="52:12" title="Elias describes how the Hawk framework can benefit regulators as well as data controllers" />
  <psc:chapter start="57:12" title="Elias discusses open source project: TOUCAN (which is funded by the German Federal Ministry of Education &amp; Research). TOUCAN is creating conceptual best practices for corresponding phases in the SDLC" />
  <psc:chapter start="1:00:44" title="How privacy engineers can convince their Head of Engineering and management to adopt a DevPrivOps approach" />
</psc:chapters>
    <itunes:duration>3859</itunes:duration>
    <itunes:keywords>DevPrivOps, DevOps, cloud native, cloud privacy</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>24</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E23: &quot;Navigating the Privacy Engineering Job Market&quot; with George Ratcliffe (Stott &amp; May) </itunes:title>
    <title>S2E23: &quot;Navigating the Privacy Engineering Job Market&quot; with George Ratcliffe (Stott &amp; May) </title>
    <itunes:summary><![CDATA[This week, my guest is George Ratcliffe, Head of the Privacy GRC &amp; Cryptography Executive Search Practice at recruitment firm, Stott &amp; May. In this conversation, we discuss the current market climate &amp; hiring trends for technical privacy roles; the need for higher technical capabilities across the industry;  pay ranges within different technical privacy roles; and George’s tips and tools for applicants interested in, entering, and/or transitioning into the privacy industry.&n...]]></itunes:summary>
    <description><![CDATA[<p>This week, my guest is <a href='https://www.linkedin.com/in/george-ratcliffe-24314154/'>George Ratcliffe</a>, Head of the Privacy GRC &amp; Cryptography Executive Search Practice at recruitment firm, <a href='https://www.stottandmay.com/'>Stott &amp; May</a>.</p><p>In this conversation, we discuss the current market climate &amp; hiring trends for technical privacy roles; the need for higher technical capabilities across the industry;  pay ranges within different technical privacy roles; and George’s tips and tools for applicants interested in, entering, and/or transitioning into the privacy industry. </p><p><br/><b>Topics Covered:</b></p><ul><li>Whether the hiring trends are picking back up for technical privacy roles</li><li>The three &apos;Privacy Engineering&apos; roles that companies seek to hire for and core competencies: Privacy Engineer, Privacy Software Engineer, &amp; Privacy Research Engineer</li><li>The demand for &apos;Privacy Architects&apos;</li><li>IAPP&apos;s new Privacy Engineering infographic &amp; if it maps with how companies approach hiring </li><li>Overall hiring trends for privacy engineers &amp; technical privacy roles</li><li>Advice technologists who want to grow into Privacy Engineer, Researcher, or Architect roles</li><li>Capabilities that companies need or want in candidates that they can&apos;t seem to find; &amp; whether there are roles that are harder to fill because of a lack of candidates &amp; skill sets</li><li>Whether a PhD is necessary to become a &apos;Privacy Research Engineer&apos;</li><li>Typical pay ranges across technical privacy roles: Privacy Engineer, Privacy Software Engineer, Privacy Researcher, Privacy Architect</li><li>Differences in pay for a Privacy Engineering Manager vs an Independent Contributor (IC) and the web apps for crowd-sourced info about roles &amp; salary ranges</li><li>Whether companies seek to fill entry level positions for technical privacy roles</li><li>How privacy technologists can stay up-to-date on hiring trends</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Check out episode <a href='https://podcasts.apple.com/us/podcast/s2e11-lessons-learned-as-a-privacy-engineering/id1651019312?i=1000605212463'>S2E11: Lessons Learned as a Privacy Engineering Manager with Menotti Minutillo (ex-Twitter &amp; Uber)</a></li><li><a href='https://iapp.org/resources/article/defining-privacy-engineering/'>IAPP Defining Privacy Engineering Infographic </a></li><li>Check out <a href='https://www.teamblind.com/'>Blind</a> and <a href='https://www.levels.fyi/?compare=Nike,Intel,Amazon&amp;track=Software%20Engineer'>Levels</a> for compensation benchmarking</li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with <a href='https://www.linkedin.com/in/george-ratcliffe-24314154/'>George on LinkedIn</a></li><li>Reach out to <a href='https://www.stottandmay.com/markets/privacy-recruitment'>Stott &amp; May</a> for your privacy recruiting needs</li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, my guest is <a href='https://www.linkedin.com/in/george-ratcliffe-24314154/'>George Ratcliffe</a>, Head of the Privacy GRC &amp; Cryptography Executive Search Practice at recruitment firm, <a href='https://www.stottandmay.com/'>Stott &amp; May</a>.</p><p>In this conversation, we discuss the current market climate &amp; hiring trends for technical privacy roles; the need for higher technical capabilities across the industry;  pay ranges within different technical privacy roles; and George’s tips and tools for applicants interested in, entering, and/or transitioning into the privacy industry. </p><p><br/><b>Topics Covered:</b></p><ul><li>Whether the hiring trends are picking back up for technical privacy roles</li><li>The three &apos;Privacy Engineering&apos; roles that companies seek to hire for and core competencies: Privacy Engineer, Privacy Software Engineer, &amp; Privacy Research Engineer</li><li>The demand for &apos;Privacy Architects&apos;</li><li>IAPP&apos;s new Privacy Engineering infographic &amp; if it maps with how companies approach hiring </li><li>Overall hiring trends for privacy engineers &amp; technical privacy roles</li><li>Advice technologists who want to grow into Privacy Engineer, Researcher, or Architect roles</li><li>Capabilities that companies need or want in candidates that they can&apos;t seem to find; &amp; whether there are roles that are harder to fill because of a lack of candidates &amp; skill sets</li><li>Whether a PhD is necessary to become a &apos;Privacy Research Engineer&apos;</li><li>Typical pay ranges across technical privacy roles: Privacy Engineer, Privacy Software Engineer, Privacy Researcher, Privacy Architect</li><li>Differences in pay for a Privacy Engineering Manager vs an Independent Contributor (IC) and the web apps for crowd-sourced info about roles &amp; salary ranges</li><li>Whether companies seek to fill entry level positions for technical privacy roles</li><li>How privacy technologists can stay up-to-date on hiring trends</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Check out episode <a href='https://podcasts.apple.com/us/podcast/s2e11-lessons-learned-as-a-privacy-engineering/id1651019312?i=1000605212463'>S2E11: Lessons Learned as a Privacy Engineering Manager with Menotti Minutillo (ex-Twitter &amp; Uber)</a></li><li><a href='https://iapp.org/resources/article/defining-privacy-engineering/'>IAPP Defining Privacy Engineering Infographic </a></li><li>Check out <a href='https://www.teamblind.com/'>Blind</a> and <a href='https://www.levels.fyi/?compare=Nike,Intel,Amazon&amp;track=Software%20Engineer'>Levels</a> for compensation benchmarking</li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with <a href='https://www.linkedin.com/in/george-ratcliffe-24314154/'>George on LinkedIn</a></li><li>Reach out to <a href='https://www.stottandmay.com/markets/privacy-recruitment'>Stott &amp; May</a> for your privacy recruiting needs</li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13395029-s2e23-navigating-the-privacy-engineering-job-market-with-george-ratcliffe-stott-may.mp3" length="33321647" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/4mhy007imqnhl3pdrfxbykp1fsfl?.jpg" />
    <itunes:author>Debra J. Farber / George Ratcliffe</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13395029</guid>
    <pubDate>Tue, 15 Aug 2023 12:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13395029/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13395029/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13395029/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13395029/transcript.vtt" type="text/vtt" />
    <podcast:soundbite startTime="988.9" duration="57.0" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13395029/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E23: &quot;Navigating the Privacy Engineering Job Market&quot; with George Ratcliffe (Stott &amp; May) " />
  <psc:chapter start="1:30" title="Introducing George Ratcliffe from Stott &amp; May" />
  <psc:chapter start="2:33" title="Whether the hiring trend are picking back up for technical privacy roles" />
  <psc:chapter start="5:45" title="The skill sets &amp; competencies companies seek when they say they&#39;re looking to hire for a &#39;Privacy Engineering&#39; role" />
  <psc:chapter start="10:28" title="What the current demand is like for a &#39;Privacy Architect&#39;" />
  <psc:chapter start="13:41" title="George&#39;s thoughts on whether the IAPP&#39;s new Privacy Engineering infographic maps well with how companies approach hiring for privacy engineers today" />
  <psc:chapter start="16:45" title="Overall trends George sees as companies hire for privacy engineers &amp; technical privacy roles" />
  <psc:chapter start="18:54" title="The capabilities that companies desperately need or want in candidates that they can&#39;t seem to find; whether there are roles that are harder to fill than others because of a lack of candidates and skill sets " />
  <psc:chapter start="23:21" title="The trends George sees when it comes to filling roles for &#39;Privacy Researchers&#39;" />
  <psc:chapter start="27:24" title="Advice for those who are already technical and want to grow into the privacy space &amp; become a Privacy Engineer, Researcher, or Architect" />
  <psc:chapter start="31:41" title="Debra&#39;s advice on the importance of networking" />
  <psc:chapter start="33:52" title="Typical pay ranges across technical privacy roles: Privacy Engineer, Privacy Software Engineer, Privacy Researcher, Privacy Architect" />
  <psc:chapter start="37:32" title="Difference in pay for a Privacy Engineering Manager vs an Independent Contributor (IC)" />
  <psc:chapter start="39:54" title="Whether George sees companies seeking to fill entry level positions for technical privacy roles" />
  <psc:chapter start="42:22" title="The best way for privacy technologists to stay up-to-date on hiring trends" />
</psc:chapters>
    <itunes:duration>2772</itunes:duration>
    <itunes:keywords>privacy engineer, privacy engineering, job market, hiring trends, privacy architect, privacy software engineer</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>23</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E22: Why You Need an &#39;Outside-In&#39; Approach to Privacy Risk Monitoring with Sanjay Saini (Privaini)</itunes:title>
    <title>S2E22: Why You Need an &#39;Outside-In&#39; Approach to Privacy Risk Monitoring with Sanjay Saini (Privaini)</title>
    <itunes:summary><![CDATA[Get ready for an eye-opening conversation with Sanjay Saini, the founder and CEO of Privaini, a groundbreaking privacy tech company. Sanjay's journey is not only impressive due to his role in creating high-performance teams that have built entirely new product categories, but also for the invaluable lessons he learned from his grandfather about the pillars of successful companies - trust and human connections. In our discussion, Sanjay shares how Privaini is raising the privacy bar by constru...]]></itunes:summary>
    <description><![CDATA[<p>Get ready for an eye-opening conversation with <a href='https://www.linkedin.com/in/sanjaysainisf/'>Sanjay Saini</a>, the founder and CEO of <a href='https://privaini.com/'>Privaini</a>, a groundbreaking privacy tech company. Sanjay&apos;s journey is not only impressive due to his role in creating high-performance teams that have built entirely new product categories, but also for the invaluable lessons he learned from his grandfather about the pillars of successful companies - trust and human connections. In our discussion, Sanjay shares how Privaini is raising the privacy bar by constructing the world&apos;s largest repository of company privacy policies and practices. It&apos;s a fascinating dive into the future of privacy risk management.<br/><br/>Imagine being able to gain full coverage of your external privacy risks with continuous monitoring. Wouldn&apos;t that revolutionize your approach to risk management? That&apos;s exactly what Privaini is doing! Sanjay explains how Privaini utilizes AI to analyze, standardize, and derive meaningful &quot;privacy views&quot; and insights from vast volumes of publicly-available data. Listen in to understand how Privaini&apos;s innovative approach is helping companies gain visibility into their entire business network to make quicker, more informed decisions. <br/><br/><b>Topics Covered:</b></p><ul><li>What motivated Sanjay to found companies that bring trusted systems to market and why he founded Privaini  to  focus on continuous privacy risk monitoring</li><li>How to quantitatively analyze &amp; monitor privacy risk throughout an entire &apos;business network&apos; and what Sanjay means by &apos;business network&apos;</li><li>Which stakeholders benefit from using the Privaini platform</li><li>The benefits to calculating a &quot;quantified privacy risk score&quot; for each company in your business network to effectively monitor privacy risk</li><li>How Privaini leverages AI to discover external data about companies&apos; privacy posture and why it must be used in a responsible and deliberate way</li><li>Why effective privacy risk monitoring of a company&apos;s business network requires an “outside-in” approach</li><li>The importance of continuous monitoring &amp; the benefits to using an &apos;outside-in&apos; approach</li><li>What it takes to set up an enterprise&apos;s network with Privaini for full coverage of external privacy risks</li><li>The recent Criteo fines and how Privaini could have helped Criteo surface privacy risks about its vendors</li><li>Why Sanjay believes learning about the “right side” of the equation is necessary in order to &quot;shift privacy left.&quot;</li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with Sanjay on <a href='https://www.linkedin.com/in/sanjaysainisf/'>LinkedIn</a></li><li>Learn more about <a href='https://privaini.com/'>Privaini</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Get ready for an eye-opening conversation with <a href='https://www.linkedin.com/in/sanjaysainisf/'>Sanjay Saini</a>, the founder and CEO of <a href='https://privaini.com/'>Privaini</a>, a groundbreaking privacy tech company. Sanjay&apos;s journey is not only impressive due to his role in creating high-performance teams that have built entirely new product categories, but also for the invaluable lessons he learned from his grandfather about the pillars of successful companies - trust and human connections. In our discussion, Sanjay shares how Privaini is raising the privacy bar by constructing the world&apos;s largest repository of company privacy policies and practices. It&apos;s a fascinating dive into the future of privacy risk management.<br/><br/>Imagine being able to gain full coverage of your external privacy risks with continuous monitoring. Wouldn&apos;t that revolutionize your approach to risk management? That&apos;s exactly what Privaini is doing! Sanjay explains how Privaini utilizes AI to analyze, standardize, and derive meaningful &quot;privacy views&quot; and insights from vast volumes of publicly-available data. Listen in to understand how Privaini&apos;s innovative approach is helping companies gain visibility into their entire business network to make quicker, more informed decisions. <br/><br/><b>Topics Covered:</b></p><ul><li>What motivated Sanjay to found companies that bring trusted systems to market and why he founded Privaini  to  focus on continuous privacy risk monitoring</li><li>How to quantitatively analyze &amp; monitor privacy risk throughout an entire &apos;business network&apos; and what Sanjay means by &apos;business network&apos;</li><li>Which stakeholders benefit from using the Privaini platform</li><li>The benefits to calculating a &quot;quantified privacy risk score&quot; for each company in your business network to effectively monitor privacy risk</li><li>How Privaini leverages AI to discover external data about companies&apos; privacy posture and why it must be used in a responsible and deliberate way</li><li>Why effective privacy risk monitoring of a company&apos;s business network requires an “outside-in” approach</li><li>The importance of continuous monitoring &amp; the benefits to using an &apos;outside-in&apos; approach</li><li>What it takes to set up an enterprise&apos;s network with Privaini for full coverage of external privacy risks</li><li>The recent Criteo fines and how Privaini could have helped Criteo surface privacy risks about its vendors</li><li>Why Sanjay believes learning about the “right side” of the equation is necessary in order to &quot;shift privacy left.&quot;</li></ul><p><br/><b>Guest Info:</b></p><ul><li>Connect with Sanjay on <a href='https://www.linkedin.com/in/sanjaysainisf/'>LinkedIn</a></li><li>Learn more about <a href='https://privaini.com/'>Privaini</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13304143-s2e22-why-you-need-an-outside-in-approach-to-privacy-risk-monitoring-with-sanjay-saini-privaini.mp3" length="26914655" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/k4msi2gv3qmyzsct2jtpskmn2qct?.jpg" />
    <itunes:author>Debra J Farber / Sanjay Saini</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13304143</guid>
    <pubDate>Tue, 01 Aug 2023 07:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13304143/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13304143/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13304143/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13304143/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13304143/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E22: Why You Need an &#39;Outside-In&#39; Approach to Privacy Risk Monitoring with Sanjay Saini (Privaini)" />
  <psc:chapter start="1:14" title="Introducing Sanjay Saini, Founder at Privaini" />
  <psc:chapter start="3:07" title="What motivated Sanjay to found companies that bring trusted systems to market" />
  <psc:chapter start="6:04" title="What motivated Sanjay to found Privaini and focus on privacy risk monitoring" />
  <psc:chapter start="8:48" title="Sanjay&#39;s approach to quantitatively analyze &amp; monitor privacy risk throughout an entire &#39;business network&#39; and what he means by &#39;business network&#39;" />
  <psc:chapter start="12:01" title="Sanjay explains which stakeholders benefit from using the Privaini platform" />
  <psc:chapter start="15:57" title="The benefits to calculating a &quot;quantified privacy risk score&quot; for each company in your business network for monitoring privacy risk" />
  <psc:chapter start="20:18" title="How Privaini leverages AI to discover external data about companies&#39; privacy posture" />
  <psc:chapter start="23:00" title="The importance of continuous monitoring for external privacy risks and the benefits to using an &#39;outside-in&#39; approach" />
  <psc:chapter start="26:40" title="What it takes to set up an enterprise&#39;s network with Privaini for full coverage of external privacy risks" />
  <psc:chapter start="30:26" title="Debra &amp; Sanjay discuss the recent Criteo fines and how Privaini could have helped Criteo surface privacy risks about its vendors" />
  <psc:chapter start="33:58" title="How to reach Sanjay and his advice about shifting privacy left" />
</psc:chapters>
    <itunes:duration>2239</itunes:duration>
    <itunes:keywords>privacy risk monitoring, privacy risk, privacy policies, Sanjay Saini, Privaini, privacy tech, Credio, privacy posture</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>22</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E21: Containing Big Tech, Federal Privacy Law, &amp; Investing in Privacy Tech with Tom Kemp (Kemp Au Ventures)</itunes:title>
    <title>S2E21: Containing Big Tech, Federal Privacy Law, &amp; Investing in Privacy Tech with Tom Kemp (Kemp Au Ventures)</title>
    <itunes:summary><![CDATA[This week’s guest is Tom Kemp: author; entrepreneur; former Co-Founder &amp; CEO of Centrify (now called Delinia), a leading cybersecurity cloud provider; and a Silicon Valley-based Seed Investor and Policy Advisor. Tom led campaign marketing efforts in 2020 to pass California Proposition 24, the California Privacy Rights Act, (CPRA), and is currently co-authoring the California Delete Act bill. In this conversation, we discuss chapters within Tom’s new book, Containing Big Tech: How to ...]]></itunes:summary>
    <description><![CDATA[<p>This week’s guest is <a href='https://www.tomkemp.ai/'>Tom Kemp</a>: author; entrepreneur; former Co-Founder &amp; CEO of Centrify (now called Delinia), a leading cybersecurity cloud provider; and a Silicon Valley-based Seed Investor and Policy Advisor. Tom led campaign marketing efforts in 2020 to pass California Proposition 24, the California Privacy Rights Act, (CPRA), and is currently co-authoring the <a href='https://sd13.senate.ca.gov/news/press-release/april-11-2023/data-brokers-beware-californians-will-gain-new-privacy-protections'>California Delete Act</a> bill.</p><p>In this conversation, we discuss chapters within Tom’s new book, <a href='https://www.tomkemp.ai/containing-big-tech'><em>Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY</em></a>; how big tech is using AI to feed into the attention economy; what should go into a U.S. federal privacy law and how it should be enforced; and a comprehensive look at some of Tom’s privacy tech investments. <br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Tom&apos;s new book - Containing Big Tech: How to Protect Our Civil Rights, Economy and Democracy</li><li>How and why Tom’s book is centered around data collection, artificial intelligence, and competition. </li><li>U.S. state privacy legislation that Tom helped get passed &amp; what he&apos;s working on now, including: CPRA, the California Delete Act, &amp; Texas Data Broker Registry</li><li>Whether there will ever be a U.S. federal, omnibus privacy law; what should be included in it; and how it should be enforced</li><li>Tom&apos;s work as a privacy tech and security tech Seed Investor with Kemp Au Ventures and what inspires him to invest in a startup or not</li><li>What inspired Tom to invest in <a href='https://www.privacycode.ai/contact-us'>PrivacyCode</a>, <a href='https://secuvy.ai/'>Secuvy</a> &amp; <a href='https://privaini.com/'>Privaini</a> </li><li>Why having a team and market size is something Tom looks for when investing. </li><li>The importance of designing for privacy from a &apos;user-interface perspective&apos; so that it’s consumer friendly</li><li>How consumers looking to trust companies are driving a shift left movement</li><li>Tom&apos;s advice for how companies can better shift left in their orgs &amp; within their business networks</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li><a href='https://oag.ca.gov/privacy/ccpa'>The California Consumer Privacy Act</a> (amended by the CPRA)</li><li><a href='https://sd13.senate.ca.gov/news/press-release/april-11-2023/data-brokers-beware-californians-will-gain-new-privacy-protections'>The California Delete Act</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Tom on <a href='https://www.linkedin.com/in/tomkemp'>LinkedIn</a></li><li><a href='https://www.tomkemp.ai/ventures'>Kemp Au Ventures</a></li><li>Pre-order <a href='https://www.tomkemp.ai/containing-big-tech'><em>Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY</em> </a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week’s guest is <a href='https://www.tomkemp.ai/'>Tom Kemp</a>: author; entrepreneur; former Co-Founder &amp; CEO of Centrify (now called Delinia), a leading cybersecurity cloud provider; and a Silicon Valley-based Seed Investor and Policy Advisor. Tom led campaign marketing efforts in 2020 to pass California Proposition 24, the California Privacy Rights Act, (CPRA), and is currently co-authoring the <a href='https://sd13.senate.ca.gov/news/press-release/april-11-2023/data-brokers-beware-californians-will-gain-new-privacy-protections'>California Delete Act</a> bill.</p><p>In this conversation, we discuss chapters within Tom’s new book, <a href='https://www.tomkemp.ai/containing-big-tech'><em>Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY</em></a>; how big tech is using AI to feed into the attention economy; what should go into a U.S. federal privacy law and how it should be enforced; and a comprehensive look at some of Tom’s privacy tech investments. <br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Tom&apos;s new book - Containing Big Tech: How to Protect Our Civil Rights, Economy and Democracy</li><li>How and why Tom’s book is centered around data collection, artificial intelligence, and competition. </li><li>U.S. state privacy legislation that Tom helped get passed &amp; what he&apos;s working on now, including: CPRA, the California Delete Act, &amp; Texas Data Broker Registry</li><li>Whether there will ever be a U.S. federal, omnibus privacy law; what should be included in it; and how it should be enforced</li><li>Tom&apos;s work as a privacy tech and security tech Seed Investor with Kemp Au Ventures and what inspires him to invest in a startup or not</li><li>What inspired Tom to invest in <a href='https://www.privacycode.ai/contact-us'>PrivacyCode</a>, <a href='https://secuvy.ai/'>Secuvy</a> &amp; <a href='https://privaini.com/'>Privaini</a> </li><li>Why having a team and market size is something Tom looks for when investing. </li><li>The importance of designing for privacy from a &apos;user-interface perspective&apos; so that it’s consumer friendly</li><li>How consumers looking to trust companies are driving a shift left movement</li><li>Tom&apos;s advice for how companies can better shift left in their orgs &amp; within their business networks</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li><a href='https://oag.ca.gov/privacy/ccpa'>The California Consumer Privacy Act</a> (amended by the CPRA)</li><li><a href='https://sd13.senate.ca.gov/news/press-release/april-11-2023/data-brokers-beware-californians-will-gain-new-privacy-protections'>The California Delete Act</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Tom on <a href='https://www.linkedin.com/in/tomkemp'>LinkedIn</a></li><li><a href='https://www.tomkemp.ai/ventures'>Kemp Au Ventures</a></li><li>Pre-order <a href='https://www.tomkemp.ai/containing-big-tech'><em>Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY</em> </a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13172356-s2e21-containing-big-tech-federal-privacy-law-investing-in-privacy-tech-with-tom-kemp-kemp-au-ventures.mp3" length="40047849" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/0nwntmq2bgqaifutxsftdbvae40i?.jpg" />
    <itunes:author>Debra J. Farber / Tom Kemp</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13172356</guid>
    <pubDate>Tue, 11 Jul 2023 15:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13172356/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13172356/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13172356/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13172356/transcript.vtt" type="text/vtt" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13172356/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E21: Containing Big Tech, Federal Privacy Law, &amp; Investing in Privacy Tech with Tom Kemp (Kemp Au Ventures)" />
  <psc:chapter start="1:59" title="Introducing Tom Kemp" />
  <psc:chapter start="3:13" title="Tom&#39;s new book - Containing Big Tech: How to Protect Our Civil Rights, Economy and Democracy" />
  <psc:chapter start="11:16" title="Tom describes U.S. state privacy legislation he helped get passed &amp; what he&#39;s working on now, including: CPRA, the California Delete Act, &amp; Texas Data Broker Registry" />
  <psc:chapter start="16:52" title="Tom and Debra give their thoughts on whether there will ever be a U.S. federal, omnibus privacy law" />
  <psc:chapter start="21:27" title="Tom explains what should be included in a U.S. federal privacy law there were comprehensive U.S. privacy law" />
  <psc:chapter start="28:24" title="Tom explains why a federal U.S. privacy law should require the use of Global Privacy Control (GPC)" />
  <psc:chapter start="33:02" title="Tom explains how a federal privacy law should be appropriately enforced" />
  <psc:chapter start="38:47" title="Tom describes his work as a privacy and security tech Seed Investor with Kemp Au Ventures and what inspires him to invest or not" />
  <psc:chapter start="44:50" title="Tom describes why he was prompted to invested in privacy tech companies: Privacy Code, Secuvy, &amp; Privaini" />
  <psc:chapter start="49:52" title="Tom&#39;s advice for how companies can better shift left in their orgs &amp; within their business networks" />
</psc:chapters>
    <itunes:duration>3333</itunes:duration>
    <itunes:keywords>Containing Big Tech, Kemp Au Ventures, U.S. privacy legislation, federal privacy law, privacy tech, seed investment, angel investment, Global Privacy Control, GPC, PrivacyCode, Secuvy, Privaini, privacy law enforcement</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>21</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E20: Location Privacy, Data Brokers &amp; Privacy Datasets with Jeff Jockisch</itunes:title>
    <title>S2E20: Location Privacy, Data Brokers &amp; Privacy Datasets with Jeff Jockisch</title>
    <itunes:summary><![CDATA[This week’s guest is Jeff Jockisch, Partner at Avantis Privacy and co-host of the weekly LinkedIn Live event, Your Bytes = Your Rights, a town hall-style discussion around ownership, digital rights, and privacy. Jeff is currently a data privacy researcher at PrivacyPlan, where he focuses specifically on privacy data sets.  In this conversation, we delve into current risks to location privacy; how precise location data really is; how humans can have more control over their data; and what ...]]></itunes:summary>
    <description><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/in/jozian/'>Jeff Jockisch</a>, Partner at <a href='https://avantisprivacy.com/'>Avantis Privacy</a> and co-host of the weekly LinkedIn Live event, <a href='https://www.linkedin.com/groups/9166631/'>Your Bytes = Your Rights</a>, a town hall-style discussion around ownership, digital rights, and privacy. Jeff is currently a data privacy researcher at <a href='https://privacyplan.net/'>PrivacyPlan</a>, where he focuses specifically on privacy data sets. </p><p>In this conversation, we delve into current risks to location privacy; how precise location data really is; how humans can have more control over their data; and what organizations can do to protect humans’ data privacy. </p><p>For access to a dataset of data resources and privacy podcasts, check out Jeff’s robust <a href='https://privacyplan.net/privacy-datasets/privacy-podcast-db/'>database</a> — the Shifting Privacy Left podcast was recently added.</p><p><br/><b>Topics Covered:</b></p><ul><li>Jeff’s approach to creating privacy data sets and what “gaining insight into the privacy landscape” means.</li><li>How law enforcement can be a threat actor to someone’s privacy, using the example of Texas&apos; abortion law</li><li>Whether data brokers are getting exact location information or are inferring someone’s location.</li><li>Why geolocation brokers had not considered themselves data brokers.</li><li>Why anonymization is insufficient for location privacy. </li><li>How &apos;consent theater&apos; coupled with location leakage is an existential threat to our privacy.</li><li>How people can protect themselves from having data collected and sold by data and location brokers.</li><li>Why apps permissions should be more specific when notifying users about personal data collection and use. </li><li>How Apple and Android devices treat Mobile Ad ID (MAID) differently and how that affects your historical location data.</li><li>How companies can protect data by using broader geolocation information instead of precise geolocation information. </li><li>More information about Jeff&apos;s LinkedIn Live show, Your Bytes = Your Rights.</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li><a href='https://avantisprivacy.com/'>Avantis Privacy</a></li><li><a href='https://privacyplan.net/'>Privacy Plan</a></li><li><a href='https://shiftingprivacyleft.buzzsprout.com/2059470/13089122-s2e19-privacy-threat-modeling-mitigating-privacy-threats-in-software-with-kim-wuyts-ku-leuven'>Threat modeling episode with Kim Wuyts</a></li><li>&quot;<a href='https://www.linkedin.com/groups/9166631/'>Your Bytes = Your Rights&quot; LinkedIn Live</a></li><li><a href='https://sd13.senate.ca.gov/news/press-release/april-11-2023/data-brokers-beware-californians-will-gain-new-privacy-protections'>The California Delete Act</a></li><li><a href='https://privacyplan.net/privacy-datasets/privacy-podcast-db/'>Privacy Podcast Database</a></li><li><a href='https://www.tomkemp.ai/'><em>Containing Big Tech</em></a> </li></ul><p><b>Guest Info:</b></p><ul><li>Follow Jeff on <a href='https://www.linkedin.com/in/jozian/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week’s guest is <a href='https://www.linkedin.com/in/jozian/'>Jeff Jockisch</a>, Partner at <a href='https://avantisprivacy.com/'>Avantis Privacy</a> and co-host of the weekly LinkedIn Live event, <a href='https://www.linkedin.com/groups/9166631/'>Your Bytes = Your Rights</a>, a town hall-style discussion around ownership, digital rights, and privacy. Jeff is currently a data privacy researcher at <a href='https://privacyplan.net/'>PrivacyPlan</a>, where he focuses specifically on privacy data sets. </p><p>In this conversation, we delve into current risks to location privacy; how precise location data really is; how humans can have more control over their data; and what organizations can do to protect humans’ data privacy. </p><p>For access to a dataset of data resources and privacy podcasts, check out Jeff’s robust <a href='https://privacyplan.net/privacy-datasets/privacy-podcast-db/'>database</a> — the Shifting Privacy Left podcast was recently added.</p><p><br/><b>Topics Covered:</b></p><ul><li>Jeff’s approach to creating privacy data sets and what “gaining insight into the privacy landscape” means.</li><li>How law enforcement can be a threat actor to someone’s privacy, using the example of Texas&apos; abortion law</li><li>Whether data brokers are getting exact location information or are inferring someone’s location.</li><li>Why geolocation brokers had not considered themselves data brokers.</li><li>Why anonymization is insufficient for location privacy. </li><li>How &apos;consent theater&apos; coupled with location leakage is an existential threat to our privacy.</li><li>How people can protect themselves from having data collected and sold by data and location brokers.</li><li>Why apps permissions should be more specific when notifying users about personal data collection and use. </li><li>How Apple and Android devices treat Mobile Ad ID (MAID) differently and how that affects your historical location data.</li><li>How companies can protect data by using broader geolocation information instead of precise geolocation information. </li><li>More information about Jeff&apos;s LinkedIn Live show, Your Bytes = Your Rights.</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li><a href='https://avantisprivacy.com/'>Avantis Privacy</a></li><li><a href='https://privacyplan.net/'>Privacy Plan</a></li><li><a href='https://shiftingprivacyleft.buzzsprout.com/2059470/13089122-s2e19-privacy-threat-modeling-mitigating-privacy-threats-in-software-with-kim-wuyts-ku-leuven'>Threat modeling episode with Kim Wuyts</a></li><li>&quot;<a href='https://www.linkedin.com/groups/9166631/'>Your Bytes = Your Rights&quot; LinkedIn Live</a></li><li><a href='https://sd13.senate.ca.gov/news/press-release/april-11-2023/data-brokers-beware-californians-will-gain-new-privacy-protections'>The California Delete Act</a></li><li><a href='https://privacyplan.net/privacy-datasets/privacy-podcast-db/'>Privacy Podcast Database</a></li><li><a href='https://www.tomkemp.ai/'><em>Containing Big Tech</em></a> </li></ul><p><b>Guest Info:</b></p><ul><li>Follow Jeff on <a href='https://www.linkedin.com/in/jozian/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13089130-s2e20-location-privacy-data-brokers-privacy-datasets-with-jeff-jockisch.mp3" length="31116141" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/ffq21w6affbju2mrhw0xb641twdh?.jpg" />
    <itunes:author>Debra J Farber / Jeff Jockisch</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13089130</guid>
    <pubDate>Wed, 05 Jul 2023 12:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13089130/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13089130/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13089130/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13089130/transcript.vtt" type="text/vtt" />
    <itunes:duration>2589</itunes:duration>
    <itunes:keywords>location privacy, geolocation</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>20</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E19: Privacy Threat Modeling - Mitigating Privacy Threats in Software with Kim Wuyts (KU Leuven)</itunes:title>
    <title>S2E19: Privacy Threat Modeling - Mitigating Privacy Threats in Software with Kim Wuyts (KU Leuven)</title>
    <itunes:summary><![CDATA[This week's guest is Kim Wuyts, Senior Postdoctoral Researcher at the DistriNet Research Group at the Department of Computer Science at KU Leuven. Kim is one of the leading minds behind the development and extension of LINDDUN, a privacy threat modeling framework that mitigates privacy threats in software systems.  In this conversation, we discuss threat modeling based on the Threat Modeling Manifesto Kim co-authored; the benefits to using the LINDDUN privacy threat model framework; and how t...]]></itunes:summary>
    <description><![CDATA[<p>This week&apos;s guest is <a href='https://www.linkedin.com/in/kwuyts/'>Kim Wuyts</a>, Senior Postdoctoral Researcher at the <a href='https://distrinet.cs.kuleuven.be/'>DistriNet Research Group</a> at the Department of Computer Science at <a href='https://www.kuleuven.be/english/kuleuven'>KU Leuven</a>. Kim is one of the leading minds behind the development and extension of LINDDUN, a privacy threat modeling framework that mitigates privacy threats in software systems.<br/><br/>In this conversation, we discuss threat modeling based on the Threat Modeling Manifesto Kim co-authored; the benefits to using the LINDDUN privacy threat model framework; and how to bridge the gap between privacy-enhancing technologies (PETs) in academia and the commercial world.    <br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>Kim&apos;s career journey &amp; why she moved into threat modeling.</li><li>The definition of &apos;threat modeling,&apos; who should threat model, and what&apos;s included in her &quot;Threat Modeling Manifesto.&quot;</li><li>The connection between threat modeling &amp; a &apos;shift left&apos; mindset / strategy.</li><li>Design patterns that benefit threat modeling &amp; anti-patterns that inhibit.</li><li>Benefits to using the LINDDUN Privacy Threat Modeling framework for mitigating privacy threats in software, including the 7 &apos;privacy threat types,&apos; associated &apos;privacy threat trees,&apos; and examples.</li><li>How &quot;privacy threat trees&apos; refine each threat type into concrete threat characteristics, examples, criteria &amp; impact info.</li><li>Benefits &amp; differences between LINDDUN GO and LINDDUN PRO.</li><li>How orgs can combine threat modeling approaches with PETs to address privacy risk.</li><li>Kim&apos;s work as Program Chair for the International Workshop on Privacy Engineering (IWPE), highlighting some anticipated talks.</li><li>The overlap of privacy &amp; AI threats, and Kim&apos;s recommendation of The Privacy Library of Threats 4 AI (&quot;PLOT4AI&quot;) Threat Modeling Card Deck</li><li>Recommended resources for privacy threat modeling, privacy engineering &amp; PETs.</li><li>How the LINDDUN model &amp; methodologies have been adopted by global orgs.</li><li>How to bridge the gap between the academic &amp; commercial world to advance &amp; deploy PETs.<br/><br/></li></ul><p><b>Resources Mentioned:</b></p><ul><li><a href='https://www.threatmodelingmanifesto.org/'>The Threat Modeling Manifesto</a></li><li><a href='https://linddun.org/'>LINDDUN Privacy Threat Model</a>  </li><li><a href='https://en.wikipedia.org/wiki/STRIDE_(security)'>STRIDE threat model</a></li><li><a href='https://www.threatmodelingconnect.com/'>Threat Modeling Connect Community</a></li><li><a href='https://shostack.org/games/elevation-of-privilege'>Elevation of Privilege card game</a></li><li><a href='https://plot4.ai/'>Plot4AI (privacy &amp; AI threat modeling) card deck</a></li><li><a href='https://www.iwpe.info/index.html'>International Workshop on Privacy Engineering (IWPE)</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Kim on <a href='https://www.linkedin.com/in/kwuyts/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week&apos;s guest is <a href='https://www.linkedin.com/in/kwuyts/'>Kim Wuyts</a>, Senior Postdoctoral Researcher at the <a href='https://distrinet.cs.kuleuven.be/'>DistriNet Research Group</a> at the Department of Computer Science at <a href='https://www.kuleuven.be/english/kuleuven'>KU Leuven</a>. Kim is one of the leading minds behind the development and extension of LINDDUN, a privacy threat modeling framework that mitigates privacy threats in software systems.<br/><br/>In this conversation, we discuss threat modeling based on the Threat Modeling Manifesto Kim co-authored; the benefits to using the LINDDUN privacy threat model framework; and how to bridge the gap between privacy-enhancing technologies (PETs) in academia and the commercial world.    <br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>Kim&apos;s career journey &amp; why she moved into threat modeling.</li><li>The definition of &apos;threat modeling,&apos; who should threat model, and what&apos;s included in her &quot;Threat Modeling Manifesto.&quot;</li><li>The connection between threat modeling &amp; a &apos;shift left&apos; mindset / strategy.</li><li>Design patterns that benefit threat modeling &amp; anti-patterns that inhibit.</li><li>Benefits to using the LINDDUN Privacy Threat Modeling framework for mitigating privacy threats in software, including the 7 &apos;privacy threat types,&apos; associated &apos;privacy threat trees,&apos; and examples.</li><li>How &quot;privacy threat trees&apos; refine each threat type into concrete threat characteristics, examples, criteria &amp; impact info.</li><li>Benefits &amp; differences between LINDDUN GO and LINDDUN PRO.</li><li>How orgs can combine threat modeling approaches with PETs to address privacy risk.</li><li>Kim&apos;s work as Program Chair for the International Workshop on Privacy Engineering (IWPE), highlighting some anticipated talks.</li><li>The overlap of privacy &amp; AI threats, and Kim&apos;s recommendation of The Privacy Library of Threats 4 AI (&quot;PLOT4AI&quot;) Threat Modeling Card Deck</li><li>Recommended resources for privacy threat modeling, privacy engineering &amp; PETs.</li><li>How the LINDDUN model &amp; methodologies have been adopted by global orgs.</li><li>How to bridge the gap between the academic &amp; commercial world to advance &amp; deploy PETs.<br/><br/></li></ul><p><b>Resources Mentioned:</b></p><ul><li><a href='https://www.threatmodelingmanifesto.org/'>The Threat Modeling Manifesto</a></li><li><a href='https://linddun.org/'>LINDDUN Privacy Threat Model</a>  </li><li><a href='https://en.wikipedia.org/wiki/STRIDE_(security)'>STRIDE threat model</a></li><li><a href='https://www.threatmodelingconnect.com/'>Threat Modeling Connect Community</a></li><li><a href='https://shostack.org/games/elevation-of-privilege'>Elevation of Privilege card game</a></li><li><a href='https://plot4.ai/'>Plot4AI (privacy &amp; AI threat modeling) card deck</a></li><li><a href='https://www.iwpe.info/index.html'>International Workshop on Privacy Engineering (IWPE)</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Kim on <a href='https://www.linkedin.com/in/kwuyts/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/13089122-s2e19-privacy-threat-modeling-mitigating-privacy-threats-in-software-with-kim-wuyts-ku-leuven.mp3" length="32506305" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/5i3lwarp57plzrv0q3twf173gdnr?.jpg" />
    <itunes:author>Debra Farber / Kim Wuyts</itunes:author>
    <guid isPermaLink="false">Buzzsprout-13089122</guid>
    <pubDate>Tue, 27 Jun 2023 11:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/13089122/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/13089122/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E19: Privacy Threat Modeling - Mitigating Privacy Threats in Software with Kim Wuyts (KU Leuven)" />
  <psc:chapter start="1:15" title="Introducing Kim Wuyts, Sr. Postdoctoral Researcher at the IMEC-DistriNet Research Group at the Department of Computer Science at KU Leuven." />
  <psc:chapter start="2:28" title="Kim describes her career journey and how she became interested in threat modeling." />
  <psc:chapter start="4:28" title="Kim defines &#39;threat modeling,&#39; explains who should threat model, and discusses her co-authored &#39;Threat Modeling Manifesto.&#39;" />
  <psc:chapter start="6:36" title="Kim describes the connection between threat modeling and a &#39;shift left&#39; mindset / strategy." />
  <psc:chapter start="9:20" title="Kim describes basic design patterns that benefit threat modeling and anti-patterns that inhibit threat modeling." />
  <psc:chapter start="13:13" title="Kim explains the benefits to using the (free) LINDDUN Privacy Threat Modeling Framework for mitigating privacy threats in software. She also describes the 7 &#39;privacy threat types,&#39; &#39;privacy threat trees,&#39; and some examples." />
  <psc:chapter start="20:56" title="Kim describes &quot;privacy threat trees&#39; and how they help you refine each threat type into more concrete threat characteristics." />
  <psc:chapter start="23:47" title="Kim uses privacy threat type of &quot;linking&quot; to illustrate concrete threat characteristics, examples, criteria, and impact info would be included in a &#39;privacy threat tree.&quot;" />
  <psc:chapter start="26:32" title="Kim explains the benefits and differences between LINDDUN&#39;s methodologies: LINDDUN GO &amp; LINDDUN PRO. " />
  <psc:chapter start="30:59" title="We discuss how orgs can combine threat modeling approaches with privacy enhancing technologies to address privacy risks; and Kim recommends multiple resources." />
  <psc:chapter start="33:18" title="Kim describes her work as Program Chair for the International Workshop on Privacy Engineering (IWPE) Conference and highlights some anticipated talks." />
  <psc:chapter start="35:58" title="Kim discusses the topic of privacy &amp; AI and refers listeners to the Privacy Library of Threats 4 AI (&quot;PLOT4AI&quot;) Threat Modeling Card Deck" />
  <psc:chapter start="36:45" title="Kim lists her favorite resources for privacy threat modeling, privacy engineering, and PETs." />
  <psc:chapter start="38:54" title="Kim talks about how the LINDDUN model and methodologies have been adopted by organizations over the past few years. " />
  <psc:chapter start="40:44" title="Kim shares how can we better bridge the gap between the academic and commercial world when it comes to advancing and deploying PETs." />
</psc:chapters>
    <itunes:duration>2705</itunes:duration>
    <itunes:keywords>privacy threat modeling, LINDDUN, privacy threat types, privacy threat trees, threat model, privacy engineering, privacy threats</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>19</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E18: Making Digital Contact Cards Private, Shareable &amp; Updatable with Brad Dominy (Neucards)</itunes:title>
    <title>S2E18: Making Digital Contact Cards Private, Shareable &amp; Updatable with Brad Dominy (Neucards)</title>
    <itunes:summary><![CDATA[I am delighted to welcome my next guest, Brad Dominy. Brad is a MacOS and iOS developer and Founder &amp; Inventor of Neucards, a privacy-preserving app that enables secure shareable and updatable digital contacts. In this conversation, we delve into why personally managing our digital contacts has been so difficult and Brad's novel approach to securely manage our contacts, architected with privacy by design and default. Contacts have always been the “junk drawer” of digital data, where peopl...]]></itunes:summary>
    <description><![CDATA[<p>I am delighted to welcome my next guest, <a href='https://www.linkedin.com/in/braddominy/'>Brad Dominy</a>. Brad is a MacOS and iOS developer and Founder &amp; Inventor of <a href='https://www.neucards.com/'>Neucards</a>, a privacy-preserving app that enables secure shareable and updatable digital contacts. In this conversation, we delve into why personally managing our digital contacts has been so difficult and Brad&apos;s novel approach to securely manage our contacts, architected with privacy by design and default.</p><p>Contacts have always been the “junk drawer” of digital data, where people have information that they want to keep up-to-date, but are rarely able to based on current technology. The vCard standard is outdated, but is the only standard that works across iOS, Android, and Microsoft. It is still the most commonly used contact format, but lacks any capacity for updating contacts. Once someone exchanges their contact information with you, it then falls on <em>you</em> to keep that up-to-date. This is why Brad created <a href='https://www.neucards.com/'>Neucards</a>: to gain the benefits of sharing information easily, privately (with E2EE) and receiving updates across all platforms.</p><p><br/><b>Topics Covered:</b></p><ul><li>Why it is difficult to keep our digital contacts up-to-date across devices and platforms.</li><li>Brad describes his career journey that inspired him to invent Neucards; the problems Neucards solves for; and why this became his passion project for over a decade</li><li>Why companies haven’t innovated more in the digital contacts space</li><li>The 3 main features that make Neucards different from other contact apps</li><li>How Neucards enables you to share digital contacts data easily &amp; securely</li><li>Neucards&apos; privacy by design and default approach to sharing and updating digital contacts</li><li>How you can use NFC tap tags with Neucards to make the process of sharing digital contacts much easier</li><li>Whether Neucards can solve the &quot;New phone, who dis?&quot; problem</li><li>Whether we will see an update to the vCard standard or new standards for digital contacts</li><li>Neucards&apos; roadmap, including a &apos;mask communications&apos; feature</li><li>The importance of language; the difference between &apos;privacy-preserving&apos; vs. &apos;privacy-enabling&apos; architectural approaches</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Learn about <a href='https://www.neucards.com/'>Neucards</a></li><li>Download the <a href='https://apps.apple.com/us/app/neucards-secure-contact-card/id1599851881'>Neucards iOS app</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Brad on <a href='https://www.linkedin.com/in/braddominy/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>I am delighted to welcome my next guest, <a href='https://www.linkedin.com/in/braddominy/'>Brad Dominy</a>. Brad is a MacOS and iOS developer and Founder &amp; Inventor of <a href='https://www.neucards.com/'>Neucards</a>, a privacy-preserving app that enables secure shareable and updatable digital contacts. In this conversation, we delve into why personally managing our digital contacts has been so difficult and Brad&apos;s novel approach to securely manage our contacts, architected with privacy by design and default.</p><p>Contacts have always been the “junk drawer” of digital data, where people have information that they want to keep up-to-date, but are rarely able to based on current technology. The vCard standard is outdated, but is the only standard that works across iOS, Android, and Microsoft. It is still the most commonly used contact format, but lacks any capacity for updating contacts. Once someone exchanges their contact information with you, it then falls on <em>you</em> to keep that up-to-date. This is why Brad created <a href='https://www.neucards.com/'>Neucards</a>: to gain the benefits of sharing information easily, privately (with E2EE) and receiving updates across all platforms.</p><p><br/><b>Topics Covered:</b></p><ul><li>Why it is difficult to keep our digital contacts up-to-date across devices and platforms.</li><li>Brad describes his career journey that inspired him to invent Neucards; the problems Neucards solves for; and why this became his passion project for over a decade</li><li>Why companies haven’t innovated more in the digital contacts space</li><li>The 3 main features that make Neucards different from other contact apps</li><li>How Neucards enables you to share digital contacts data easily &amp; securely</li><li>Neucards&apos; privacy by design and default approach to sharing and updating digital contacts</li><li>How you can use NFC tap tags with Neucards to make the process of sharing digital contacts much easier</li><li>Whether Neucards can solve the &quot;New phone, who dis?&quot; problem</li><li>Whether we will see an update to the vCard standard or new standards for digital contacts</li><li>Neucards&apos; roadmap, including a &apos;mask communications&apos; feature</li><li>The importance of language; the difference between &apos;privacy-preserving&apos; vs. &apos;privacy-enabling&apos; architectural approaches</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Learn about <a href='https://www.neucards.com/'>Neucards</a></li><li>Download the <a href='https://apps.apple.com/us/app/neucards-secure-contact-card/id1599851881'>Neucards iOS app</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Brad on <a href='https://www.linkedin.com/in/braddominy/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12853299-s2e18-making-digital-contact-cards-private-shareable-updatable-with-brad-dominy-neucards.mp3" length="34622925" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/pi55oc78k9outhq9r8o3yavsf499?.jpg" />
    <itunes:author>Debra J. Farber / Brad Dominy</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12853299</guid>
    <pubDate>Tue, 16 May 2023 13:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12853299/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12853299/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E18: Making Digital Contact Cards Private, Shareable &amp; Updatable with Brad Dominy (Neucards)" />
  <psc:chapter start="1:16" title="Introducing Brad Dominy, MacOS and iOS developer and founder and inventor of Neucards" />
  <psc:chapter start="2:38" title="Why it&#39;s so difficult to keep our digital contacts up-to-date across devices and platforms" />
  <psc:chapter start="6:43" title="Brad describes his career journey that led him to inventing Neucards; the problems it solves for; and why this became his passion project for over a decade" />
  <psc:chapter start="10:22" title="Brad describes the features that make Neucards different from other contacts apps out there" />
  <psc:chapter start="13:47" title="How Neucards enables you to share digital contacts data easily &amp; securely" />
  <psc:chapter start="17:06" title="Neucards&#39; privacy by design and default approach to sharing and updating digital contacts" />
  <psc:chapter start="20:12" title="Brad explains how you can use NFC tap tags with Neucards to make the process of sharing digital contacts much easier" />
  <psc:chapter start="24:29" title="Whether Neucards can solve the &quot;New phone, who dis?&quot; problem" />
  <psc:chapter start="31:36" title="Whether we will see an update to the vCard standard or new standards for digital contacts" />
  <psc:chapter start="33:25" title="Brad describes features on Neucards&#39; product roadmap, including a &#39;mask communications&#39; feature" />
  <psc:chapter start="41:02" title="Brad makes a call for collaborators who want to help make digital contacts easier to share, update, and to keep private" />
</psc:chapters>
    <itunes:duration>2881</itunes:duration>
    <itunes:keywords>digital contacts, E2EE, end to end encryption, privacy architecture, iOS, updatable contacts, shareable contacts, digital contacts management</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>18</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E17 - Noise in the Machine: How to Assess, Design &amp; Deploy &#39;Differential Privacy&#39; with Damien Desfontaines (Tumult Labs)</itunes:title>
    <title>S2E17 - Noise in the Machine: How to Assess, Design &amp; Deploy &#39;Differential Privacy&#39; with Damien Desfontaines (Tumult Labs)</title>
    <itunes:summary><![CDATA[In this week’s episode, I speak with Damien Desfontaines, also known by the pseudonym “Ted”, who is the Staff Scientist at Tumult Labs, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at ETH Zurich, as well as his Master's Degree in Mathematical Logic and Theoretical Computer Science. Tumult Labs’ platform makes dif...]]></itunes:summary>
    <description><![CDATA[<p>In this week’s episode, I speak with <a href='https://www.linkedin.com/in/desfontaines/'>Damien Desfontaines</a>, also known by the pseudonym “Ted”, who is the Staff Scientist at <a href='https://www.tmlt.io/'>Tumult Labs</a>, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at <a href='https://ethz.ch/en.html'>ETH Zurich</a>, as well as his Master&apos;s Degree in Mathematical Logic and Theoretical Computer Science.</p><p><a href='https://www.tmlt.io/'>Tumult Labs</a>’ platform makes differential privacy useful by making it easy to create innovative privacy and enabling data products that can be safely shared and used widely. In this conversation, we focus our discussion on Differential Privacy techniques, including what’s next in its evolution, common vulnerabilities, and how to implement differential privacy into your platform.</p><p>When it comes to protecting personal data, <a href='https://www.tmlt.io/'>Tumult Labs</a> has three stages in their approach. These are Assess, Design, and Deploy. Damien takes us on a deep dive into each with use cases provided.<br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Why there&apos;s such a gap between the academia and the corporate world</li><li>How differential privacy&apos;s strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction &amp; usability</li><li>When to use &quot;local&quot; vs &quot;central&quot; differential privacy techniques</li><li>Advancements in technology that enable the private collection of data</li><li>Tumult Labs&apos; Assessment approach to deploying differential privacy, where a customer defines its &apos;data publication&apos; problem or question</li><li>How the Tumult Analytics platform can help you build different privacy algorithms that satisfies &apos;fitness for use&apos; requirements</li><li>Why using gold standard techniques like differential privacy to safely release, publish, or share data has value far beyond compliance</li><li>How data scientists can make the analysis &amp; design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks &amp; number of tasks that you can possibly answer</li><li>Damien&apos;s work assisting the IRS &amp; DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project</li><li>How to address security vulnerabilities (i.e. potential attacks) to differentially private datasets</li><li>Where you can learn more about differential privacy</li><li>How Damien sees this space evolving over the next several years</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Join the <a href='https://www.tmlt.dev/slack'>Tumult Labs Slack</a></li><li>Learn about <a href='https://www.tmlt.io/'>Tumult Labs</a></li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Damien on <a href='https://www.linkedin.com/in/desfontaines/'>LinkedIn</a></li><li>Learn more on Damien’s <a href='https://desfontain.es/serious.html'>website</a></li><li>Follow &apos;Ted&apos; on <a href='https://twitter.com/TedOnPrivacy'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this week’s episode, I speak with <a href='https://www.linkedin.com/in/desfontaines/'>Damien Desfontaines</a>, also known by the pseudonym “Ted”, who is the Staff Scientist at <a href='https://www.tmlt.io/'>Tumult Labs</a>, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at <a href='https://ethz.ch/en.html'>ETH Zurich</a>, as well as his Master&apos;s Degree in Mathematical Logic and Theoretical Computer Science.</p><p><a href='https://www.tmlt.io/'>Tumult Labs</a>’ platform makes differential privacy useful by making it easy to create innovative privacy and enabling data products that can be safely shared and used widely. In this conversation, we focus our discussion on Differential Privacy techniques, including what’s next in its evolution, common vulnerabilities, and how to implement differential privacy into your platform.</p><p>When it comes to protecting personal data, <a href='https://www.tmlt.io/'>Tumult Labs</a> has three stages in their approach. These are Assess, Design, and Deploy. Damien takes us on a deep dive into each with use cases provided.<br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Why there&apos;s such a gap between the academia and the corporate world</li><li>How differential privacy&apos;s strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction &amp; usability</li><li>When to use &quot;local&quot; vs &quot;central&quot; differential privacy techniques</li><li>Advancements in technology that enable the private collection of data</li><li>Tumult Labs&apos; Assessment approach to deploying differential privacy, where a customer defines its &apos;data publication&apos; problem or question</li><li>How the Tumult Analytics platform can help you build different privacy algorithms that satisfies &apos;fitness for use&apos; requirements</li><li>Why using gold standard techniques like differential privacy to safely release, publish, or share data has value far beyond compliance</li><li>How data scientists can make the analysis &amp; design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks &amp; number of tasks that you can possibly answer</li><li>Damien&apos;s work assisting the IRS &amp; DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project</li><li>How to address security vulnerabilities (i.e. potential attacks) to differentially private datasets</li><li>Where you can learn more about differential privacy</li><li>How Damien sees this space evolving over the next several years</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Join the <a href='https://www.tmlt.dev/slack'>Tumult Labs Slack</a></li><li>Learn about <a href='https://www.tmlt.io/'>Tumult Labs</a></li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Damien on <a href='https://www.linkedin.com/in/desfontaines/'>LinkedIn</a></li><li>Learn more on Damien’s <a href='https://desfontain.es/serious.html'>website</a></li><li>Follow &apos;Ted&apos; on <a href='https://twitter.com/TedOnPrivacy'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12457123-s2e17-noise-in-the-machine-how-to-assess-design-deploy-differential-privacy-with-damien-desfontaines-tumult-labs.mp3" length="33253391" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/ih9mcahinle4v4vuytko4wgwd896?.jpg" />
    <itunes:author>Debra J. Farber / Damien Desfontaines</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12457123</guid>
    <pubDate>Tue, 09 May 2023 13:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12457123/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12457123/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E17 - Noise in the Machine: How to Assess, Design &amp; Deploy &#39;Differential Privacy&#39; with Damien Desfontaines (Tumult Labs)" />
  <psc:chapter start="1:15" title="Introducing Damien Desfontaines, PhD" />
  <psc:chapter start="3:34" title="Why there&#39;s such a gap between the academia and the corporate world" />
  <psc:chapter start="5:19" title="How differential privacy&#39;s strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction &amp; useability" />
  <psc:chapter start="8:03" title="When to use &quot;local&quot; vs &quot;central&quot; differential privacy techniques" />
  <psc:chapter start="11:56" title="Damien describes advancements in technology that enable the private collection of data (i.e., multi-party computation, secure computation, federated learning) that can be used with local DP" />
  <psc:chapter start="14:32" title="Damien describes Tumult Labs&#39; Assessment approach to deploying differential privacy, where a customer would define its &#39;data publication&#39; problem or question. " />
  <psc:chapter start="17:08" title="Damien describes how the open source Tumult Analytics platform can help you build different privacy algorithms that satisfies &#39;fitness for use&#39; requirements" />
  <psc:chapter start="19:13" title="Why using gold standard techniques like differential privacy to safely release, publish, or share data, we tell them that this goes beyond compliance to unlock the value of company data" />
  <psc:chapter start="20:37" title="What&#39;s involved with deploying differentially private algorithms via Tumult Labs&#39; platform" />
  <psc:chapter start="21:49" title="Damien&#39;s litmus test for when it&#39;s appropriate to use differential privacy" />
  <psc:chapter start="26:34" title="How data scientists can make the analysis &amp; design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks and number of tasks that you can possibly answer" />
  <psc:chapter start="30:36" title="Damien describes his work assisting the IRS &amp; DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project" />
  <psc:chapter start="33:11" title="Damien discusses security vulnerabilities (i.e. potential attacks) to differentially private datasets" />
  <psc:chapter start="37:33" title="Where you can learn more about differential privacy" />
  <psc:chapter start="40:27" title="How Damien sees this space evolving over the next several years" />
</psc:chapters>
    <itunes:duration>2767</itunes:duration>
    <itunes:keywords>differential privacy</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>17</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E16: Words with Impact; Communication Tips for Privacy Technologists with Melanie Ensign (Discernible)</itunes:title>
    <title>S2E16: Words with Impact; Communication Tips for Privacy Technologists with Melanie Ensign (Discernible)</title>
    <itunes:summary><![CDATA[I'm delighted to welcome guest, Melanie Ensign, Founder and CEO of Discernible, where she helps organizations adopt effective communication strategies to improve risk-related outcomes. She's managed security &amp; privacy communications for some of the world's most notable brands, including Facebook, Uber &amp; AT&amp;T.  Melanie counsels executives and technical teams to cut through internal politics, dysfunctional inertia &amp; meaningless metrics. For the past 10 years, she's also led the ...]]></itunes:summary>
    <description><![CDATA[<p>I&apos;m delighted to welcome guest, <a href='https://www.linkedin.com/in/melanieensign'>Melanie Ensign</a>, Founder and CEO of <a href='https://discernibleinc.com/'>Discernible</a>, where she helps organizations adopt effective communication strategies to improve risk-related outcomes. She&apos;s managed security &amp; privacy communications for some of the world&apos;s most notable brands, including Facebook, Uber &amp; AT&amp;T.<br/><br/>Melanie counsels executives and technical teams to cut through internal politics, dysfunctional inertia &amp; meaningless metrics. For the past 10 years, she&apos;s also led the press department &amp; communication strategy for DEF CON. Also, Melanie is an accomplished scuba diver and brings lessons learned preventing, preparing for &amp; navigating unexpected high-risk underwater incidents to her work in security &amp; privacy. Today&apos;s discussion focuses on the importance of communication strategies and tactics for privacy engineering teams. </p><p><b>Topics Covered</b>:</p><ul><li>Melanie&apos;s career journey and how she leveraged her experience in shark science to help executives get over their initial fears of the unknown in security &amp; privacy</li><li>How Melanie guides and supports technical teams at Discernible on effective communications</li><li>How to prevent &apos;Privacy Outrage&apos;</li><li>The value of preventing privacy snafus rather than focusing only on crisis comms</li><li>How companies can use technical communication strategies &amp; tactics to earn trust with the public</li><li>The problem with incentives - why most social media metrics have been bullshit for far too long</li><li>Why Melanie decided to leave big tech to start Discernible</li><li>Insight into the 7 Arthur W. Page Society Principles, a &apos;code of ethics&apos; for communications professionals</li><li>What makes for a good PR story that the media would want to cover</li><li>Why press releases are mostly ineffective except for announcing funding raises</li><li>The importance of educating the community for which you&apos;re building</li><li>Melanie&apos;s advice to Elon Musk, who does not invest in a comms team</li><li>What OpenAI could have done differently, and whether their go-to-market strategy was effective</li><li>The importance of elevating Compliance teams to Business Advisors in the eyes of stakeholders</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Subscribe to the <a href='https://discernibleinc.com/newsletter-signup'>Discernible newsletter</a></li><li>Discover <a href='https://github.com/readme'>Github&apos;s ReadMe Newsletter</a></li><li>Learn about the <a href='https://www.bellisario.psu.edu/page-center/about/arthur-w-page/the-page-principles'>Arthur W. Page Principles</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Melanie on <a href='https://www.linkedin.com/in/melanieensign'>LinkedIn</a></li><li>Follow Melanie on <a href='https://Mastodon: Wednesday@defcon.social/about'>Mastodon</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>I&apos;m delighted to welcome guest, <a href='https://www.linkedin.com/in/melanieensign'>Melanie Ensign</a>, Founder and CEO of <a href='https://discernibleinc.com/'>Discernible</a>, where she helps organizations adopt effective communication strategies to improve risk-related outcomes. She&apos;s managed security &amp; privacy communications for some of the world&apos;s most notable brands, including Facebook, Uber &amp; AT&amp;T.<br/><br/>Melanie counsels executives and technical teams to cut through internal politics, dysfunctional inertia &amp; meaningless metrics. For the past 10 years, she&apos;s also led the press department &amp; communication strategy for DEF CON. Also, Melanie is an accomplished scuba diver and brings lessons learned preventing, preparing for &amp; navigating unexpected high-risk underwater incidents to her work in security &amp; privacy. Today&apos;s discussion focuses on the importance of communication strategies and tactics for privacy engineering teams. </p><p><b>Topics Covered</b>:</p><ul><li>Melanie&apos;s career journey and how she leveraged her experience in shark science to help executives get over their initial fears of the unknown in security &amp; privacy</li><li>How Melanie guides and supports technical teams at Discernible on effective communications</li><li>How to prevent &apos;Privacy Outrage&apos;</li><li>The value of preventing privacy snafus rather than focusing only on crisis comms</li><li>How companies can use technical communication strategies &amp; tactics to earn trust with the public</li><li>The problem with incentives - why most social media metrics have been bullshit for far too long</li><li>Why Melanie decided to leave big tech to start Discernible</li><li>Insight into the 7 Arthur W. Page Society Principles, a &apos;code of ethics&apos; for communications professionals</li><li>What makes for a good PR story that the media would want to cover</li><li>Why press releases are mostly ineffective except for announcing funding raises</li><li>The importance of educating the community for which you&apos;re building</li><li>Melanie&apos;s advice to Elon Musk, who does not invest in a comms team</li><li>What OpenAI could have done differently, and whether their go-to-market strategy was effective</li><li>The importance of elevating Compliance teams to Business Advisors in the eyes of stakeholders</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Subscribe to the <a href='https://discernibleinc.com/newsletter-signup'>Discernible newsletter</a></li><li>Discover <a href='https://github.com/readme'>Github&apos;s ReadMe Newsletter</a></li><li>Learn about the <a href='https://www.bellisario.psu.edu/page-center/about/arthur-w-page/the-page-principles'>Arthur W. Page Principles</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Melanie on <a href='https://www.linkedin.com/in/melanieensign'>LinkedIn</a></li><li>Follow Melanie on <a href='https://Mastodon: Wednesday@defcon.social/about'>Mastodon</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12697654-s2e16-words-with-impact-communication-tips-for-privacy-technologists-with-melanie-ensign-discernible.mp3" length="43023533" type="audio/mpeg" />
    <itunes:image href="https://storage.buzzsprout.com/641ygcrjcx6k826lft1chqyg3tbx?.jpg" />
    <itunes:author>Debra J Farber / Melanie Ensign</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12697654</guid>
    <pubDate>Tue, 02 May 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12697654/transcript" type="text/html" />
    <podcast:soundbite startTime="1257.439" duration="53.5" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12697654/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E16: Words with Impact; Communication Tips for Privacy Technologists with Melanie Ensign (Discernible)" />
  <psc:chapter start="1:15" title="Introducing Melanie Ensign, CEO of Discernible" />
  <psc:chapter start="2:39" title="Melanie describes her journey from studying shark science and marine biology to a focusing on corporate communications and then specializing her career in privacy and security communications" />
  <psc:chapter start="7:41" title="Melanie explains how she leveraged her experience working in shark science to help executives get over their initial fear of the unknown in the realm of security" />
  <psc:chapter start="11:31" title="How Melanie guides and supports technical teams like privacy &amp; security engineers for effective communications" />
  <psc:chapter start="14:05" title="How to prevent &#39;Privacy Outrage&#39;" />
  <psc:chapter start="16:36" title="The importance of preventing privacy snafus rather than over-focusing on crisis comms" />
  <psc:chapter start="19:49" title="Authenticity - how companies can use technical communication strategies &amp; tactics to earn trust with the public" />
  <psc:chapter start="23:29" title="The problem with incentives - why most social media metrics have been bullshit &amp; remained that way for so long" />
  <psc:chapter start="26:38" title="Why Melanie decided to leave big tech to start Discernible" />
  <psc:chapter start="31:15" title="Melanie outlines the 7 Arthur W. Page Society Principles, a sort of &#39;code of ethics&#39; for communications professionals," />
  <psc:chapter start="35:07" title="The do&#39;s &amp; don&#39;ts when communicating for public relations purposes?" />
  <psc:chapter start="39:30" title="Why press releases are mostly ineffective except for announcing funding raises" />
  <psc:chapter start="42:45" title="The importance of educating the community you&#39;re building for" />
  <psc:chapter start="46:28" title="Melanie&#39;s advice to Elon Musk, who devalues the importance of a comms team" />
  <psc:chapter start="47:40" title="Debra &amp; Melanie discuss ChatGPT, what OpenAI could have done differently, and whether their go-to-market strategy was effective" />
  <psc:chapter start="51:39" title="The importance of elevating Compliance teams to Business Advisors in the eyes of stakeholders in order to demonstrate business value" />
  <psc:chapter start="55:21" title="Melanie shares helpful resources for effective communications" />
</psc:chapters>
    <itunes:duration>3581</itunes:duration>
    <itunes:keywords>communications, privacy, trust, authenticity, Discernible Inc, crisis comms, PR, public relations</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>16</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E15: &#39;Watching the Watchers: Transparency &amp; Control Research&#39; with Umar Iqbal, PhD (University of Washington)</itunes:title>
    <title>S2E15: &#39;Watching the Watchers: Transparency &amp; Control Research&#39; with Umar Iqbal, PhD (University of Washington)</title>
    <itunes:summary><![CDATA[This week's guest is Umar Iqbal, PhD, a Postdoctoral Scholar at the Paul G. Allen School of Computer Science &amp; Engineering at the University of Washington, working in the Security and Privacy Research Lab. Umar focuses his research on two themes: 1) bringing transparency into data collection and usage practices, and 2) enabling individuals to have control over their own data by identifying &amp; restricting privacy-invasive data collection &amp; usage practices of online services  His lon...]]></itunes:summary>
    <description><![CDATA[<p>This week&apos;s guest is <a href='https://www.linkedin.com/in/umar-iqbal-380a6752/'>Umar Iqbal, PhD</a>, a Postdoctoral Scholar at the <a href='https://cs.washington.edu/'>Paul G. Allen School of Computer Science &amp; Engineering</a> at the University of Washington, working in the <a href='https://seclab.cs.washington.edu/'>Security and Privacy Research Lab</a>. Umar focuses his research on two themes: 1) bringing <em>transparency</em> into data collection and usage practices, and 2) enabling individuals to have <em>control</em> <em>over their own data</em> by identifying &amp; restricting privacy-invasive data collection &amp; usage practices of online services</p><p><br/>His long-term research vision is to create an environment where users can reap the benefits of technology without losing their privacy by enabling preemptive privacy protections and establishing &apos;checks &amp; balances&apos; on the Internet. In this discussion, we discuss his previous and current research with a goal of empowering people to protect their privacy on the Internet. </p><p><br/><b>Topics Covered:</b></p><ul><li>Why Umar focused his research on transparency</li><li>Umar&apos;s research relating to transparency, data collection &amp; use, with a focus on Amazon&apos;s smart speaker &amp; metadata privacy and potential EU regulatory enforcement</li><li>His transparency-related work related to browsers &amp; API&apos;s, and the growing problem of using fingerprinting techniques to track people without consent</li><li>How Umar plans to bring control to individuals by restricting online privacy-invasive data collections</li><li>How he used a ML technique to detect browser fingerprinting scripts based on their functionality</li><li>Umar&apos;s research to determine the prevalence of online tracking &amp; measure how effective currently-available tracker detection tools are </li><li>His research on early detection of emerging privacy threats (e.g., &apos;browser fingerprinting&apos; &amp; &apos;navigational tracking&apos;, etc.) and his investigation of privacy issues related to IoT (e.g., smart speakers &amp; health &amp; fitness bands that analyze people&apos;s voices)</li><li>How we can ensure strong privacy guarantees and make a more accountable Internet</li><li>Why regulations need technological support to be effective for enforcement</li><li>Umar&apos;s advice to developers / hackers looking for &apos;privacy bugs&apos; via dynamic code analysis and a discussion of the future of &apos;privacy bug bounties&apos;</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Read Umar&apos;s papers: <a href='https://scholar.google.com/citations?user=K1GBN0MAAAAJ'>Google Scholar Citations</a></li></ul><p><b>Guest Info:</b></p><ul><li>Learn about Umar on <a href='https://umariqbal.com/'>his website</a></li><li>Connect with Umar on <a href='https://www.linkedin.com/in/umar-iqbal-380a6752/'>LinkedIn</a></li><li>Follow Umar on <a href='https://twitter.com/umaarr6'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week&apos;s guest is <a href='https://www.linkedin.com/in/umar-iqbal-380a6752/'>Umar Iqbal, PhD</a>, a Postdoctoral Scholar at the <a href='https://cs.washington.edu/'>Paul G. Allen School of Computer Science &amp; Engineering</a> at the University of Washington, working in the <a href='https://seclab.cs.washington.edu/'>Security and Privacy Research Lab</a>. Umar focuses his research on two themes: 1) bringing <em>transparency</em> into data collection and usage practices, and 2) enabling individuals to have <em>control</em> <em>over their own data</em> by identifying &amp; restricting privacy-invasive data collection &amp; usage practices of online services</p><p><br/>His long-term research vision is to create an environment where users can reap the benefits of technology without losing their privacy by enabling preemptive privacy protections and establishing &apos;checks &amp; balances&apos; on the Internet. In this discussion, we discuss his previous and current research with a goal of empowering people to protect their privacy on the Internet. </p><p><br/><b>Topics Covered:</b></p><ul><li>Why Umar focused his research on transparency</li><li>Umar&apos;s research relating to transparency, data collection &amp; use, with a focus on Amazon&apos;s smart speaker &amp; metadata privacy and potential EU regulatory enforcement</li><li>His transparency-related work related to browsers &amp; API&apos;s, and the growing problem of using fingerprinting techniques to track people without consent</li><li>How Umar plans to bring control to individuals by restricting online privacy-invasive data collections</li><li>How he used a ML technique to detect browser fingerprinting scripts based on their functionality</li><li>Umar&apos;s research to determine the prevalence of online tracking &amp; measure how effective currently-available tracker detection tools are </li><li>His research on early detection of emerging privacy threats (e.g., &apos;browser fingerprinting&apos; &amp; &apos;navigational tracking&apos;, etc.) and his investigation of privacy issues related to IoT (e.g., smart speakers &amp; health &amp; fitness bands that analyze people&apos;s voices)</li><li>How we can ensure strong privacy guarantees and make a more accountable Internet</li><li>Why regulations need technological support to be effective for enforcement</li><li>Umar&apos;s advice to developers / hackers looking for &apos;privacy bugs&apos; via dynamic code analysis and a discussion of the future of &apos;privacy bug bounties&apos;</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Read Umar&apos;s papers: <a href='https://scholar.google.com/citations?user=K1GBN0MAAAAJ'>Google Scholar Citations</a></li></ul><p><b>Guest Info:</b></p><ul><li>Learn about Umar on <a href='https://umariqbal.com/'>his website</a></li><li>Connect with Umar on <a href='https://www.linkedin.com/in/umar-iqbal-380a6752/'>LinkedIn</a></li><li>Follow Umar on <a href='https://twitter.com/umaarr6'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12401515-s2e15-watching-the-watchers-transparency-control-research-with-umar-iqbal-phd-university-of-washington.mp3" length="28512703" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/446086</link>
    <itunes:image href="https://storage.buzzsprout.com/oagfval9ypb1ajr5g1j4vobd3c87?.jpg" />
    <itunes:author>Debra J Farber / Umar Iqbal</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12401515</guid>
    <pubDate>Tue, 18 Apr 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12401515/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12401515/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="Intro" />
  <psc:chapter start="1:15" title="Introducing Umar Iqbal" />
  <psc:chapter start="2:21" title="Umar explains why he focuses his research on bringing transparency into the data collection and usage" />
  <psc:chapter start="3:48" title="How Umar addresses this transparency challenge with his work / research" />
  <psc:chapter start="6:11" title="Umar details his research relating to transparency, data collection &amp; use, with a focus on Amazon&#39;s smart speaker &amp; metadata, &amp; potential regulatory enforcement in the EU" />
  <psc:chapter start="11:41" title="Umar describes his transparency-related work related to browsers &amp; API&#39;s and the growing problem of using fingerprinting techniques to track people without consent" />
  <psc:chapter start="13:57" title="Umar explains his approach to bring control to individuals by restricting privacy invasive data collection on the web" />
  <psc:chapter start="17:07" title="Umar walks us through his ML technique, which detected browser fingerprinting scripts based on their functionality with the help of ML" />
  <psc:chapter start="18:43" title="Umar&#39;s research to determine the prevalence of online tracking &amp; measure the effectiveness of existing tracker detection tools like ad blockers" />
  <psc:chapter start="19:37" title="Umar&#39;s research on early detection of emerging privacy threats: &#39;browser fingerprinting&#39; &amp; &#39;navigational tracking&#39;" />
  <psc:chapter start="20:17" title="Umar&#39;s investigation of privacy issues related to IoT: smart speakers and health &amp; fitness bands that analyze user voice" />
  <psc:chapter start="21:04" title="Umar&#39;s long-term research vision to enable an environment where users can reap the benefits of technology without losing their privacy by 1) enabling preemptive privacy protections; and 2) establishing &#39;checks &amp; balances&#39; on the Internet" />
  <psc:chapter start="23:44" title="Umar describes a big developer &amp; researcher challenge: the information they need to understand and mitigate the threats is not readily available" />
  <psc:chapter start="24:48" title="Umar addresses how we can ensure strong privacy guarantees, especially around his research to make for a more accountable Internet" />
  <psc:chapter start="28:21" title="Why regulations need technological support to be effective, and Umar&#39;s focus on building tools for regulators to detect infringements" />
  <psc:chapter start="30:20" title="Umar describes his approach to building tools for individuals to exercise their data protection rights" />
  <psc:chapter start="32:31" title="Umar&#39;s advice to developers / hackers looking for privacy issues via dynamic code analysis; the future of &#39;privacy bug bounties&#39;" />
  <psc:chapter start="36:19" title="Discussion of the recent academic-policy workshop: &quot;Beyond the FTC: The Future of Privacy Enforcement&quot;" />
  <psc:chapter start="38:54" title="Outro" />
</psc:chapters>
    <itunes:duration>2371</itunes:duration>
    <itunes:keywords>transparency, control, ethical hacking, online tracking, privacy enhancing technologies, PETs, privacy guarantees, privacy assurance</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>15</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E14: Addressing Privacy with Static Analysis Techniques Like ‘Taint-Tracking’ &amp; ‘Data Flow Analysis’ with Suchakra Sharma (Privado.ai)</itunes:title>
    <title>S2E14: Addressing Privacy with Static Analysis Techniques Like ‘Taint-Tracking’ &amp; ‘Data Flow Analysis’ with Suchakra Sharma (Privado.ai)</title>
    <itunes:summary><![CDATA[This week, we welcome Suchakra Sharma, Chief Scientist at Privado.ai, where he builds code analysis tools for data privacy &amp; security. Previously, he earned his PhD in Computer Engineering from Polytechnique Montreal, where he worked on eBPF Technology and hardware-assisted tracing techniques for OS Analysis. In this conversation, we delve into Suchakra’s background in shifting left for security and how he applies traditional, tested static analysis techniques — such as 'taint tracking' a...]]></itunes:summary>
    <description><![CDATA[<p>This week, we welcome <a href='https://www.linkedin.com/in/suchakrasharma/'>Suchakra Sharma</a>, Chief Scientist at <a href='https://www.privado.ai/'>Privado.ai</a>, where he builds code analysis tools for data privacy &amp; security. Previously, he earned his PhD in Computer Engineering from Polytechnique Montreal, where he worked on eBPF Technology and hardware-assisted tracing techniques for OS Analysis. In this conversation, we delve into Suchakra’s background in shifting left for security and how he applies traditional, tested static analysis techniques — such as &apos;taint tracking&apos; and &apos;data flow analysis&apos; — for use on large code bases at scale to help fix privacy leaks right at the source.<br/><br/>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer friendly privacy platform.<br/></b>---------</p><p>Suchakra aligns himself with the philosophical aspects of privacy and wishes to work on anything that helps in limiting the erosion of privacy in modern society, since privacy is fundamental to all of us. These kinds of needs have always been here, and as societies have advanced, this is a time when we require more guarantees of privacy. After all, it is humans that are behind systems and it is humans that are going to be affected by the machines that we build. Check out this fascinating discussion on how to shift privacy left in your organization.<br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Why Suchakra was interested in privacy after focusing on static code analysis for security</li><li>What &apos;shift left&apos; means and lessons learned from the &apos;shift security left&apos; movement that can be applied to &apos;shift privacy left&apos; efforts</li><li>Sociological perspectives on how humans developed a need for keeping things &apos;private&apos; from others</li><li>How to provide engineering-focused guarantees around privacy today &amp; what the role should be of engineers within this &apos;shift privacy left&apos; paradigm</li><li>Suchakra&apos;s USENIX Enigma talk &amp; discussion of &apos;taint tracking&apos; &amp; &apos;data flow analysis&apos; techniques</li><li>Which companies should build in-house tooling for static analysis, and which should be outsourcing to experienced vendors like Privado</li><li>How to address &apos;privacy bugs&apos; in code; why it&apos;s important to have an &apos;auditor&apos;s mindset;&apos; &amp;, why we&apos;ll see &apos;Privacy Bug Bounty Programs&apos; soon</li><li>Suchakra&apos;s advice to engineering managers to move the needle on privacy in their orgs</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Join Privado&apos;s <a href='https://privado-community.slack.com/join/shared_invite/zt-yk5zcxh3-gj8sS9w6SvL5lNYZLMbIpw#/shared-invite/email'>Slack Community</a></li><li>Review Privado&apos;s <a href='https://www.privado.ai/open-source'>Open Source Code Scanning Tools</a></li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Suchakra on <a href='https://www.linkedin.com/in/suchakrasharma/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, we welcome <a href='https://www.linkedin.com/in/suchakrasharma/'>Suchakra Sharma</a>, Chief Scientist at <a href='https://www.privado.ai/'>Privado.ai</a>, where he builds code analysis tools for data privacy &amp; security. Previously, he earned his PhD in Computer Engineering from Polytechnique Montreal, where he worked on eBPF Technology and hardware-assisted tracing techniques for OS Analysis. In this conversation, we delve into Suchakra’s background in shifting left for security and how he applies traditional, tested static analysis techniques — such as &apos;taint tracking&apos; and &apos;data flow analysis&apos; — for use on large code bases at scale to help fix privacy leaks right at the source.<br/><br/>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer friendly privacy platform.<br/></b>---------</p><p>Suchakra aligns himself with the philosophical aspects of privacy and wishes to work on anything that helps in limiting the erosion of privacy in modern society, since privacy is fundamental to all of us. These kinds of needs have always been here, and as societies have advanced, this is a time when we require more guarantees of privacy. After all, it is humans that are behind systems and it is humans that are going to be affected by the machines that we build. Check out this fascinating discussion on how to shift privacy left in your organization.<br/><br/></p><p><b>Topics Covered:</b></p><ul><li>Why Suchakra was interested in privacy after focusing on static code analysis for security</li><li>What &apos;shift left&apos; means and lessons learned from the &apos;shift security left&apos; movement that can be applied to &apos;shift privacy left&apos; efforts</li><li>Sociological perspectives on how humans developed a need for keeping things &apos;private&apos; from others</li><li>How to provide engineering-focused guarantees around privacy today &amp; what the role should be of engineers within this &apos;shift privacy left&apos; paradigm</li><li>Suchakra&apos;s USENIX Enigma talk &amp; discussion of &apos;taint tracking&apos; &amp; &apos;data flow analysis&apos; techniques</li><li>Which companies should build in-house tooling for static analysis, and which should be outsourcing to experienced vendors like Privado</li><li>How to address &apos;privacy bugs&apos; in code; why it&apos;s important to have an &apos;auditor&apos;s mindset;&apos; &amp;, why we&apos;ll see &apos;Privacy Bug Bounty Programs&apos; soon</li><li>Suchakra&apos;s advice to engineering managers to move the needle on privacy in their orgs</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Join Privado&apos;s <a href='https://privado-community.slack.com/join/shared_invite/zt-yk5zcxh3-gj8sS9w6SvL5lNYZLMbIpw#/shared-invite/email'>Slack Community</a></li><li>Review Privado&apos;s <a href='https://www.privado.ai/open-source'>Open Source Code Scanning Tools</a></li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Suchakra on <a href='https://www.linkedin.com/in/suchakrasharma/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12308815-s2e14-addressing-privacy-with-static-analysis-techniques-like-taint-tracking-data-flow-analysis-with-suchakra-sharma-privado-ai.mp3" length="25032756" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/436088</link>
    <itunes:image href="https://storage.buzzsprout.com/4iffm98hywtg1aokmffhtvedqkm8?.jpg" />
    <itunes:author>Debra J. Farber / Suchakra Sharma</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12308815</guid>
    <pubDate>Tue, 11 Apr 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12308815/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12308815/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E14: Addressing Privacy with Static Analysis Techniques Like ‘Taint-Tracking’ &amp; ‘Data Flow Analysis’ with Suchakra Sharma (Privado.ai)" />
  <psc:chapter start="1:25" title="Introducing Suchakra Sharma" />
  <psc:chapter start="2:51" title="How Suchakra got interested in privacy after focusing on static code analysis for security" />
  <psc:chapter start="5:43" title="What &#39;shift left&#39; means to Suchakra, and lessons learned from the &#39;shift security left&#39; movement that can be applied to &#39;shift privacy left&#39; efforts" />
  <psc:chapter start="9:17" title="Suchakra shares some sociological stories of how humans developed a need to keep certain things &#39;private&#39;" />
  <psc:chapter start="12:56" title="How we can provide guarantees around privacy today in a engineering-focused way, and what the role should be of engineers in this &#39;shift privacy left&#39; paradigm" />
  <psc:chapter start="14:58" title="Debra &amp; Suchakra discuss his USENIX Enigma talk: &#39;Building an Automated Machine for Discovering Privacy Violations at Scale;&#39; and Suchakra describes techniques like &#39;taint tracking&#39; and &#39;data flow analysis&#39; for static code analysis" />
  <psc:chapter start="19:02" title="Suchakra describes what it takes to build static code analysis tooling and gives examples of large companies that have build their own (i.e., Facebook, Microsoft, Gitlab, Github)" />
  <psc:chapter start="23:09" title="Suchakra addresses how developers &amp; privacy engineers can find &amp; fix &#39;privacy bugs&#39; in code; why it&#39;s important to have an &#39;auditor&#39;s mindset;&#39; and, why he believes we&#39;ll see &#39;Privacy Bug Bounty Programs&#39; in our future" />
  <psc:chapter start="28:35" title="Suchakra&#39;s advice to engineering managers to move the needle on privacy in their organizations" />
  <psc:chapter start="32:13" title="Suchakra recommends relevant conferences and events to stay plugged into this space" />
</psc:chapters>
    <itunes:duration>2082</itunes:duration>
    <itunes:keywords>taint tracking, static code analysis, data flow analysis, privacy bug bounty, privacy vulnerabilities, privacy attacks</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>14</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E13: Diving Deep into Fully Homomorphic Encryption (FHE) with Kurt R. Rohloff (Duality Technologies)</itunes:title>
    <title>S2E13: Diving Deep into Fully Homomorphic Encryption (FHE) with Kurt R. Rohloff (Duality Technologies)</title>
    <itunes:summary><![CDATA[I am delighted to welcome this week’s guest, Kurt Rohloff. Kurt is the CTO and Co-Founder of Duality Technologies, a privacy tech company that enables organizations to leverage data across their ecosystem and generate joint insights for better business while preserving privacy. Kurt was also Co-Founder of the OpenFHE Homomorphic Encryption Software Library that enables practical and usable privacy and collaborative data analytics.   He's successfully led teams that are developing, transitioni...]]></itunes:summary>
    <description><![CDATA[<p>I am delighted to welcome this week’s guest, Kurt Rohloff. Kurt is the CTO and Co-Founder of <a href='https://dualitytech.com/'>Duality Technologies</a>, a privacy tech company that enables organizations to leverage data across their ecosystem and generate joint insights for better business while preserving privacy. Kurt was also Co-Founder of the <a href='https://www.openfhe.org/'>OpenFHE Homomorphic Encryption Software Library</a> that enables practical and usable privacy and collaborative data analytics.<br/><br/></p><p>He&apos;s successfully led teams that are developing, transitioning, and applying first-in-the-world technology capabilities for both the <a href='https://www.defense.gov/'>Department of Defense</a> as well as for commercial use. Kurt specializes in generating, developing, and commercializing innovative secure computing technologies with a focus on privacy and AI/ML at scale. In this episode, we discuss use cases for leveraging Fully Homomorphic Encryption (FHE) and other PETs.<br/><br/></p><p>In a previous episode, we spoke about federated learning; and in this episode, we learn how to achieve secure federated learning using fully homomorphic encryption (FHE) techniques.</p><p><br/>Kurt has been focused on and supported homomorphic encryption since it was first discovered, including his involvement in one of the seminal projects, funded by <a href='https://www.darpa.mil/'>DARPA</a>, where he ran an implementation team, called <a href='https://www.darpa.mil/program/programming-computation-on-encrypted-data'>PROCEED</a>.</p><p><br/>FHE, as opposed to other kinds of privacy technologies, is more general and malleable. As each organization has different needs when it comes to data collaboration, Duality Technologies offers three separate models for collaboration, which enable organizations to secure sensitive data while still allowing different types of sharing.</p><p><br/><b>Topics Covered</b>:</p><ul><li>How companies can gain utility from a dataset while protecting the privacy of individuals or entities</li><li>How FHEs help with fraud prevention, How FHEs help with fraud prevention, secure investigations, real-world evidence &amp; genome-wide association studies</li><li>Use cases for the three collaboration models Duality offers: Single Data Set, Horizontal Data Analysis, and Vertical Data Analysis</li><li>Comparison &amp; trade-offs involved between federated learning and homomorphic encryption</li><li>Proliferation of FHE Standards</li><li>OpenFHE.org, the leading open source library for implementations of fully homomorphic encryption protocols</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Review the <a href='https://www.openfhe.org/'>OpenFHE encryption software library</a></li><li>Learn about <a href='https://dualitytech.com/'>Duality</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Connect with Kurt on <a href='https://www.linkedin.com/in/kurt-rohloff/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>I am delighted to welcome this week’s guest, Kurt Rohloff. Kurt is the CTO and Co-Founder of <a href='https://dualitytech.com/'>Duality Technologies</a>, a privacy tech company that enables organizations to leverage data across their ecosystem and generate joint insights for better business while preserving privacy. Kurt was also Co-Founder of the <a href='https://www.openfhe.org/'>OpenFHE Homomorphic Encryption Software Library</a> that enables practical and usable privacy and collaborative data analytics.<br/><br/></p><p>He&apos;s successfully led teams that are developing, transitioning, and applying first-in-the-world technology capabilities for both the <a href='https://www.defense.gov/'>Department of Defense</a> as well as for commercial use. Kurt specializes in generating, developing, and commercializing innovative secure computing technologies with a focus on privacy and AI/ML at scale. In this episode, we discuss use cases for leveraging Fully Homomorphic Encryption (FHE) and other PETs.<br/><br/></p><p>In a previous episode, we spoke about federated learning; and in this episode, we learn how to achieve secure federated learning using fully homomorphic encryption (FHE) techniques.</p><p><br/>Kurt has been focused on and supported homomorphic encryption since it was first discovered, including his involvement in one of the seminal projects, funded by <a href='https://www.darpa.mil/'>DARPA</a>, where he ran an implementation team, called <a href='https://www.darpa.mil/program/programming-computation-on-encrypted-data'>PROCEED</a>.</p><p><br/>FHE, as opposed to other kinds of privacy technologies, is more general and malleable. As each organization has different needs when it comes to data collaboration, Duality Technologies offers three separate models for collaboration, which enable organizations to secure sensitive data while still allowing different types of sharing.</p><p><br/><b>Topics Covered</b>:</p><ul><li>How companies can gain utility from a dataset while protecting the privacy of individuals or entities</li><li>How FHEs help with fraud prevention, How FHEs help with fraud prevention, secure investigations, real-world evidence &amp; genome-wide association studies</li><li>Use cases for the three collaboration models Duality offers: Single Data Set, Horizontal Data Analysis, and Vertical Data Analysis</li><li>Comparison &amp; trade-offs involved between federated learning and homomorphic encryption</li><li>Proliferation of FHE Standards</li><li>OpenFHE.org, the leading open source library for implementations of fully homomorphic encryption protocols</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Review the <a href='https://www.openfhe.org/'>OpenFHE encryption software library</a></li><li>Learn about <a href='https://dualitytech.com/'>Duality</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Connect with Kurt on <a href='https://www.linkedin.com/in/kurt-rohloff/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12308804-s2e13-diving-deep-into-fully-homomorphic-encryption-fhe-with-kurt-r-rohloff-duality-technologies.mp3" length="35132733" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/429036</link>
    <itunes:image href="https://storage.buzzsprout.com/qotarok94nc8igr8s759stugnjox?.jpg" />
    <itunes:author>Debra J. Farber / Kurt R. Rohloff</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12308804</guid>
    <pubDate>Tue, 04 Apr 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12308804/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12308804/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E13: Diving Deep into Fully Homomorphic Encryption (FHE) with Kurt R. Rohloff (Duality Technologies)" />
  <psc:chapter start="1:15" title="Introducting Kurt R. Rohloff, CTO &amp; Co-Founder of Duality Technologies" />
  <psc:chapter start="2:28" title="How and why companies have been struggling to find ways to conduct analytics while preserving privacy" />
  <psc:chapter start="4:27" title="Introducing &#39;homomorphic encryption&#39;" />
  <psc:chapter start="6:07" title="Duality Collaboration Model 1: Analytics of a single data set" />
  <psc:chapter start="7:55" title="Duality Collaboration Model 2: Analytics of a union of datasets (Horizontal Data Analysis)" />
  <psc:chapter start="11:54" title="Duality Collaboration Model 3: Analytics of join datasets (Vertical Data Analysis)" />
  <psc:chapter start="14:07" title="Comparing &#39;federated learning&#39; to &#39;federated analysis&#39; using FHE" />
  <psc:chapter start="16:25" title="Tradeoffs involved with FHE" />
  <psc:chapter start="18:58" title="Kurt describes his work with DARPA&#39;s PROCEED program using FHE a decade ago" />
  <psc:chapter start="24:57" title="FHE Use Case 1: financial services - fraud prevention" />
  <psc:chapter start="28:35" title="FHE Use Case 2: government - secure investigations" />
  <psc:chapter start="31:39" title="FHE Use Cases 3 &amp; 4: real-world evidence and genome-wide association studies" />
  <psc:chapter start="34:34" title="Proliferation of FHE Standards" />
  <psc:chapter start="41:37" title="Kurt discusses OpenFHE.org, the leading open source library for implementations of fully homomorphic encryption protocols" />
  <psc:chapter start="44:25" title="Resources and conferences that Kurt recommends to privacy technologists and data scientists who want to learn more about this space" />
</psc:chapters>
    <itunes:duration>2923</itunes:duration>
    <itunes:keywords>fully homomorphic encryption, PETs, FHE, privacy enhancing technologies, duality technologies</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>13</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E12: &#39;Building Powerful ML Models with Privacy &amp; Ethics&#39; with Katharine Jarmul (ThoughtWorks)</itunes:title>
    <title>S2E12: &#39;Building Powerful ML Models with Privacy &amp; Ethics&#39; with Katharine Jarmul (ThoughtWorks)</title>
    <itunes:summary><![CDATA[This week, I'm joined by Katharine Jarmul, Principal Data Scientist at Thoughtworks &amp; author of the the forthcoming book, "Practical Data Privacy: Enhancing Privacy and Security in Data." Katharine began asking questions similar to those of today's ethical machine learning community as a university student working on her undergrad thesis during the war in Iraq. She focused that research on natural language processing and investigated the statistical differences between embedded &amp; non-...]]></itunes:summary>
    <description><![CDATA[<p>This week, I&apos;m joined by <a href='https://kjamistan.com/'>Katharine Jarmul</a>, Principal Data Scientist at Thoughtworks &amp; author of the the forthcoming book, &quot;Practical Data Privacy: Enhancing Privacy and Security in Data.&quot; Katharine began asking questions similar to those of today&apos;s ethical machine learning community as a university student working on her undergrad thesis during the war in Iraq. She focused that research on natural language processing and investigated the statistical differences between embedded &amp; non-embedded reporters. In our conversation, we discuss ethical &amp; secure machine learning approaches, threat modeling against adversarial attacks, the importance of distributed data setups, and what Katharine wants data scientists to know about privacy and ethical ML.<br/><br/></p><p>Katharine believes that we should never fall victim to a &apos;techno-solutionist&apos; mindset where we believe that we can solve a deep societal problem simply with tech alone. However, by solving issues around privacy &amp; consent with data collection, we can more easily address the challenges with ethical ML.  In fact, ML research is finally beginning to broaden and include the intersections of law, privacy, and ethics. Katharine anticipates that data scientists will embrace PETs that facilitate data sharing in a privacy-preserving way; and, she evangelizes the un-normalization of sending ML data from one company to another. </p><p><br/><b>Topics Covered:</b></p><ul><li>Katharine&apos;s motivation for writing a book on privacy for a data scientist audience and what she hopes readers will learn from it</li><li>What areas must be addressed for ML to be considered ethical</li><li>Overlapping AI/ML &amp; Privacy goals</li><li>Challenges with sharing data for analytics</li><li>The need for data scientists to embrace PETs</li><li>How PETs will likely mature across orgs over the next 2 years</li><li>Katharine&apos;s &amp; Debra&apos;s favorite PETs</li><li>The importance of threat modeling ML models: discussing &apos;adversarial attacks&apos; like &apos;model inversion&apos; &amp; &apos;membership inference&apos; attacks</li><li>Why companies that train LLMs must be accountable for the safety of their models</li><li>New ethical approaches to data sharing</li><li>Why scraping data off the Internet to train models is the harder, lazier, unethical way to train ML models</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Pre-order the forthcoming book: &quot;<a href='https://www.oreilly.com/library/view/practical-data-privacy/9781098129453/'>Practical Data Privacy</a>&quot;</li><li>Subscribe to Katharine’s newsletter: <a href='https://probablyprivate.com/'>Probably Private</a></li></ul><p><br/><b>Guest Info:</b></p><ul><li>Follow Katharine on <a href='https://www.linkedin.com/in/katharinejarmul/'>LinkedIn</a></li><li>Follow Katharine on <a href='https://twitter.com/kjam'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, I&apos;m joined by <a href='https://kjamistan.com/'>Katharine Jarmul</a>, Principal Data Scientist at Thoughtworks &amp; author of the the forthcoming book, &quot;Practical Data Privacy: Enhancing Privacy and Security in Data.&quot; Katharine began asking questions similar to those of today&apos;s ethical machine learning community as a university student working on her undergrad thesis during the war in Iraq. She focused that research on natural language processing and investigated the statistical differences between embedded &amp; non-embedded reporters. In our conversation, we discuss ethical &amp; secure machine learning approaches, threat modeling against adversarial attacks, the importance of distributed data setups, and what Katharine wants data scientists to know about privacy and ethical ML.<br/><br/></p><p>Katharine believes that we should never fall victim to a &apos;techno-solutionist&apos; mindset where we believe that we can solve a deep societal problem simply with tech alone. However, by solving issues around privacy &amp; consent with data collection, we can more easily address the challenges with ethical ML.  In fact, ML research is finally beginning to broaden and include the intersections of law, privacy, and ethics. Katharine anticipates that data scientists will embrace PETs that facilitate data sharing in a privacy-preserving way; and, she evangelizes the un-normalization of sending ML data from one company to another. </p><p><br/><b>Topics Covered:</b></p><ul><li>Katharine&apos;s motivation for writing a book on privacy for a data scientist audience and what she hopes readers will learn from it</li><li>What areas must be addressed for ML to be considered ethical</li><li>Overlapping AI/ML &amp; Privacy goals</li><li>Challenges with sharing data for analytics</li><li>The need for data scientists to embrace PETs</li><li>How PETs will likely mature across orgs over the next 2 years</li><li>Katharine&apos;s &amp; Debra&apos;s favorite PETs</li><li>The importance of threat modeling ML models: discussing &apos;adversarial attacks&apos; like &apos;model inversion&apos; &amp; &apos;membership inference&apos; attacks</li><li>Why companies that train LLMs must be accountable for the safety of their models</li><li>New ethical approaches to data sharing</li><li>Why scraping data off the Internet to train models is the harder, lazier, unethical way to train ML models</li></ul><p><br/><b>Resources Mentioned:</b></p><ul><li>Pre-order the forthcoming book: &quot;<a href='https://www.oreilly.com/library/view/practical-data-privacy/9781098129453/'>Practical Data Privacy</a>&quot;</li><li>Subscribe to Katharine’s newsletter: <a href='https://probablyprivate.com/'>Probably Private</a></li></ul><p><br/><b>Guest Info:</b></p><ul><li>Follow Katharine on <a href='https://www.linkedin.com/in/katharinejarmul/'>LinkedIn</a></li><li>Follow Katharine on <a href='https://twitter.com/kjam'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12308797-s2e12-building-powerful-ml-models-with-privacy-ethics-with-katharine-jarmul-thoughtworks.mp3" length="39986899" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/423609</link>
    <itunes:image href="https://storage.buzzsprout.com/abll7csf2chrr4babinuctfoer8t?.jpg" />
    <itunes:author>Debra J Farber / Katharine Jarmul</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12308797</guid>
    <pubDate>Tue, 28 Mar 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12308797/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12308797/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E12: &#39;Building Powerful ML Models with Privacy &amp; Ethics&#39; with Katharine Jarmul (ThoughtWorks)" />
  <psc:chapter start="1:15" title="Introducing Katharine Jarmul, Principal Data Scientist at ThoughtWorks, where she shares her data science &amp; privacy origin story" />
  <psc:chapter start="5:01" title="Katharine describes why she was inspired to write the book, Practical Data Privacy" />
  <psc:chapter start="11:30" title="Katharine explains why she wrote a book on privacy for a data scientist audience and what she hopes readers will learn from it" />
  <psc:chapter start="14:44" title="Katharine &amp; Debra discuss what areas must be addressed for machine learning to be considered ethical" />
  <psc:chapter start="20:18" title="Katherine &amp; Debra discuss the overlapping AI/ML &amp; Privacy goals of: fairness, accountability, &amp; transparency as well as overlapping research" />
  <psc:chapter start="23:11" title="Derbra &amp; Katharine discuss challenges with sharing data for analytics; the need for data scientists to embrace PETs; and why we should stop normalizing the idea of sending data to another company...ever" />
  <psc:chapter start="26:26" title="Debra shares her belief that there soon will be an &#39;explosion of PET&#39; uses and cites a new report from The United Nations" />
  <psc:chapter start="27:38" title="Katharine shares her thoughts on how PETs are going to mature across orgs over the next 2 years" />
  <psc:chapter start="29:54" title="Debra &amp; Katharine share each of their favorite PETs - Debra discusses self-sovereign identity and Katharine talks up Secure Multi-Party Computation" />
  <psc:chapter start="33:48" title="Debra discusses her renewed sense of optimism about privacy and technology due the research, implementation, &amp; standardization of PETs" />
  <psc:chapter start="36:15" title="Katharine discusses the need to threat model ML models, discussing &#39;adversarial attacks,&#39; including: &#39;model inversion&#39; attacks &amp; &#39;membership inference&#39; attacks" />
  <psc:chapter start="44:32" title="Debra &amp; Katharine discuss the issue of lack of consent when it comes to training LLMs" />
  <psc:chapter start="49:37" title="Katharine explains why companies that train LLMs must be accountable for the safety of their models, never putting that onus on users" />
  <psc:chapter start="51:19" title="Katharine explains why scraping data off the Internet to train models is actually the harder, lazier way to train ML models" />
  <psc:chapter start="53:52" title="Katharine plugs her ethical data science newsletter, &#39;Probably Private&#39;" />
</psc:chapters>
    <itunes:duration>3328</itunes:duration>
    <itunes:keywords>PETs, ethical machine learning, ThoughtWorks, Practical Data Privacy, Data Wrangling with Python,ChatGPT, fairness, accountability, transparency, ML, privacy enhancing technologies, distributed data, data scraping, consented models, ML attacks</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>12</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E11: Lessons Learned as a Privacy Engineering Manager with Menotti Minutillo (ex-Twitter &amp; Uber)</itunes:title>
    <title>S2E11: Lessons Learned as a Privacy Engineering Manager with Menotti Minutillo (ex-Twitter &amp; Uber)</title>
    <itunes:summary><![CDATA[This week, we gain insights into the profession of privacy engineering with guest Menotti Minutillo, a Sr. Privacy Engineering Manager with 15+ years of experience leading critical programs and product delivery at companies like Uber, Thrive Global &amp; Twitter. He started his career in 2007 on Wall Street as a DevOps &amp; Infrastructure Engineer; and now, Menotti is a sought-after technical privacy expert and Privacy Tech Advisor. In this conversation, we discuss privacy engineering approa...]]></itunes:summary>
    <description><![CDATA[<p>This week, we gain insights into the profession of privacy engineering with guest <a href='https://www.linkedin.com/in/menotti/'>Menotti Minutillo</a>, a Sr. Privacy Engineering Manager with 15+ years of experience leading critical programs and product delivery at companies like Uber, Thrive Global &amp; Twitter. He started his career in 2007 on Wall Street as a DevOps &amp; Infrastructure Engineer; and now, Menotti is a sought-after technical privacy expert and Privacy Tech Advisor. In this conversation, we discuss privacy engineering approaches that have work, the skillsets required for privacy engineering, and the current climate for landing privacy engineering roles.</p><p>Menotti sees privacy engineering as the practice of building or improving info systems to advance a set of privacy goals. It&apos;s like a &apos;layer cake&apos; in that you have different protections and risk reductions based on threat modeling, as well as different specialization capabilities for larger orgs.<br/><br/>It makes a lot of sense that he&apos;s held weaving roles from company to company. His journey into privacy engineering was originally &apos;adjacent work&apos; and today, he shares lessons learned from taking a PET like differential privacy from the lab to systematizing it into an organization to deploying it in the real-world. In this episode, we delve into tools, technical processes, technical standards, the maturing landscape for privacy engineers, and how the success of privacy is coupled with the success of each product shipped.<br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>How Menotti found his way to managing privacy engineering teams</li><li>Menotti&apos;s definition of &apos;privacy engineer&apos; &amp; the skillsets required</li><li>What it was like to work at Uber &amp; Twitter, which have multiple privacy engineering teams</li><li>Best practices for setting up teams &amp; deploying solutions</li><li>Privacy outcomes that privacy engineers should keep top of mind</li><li>Best practices for privacy architecture</li><li>Menotti positive experience while at Uber working with Privacy Researchers from UC Berkeley to take differential privacy from the lab to a real-world deployment</li><li>Lessons learned from times of transition, including while at Twitter during Musk&apos;s takeover </li><li>Whether privacy was a &apos;zero interest rate bet,&apos; and what that means for privacy engineering roles given current economic realities</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Check out the <a href='https://www.usenix.org/conference/pepr23'>PEPR conference</a></li><li>Read &apos;<a href='https://www.linkedin.com/pulse/privacy-zero-interest-rate-bet-menotti-minutillo/?trackingId=DdUDD8IqTqGhZxoZnXE7Kg%3D%3D'>Was Privacy a Zero Interest Rate Bet?</a>&apos;</li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Menotti on <a href='https://www.linkedin.com/in/menotti/'>LinkedIn</a></li><li>Connect with Menotti on <a href='https://macaw.social/@44'>Mastadon</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, we gain insights into the profession of privacy engineering with guest <a href='https://www.linkedin.com/in/menotti/'>Menotti Minutillo</a>, a Sr. Privacy Engineering Manager with 15+ years of experience leading critical programs and product delivery at companies like Uber, Thrive Global &amp; Twitter. He started his career in 2007 on Wall Street as a DevOps &amp; Infrastructure Engineer; and now, Menotti is a sought-after technical privacy expert and Privacy Tech Advisor. In this conversation, we discuss privacy engineering approaches that have work, the skillsets required for privacy engineering, and the current climate for landing privacy engineering roles.</p><p>Menotti sees privacy engineering as the practice of building or improving info systems to advance a set of privacy goals. It&apos;s like a &apos;layer cake&apos; in that you have different protections and risk reductions based on threat modeling, as well as different specialization capabilities for larger orgs.<br/><br/>It makes a lot of sense that he&apos;s held weaving roles from company to company. His journey into privacy engineering was originally &apos;adjacent work&apos; and today, he shares lessons learned from taking a PET like differential privacy from the lab to systematizing it into an organization to deploying it in the real-world. In this episode, we delve into tools, technical processes, technical standards, the maturing landscape for privacy engineers, and how the success of privacy is coupled with the success of each product shipped.<br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>How Menotti found his way to managing privacy engineering teams</li><li>Menotti&apos;s definition of &apos;privacy engineer&apos; &amp; the skillsets required</li><li>What it was like to work at Uber &amp; Twitter, which have multiple privacy engineering teams</li><li>Best practices for setting up teams &amp; deploying solutions</li><li>Privacy outcomes that privacy engineers should keep top of mind</li><li>Best practices for privacy architecture</li><li>Menotti positive experience while at Uber working with Privacy Researchers from UC Berkeley to take differential privacy from the lab to a real-world deployment</li><li>Lessons learned from times of transition, including while at Twitter during Musk&apos;s takeover </li><li>Whether privacy was a &apos;zero interest rate bet,&apos; and what that means for privacy engineering roles given current economic realities</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Check out the <a href='https://www.usenix.org/conference/pepr23'>PEPR conference</a></li><li>Read &apos;<a href='https://www.linkedin.com/pulse/privacy-zero-interest-rate-bet-menotti-minutillo/?trackingId=DdUDD8IqTqGhZxoZnXE7Kg%3D%3D'>Was Privacy a Zero Interest Rate Bet?</a>&apos;</li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Menotti on <a href='https://www.linkedin.com/in/menotti/'>LinkedIn</a></li><li>Connect with Menotti on <a href='https://macaw.social/@44'>Mastadon</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12363553-s2e11-lessons-learned-as-a-privacy-engineering-manager-with-menotti-minutillo-ex-twitter-uber.mp3" length="38484127" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/419910</link>
    <itunes:image href="https://storage.buzzsprout.com/80k4sjtpjjnf1m3cb2fv9hp9elb4?.jpg" />
    <itunes:author>Debra J Farber / Menotti Minutillo</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12363553</guid>
    <pubDate>Tue, 21 Mar 2023 11:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12363553/transcript" type="text/html" />
    <podcast:soundbite startTime="362.083" duration="59.0" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12363553/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E11: Lessons Learned as a Privacy Engineering Manager with Menotti Minutillo (ex-Twitter &amp; Uber)" />
  <psc:chapter start="1:15" title="Introducing Menotti Minutillo" />
  <psc:chapter start="2:22" title="Menotti shares his origin story and found his way to managing privacy engineering teams" />
  <psc:chapter start="6:07" title="Menotti give his definition of a &#39;privacy engineer&#39; &amp; the skillsets required under this job type as the industry matures" />
  <psc:chapter start="12:48" title="Menotti shares what it&#39;s been like to work at Uber and Twitter, which have multiple privacy engineering teams; and best practices for setting up teams &amp; deploying solutions" />
  <psc:chapter start="14:49" title="Why Menotti prefers privacy engineering teams that are multi-disciplinary" />
  <psc:chapter start="17:32" title="The privacy outcomes that privacy engineers should keep top of mind" />
  <psc:chapter start="23:40" title="Menotti shares his views on best practices for privacy architecture" />
  <psc:chapter start="30:21" title="Menotti relays his experience at Uber working with Privacy Researchers (from UC Berkeley) to take differential privacy from the lab to a real-world deployment" />
  <psc:chapter start="36:10" title="Menotti describes lessons learned from working at a company during a transfer of ownership (like Elon Musk&#39;s takeover of Twitter)" />
  <psc:chapter start="38:38" title="Debra &amp; Menutti discussed his recent LinkedIn article, &quot;Was Privacy a Zero Interest Rate Bet?&quot;" />
  <psc:chapter start="42:20" title="Menotti gives his assessment on the current outlook for Privacy Engineering roles given the economic climate" />
  <psc:chapter start="48:45" title="Menotti outlines some of the resources he uses to stay current on privacy engineering" />
</psc:chapters>
    <itunes:duration>3203</itunes:duration>
    <itunes:keywords>privacy engineer, privacy engineering manager, zero interest rate bet, ZIRB, PEPR, Twitter, Uber, privacy outcomes, privacy architecture</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>11</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E10: Leveraging Synthetic Data and Privacy Guarantees with Lipika Ramaswamy (Gretel.ai)</itunes:title>
    <title>S2E10: Leveraging Synthetic Data and Privacy Guarantees with Lipika Ramaswamy (Gretel.ai)</title>
    <itunes:summary><![CDATA[This week, we welcome Lipika Ramaswamy, Senior Applied Scientist at Gretel AI, a privacy tech company that makes it simple to generate anonymized and safe synthetic data via APIs. Previously, Lipika worked as a Data Scientist at LeapYear Technologies, and was the Machine Learning Researcher at Harvard University's Privacy Tools Project.   Lipika’s interest in both machine learning and privacy comes from her love of math and things that can be defined with equations. Her interest was piqued in...]]></itunes:summary>
    <description><![CDATA[<p>This week, we welcome <a href='https://www.linkedin.com/in/lipikaramaswamy/'>Lipika Ramaswamy</a>, Senior Applied Scientist at <a href='https://gretel.ai/'>Gretel AI</a>, a privacy tech company that makes it simple to generate anonymized and safe synthetic data via APIs. Previously, Lipika worked as a Data Scientist at LeapYear Technologies, and was the Machine Learning Researcher at <a href='https://privacytools.seas.harvard.edu/'>Harvard University&apos;s Privacy Tools Project</a>.</p><p><br/></p><p>Lipika’s interest in both machine learning and privacy comes from her love of math and things that can be defined with equations. Her interest was piqued in grad school and accidentally walked into a classroom holding a lecture on Applying Differential Privacy for Data Science. The intersection of data combined with the privacy guarantees that we have available today has kept her hooked ever since.<br/><br/></p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------<br/><br/></p><p>There&apos;s a lot to unpack when it comes to synthetic data &amp; privacy guarantees, as she takes listeners on a deep dive of these compelling topics. Lipika finds elegant how privacy assurances like differential privacy revolve around math and statistics at their core. Essentially, she loves building things with &apos;usable privacy&apos; &amp; security that people can easily use. We also delve into the metrics tracked in the Gretel Synthetic Data Report, which assesses both &apos;statistical integrity&apos; &amp; &apos;privacy levels&apos; of a customer&apos;s training data.<br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>The definition of &apos;synthetic data,&apos; &amp; good use cases</li><li>The process of creating synthetic data</li><li>How to ensure that synthetic data is &apos;privacy-preserving&apos;</li><li>Privacy problems that may arise from overtraining ML models</li><li>When to use synthetic data rather than other techniques like tokenization, anonymization, aggregation &amp; others</li><li>Examples of good use cases vs poor use cases for using synthetic data</li><li>Common misperceptions around synthetic data</li><li>Gretel.ai&apos;s approach to &apos;privacy assurance,&apos; including a focus on &apos;privacy filters,&apos; which prevent some privacy harms outputted by LLMs</li><li>How to plug into the &apos;synthetic data&apos; community</li><li>Who bears the responsibility for educating the public about new technology like LLMs and potential harms</li><li>Highlights from Gretel.ai&apos;s Synthesize 2023 conference</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Join Gretel&apos;s <a href='https://discord.com/invite/U6dAWX4CFP'>Synthetic Data Community on Discord</a></li><li>Watch <a href='https://www.youtube.com/@gretel_ai/featured'>Talks on Synthetic Data on YouTube</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Connect with Lipika on <a href='https://www.linkedin.com/in/lipikaramaswamy/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, we welcome <a href='https://www.linkedin.com/in/lipikaramaswamy/'>Lipika Ramaswamy</a>, Senior Applied Scientist at <a href='https://gretel.ai/'>Gretel AI</a>, a privacy tech company that makes it simple to generate anonymized and safe synthetic data via APIs. Previously, Lipika worked as a Data Scientist at LeapYear Technologies, and was the Machine Learning Researcher at <a href='https://privacytools.seas.harvard.edu/'>Harvard University&apos;s Privacy Tools Project</a>.</p><p><br/></p><p>Lipika’s interest in both machine learning and privacy comes from her love of math and things that can be defined with equations. Her interest was piqued in grad school and accidentally walked into a classroom holding a lecture on Applying Differential Privacy for Data Science. The intersection of data combined with the privacy guarantees that we have available today has kept her hooked ever since.<br/><br/></p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------<br/><br/></p><p>There&apos;s a lot to unpack when it comes to synthetic data &amp; privacy guarantees, as she takes listeners on a deep dive of these compelling topics. Lipika finds elegant how privacy assurances like differential privacy revolve around math and statistics at their core. Essentially, she loves building things with &apos;usable privacy&apos; &amp; security that people can easily use. We also delve into the metrics tracked in the Gretel Synthetic Data Report, which assesses both &apos;statistical integrity&apos; &amp; &apos;privacy levels&apos; of a customer&apos;s training data.<br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>The definition of &apos;synthetic data,&apos; &amp; good use cases</li><li>The process of creating synthetic data</li><li>How to ensure that synthetic data is &apos;privacy-preserving&apos;</li><li>Privacy problems that may arise from overtraining ML models</li><li>When to use synthetic data rather than other techniques like tokenization, anonymization, aggregation &amp; others</li><li>Examples of good use cases vs poor use cases for using synthetic data</li><li>Common misperceptions around synthetic data</li><li>Gretel.ai&apos;s approach to &apos;privacy assurance,&apos; including a focus on &apos;privacy filters,&apos; which prevent some privacy harms outputted by LLMs</li><li>How to plug into the &apos;synthetic data&apos; community</li><li>Who bears the responsibility for educating the public about new technology like LLMs and potential harms</li><li>Highlights from Gretel.ai&apos;s Synthesize 2023 conference</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Join Gretel&apos;s <a href='https://discord.com/invite/U6dAWX4CFP'>Synthetic Data Community on Discord</a></li><li>Watch <a href='https://www.youtube.com/@gretel_ai/featured'>Talks on Synthetic Data on YouTube</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Connect with Lipika on <a href='https://www.linkedin.com/in/lipikaramaswamy/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12308780-s2e10-leveraging-synthetic-data-and-privacy-guarantees-with-lipika-ramaswamy-gretel-ai.mp3" length="32913902" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/412005</link>
    <itunes:image href="https://storage.buzzsprout.com/ydq3iz5h4bml63eflbxoeudfzp5l?.jpg" />
    <itunes:author>Debra J. Farber / Lipika Ramaswamy</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12308780</guid>
    <pubDate>Tue, 14 Mar 2023 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12308780/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12308780/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E10: Leveraging Synthetic Data and Privacy Guarantees with Lipika Ramaswamy (Gretel.ai)" />
  <psc:chapter start="1:15" title="Debra introduces Lipika Ramaswamy, Sr. Applied Scientist at Gretel.ai" />
  <psc:chapter start="1:56" title="Lipika discusses her origin story: her interest in ML &amp; privacy, and how she ended up in the field" />
  <psc:chapter start="3:18" title="Lipika defines &#39;synthetic data&#39; &amp; good use cases for synthetic data" />
  <psc:chapter start="6:12" title="Lipika discusses the process of creating synthetic data" />
  <psc:chapter start="8:18" title="How to ensure that synthetic data is &#39;privacy-preserving&#39;" />
  <psc:chapter start="10:12" title="Privacy problems that may arise from overtraining ML models" />
  <psc:chapter start="11:04" title="When to use synthetic data rather than other techniques like tokenization, anonymization, aggregation &amp; others" />
  <psc:chapter start="17:14" title="Selecting the right PET to use for a specific use case depends on: 1) the data model; 2) the adversarial modeling you&#39;re working with; and, 3) your analytics goals" />
  <psc:chapter start="19:46" title="Good &amp; poor use cases for &#39;synthetic data&#39;" />
  <psc:chapter start="21:20" title="Common misperceptions around synthetic data" />
  <psc:chapter start="24:10" title="Gretel.ai&#39;s approach to &#39;privacy assurance,&#39; including a focus on &#39;privacy filters,&#39; which prevent some privacy harms outputted by LLMs" />
  <psc:chapter start="30:07" title="Lipika recommends several communities to plug into to keep up-to-date with the &#39;synthetic data&#39; community" />
  <psc:chapter start="32:38" title="We discuss the problem of bias in ML/AI" />
  <psc:chapter start="33:26" title="Debra &amp; Lipika discuss who bears the responsibility for educating the public about new technology like LLMs and potential harms" />
  <psc:chapter start="40:38" title="Debra &amp; Lipika discuss the privacy problems involved with training on public-available data that is still considered personal data" />
  <psc:chapter start="43:24" title="Lipika highlights Gretel.ai&#39;s recent &#39;synthetic data&#39; conference, Synthesize 2023" />
</psc:chapters>
    <itunes:duration>2738</itunes:duration>
    <itunes:keywords>synthetic data, privacy guarantees, ML overtraining, privacy assurance, privacy filters, Gretel.ai, Synthesize conference, large language models, LLMs</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>10</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E9: Personalized Noise, Decaying Photos, &amp; Digital Forgetting with Apu Kapadia (Indiana University Bloomington)</itunes:title>
    <title>S2E9: Personalized Noise, Decaying Photos, &amp; Digital Forgetting with Apu Kapadia (Indiana University Bloomington)</title>
    <itunes:summary><![CDATA[In this episode, I'm delighted to welcome Apu Kapadia, Professor of Computer Science and Informatics at the School of Informatics and Computing, Indiana University. His research is focused on the privacy implications of ubiquitous cameras and online photo sharing. More recently, he has examined the cybersecurity and privacy challenges posed by AI-based smart voice assistants that can listen and converse with us.  Prof. Kapadia has been excited by anonymized networks since childhood. He has me...]]></itunes:summary>
    <description><![CDATA[<p>In this episode, I&apos;m delighted to welcome <a href='https://www.linkedin.com/in/akapadia/'>Apu Kapadia</a>, Professor of Computer Science and Informatics at the School of Informatics and Computing, Indiana University. His research is focused on the privacy implications of ubiquitous cameras and online photo sharing. More recently, he has examined the cybersecurity and privacy challenges posed by AI-based smart voice assistants that can listen and converse with us.</p><p><br/>Prof. Kapadia has been excited by anonymized networks since childhood. He has memories of watching movies where a telephone call was being routed around the world so that it became impossible to trace. What really fascinates him now is how much there is to understand mathematically and technically in order to measure that amount of privacy. In more recent years, he has been interested in privacy in the context of digital photography and audio shared online and on social media. His current research is focused on understanding privacy issues around photo sharing in a world with cameras everywhere.<br/><br/></p><p>In this conversation, we delve into how users are affected once privacy violations have already occurred, the implications of privacy of children when it comes to parents sharing photos of them online, the fascinating future of trusted hardware that will help ensure &quot;digital forgetting,&quot; and how all of this is a people problem as much as it is a technical problem.</p><p><br/></p><p><b>Topics Covered</b>:</p><ul><li>Can we trick &apos;automated speech recognition&apos; (ASR)?</li><li>Apu&apos;s co-authored paper: &apos;Defending Against Microphone-based Attacks with Personalized Noise&apos;</li><li>What Apu means by &apos;tangible privacy&apos; &amp; what design approaches he recommends</li><li>Apu&apos;s view on &apos;bystander privacy&apos; &amp; the approach that he took in his research</li><li>How to leverage &apos;temporal redactions&apos; via &apos;trusted hardware&apos; for &apos;digital forgetting&apos;</li><li>Apu’s surprising finding in his research on &quot;interpersonal privacy&quot; in the context of social media and photos</li><li>Guidance for developers building privacy-respective social media apps</li><li>Apu&apos;s research focused on cybersecurity &amp; privacy for marginalized &amp; vulnerable populations</li><li>How we can make privacy &amp; security more &apos;useable&apos;</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Read <a href='https://homes.luddy.indiana.edu/kapadia/papers/popets21-noise.pdf'>Defending Against Microphone-Based Attacks with Personalized Noise</a></li><li>Read <a href='https://homes.luddy.indiana.edu/kapadia/papers/pias-cscw22.pdf'>Decaying Photos for Enhanced Privacy: User Perceptions Towards Temporal Redactions and &apos;Trusted&apos; Platforms</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Prof. Kapadia on <a href='https://www.linkedin.com/in/akapadia/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this episode, I&apos;m delighted to welcome <a href='https://www.linkedin.com/in/akapadia/'>Apu Kapadia</a>, Professor of Computer Science and Informatics at the School of Informatics and Computing, Indiana University. His research is focused on the privacy implications of ubiquitous cameras and online photo sharing. More recently, he has examined the cybersecurity and privacy challenges posed by AI-based smart voice assistants that can listen and converse with us.</p><p><br/>Prof. Kapadia has been excited by anonymized networks since childhood. He has memories of watching movies where a telephone call was being routed around the world so that it became impossible to trace. What really fascinates him now is how much there is to understand mathematically and technically in order to measure that amount of privacy. In more recent years, he has been interested in privacy in the context of digital photography and audio shared online and on social media. His current research is focused on understanding privacy issues around photo sharing in a world with cameras everywhere.<br/><br/></p><p>In this conversation, we delve into how users are affected once privacy violations have already occurred, the implications of privacy of children when it comes to parents sharing photos of them online, the fascinating future of trusted hardware that will help ensure &quot;digital forgetting,&quot; and how all of this is a people problem as much as it is a technical problem.</p><p><br/></p><p><b>Topics Covered</b>:</p><ul><li>Can we trick &apos;automated speech recognition&apos; (ASR)?</li><li>Apu&apos;s co-authored paper: &apos;Defending Against Microphone-based Attacks with Personalized Noise&apos;</li><li>What Apu means by &apos;tangible privacy&apos; &amp; what design approaches he recommends</li><li>Apu&apos;s view on &apos;bystander privacy&apos; &amp; the approach that he took in his research</li><li>How to leverage &apos;temporal redactions&apos; via &apos;trusted hardware&apos; for &apos;digital forgetting&apos;</li><li>Apu’s surprising finding in his research on &quot;interpersonal privacy&quot; in the context of social media and photos</li><li>Guidance for developers building privacy-respective social media apps</li><li>Apu&apos;s research focused on cybersecurity &amp; privacy for marginalized &amp; vulnerable populations</li><li>How we can make privacy &amp; security more &apos;useable&apos;</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Read <a href='https://homes.luddy.indiana.edu/kapadia/papers/popets21-noise.pdf'>Defending Against Microphone-Based Attacks with Personalized Noise</a></li><li>Read <a href='https://homes.luddy.indiana.edu/kapadia/papers/pias-cscw22.pdf'>Decaying Photos for Enhanced Privacy: User Perceptions Towards Temporal Redactions and &apos;Trusted&apos; Platforms</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Prof. Kapadia on <a href='https://www.linkedin.com/in/akapadia/'>LinkedIn</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12267602-s2e9-personalized-noise-decaying-photos-digital-forgetting-with-apu-kapadia-indiana-university-bloomington.mp3" length="33543087" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/402170</link>
    <itunes:image href="https://storage.buzzsprout.com/id6c0s4mj337dm436ti75va9ardf?.jpg" />
    <itunes:author>Debra J. Farber / Apu Kapadia</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12267602</guid>
    <pubDate>Tue, 07 Mar 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12267602/transcript" type="text/html" />
    <podcast:soundbite startTime="1741.793" duration="58.0" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12267602/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E9: Personalized Noise, Decaying Photos, &amp; Digital Forgetting with Apu Kapadia (Indiana University Bloomington)" />
  <psc:chapter start="1:15" title="Introducing Apu Kapadia" />
  <psc:chapter start="2:59" title="Apu describes his current research exploring security &amp; privacy issues related to audio &amp; voice" />
  <psc:chapter start="6:22" title="Debra gives an overview of Apu&#39;s co-authored paper: &#39;Defending Against Microphone-based Attacks with Personalized Noise&#39;" />
  <psc:chapter start="8:01" title="Apu describes the approach he took to solve for these problems and the results of his research" />
  <psc:chapter start="15:01" title="Apu explains what he means by &#39;tangible privacy&#39; &amp; what design approaches he recommends" />
  <psc:chapter start="17:53" title="Apu explains what he means by &#39;bystander privacy&#39; &amp; the approach that he took in his research" />
  <psc:chapter start="23:04" title="&#39;Digital Forgetting.&#39; Apu describes research on &#39;temporal redactions&#39; that can be applied through &#39;trusted hardware,&#39; which he raised in his paper: &quot;Decaying Photos for Enhanced Privacy: user perceptions towards temporal redactions &amp; trusted platforms&quot;" />
  <psc:chapter start="29:35" title="Apu gives guidance for developers building privacy respective social media apps" />
  <psc:chapter start="32:55" title="Apu tells us about his grant with The National Science Foundation with a focus on cybersecurity &amp; privacy for marginalized &amp; vulnerable populations" />
  <psc:chapter start="35:17" title="Apu gives some use cases regarding his NSF research" />
  <psc:chapter start="37:45" title="Apu describes his love and fascination for anonymizing networks like Tor" />
  <psc:chapter start="44:01" title="Apu describes how we can make privacy &amp; security more &#39;useable&#39;" />
</psc:chapters>
    <itunes:duration>2791</itunes:duration>
    <itunes:keywords>privacy research, IoT privacy, voice assistants, digital forgetting, personalized noise, decaying photos, interpersonal privacy, tangible privacy, bystander privacy, temporal redactions, useable privacy, automated speech recognition</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>9</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E8: Leveraging Federated Learning for Input Privacy with Victor Platt</itunes:title>
    <title>S2E8: Leveraging Federated Learning for Input Privacy with Victor Platt</title>
    <itunes:summary><![CDATA[Victor Platt is a Senior AI Security and Privacy Strategist who previously served as Head of Security and Privacy for privacy tech company, Integrate.ai. Victor was formerly a founding member of the Risk AI Team with Omnia AI, Deloitt’s artificial intelligence practice in Canada. He joins today to discuss privacy enhancing technologies (PETs) that are shaping industries around the world, with a focus on federated learning. --------- Thank you to our sponsor, Privado, the developer-friendly pr...]]></itunes:summary>
    <description><![CDATA[<p>Victor Platt is a Senior AI Security and Privacy Strategist who previously served as Head of Security and Privacy for privacy tech company, Integrate.ai. Victor was formerly a founding member of the Risk AI Team with Omnia AI, Deloitt’s artificial intelligence practice in Canada. He joins today to discuss privacy enhancing technologies (PETs) that are shaping industries around the world, with a focus on federated learning.</p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/>Victor views PETs as functional requirements and says they shouldn’t be buried in your design document as nonfunctional obligations. In his work, he has found key gaps where organizations were only doing “security for security’s sake.” Rather, he believes organizations should be thinking about it at the forefront. Not only that, we should all be getting excited about it because we all have a stake in privacy.</p><p><br/></p><p>With federated learning, you have the tools available to train ML models on large data sets with precision at scale without risking user privacy. In this conversation, Victor demystifies what federated learning is, describes the 2 different types: at the edge and across data silos, and explains how it works and how it compares to traditional machine learning.We deep dive into how an organization knows when to use federated learning, with specific advice for developers and data scientists as they implement it into their organizations.</p><p><br/></p><p><b>Topics Covered</b>:</p><ul><li>What &apos;federated learning&apos; is and how it compares to traditional machine learning</li><li>When an organization should use vertical federated learning vs horizontal federated learning, or instead a hybrid version</li><li>A key challenge in &apos;transfer learning&apos;: knowing whether two data sets are related to each other and techniques to overcome this, like &apos;private set intersection&apos;</li><li>How the future of technology will be underpinned by a &apos;constellation of PETs&apos; </li><li>The distinction between &apos;input privacy&apos; vs. &apos;output privacy&apos;</li><li>Different kinds of federated learning with use case examples</li><li>Where the responsibility for adding PETs lies within an organization</li><li>The key barriers to adopting federated learning and other PETs within different industries and use cases</li><li>How to move the needle on data privacy when it comes to legislation and regulation</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Take this outstanding, free class from OpenMined:  <a href='https://courses.openmined.org/'>Our Privacy Opportunity</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Victor on <a href='https://www.linkedin.com/in/victor-platt/'>LinkedIn</a></li></ul><p><b>Follow the SPL Show</b>:</p><ul><li>Follow us on <a href='https://twitter.com/ShiftPrivacyPod'>Twitter</a></li><li>Follow us on <a href='https://www.linkedin.com/showcase/shifting-privacy-left-podcast/?'>LinkedIn</a></li><li>Check out our <a href='https://shiftingprivacyleft.com/'>website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Victor Platt is a Senior AI Security and Privacy Strategist who previously served as Head of Security and Privacy for privacy tech company, Integrate.ai. Victor was formerly a founding member of the Risk AI Team with Omnia AI, Deloitt’s artificial intelligence practice in Canada. He joins today to discuss privacy enhancing technologies (PETs) that are shaping industries around the world, with a focus on federated learning.</p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/>Victor views PETs as functional requirements and says they shouldn’t be buried in your design document as nonfunctional obligations. In his work, he has found key gaps where organizations were only doing “security for security’s sake.” Rather, he believes organizations should be thinking about it at the forefront. Not only that, we should all be getting excited about it because we all have a stake in privacy.</p><p><br/></p><p>With federated learning, you have the tools available to train ML models on large data sets with precision at scale without risking user privacy. In this conversation, Victor demystifies what federated learning is, describes the 2 different types: at the edge and across data silos, and explains how it works and how it compares to traditional machine learning.We deep dive into how an organization knows when to use federated learning, with specific advice for developers and data scientists as they implement it into their organizations.</p><p><br/></p><p><b>Topics Covered</b>:</p><ul><li>What &apos;federated learning&apos; is and how it compares to traditional machine learning</li><li>When an organization should use vertical federated learning vs horizontal federated learning, or instead a hybrid version</li><li>A key challenge in &apos;transfer learning&apos;: knowing whether two data sets are related to each other and techniques to overcome this, like &apos;private set intersection&apos;</li><li>How the future of technology will be underpinned by a &apos;constellation of PETs&apos; </li><li>The distinction between &apos;input privacy&apos; vs. &apos;output privacy&apos;</li><li>Different kinds of federated learning with use case examples</li><li>Where the responsibility for adding PETs lies within an organization</li><li>The key barriers to adopting federated learning and other PETs within different industries and use cases</li><li>How to move the needle on data privacy when it comes to legislation and regulation</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Take this outstanding, free class from OpenMined:  <a href='https://courses.openmined.org/'>Our Privacy Opportunity</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Victor on <a href='https://www.linkedin.com/in/victor-platt/'>LinkedIn</a></li></ul><p><b>Follow the SPL Show</b>:</p><ul><li>Follow us on <a href='https://twitter.com/ShiftPrivacyPod'>Twitter</a></li><li>Follow us on <a href='https://www.linkedin.com/showcase/shifting-privacy-left-podcast/?'>LinkedIn</a></li><li>Check out our <a href='https://shiftingprivacyleft.com/'>website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12209272-s2e8-leveraging-federated-learning-for-input-privacy-with-victor-platt.mp3" length="29812808" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/396366</link>
    <itunes:image href="https://storage.buzzsprout.com/edek2px9ip1umlh1wxv4530iv3dh?.jpg" />
    <itunes:author>Debra J Farber / Victor Platt</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12209272</guid>
    <pubDate>Tue, 28 Feb 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12209272/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12209272/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E8: Leveraging Federated Learning for Input Privacy with Victor Platt" />
  <psc:chapter start="1:15" title="Introducing Victor Platt" />
  <psc:chapter start="2:20" title="Victor&#39;s &#39;origin story&#39; and how he got interested in PETs like &#39;federated learning&#39;" />
  <psc:chapter start="5:37" title="Victor describes what &#39;federated learning&#39; is and how it compares to traditional machine learning" />
  <psc:chapter start="6:52" title="Victor explains the use of federated learning &#39;at the edge&#39; &amp; example use cases" />
  <psc:chapter start="8:45" title="Victor explains federated learning &#39;across silos&#39; &amp; example use cases" />
  <psc:chapter start="11:36" title="Victor explains when an org would consider using vertical vs. horizontal federated learning or a hybrid of both" />
  <psc:chapter start="14:16" title="Victor outlines a key challenge in &#39;transfer learning&#39;: knowing whether two data sets are related to each other. He also explains how &#39;private set intersection&#39; techniques can help blindly identify common rows between data sets" />
  <psc:chapter start="16:49" title="Victor describes the two paths to when you should consider federated learning for your use case" />
  <psc:chapter start="19:44" title="Victor shares his excitement for the future of technology, which he believes will be underpinned by a &#39;constellation of PETs&#39;. He also distinguishes between PETs that assist with &#39;input privacy&#39; vs. &#39;output privacy&#39;" />
  <psc:chapter start="22:38" title="Victor shares his view on where the responsibility for leveraging PETs lies within an org" />
  <psc:chapter start="25:42" title="Victor describes key barriers to adoption of federated learning &amp; other PETs within different industries" />
  <psc:chapter start="29:21" title="Victor shares how we can move the needle on updating laws that have antiquated approaches to anonymity / &#39;de-identification&#39; (e.g. HIPAA) and to include the use of new PETs" />
  <psc:chapter start="35:54" title="Debra &amp; Victor discuss standards in development related to PETs like federated learning, differential privacy &amp; homomorphic encryption" />
  <psc:chapter start="37:46" title="Victor shares his advice for developers and data scientists as they implement privacy by design into their organizations. He recommends taking OpenMined&#39;s (free) course: &#39;Our Privacy Opportunity&#39;" />
</psc:chapters>
    <itunes:duration>2480</itunes:duration>
    <itunes:keywords>federated learning, PETs, Victor Platt, privacy enhancing technologies, input privacy, output privacy, private set intersection, OpenMined, horizontal federated learning, vertical federated learning, AI, machine learning, Our Privacy Opportunity, data val</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>8</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E7: Bring Your Own Data, ChatGPT &amp; Personal AIs with Markus Lampinen (Prifina)</itunes:title>
    <title>S2E7: Bring Your Own Data, ChatGPT &amp; Personal AIs with Markus Lampinen (Prifina)</title>
    <itunes:summary><![CDATA[In this conversation with Markus Lampinen, Co-founder and CEO at Prifina, a personal data platform, we discuss meaty topics like: Prifina’s approach to building privacy-respected apps for consumer wearable sensors; LLMs (Large Language Models) like Chat GPT; and why we should consider training our own personal AIs.  Markus shares his entrepreneurial journey in the privacy world and how he is “the biggest data nerd you’ll find.” It started with tracking his own data, like his eating habits, ac...]]></itunes:summary>
    <description><![CDATA[<p>In this conversation with <a href='https://www.linkedin.com/in/markuslampinen/'>Markus Lampinen</a>, Co-founder and CEO at <a href='https://www.prifina.com/'>Prifina</a>, a personal data platform, we discuss meaty topics like: Prifina’s approach to building privacy-respected apps for consumer wearable sensors; LLMs (Large Language Models) like Chat GPT; and why we should consider training our own personal AIs.<br/><br/>Markus shares his entrepreneurial journey in the privacy world and how he is “the biggest data nerd you’ll find.” It started with tracking his own data, like his eating habits, activity, sleep, and stress, an then he built his company around that interest. His curiosity about what you can glean from one&apos;s own data made him wonder how you could also improve your life or the lives of your customers with that data.</p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------<br/><br/></p><p>We discuss how to approach building a privacy-first platform to unlock the value and use of IOT / sensor data. It began with the concept of individual ownership: who should actually benefit from the data that we generate? Markus says it should be individuals themselves. <br/><br/>Prifina boasts a strong community of 30,000 developers who align around common interests - <a href='https://www.prifina.com/slack.html'>liberty, equality &amp; data</a> - and build and test prototypes that are gathering and utilizing the data working for individuals, as opposed to corporate entities. The aim is to empower individuals, companies &amp; developers to build apps that re-purpose individuals&apos; own sensor data to gain privacy-enabled insights.<br/><br/></p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>---------<br/><br/><b>Topics Covered</b>:</p><ul><li>Enabling true, consumer-grade &apos;data portability&apos; with personal data clouds (a &apos;bring your own data&apos; approach)</li><li>Use cases to illustrate the problems Prifina is solving with sensors</li><li>What are large language models (LLM) and chatbots trained on them, and why they are so hot right now</li><li>The dangers of using LLMs, with emphasis on privacy harms</li><li>How to benefit from our own data with personal AIs</li><li>Advice to data scientists, researchers and developers regarding how to architect for ethical uses of LLMs</li><li>Who&apos;s responsible for educating the public about LLMs, chatbots, and their potential harms &amp; limitations</li></ul><p><br/><b>Resources Mentioned</b>:</p><ul><li>Learn more about <a href='https://www.prifina.com/'>Prifina</a></li><li>Join Prifina&apos;s Slack Community: <a href='https://www.prifina.com/slack.html'>Liberty.Equality.Data</a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li>Follow Markus on <a href='https://www.linkedin.com/in/markuslampinen/'></a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this conversation with <a href='https://www.linkedin.com/in/markuslampinen/'>Markus Lampinen</a>, Co-founder and CEO at <a href='https://www.prifina.com/'>Prifina</a>, a personal data platform, we discuss meaty topics like: Prifina’s approach to building privacy-respected apps for consumer wearable sensors; LLMs (Large Language Models) like Chat GPT; and why we should consider training our own personal AIs.<br/><br/>Markus shares his entrepreneurial journey in the privacy world and how he is “the biggest data nerd you’ll find.” It started with tracking his own data, like his eating habits, activity, sleep, and stress, an then he built his company around that interest. His curiosity about what you can glean from one&apos;s own data made him wonder how you could also improve your life or the lives of your customers with that data.</p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------<br/><br/></p><p>We discuss how to approach building a privacy-first platform to unlock the value and use of IOT / sensor data. It began with the concept of individual ownership: who should actually benefit from the data that we generate? Markus says it should be individuals themselves. <br/><br/>Prifina boasts a strong community of 30,000 developers who align around common interests - <a href='https://www.prifina.com/slack.html'>liberty, equality &amp; data</a> - and build and test prototypes that are gathering and utilizing the data working for individuals, as opposed to corporate entities. The aim is to empower individuals, companies &amp; developers to build apps that re-purpose individuals&apos; own sensor data to gain privacy-enabled insights.<br/><br/></p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>---------<br/><br/><b>Topics Covered</b>:</p><ul><li>Enabling true, consumer-grade &apos;data portability&apos; with personal data clouds (a &apos;bring your own data&apos; approach)</li><li>Use cases to illustrate the problems Prifina is solving with sensors</li><li>What are large language models (LLM) and chatbots trained on them, and why they are so hot right now</li><li>The dangers of using LLMs, with emphasis on privacy harms</li><li>How to benefit from our own data with personal AIs</li><li>Advice to data scientists, researchers and developers regarding how to architect for ethical uses of LLMs</li><li>Who&apos;s responsible for educating the public about LLMs, chatbots, and their potential harms &amp; limitations</li></ul><p><br/><b>Resources Mentioned</b>:</p><ul><li>Learn more about <a href='https://www.prifina.com/'>Prifina</a></li><li>Join Prifina&apos;s Slack Community: <a href='https://www.prifina.com/slack.html'>Liberty.Equality.Data</a></li></ul><p><br/><b>Guest Info</b>:</p><ul><li>Follow Markus on <a href='https://www.linkedin.com/in/markuslampinen/'></a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12209256-s2e7-bring-your-own-data-chatgpt-personal-ais-with-markus-lampinen-prifina.mp3" length="42531385" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/391573</link>
    <itunes:image href="https://storage.buzzsprout.com/f8szrqapajhlijqftiico8zsysj7?.jpg" />
    <itunes:author>Debra J. Farber / Markus Lampinen</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12209256</guid>
    <pubDate>Tue, 21 Feb 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12209256/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12209256/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E7: Bring Your Own Data, ChatGPT &amp; Personal AIs with Markus Lampinen (Prifina)" />
  <psc:chapter start="1:15" title="Introducing Markus Lampinen" />
  <psc:chapter start="2:23" title="Markus relays his &#39;data geek&#39; origin story" />
  <psc:chapter start="5:24" title="Why Markus founded Prifina, his approach to building a privacy-first platform, and how it unlocks the value and use of IOT / sensor data" />
  <psc:chapter start="13:29" title="Markus describes the Prifina community of over 30,000 developers" />
  <psc:chapter start="17:09" title="Debra &amp; Markus discuss the importance of building privacy constraints into engineering tools and why it&#39;s essential to build with privacy by design and default" />
  <psc:chapter start="23:54" title="Enabling true, consumer-grade &#39;data portability&#39; with personal data clouds" />
  <psc:chapter start="26:58" title="Debra explains what large language models (LLM) and chatbots trained on them, and why they are so hot right now" />
  <psc:chapter start="28:48" title="Markus relays his belief that we&#39;re only at the top of the 1st wave of the hype cycle for LLMs" />
  <psc:chapter start="30:13" title="Markus opines on why technologists are so excited about generative AI" />
  <psc:chapter start="33:00" title="Markus differentiates between ChatGPT and GPT3" />
  <psc:chapter start="35:04" title="Markus describes some of the dangers of using LLMs with a focus on privacy harms" />
  <psc:chapter start="40:16" title="Debra describes the privacy harm of &#39;decisional interference&#39;" />
  <psc:chapter start="40:53" title="Debra wonders whether global regulators may force companies to throw away unethically-trained chatbots" />
  <psc:chapter start="44:08" title="Markus envisions a future where we all have our own &#39;personal AIs&#39;" />
  <psc:chapter start="45:20" title="Markus&#39;s advice to data scientists and researchers and developers regarding how to architect for ethical uses of LLMs" />
  <psc:chapter start="48:42" title="Discussion over who&#39;s job it is to educate the public on LLMs, chatbots, their potential harms &amp; limitations, etc." />
  <psc:chapter start="52:28" title="Markus is optimistic about future investments in a &#39;consumer market&#39; for privacy-enablement and portability that rivals previous investment in enterprise tech" />
  <psc:chapter start="55:57" title="Markus talks about Prifina&#39;s Slack community: &#39;Liberty. Equality. Data.&#39;" />
</psc:chapters>
    <itunes:duration>3540</itunes:duration>
    <itunes:keywords>personal AIs, privacy, ChatGPT, LLMs, large language models, sensors, model, developers, generative AI, Prifina</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>7</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E6: &#39;Privacy Left Trust&#39; with Gary LaFever (Anonos)</itunes:title>
    <title>S2E6: &#39;Privacy Left Trust&#39; with Gary LaFever (Anonos)</title>
    <itunes:summary><![CDATA[Today, I welcome Gary LaFever, co-CEO &amp; GC at Anonos; WEF Global Innovator; and a solutions-oriented futurist with a computer science and legal background. Gary has over 35 years of technical, legal and policy experience that enables him to approach issues from multiple perspectives. I last saw Gary when we shared the stage at a RegTech conference in London six years ago, and it was a pleasure to speak with him again to discuss how the Schrems II decision coupled with the increasing preva...]]></itunes:summary>
    <description><![CDATA[<p>Today, I welcome Gary LaFever, co-CEO &amp; GC at Anonos; WEF Global Innovator; and a solutions-oriented futurist with a computer science and legal background. Gary has over 35 years of technical, legal and policy experience that enables him to approach issues from multiple perspectives. I last saw Gary when we shared the stage at a RegTech conference in London six years ago, and it was a pleasure to speak with him again to discuss how the Schrems II decision coupled with the increasing prevalence of data breaches and ransomware attacks have shifted privacy left from optional to mandatory, necessitating a &quot;privacy left trust&quot; approach.<br/><br/>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------<br/><br/></p><p>Gary describes the 7 Universal Data Use Cases with relatable examples and how they are applicable across orgs and industries, regardless of jurisdiction. We then dive into what Gary is seeing in the market in regard to the use cases. He then reveals the 3 Main Data Use Obstacles to accomplishing these use cases and how to overcome them with &quot;statutory pseudonymization&quot; and &quot;synthetic data.&quot;</p><p><br/></p><p>In this conversation that evaluates how we can do business in a de-risked environment, we discuss why you can&apos;t approach privacy with just words - contracts, policies, and treaties; why it&apos;s essential to protect data in use;  and how you can embed technical controls that move with data for protection that meets regulatory thresholds while &quot;in use&quot; to unlock additional data use cases. I.e., these effective controls equate to competitive advantage.<br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>Why trust must be updated to be technologically enforced - &quot;privacy left trust&quot;</li><li>The increasing prevalence of data breaches and ransomware attacks and how they have shifted privacy left from optional to mandatory</li><li>7 Data Use Cases, 3 Data Use Obstacles, and deployable technologies to unlock new data use cases</li><li>How the market is adopting technology for the 7 use cases and trends that Gary is seeing</li><li>What it means to &quot;de-risk&quot; data</li><li>Beneficial uses of &quot;variant twins&quot; technology</li><li>Building privacy in by design, so it increases revenue generation</li><li>&quot;Statutory pseudonymization&quot; and how it will help you reduce data privacy risks while increasing utility and value</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Learn about <a href='http://www.anonos.com'>Anonos</a></li><li>Read:<a href='https://www.anonos.com/hubfs/Documents/Whitepapers/Jan_2023_Journal_of_Data_Protection_and_Privacy_Article.pdf?hsLang=en'> &quot;Technical Controls that Protect Data When in Use and Prevent Misuse&quot;</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Gary on <a href='https://www.linkedin.com/in/garylafever/'>LinkedIn</a></li><li>Follow Gary on <a href='https://twitter.com/GaryLaFever'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Today, I welcome Gary LaFever, co-CEO &amp; GC at Anonos; WEF Global Innovator; and a solutions-oriented futurist with a computer science and legal background. Gary has over 35 years of technical, legal and policy experience that enables him to approach issues from multiple perspectives. I last saw Gary when we shared the stage at a RegTech conference in London six years ago, and it was a pleasure to speak with him again to discuss how the Schrems II decision coupled with the increasing prevalence of data breaches and ransomware attacks have shifted privacy left from optional to mandatory, necessitating a &quot;privacy left trust&quot; approach.<br/><br/>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------<br/><br/></p><p>Gary describes the 7 Universal Data Use Cases with relatable examples and how they are applicable across orgs and industries, regardless of jurisdiction. We then dive into what Gary is seeing in the market in regard to the use cases. He then reveals the 3 Main Data Use Obstacles to accomplishing these use cases and how to overcome them with &quot;statutory pseudonymization&quot; and &quot;synthetic data.&quot;</p><p><br/></p><p>In this conversation that evaluates how we can do business in a de-risked environment, we discuss why you can&apos;t approach privacy with just words - contracts, policies, and treaties; why it&apos;s essential to protect data in use;  and how you can embed technical controls that move with data for protection that meets regulatory thresholds while &quot;in use&quot; to unlock additional data use cases. I.e., these effective controls equate to competitive advantage.<br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>Why trust must be updated to be technologically enforced - &quot;privacy left trust&quot;</li><li>The increasing prevalence of data breaches and ransomware attacks and how they have shifted privacy left from optional to mandatory</li><li>7 Data Use Cases, 3 Data Use Obstacles, and deployable technologies to unlock new data use cases</li><li>How the market is adopting technology for the 7 use cases and trends that Gary is seeing</li><li>What it means to &quot;de-risk&quot; data</li><li>Beneficial uses of &quot;variant twins&quot; technology</li><li>Building privacy in by design, so it increases revenue generation</li><li>&quot;Statutory pseudonymization&quot; and how it will help you reduce data privacy risks while increasing utility and value</li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Learn about <a href='http://www.anonos.com'>Anonos</a></li><li>Read:<a href='https://www.anonos.com/hubfs/Documents/Whitepapers/Jan_2023_Journal_of_Data_Protection_and_Privacy_Article.pdf?hsLang=en'> &quot;Technical Controls that Protect Data When in Use and Prevent Misuse&quot;</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Gary on <a href='https://www.linkedin.com/in/garylafever/'>LinkedIn</a></li><li>Follow Gary on <a href='https://twitter.com/GaryLaFever'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12179640-s2e6-privacy-left-trust-with-gary-lafever-anonos.mp3" length="42208704" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/384440</link>
    <itunes:image href="https://storage.buzzsprout.com/4q0me520n8nvneqgr599583hr27i?.jpg" />
    <itunes:author>Debra J Farber / Gary LaFever</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12179640</guid>
    <pubDate>Tue, 14 Feb 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12179640/transcript" type="text/html" />
    <podcast:soundbite startTime="3211.328" duration="56.0" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12179640/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E6: &#39;Privacy Left Trust&#39; with Gary LaFever (Anonos)" />
  <psc:chapter start="1:15" title="Introducing Gary LaFever" />
  <psc:chapter start="3:30" title="Gary opines on what &quot;Shifting Privacy Left&quot; means to him" />
  <psc:chapter start="4:09" title="Why Gary believes the Schrems II decision and the increasing prevalence of data breaches and ransomware attacks have shifted privacy left from optional to mandatory" />
  <psc:chapter start="8:12" title="Why trust must be updated to be technologically enforced" />
  <psc:chapter start="10:16" title="Introducing the seven universal data use cases" />
  <psc:chapter start="11:14" title="1) Using Data for Application Development &amp; Testing" />
  <psc:chapter start="13:11" title="2) Internal Data Sharing across Business Boundaries" />
  <psc:chapter start="15:23" title="3) Using Data for Analytics, ML &amp; AI Model Building" />
  <psc:chapter start="17:02" title="4) Using Data to Generate Inferences &amp; Predictions (using AI &amp; ML models in production)" />
  <psc:chapter start="19:42" title="5) Sharing Data with a 3rd Party Service Provider" />
  <psc:chapter start="24:49" title="6) Sharing Data with 3rd Parties for Monetizations" />
  <psc:chapter start="29:20" title="7) Using Data for Enrichment (inbound, outbound, or bi-directional)" />
  <psc:chapter start="32:05" title="Gary&#39;s assessment as to how the market is adopting technology for the 7 use cases" />
  <psc:chapter start="35:28" title="Debra &amp; Gary share views on what it means to &quot;de-risk&quot; data" />
  <psc:chapter start="40:12" title="Data Use Obstacle 1: Data must be protected when in use / Gary unpacks what he means by &quot;statutory pseudonymization&quot;" />
  <psc:chapter start="46:42" title="Data Use Obstacle 2: Data is too sparsed or biased" />
  <psc:chapter start="51:27" title="Data Use Obstacle 3: Satisfying regulatory requirements for lawful international transfer and surveillance-proof processing" />
  <psc:chapter start="53:05" title="Gary discusses the use of &quot;variant twins&quot;" />
  <psc:chapter start="54:56" title="Gary discusses how technologically enforced privacy can make it easier to achieve &quot;proportionality&quot; goals under GDPR / Schrems II" />
</psc:chapters>
    <itunes:duration>3513</itunes:duration>
    <itunes:keywords>privacy left trust, universal data use cases, Schrems II, Anonos, Gary LaFever, PETs, privacy enhancing technologies, anonymization, pseudonymization, pseudonymisation, proportionality, GDPR, privacy, statutory pseudonymization, data sharing, analytics</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>6</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E5 - What&#39;s New in Privacy-by-Design with R. Jason Cronk (IOPD)</itunes:title>
    <title>S2E5 - What&#39;s New in Privacy-by-Design with R. Jason Cronk (IOPD)</title>
    <itunes:summary><![CDATA[R. Jason Cronk is the Founder of the Institute of Operational Privacy Design (IOPD) and CEO of Enterprivacy Consulting Group, as well as the author of Strategic Privacy by Design. I recently caught up with Jason at the annual Privacy Law Salon event and had a conversation about the socio-technical challenges of privacy, different privacy-by-design frameworks that he’s worked on, and his thoughts on some hot topics in the web privacy space.  --------- Thank you to our sponsor, Privado, the dev...]]></itunes:summary>
    <description><![CDATA[<p>R. Jason Cronk is the Founder of the <a href='https://instituteofprivacydesign.org/'>Institute of Operational Privacy Design</a> (IOPD) and CEO of <a href='https://enterprivacy.com/'>Enterprivacy Consulting Group</a>, as well as the author of <a href='https://iapp.org/resources/article/strategic-privacy-by-design/'>Strategic Privacy by Design</a>. I recently caught up with Jason at the annual <a href='https://www.privacylawsalon.com/'>Privacy Law Salon </a>event and had a conversation about the socio-technical challenges of privacy, different privacy-by-design frameworks that he’s worked on, and his thoughts on some hot topics in the web privacy space.<br/><br/>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p>We start off discussing updates to  <a href='https://iapp.org/resources/article/strategic-privacy-by-design/'>Strategic Privacy by Design</a>, now in it&apos;s 2nd edition. We chat about the brand new ISO 31700 Privacy by Design for Consumer Goods and Services standard and consensus process and  compare it to the NIST Privacy Framework, IEEE 7002 Standard for Data Privacy, and Jason&apos;s work with the Institute of Operational Privacy Design (IOPD) and it&apos;s newly-published Design Process Standard v1. </p><p><br/>Jason and I also explore risk tolerance through the lens of privacy using <a href='https://www.fairinstitute.org/fair-risk-management'>FAIR</a>. There’s a lot of room for subjective interpretation, particularly of non-monetary harm, and Jason provides many thought-provoking examples of how this plays out in our society. We round out our conversation by talking about the challenges of Global Privacy Control (GPC) and what deceptive design strategies to look out for.</p><p><br/><b>Topics Covered</b>:</p><ul><li>Why we should think of privacy beyond &quot;digital privacy&quot;</li><li>What readers can expect from Jason’s book,  <a href='https://iapp.org/resources/article/strategic-privacy-by-design/'>Strategic Privacy by Design</a>, and what’s included in the 2nd edition</li><li>IOPD’s B2B third-party privacy audit</li><li>Why you should leverage the FAIR quantitative risk analysis model to define address effective privacy risk management programs</li><li>The NIST Privacy Framework and developments of its Privacy Workforce Working Group</li><li>Dark patterns &amp; why just asking the wrong question can be a privacy harm (interrogation)</li><li>How there are 15 privacy harms &amp; only 1 of them is about security</li></ul><p><b>Resources Mentioned</b>:</p><ul><li><a href='https://www.iso.org/obp/ui/#iso:std:iso:31700:-1:ed-1:v1:en'>Learn about the ISO 31700 Privacy by Design Standard</a></li><li><a href='https://instituteofprivacydesign.org/2023/01/12/introducing-the-design-process-standard-v-1-0/'>Review the IOPD Design Process Standard v1</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Jason on <a href='https://www.linkedin.com/in/rjc06c/'>LinkedIn</a></li><li>Follow Enterprivacy Consulting Group on <a href='https://twitter.com/enterprivacy'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>R. Jason Cronk is the Founder of the <a href='https://instituteofprivacydesign.org/'>Institute of Operational Privacy Design</a> (IOPD) and CEO of <a href='https://enterprivacy.com/'>Enterprivacy Consulting Group</a>, as well as the author of <a href='https://iapp.org/resources/article/strategic-privacy-by-design/'>Strategic Privacy by Design</a>. I recently caught up with Jason at the annual <a href='https://www.privacylawsalon.com/'>Privacy Law Salon </a>event and had a conversation about the socio-technical challenges of privacy, different privacy-by-design frameworks that he’s worked on, and his thoughts on some hot topics in the web privacy space.<br/><br/>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p>We start off discussing updates to  <a href='https://iapp.org/resources/article/strategic-privacy-by-design/'>Strategic Privacy by Design</a>, now in it&apos;s 2nd edition. We chat about the brand new ISO 31700 Privacy by Design for Consumer Goods and Services standard and consensus process and  compare it to the NIST Privacy Framework, IEEE 7002 Standard for Data Privacy, and Jason&apos;s work with the Institute of Operational Privacy Design (IOPD) and it&apos;s newly-published Design Process Standard v1. </p><p><br/>Jason and I also explore risk tolerance through the lens of privacy using <a href='https://www.fairinstitute.org/fair-risk-management'>FAIR</a>. There’s a lot of room for subjective interpretation, particularly of non-monetary harm, and Jason provides many thought-provoking examples of how this plays out in our society. We round out our conversation by talking about the challenges of Global Privacy Control (GPC) and what deceptive design strategies to look out for.</p><p><br/><b>Topics Covered</b>:</p><ul><li>Why we should think of privacy beyond &quot;digital privacy&quot;</li><li>What readers can expect from Jason’s book,  <a href='https://iapp.org/resources/article/strategic-privacy-by-design/'>Strategic Privacy by Design</a>, and what’s included in the 2nd edition</li><li>IOPD’s B2B third-party privacy audit</li><li>Why you should leverage the FAIR quantitative risk analysis model to define address effective privacy risk management programs</li><li>The NIST Privacy Framework and developments of its Privacy Workforce Working Group</li><li>Dark patterns &amp; why just asking the wrong question can be a privacy harm (interrogation)</li><li>How there are 15 privacy harms &amp; only 1 of them is about security</li></ul><p><b>Resources Mentioned</b>:</p><ul><li><a href='https://www.iso.org/obp/ui/#iso:std:iso:31700:-1:ed-1:v1:en'>Learn about the ISO 31700 Privacy by Design Standard</a></li><li><a href='https://instituteofprivacydesign.org/2023/01/12/introducing-the-design-process-standard-v-1-0/'>Review the IOPD Design Process Standard v1</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Jason on <a href='https://www.linkedin.com/in/rjc06c/'>LinkedIn</a></li><li>Follow Enterprivacy Consulting Group on <a href='https://twitter.com/enterprivacy'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12177884-s2e5-what-s-new-in-privacy-by-design-with-r-jason-cronk-iopd.mp3" length="42193950" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/383160</link>
    <itunes:image href="https://storage.buzzsprout.com/a8r26qikxo313ud87mwyv5x3w8v9?.jpg" />
    <itunes:author>Debra J. Farber / R. Jason Cronk</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12177884</guid>
    <pubDate>Tue, 07 Feb 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12177884/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12177884/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E5 - What&#39;s New in Privacy-by-Design with R. Jason Cronk (IOPD)" />
  <psc:chapter start="2:37" title="What are the &quot;socio-technical challenges of privacy&quot; and how can we overcome them?" />
  <psc:chapter start="3:34" title="Why we should think of privacy beyond &quot;digital privacy.&quot;" />
  <psc:chapter start="5:42" title="Jason discusses his book, Strategic Privacy by Design and what&#39;s included in the updated 2nd edition" />
  <psc:chapter start="12:49" title="ISO 31700 Privacy by Design Standard for Consumer Goods and Services" />
  <psc:chapter start="21:13" title="Jason describes the Institute of Operational Privacy Design (IOPD) and it&#39;s newly-published Design Process Standard v1." />
  <psc:chapter start="23:40" title="IEEE 7002 Standard for Data Privacy Process" />
  <psc:chapter start="30:22" title="Leveraging the FAIR quantitative risk analysis model to define the necessary building blocks for implementing effective privacy risk management programs" />
  <psc:chapter start="37:57" title="Jason discusses The NIST Privacy Framework and The NIST Privacy Workforce Working Group developments" />
  <psc:chapter start="42:05" title="Hot topics in web privacy: do not sell; global privacy control (GPC)" />
  <psc:chapter start="47:48" title="Discussing dark patterns and why just asking the wrong question can be a privacy harm (interrogation)" />
  <psc:chapter start="54:12" title="Discussing data minimization and &quot;data devaluation&quot;" />
  <psc:chapter start="56:07" title="How there are 15 privacy harms, and only 1 of them is about security" />
</psc:chapters>
    <itunes:duration>3512</itunes:duration>
    <itunes:keywords>privacy by design, PbD, consent, IOPD, Institute of Privacy Design, privacy design, ISO 31700</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>5</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E4: Training the Next Wave of Privacy Engineers with Nishant Bhajaria (Uber)</itunes:title>
    <title>S2E4: Training the Next Wave of Privacy Engineers with Nishant Bhajaria (Uber)</title>
    <itunes:summary><![CDATA[Nishant Bhajaria is the Director of Privacy Engineering, Architecture, &amp; Analytics at Uber and Author of "Data Privacy: A Runbook for Engineers.” He’s also an Advisor to Data Protocol, Privado &amp; Piiano. In our conversation, we discuss privacy engineering trends, educational materials that Nishant has developed, and his advice to privacy technologists, engineers, and hiring managers.  --------- Thank you to our sponsor, Privado, the developer-friendly privacy platform ---------  N...]]></itunes:summary>
    <description><![CDATA[<p>Nishant Bhajaria is the Director of Privacy Engineering, Architecture, &amp; Analytics at <a href='http://www.uber.com'>Uber</a> and Author of &quot;<a href='https://www.manning.com/books/data-privacy'>Data Privacy: A Runbook for Engineers</a>.” He’s also an Advisor to <a href='https://dataprotocol.com/advisory-board/'>Data Protocol</a>, <a href='https://www.privado.ai/'>Privado</a> &amp; <a href='https://piiano.com/'>Piiano</a>. In our conversation, we discuss privacy engineering trends, educational materials that Nishant has developed, and his advice to privacy technologists, engineers, and hiring managers. </p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/>Nishant is a great example of a cross-functional, influential agent who has adapted to the ever-growing privacy discipline. He describes himself as an engineer for the attorneys and an attorney for the engineers, which has helped him secure positions at <a href='https://www.webmd.com/'>WebMD</a>, <a href='https://www.nike.com/'>Nike</a>, <a href='https://www.netflix.com/'>Netflix</a>, and now <a href='http://www.uber.com'>Uber</a>. <br/><br/></p><p>Nishant shares his advice for career development, both through the lens of how to break into the privacy space and also how to grow within your role. He explains how he’s been able to get board-level understanding about the importance of privacy as a product, not an afterthought. He also highlights takeaways from his book and online courses.</p><p><br/><b>Topics Covered</b>:</p><ul><li>How privacy engineers can secure their jobs during this widespread tech industry layoff </li><li>Privacy tech as the glue between different teams and in-house services</li><li>How to make privacy more visible to the business as something that benefits the bottom line </li><li>Common mistakes that Nishant sees engineers make when it comes to privacy </li><li>What’s covered in Nishant’s ‘Privacy by Design’ courses </li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Buy <a href='https://www.manning.com/books/data-privacy'>Data Privacy: A Runbook for Engineers </a></li><li>Check out the <a href='https://dataprotocol.com/certifications/privacy-engineering-certification'>Privacy Engineering Certification Course</a> </li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Nishant on <a href='https://www.linkedin.com/in/nishantjb/'>LinkedIn</a> </li></ul><p><b>Follow the SPL Show</b>:</p><ul><li>Follow us on <a href='https://twitter.com/ShiftPrivacyPod'>Twitter</a> </li><li>Follow us on <a href='https://www.linkedin.com/showcase/shifting-privacy-left-podcast/?'>LinkedIn</a></li><li>Check out our <a href='https://shiftingprivacyleft.com/'>website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Nishant Bhajaria is the Director of Privacy Engineering, Architecture, &amp; Analytics at <a href='http://www.uber.com'>Uber</a> and Author of &quot;<a href='https://www.manning.com/books/data-privacy'>Data Privacy: A Runbook for Engineers</a>.” He’s also an Advisor to <a href='https://dataprotocol.com/advisory-board/'>Data Protocol</a>, <a href='https://www.privado.ai/'>Privado</a> &amp; <a href='https://piiano.com/'>Piiano</a>. In our conversation, we discuss privacy engineering trends, educational materials that Nishant has developed, and his advice to privacy technologists, engineers, and hiring managers. </p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/>Nishant is a great example of a cross-functional, influential agent who has adapted to the ever-growing privacy discipline. He describes himself as an engineer for the attorneys and an attorney for the engineers, which has helped him secure positions at <a href='https://www.webmd.com/'>WebMD</a>, <a href='https://www.nike.com/'>Nike</a>, <a href='https://www.netflix.com/'>Netflix</a>, and now <a href='http://www.uber.com'>Uber</a>. <br/><br/></p><p>Nishant shares his advice for career development, both through the lens of how to break into the privacy space and also how to grow within your role. He explains how he’s been able to get board-level understanding about the importance of privacy as a product, not an afterthought. He also highlights takeaways from his book and online courses.</p><p><br/><b>Topics Covered</b>:</p><ul><li>How privacy engineers can secure their jobs during this widespread tech industry layoff </li><li>Privacy tech as the glue between different teams and in-house services</li><li>How to make privacy more visible to the business as something that benefits the bottom line </li><li>Common mistakes that Nishant sees engineers make when it comes to privacy </li><li>What’s covered in Nishant’s ‘Privacy by Design’ courses </li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Buy <a href='https://www.manning.com/books/data-privacy'>Data Privacy: A Runbook for Engineers </a></li><li>Check out the <a href='https://dataprotocol.com/certifications/privacy-engineering-certification'>Privacy Engineering Certification Course</a> </li></ul><p><b>Guest Info</b>:</p><ul><li>Follow Nishant on <a href='https://www.linkedin.com/in/nishantjb/'>LinkedIn</a> </li></ul><p><b>Follow the SPL Show</b>:</p><ul><li>Follow us on <a href='https://twitter.com/ShiftPrivacyPod'>Twitter</a> </li><li>Follow us on <a href='https://www.linkedin.com/showcase/shifting-privacy-left-podcast/?'>LinkedIn</a></li><li>Check out our <a href='https://shiftingprivacyleft.com/'>website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12147778-s2e4-training-the-next-wave-of-privacy-engineers-with-nishant-bhajaria-uber.mp3" length="30734127" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/376909</link>
    <itunes:image href="https://storage.buzzsprout.com/zaxy2jqtoff2mo2rvbs77hqtwbns?.jpg" />
    <itunes:author>Debra J. Farber / Nishant Bhajaria</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12147778</guid>
    <pubDate>Tue, 31 Jan 2023 02:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12147778/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12147778/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E4: Training the Next Wave of Privacy Engineers with Nishant Bhajaria (Uber)" />
  <psc:chapter start="2:02" title="Why Nishant became interested in privacy engineering" />
  <psc:chapter start="7:33" title="Nishant&#39;s definition of a &quot;privacy engineer&quot; &amp; how he thinks the role will evolve" />
  <psc:chapter start="14:01" title="The trends Nishant sees in the market regarding the hiring of privacy engineers" />
  <psc:chapter start="16:49" title="Advice to hiring managers who are having a difficult time hiring for privacy engineering roles" />
  <psc:chapter start="19:32" title="Career development advice for privacy engineers, including Nishant&#39;s LinkedIn Learning courses, Privacy Engineering Certification with Data Protocol &amp; his book, Data Privacy: a runbook for engineers" />
  <psc:chapter start="27:02" title="Nishant describes the privacy tech tools on the market that he finds useful" />
  <psc:chapter start="31:06" title="Nishant&#39;s advice on how to shift privacy left" />
  <psc:chapter start="33:31" title="Venture capital&#39;s role in understanding privacy tech and why it&#39;s essential to build community amongst privacy engineers" />
  <psc:chapter start="36:55" title="How to appeal to privacy engineers when selling privacy tech solutions" />
  <psc:chapter start="39:55" title="Nishant discusses the updates to his courses and book" />
</psc:chapters>
    <itunes:duration>2557</itunes:duration>
    <itunes:keywords>privacy engineering, privacy engineer, privacy engineering certification, LinkedIn Learning, tech, hiring, Data Protocol, Privacy Engineering Certification, Privado</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>4</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E3: Fixing Consent &amp; Transparency on the Web with Mark Lizar (Digital Transparency Lab)</itunes:title>
    <title>S2E3: Fixing Consent &amp; Transparency on the Web with Mark Lizar (Digital Transparency Lab)</title>
    <itunes:summary><![CDATA[To kick off Data Privacy Week 2023, I’m joined by Mark Lizar, CEO of the Digital Transparency Lab and Founder of 0PN: Open Privacy Network.  Mark is also the Vice Chair of the IEEE Cybersecurity for Next-Generation Connectivity Systems' Human Control &amp; Flow Sub-Committee and Editor &amp; Lead Author of the ANCR Notice Record Specification and Framework at the Kantara Initiative.    In our conversation, we unpack the current standards and specifications for transparency and data ...]]></itunes:summary>
    <description><![CDATA[<p>To kick off Data Privacy Week 2023, I’m joined by Mark Lizar, CEO of the <a href='https://transparencylab.ca/'>Digital Transparency Lab</a> and Founder of <a href='https://www.0pn.org/'>0PN: Open Privacy Network</a>. </p><p>Mark is also the Vice Chair of the IEEE Cybersecurity for Next-Generation Connectivity Systems&apos; Human Control &amp; Flow Sub-Committee and Editor &amp; Lead Author of the ANCR Notice Record Specification and Framework at the Kantara Initiative. <br/><br/></p><p>In our conversation, we unpack the current standards and specifications for transparency and data control in the digital space. Mark shares some of the innovative solutions he and his colleagues are working on to bridge the gap in web consent. </p><p><br/></p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/></p><p>Mark unpacks his interpretation of the open transparency standards, laws, and tech required for privacy to scale digitally. One of the major use cases he’s working on at 0PN is called ‘Do Track,’ which is a response to the shortcomings of the current ‘Do Not Track’ mechanism that we have in place today. The Controller Credential Standard allows users to specify or direct consent, and he shares some exciting examples of how users can use ‘Do Track’ to take back control over their own data. <br/><br/></p><p>Mark breaks down the four levels of privacy assurance achieved Controller Credential Framework and explains what’s needed to gain market traction for this privacy-enabling tech standard. He also gives us a peek into what else they’re working on over at the Digital Transparency Lab and how to get involved with the organization and their efforts..<br/><br/></p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>---------<br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>A simple way to understand online consents vs. system permissions </li><li>Why it’s important to see who&apos;s controlling our data </li><li>How the new Controller Credential gives people autonomy over their own data</li><li>International privacy instruments that can be scaled for local use </li><li>A new digital model for representing physical privacy </li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Learn more about <a href='https://transparencylab.ca/'>Digital Transparency Lab</a> </li><li>RSVP to the 1/27/23 <a href='https://transparencylab.ca/store/p/jan-27-digital-privacy-transparency-launch-event'>Digital Privacy Transparency Launch</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Connect with Mark on <a href='https://www.linkedin.com/in/marklizar/'>LinkedIn</a> </li><li>Follow Mark on <a href='https://twitter.com/smartopian'>Twitter</a> </li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>To kick off Data Privacy Week 2023, I’m joined by Mark Lizar, CEO of the <a href='https://transparencylab.ca/'>Digital Transparency Lab</a> and Founder of <a href='https://www.0pn.org/'>0PN: Open Privacy Network</a>. </p><p>Mark is also the Vice Chair of the IEEE Cybersecurity for Next-Generation Connectivity Systems&apos; Human Control &amp; Flow Sub-Committee and Editor &amp; Lead Author of the ANCR Notice Record Specification and Framework at the Kantara Initiative. <br/><br/></p><p>In our conversation, we unpack the current standards and specifications for transparency and data control in the digital space. Mark shares some of the innovative solutions he and his colleagues are working on to bridge the gap in web consent. </p><p><br/></p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/></p><p>Mark unpacks his interpretation of the open transparency standards, laws, and tech required for privacy to scale digitally. One of the major use cases he’s working on at 0PN is called ‘Do Track,’ which is a response to the shortcomings of the current ‘Do Not Track’ mechanism that we have in place today. The Controller Credential Standard allows users to specify or direct consent, and he shares some exciting examples of how users can use ‘Do Track’ to take back control over their own data. <br/><br/></p><p>Mark breaks down the four levels of privacy assurance achieved Controller Credential Framework and explains what’s needed to gain market traction for this privacy-enabling tech standard. He also gives us a peek into what else they’re working on over at the Digital Transparency Lab and how to get involved with the organization and their efforts..<br/><br/></p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>---------<br/><br/></p><p><b>Topics Covered</b>:</p><ul><li>A simple way to understand online consents vs. system permissions </li><li>Why it’s important to see who&apos;s controlling our data </li><li>How the new Controller Credential gives people autonomy over their own data</li><li>International privacy instruments that can be scaled for local use </li><li>A new digital model for representing physical privacy </li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Learn more about <a href='https://transparencylab.ca/'>Digital Transparency Lab</a> </li><li>RSVP to the 1/27/23 <a href='https://transparencylab.ca/store/p/jan-27-digital-privacy-transparency-launch-event'>Digital Privacy Transparency Launch</a></li></ul><p><b>Guest Info</b>:</p><ul><li>Connect with Mark on <a href='https://www.linkedin.com/in/marklizar/'>LinkedIn</a> </li><li>Follow Mark on <a href='https://twitter.com/smartopian'>Twitter</a> </li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12096997-s2e3-fixing-consent-transparency-on-the-web-with-mark-lizar-digital-transparency-lab.mp3" length="36428615" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/371456</link>
    <itunes:image href="https://storage.buzzsprout.com/w45l3qwc7b90k3o0wquovn38cvrd?.jpg" />
    <itunes:author>Debra J. Farber / Mark Lizar</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12096997</guid>
    <pubDate>Tue, 24 Jan 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12096997/transcript" type="text/html" />
    <itunes:duration>3032</itunes:duration>
    <itunes:keywords>privacy, consent, standards, transparency, credential, permissions, security, controller, digital privacy, surveillance, web, trust, control, Consent Receipts, Digital Transparency Lab, Kantara Initiative, 0PN, Open Privacy, ZPN, Zero Public Network</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>3</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E2: &quot;Software Libraries, SBOMs &amp; Wicked Privacy, Oh My!&quot; with Michelle Dennedy (PrivacyCode)</itunes:title>
    <title>S2E2: &quot;Software Libraries, SBOMs &amp; Wicked Privacy, Oh My!&quot; with Michelle Dennedy (PrivacyCode)</title>
    <itunes:summary><![CDATA[Michelle Dennedy is Co-Founder &amp; CEO of PrivacyCode, Inc., Partner at Privatus Consulting, and the Co-Author of The Privacy Engineer's Manifesto. In our lively conversation, we discuss the digital cost of information, the privacy problems that her company solves for, and how the Privatus Wicked Privacy™ framework differs from other approaches. --------- Thank you to our sponsor, Privado, the developer-friendly privacy platform ---------  As Michelle puts it, we’re living in an ‘innovation...]]></itunes:summary>
    <description><![CDATA[<p>Michelle Dennedy is Co-Founder &amp; CEO of <a href='https://privacycode.ai/'>PrivacyCode, Inc</a>., Partner at <a href='https://privatus.online/'>Privatus Consulting</a>, and the Co-Author of <a href='https://www.amazon.com/Privacy-Engineers-Manifesto-Getting-Policy/dp/1430263555'>The Privacy Engineer&apos;s Manifesto</a>. In our lively conversation, we discuss the digital cost of information, the privacy problems that her company solves for, and how the Privatus Wicked Privacy™ framework differs from other approaches.</p><p><b>---------<br/>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/>As Michelle puts it, we’re living in an ‘innovation palooza’ right now. But, there’s still progress to be made. Michelle highlights how we can change the investment proposition to get more VCs and investors to see privacy is a strategic business enabler. At PrivacyCode, they’re focused on creating a simple way to communicate the language of ‘people data’ across specialities.<br/><br/></p><p>Part of the solution includes having a software bill of materials (SBOM), which is essentially a list of ingredients that make up software components. Michelle shares a tangible example of how an SBOM creates flow, compliance, and transparency in new areas of tech. She also touches on her consulting work, including her simple strategy for determining privacy benefit metrics.</p><p><br/><b>Topics Covered:</b></p><ul><li>Privacy as a strategic enabler</li><li>Why Michelle thinks &quot;today&apos;s VCs are more of a mood than an algorithm&quot;</li><li>How PrivacyCode allows users to orchestrate requirements across various departments and lets specialists operate in their &quot;zone of genius&quot;</li><li>What a Software Bill of Materials (SBOM) is &amp; why we need one to ensure privacy</li><li>Michelle&apos;s advice to privacy engineers on how to leverage an SBOM for quality code</li><li>Michelle&apos;s work at Privatus Consulting and their Wicked Privacy Framework</li><li>Examples of creative, straightforward privacy metrics</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Learn more about <a href='https://privacycode.ai/'>PrivacyCode &amp; schedule a demo</a></li><li>Learn more about <a href='https://privatus.online/'>Privatus Consulting</a></li><li><a href='https://www.amazon.com/Trillions-Thriving-Emerging-Information-Ecology/dp/1118176073'>Trillions: Thriving in the Emerging Information Ecology</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Michelle on <a href='https://www.linkedin.com/in/michelledennedy/'>LinkedIn</a></li><li>Follow Michelle on <a href='https://twitter.com/mdennedy'>Twitter</a></li><li>Read <a href='https://www.amazon.com/Privacy-Engineers-Manifesto-Getting-Policy/dp/1430263555'>The Privacy Engineer&apos;s Manifesto: Getting from Policy to Code to QA to Value</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Michelle Dennedy is Co-Founder &amp; CEO of <a href='https://privacycode.ai/'>PrivacyCode, Inc</a>., Partner at <a href='https://privatus.online/'>Privatus Consulting</a>, and the Co-Author of <a href='https://www.amazon.com/Privacy-Engineers-Manifesto-Getting-Policy/dp/1430263555'>The Privacy Engineer&apos;s Manifesto</a>. In our lively conversation, we discuss the digital cost of information, the privacy problems that her company solves for, and how the Privatus Wicked Privacy™ framework differs from other approaches.</p><p><b>---------<br/>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/>As Michelle puts it, we’re living in an ‘innovation palooza’ right now. But, there’s still progress to be made. Michelle highlights how we can change the investment proposition to get more VCs and investors to see privacy is a strategic business enabler. At PrivacyCode, they’re focused on creating a simple way to communicate the language of ‘people data’ across specialities.<br/><br/></p><p>Part of the solution includes having a software bill of materials (SBOM), which is essentially a list of ingredients that make up software components. Michelle shares a tangible example of how an SBOM creates flow, compliance, and transparency in new areas of tech. She also touches on her consulting work, including her simple strategy for determining privacy benefit metrics.</p><p><br/><b>Topics Covered:</b></p><ul><li>Privacy as a strategic enabler</li><li>Why Michelle thinks &quot;today&apos;s VCs are more of a mood than an algorithm&quot;</li><li>How PrivacyCode allows users to orchestrate requirements across various departments and lets specialists operate in their &quot;zone of genius&quot;</li><li>What a Software Bill of Materials (SBOM) is &amp; why we need one to ensure privacy</li><li>Michelle&apos;s advice to privacy engineers on how to leverage an SBOM for quality code</li><li>Michelle&apos;s work at Privatus Consulting and their Wicked Privacy Framework</li><li>Examples of creative, straightforward privacy metrics</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Learn more about <a href='https://privacycode.ai/'>PrivacyCode &amp; schedule a demo</a></li><li>Learn more about <a href='https://privatus.online/'>Privatus Consulting</a></li><li><a href='https://www.amazon.com/Trillions-Thriving-Emerging-Information-Ecology/dp/1118176073'>Trillions: Thriving in the Emerging Information Ecology</a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Michelle on <a href='https://www.linkedin.com/in/michelledennedy/'>LinkedIn</a></li><li>Follow Michelle on <a href='https://twitter.com/mdennedy'>Twitter</a></li><li>Read <a href='https://www.amazon.com/Privacy-Engineers-Manifesto-Getting-Policy/dp/1430263555'>The Privacy Engineer&apos;s Manifesto: Getting from Policy to Code to QA to Value</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/12005078-s2e2-software-libraries-sboms-wicked-privacy-oh-my-with-michelle-dennedy-privacycode.mp3" length="41485775" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/359647</link>
    <itunes:image href="https://storage.buzzsprout.com/uvnfu0sw9ujqvdqt92p3zyzh5d2u?.jpg" />
    <itunes:author>Debra J Farber / Michelle Dennedy</itunes:author>
    <guid isPermaLink="false">Buzzsprout-12005078</guid>
    <pubDate>Tue, 10 Jan 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/12005078/transcript" type="text/html" />
    <podcast:soundbite startTime="1196.0" duration="60.0" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/12005078/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E2: &quot;Software Libraries, SBOMs &amp; Wicked Privacy, Oh My!&quot; with Michelle Dennedy (PrivacyCode)" />
  <psc:chapter start="1:15" title="Introducing Michelle Dennedy" />
  <psc:chapter start="2:12" title="Privacy as a Strategic Enabler" />
  <psc:chapter start="9:00" title="Why Today&#39;s VCs are More of a Mood Than an Algorithm" />
  <psc:chapter start="16:14" title="How PrivacyCode Solves for Privacy Problems" />
  <psc:chapter start="28:52" title="Clubhouse: A Case Study in Scaling a Privacy Disaster" />
  <psc:chapter start="30:58" title="Why we Need a Software Bill of Materials (SBOM) for Ensuring Privacy" />
  <psc:chapter start="37:47" title="Michelle&#39;s Advice to Privacy Engineers on How to Leverage an SBOM for Quality Code" />
  <psc:chapter start="41:47" title="Michelle Discusses Her Work as a Partner at Privatus Consulting and their Wicked Privacy Framework" />
  <psc:chapter start="46:58" title="How to Approach Privacy Metrics" />
  <psc:chapter start="55:13" title="How to Contact Michelle" />
</psc:chapters>
    <itunes:duration>3453</itunes:duration>
    <itunes:keywords>privacy, metrics, Wicked Privacy, PrivacyCode, Privatus Consulting, Michelle Dennedy, software libraries, data value, privacy by design, SBOM, software bill of materials, engineers, developers, shift left, VC, privacy tech</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>2</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S2E1: Driving Privacy Left: Vehicular Privacy with Andrea Amico (Privacy4Cars)</itunes:title>
    <title>S2E1: Driving Privacy Left: Vehicular Privacy with Andrea Amico (Privacy4Cars)</title>
    <itunes:summary><![CDATA[Of the almost 300 million cars that are in circulation in the U.S., the vast majority collect consumer’s personal information. Every time you connect your phone via USB or Bluetooth, your car is designed to download data and store it locally. The automotive industry is grossly behind when it comes to data privacy and safety, but that’s where Privacy4Cars comes in.  Privacy4Cars is the first (and only) privacy tech company focused on identifying the challenges posed by vehicle data. They ...]]></itunes:summary>
    <description><![CDATA[<p>Of the almost 300 million cars that are in circulation in the U.S., the vast majority collect consumer’s personal information. Every time you connect your phone via USB or Bluetooth, your car is designed to download data and store it locally. The automotive industry is grossly behind when it comes to data privacy and safety, but that’s where Privacy4Cars comes in. </p><p>Privacy4Cars is the first (and only) privacy tech company focused on identifying the challenges posed by vehicle data. They create solutions to better protect consumers and businesses by offering improved privacy, safety, security, and compliance. </p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p>In our conversation, Andrea reveals how personal data flows through vehicular systems and networks. He highlights the type of data that can be easily found in cars, such as your frequently visited addresses, garage codes, text messages, emails, and so on. Andrea explains the different privacy concerns that have so far remained unaddressed across the industry and his theory on why these gaps exist. </p><p><br/>It might be unsettling to hear about the state of privacy in the automotive industry, but fortunately, the folks at Privacy4Cars are dedicated to creating new standards. Andrea shares what the industry reaction has been to Privacy4Cars’ initiatives and highlights some other organizations that are leading innovation on this issue. </p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.</b><br/>---------</p><p><b>Topics Covered:</b></p><ul><li>Andrea’s professional background and what inspired him to launch Privacy4Cars</li><li>Debunking common myths about data storage and security in cars </li><li>Where car data privacy falls under EU GDPR</li><li>How Privacy4Cars helps companies solve compliance issues</li><li>Feedback from third-party wholesalers, dealerships, and service providers </li><li>Advice for automotive software developers when architecting systems and networks in this space </li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Read STOP&apos;s paper, <a href='https://www.stopspying.org/wiretaps-on-wheels'>Wiretaps on Wheels </a></li><li>Read the <a href='https://www.wardsauto.com/vehicles/european-data-protection-guidelines-connected-vehicles'>European Data Protection Guidelines for Connected Vehicles</a></li><li>Learn about <a href='https://privacy4cars.com/'>Privacy4Cars</a></li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Andrea on <a href='https://www.linkedin.com/in/andrea-amico-a44aa/'>LinkedIn</a></li><li>Follow <a href='https://twitter.com/Privacy4Cars'>@Privacy4Cars on Twitter </a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Of the almost 300 million cars that are in circulation in the U.S., the vast majority collect consumer’s personal information. Every time you connect your phone via USB or Bluetooth, your car is designed to download data and store it locally. The automotive industry is grossly behind when it comes to data privacy and safety, but that’s where Privacy4Cars comes in. </p><p>Privacy4Cars is the first (and only) privacy tech company focused on identifying the challenges posed by vehicle data. They create solutions to better protect consumers and businesses by offering improved privacy, safety, security, and compliance. </p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p>In our conversation, Andrea reveals how personal data flows through vehicular systems and networks. He highlights the type of data that can be easily found in cars, such as your frequently visited addresses, garage codes, text messages, emails, and so on. Andrea explains the different privacy concerns that have so far remained unaddressed across the industry and his theory on why these gaps exist. </p><p><br/>It might be unsettling to hear about the state of privacy in the automotive industry, but fortunately, the folks at Privacy4Cars are dedicated to creating new standards. Andrea shares what the industry reaction has been to Privacy4Cars’ initiatives and highlights some other organizations that are leading innovation on this issue. </p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.</b><br/>---------</p><p><b>Topics Covered:</b></p><ul><li>Andrea’s professional background and what inspired him to launch Privacy4Cars</li><li>Debunking common myths about data storage and security in cars </li><li>Where car data privacy falls under EU GDPR</li><li>How Privacy4Cars helps companies solve compliance issues</li><li>Feedback from third-party wholesalers, dealerships, and service providers </li><li>Advice for automotive software developers when architecting systems and networks in this space </li></ul><p><b>Resources Mentioned</b>:</p><ul><li>Read STOP&apos;s paper, <a href='https://www.stopspying.org/wiretaps-on-wheels'>Wiretaps on Wheels </a></li><li>Read the <a href='https://www.wardsauto.com/vehicles/european-data-protection-guidelines-connected-vehicles'>European Data Protection Guidelines for Connected Vehicles</a></li><li>Learn about <a href='https://privacy4cars.com/'>Privacy4Cars</a></li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Andrea on <a href='https://www.linkedin.com/in/andrea-amico-a44aa/'>LinkedIn</a></li><li>Follow <a href='https://twitter.com/Privacy4Cars'>@Privacy4Cars on Twitter </a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11949739-s2e1-driving-privacy-left-vehicular-privacy-with-andrea-amico-privacy4cars.mp3" length="41769110" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/353742</link>
    <itunes:image href="https://storage.buzzsprout.com/z8hmn1m4nh8oigbhzcik7c5fj4bw?.jpg" />
    <itunes:author>Debra J Farber / Andrea Amico</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11949739</guid>
    <pubDate>Tue, 03 Jan 2023 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11949739/transcript" type="text/html" />
    <podcast:soundbite startTime="141.462" duration="50.5" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/11949739/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S2E1: Driving Privacy Left: Vehicular Privacy with Andrea Amico (Privacy4Cars)" />
  <psc:chapter start="1:15" title="Introducing Andrea Amico: CEO, Privacy4Cars" />
  <psc:chapter start="2:51" title="What Personal Data is Collected by and Stored in Vehicle Systems?" />
  <psc:chapter start="8:46" title="Andrea&#39;s Theory: Why Privacy has been Neglected by Auto Manufacturers and Dealers" />
  <psc:chapter start="11:51" title="Andrea Reveals His &quot;Origin Story&quot;: From Engineering Consultant to a Founder of Automotive Privacy Tech Company, Privacy4Cars)" />
  <psc:chapter start="16:29" title="What or Who is Listening To You in Your Vehicle?" />
  <psc:chapter start="19:36" title="Despite GDPR, European Auto Manufactures &amp; Fleet Managers Lag Behind When it Comes to Privacy &amp; Data Protection" />
  <psc:chapter start="20:09" title="Privacy Harms that the Automotive Industry Must Prevent" />
  <psc:chapter start="27:16" title="Andrea Discusses Privacy4Cars and the Privacy Problems it Solves" />
  <psc:chapter start="31:22" title="The Need for Automotive Privacy Standards" />
  <psc:chapter start="38:12" title="Andrea Believes That Automotive Privacy Will Evolve as a Component of Vehicle Safety as Well as Customer Service" />
  <psc:chapter start="47:20" title="Andrea&#39;s Advice to Developers in the Automotive Space When Designing for Privacy" />
  <psc:chapter start="50:47" title="Debra &amp; Jim Talk about the FTC and its and its Guidance Regarding Vehicular Privacy" />
  <psc:chapter start="51:48" title="Andrea highlights Industry Orgs &amp; Non-profits Focused on Vehicular Privacy &amp; Civil Liberties" />
</psc:chapters>
    <itunes:duration>3476</itunes:duration>
    <itunes:keywords>privacy, data protection, cars, vehicles, automotive, dealerships, fleets, surveillance, data retention, data deletion, Privacy4Cars, Andrea Amico, ethics, hacking</itunes:keywords>
    <itunes:season>2</itunes:season>
    <itunes:episode>1</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title> S1E9: Funding Web3 Privacy &amp; Recent Web3 Trust Fails with Jim Nasr</itunes:title>
    <title> S1E9: Funding Web3 Privacy &amp; Recent Web3 Trust Fails with Jim Nasr</title>
    <itunes:summary><![CDATA[This week, I continue my conversation with Jim Nasr, CEO of Acoer about privacy and using distributed ledger technology (DLT). We discuss his work leading The HBAR Foundation's Privacy Market Development Fund and the trends he sees across grant applicants. We also chat about the collapse of FTX and the ripple effect it’s had on the crypto space.  --------- Thank you to our sponsor, Privado, the developer-friendly privacy platform ---------   Jim tells us about the types of innovations Th...]]></itunes:summary>
    <description><![CDATA[<p>This week, I continue my conversation with <a href='https://www.linkedin.com/in/jnasr/'>Jim Nasr</a>, CEO of <a href='https://www.acoer.com/'>Acoer</a> about privacy and using distributed ledger technology (DLT). We discuss his work leading The HBAR Foundation&apos;s <a href='https://www.hbarfoundation.org/blog-post/introducing-the-hbar-foundation-privacy-market-development-fund'>Privacy Market Development Fund </a>and the trends he sees across grant applicants. We also chat about the collapse of FTX and the ripple effect it’s had on the crypto space. </p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/></p><p>Jim tells us about the types of innovations <a href='https://www.hbarfoundation.org/'>The HBAR Foundation</a> seeks to fund; why privacy &amp; security usability is an imperative; uses cases for decentralized identifiers (DIDs) and new &quot;DID methods&quot; like PKH. We also discuss FTX&apos;s collapse and how to provide real transparency and data regulation in DLT technology. </p><p><br/></p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>---------</p><p><br/></p><p><b>Topics Covered:</b></p><ul><li>The HBAR Foundation’s search for projects to fund that enhance privacy usability</li><li>Exciting privacy use cases Jim has seen using Hedera&apos;s DLT, including those that enable high-value, privacy-preserving transactions </li><li>What went wrong with FTX and what we can learn from it&apos;s collapse </li><li>How decentralized identity can enable the next iteration of web privacy</li><li>The tech behind <a href='https://metamask.io/snaps/'>MetaMask&apos;s Snap software</a> that allows anyone to safely extend capabilities of their wallet </li></ul><p><b>Resources Mentioned:</b></p><ul><li>Learn about <a href='https://www.acoer.com/'>Acoer</a></li><li>Learn about <a href='https://www.hbarfoundation.org/'>The HBAR Foundation</a></li><li>Read about <a href='https://www.hbarfoundation.org/blog-post/introducing-the-hbar-foundation-privacy-market-development-fund'>The HBAR Foundation&apos;s Privacy Market Development Fund </a></li></ul><p><b>Jim Nasr’s Info:</b></p><ul><li>Follow Jim on <a href='https://www.linkedin.com/in/jnasr/'>LinkedIn</a></li><li>Follow Jim on <a href='https://twitter.com/jnasr'>Twitter</a></li></ul><p><b>Follow the SPL Show:</b></p><ul><li>Follow us on <a href='https://twitter.com/ShiftPrivacyPod'>Twitter</a> </li><li>Follow us on <a href='https://www.linkedin.com/showcase/shifting-privacy-left-podcast/?'>LinkedIn</a></li><li>Check out our <a href='https://shiftingprivacyleft.com/'>website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, I continue my conversation with <a href='https://www.linkedin.com/in/jnasr/'>Jim Nasr</a>, CEO of <a href='https://www.acoer.com/'>Acoer</a> about privacy and using distributed ledger technology (DLT). We discuss his work leading The HBAR Foundation&apos;s <a href='https://www.hbarfoundation.org/blog-post/introducing-the-hbar-foundation-privacy-market-development-fund'>Privacy Market Development Fund </a>and the trends he sees across grant applicants. We also chat about the collapse of FTX and the ripple effect it’s had on the crypto space. </p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------</p><p><br/></p><p>Jim tells us about the types of innovations <a href='https://www.hbarfoundation.org/'>The HBAR Foundation</a> seeks to fund; why privacy &amp; security usability is an imperative; uses cases for decentralized identifiers (DIDs) and new &quot;DID methods&quot; like PKH. We also discuss FTX&apos;s collapse and how to provide real transparency and data regulation in DLT technology. </p><p><br/></p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>---------</p><p><br/></p><p><b>Topics Covered:</b></p><ul><li>The HBAR Foundation’s search for projects to fund that enhance privacy usability</li><li>Exciting privacy use cases Jim has seen using Hedera&apos;s DLT, including those that enable high-value, privacy-preserving transactions </li><li>What went wrong with FTX and what we can learn from it&apos;s collapse </li><li>How decentralized identity can enable the next iteration of web privacy</li><li>The tech behind <a href='https://metamask.io/snaps/'>MetaMask&apos;s Snap software</a> that allows anyone to safely extend capabilities of their wallet </li></ul><p><b>Resources Mentioned:</b></p><ul><li>Learn about <a href='https://www.acoer.com/'>Acoer</a></li><li>Learn about <a href='https://www.hbarfoundation.org/'>The HBAR Foundation</a></li><li>Read about <a href='https://www.hbarfoundation.org/blog-post/introducing-the-hbar-foundation-privacy-market-development-fund'>The HBAR Foundation&apos;s Privacy Market Development Fund </a></li></ul><p><b>Jim Nasr’s Info:</b></p><ul><li>Follow Jim on <a href='https://www.linkedin.com/in/jnasr/'>LinkedIn</a></li><li>Follow Jim on <a href='https://twitter.com/jnasr'>Twitter</a></li></ul><p><b>Follow the SPL Show:</b></p><ul><li>Follow us on <a href='https://twitter.com/ShiftPrivacyPod'>Twitter</a> </li><li>Follow us on <a href='https://www.linkedin.com/showcase/shifting-privacy-left-podcast/?'>LinkedIn</a></li><li>Check out our <a href='https://shiftingprivacyleft.com/'>website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11880124-s1e9-funding-web3-privacy-recent-web3-trust-fails-with-jim-nasr.mp3" length="42320210" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/342716</link>
    <itunes:image href="https://storage.buzzsprout.com/la5pn971nhkczsmhlm0kak18lowa?.jpg" />
    <itunes:author>Debra J Farber / Jim Nasr</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11880124</guid>
    <pubDate>Tue, 20 Dec 2022 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11880124/transcript" type="text/html" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/11880124/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title=" S1E9: Funding Web3 Privacy &amp; Recent Web3 Trust Fails with Jim Nasr" />
  <psc:chapter start="1:58" title="Jim Nasr Discusses The HBAR Foundation&#39;s Privacy Market Development Fund, Which He Oversees" />
  <psc:chapter start="7:23" title="Privacy Trend 1: Verification, Proof of Events, Data Stamping" />
  <psc:chapter start="8:42" title="Privacy Trend 2: Separation of dApp business Function from DLT Element" />
  <psc:chapter start="10:53" title="Privacy Trend 3: Leveraging DLT while Enabling the Usability of dApps" />
  <psc:chapter start="12:06" title="Stand-Out DLT Deployments that Received Grants from The HBAR Foundation&#39;s Privacy Market Development Fund" />
  <psc:chapter start="18:01" title="Discussing: decentralized identity, verified credentials, DID specifications, SSI &amp; crypto wallets (including MetaMask&#39;s Snaps Extension), etc." />
  <psc:chapter start="27:26" title="Jim Seeks Grant Applications that Can Take a Current Application and Make Incremental Improvements Using DLT" />
  <psc:chapter start="31:57" title="Debra &amp; Jim Discuss the FTX Crypto Exchange Collapse, Centralized Exchanges, Key Recovery, and Other Web3 Usability Challenges" />
</psc:chapters>
    <itunes:duration>3523</itunes:duration>
    <itunes:keywords>Privacy, DLT, distributed ledgers, Acoer, hashgraph, The HBAR Foundation, Hedera, cryptocurrency, crypto, FTX, trust, usable privacy, centralized exchanges, adoption, transactions, web3, DIDs, verified credentials</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>9</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S1E8: Leveraging Distributed Ledgers for Privacy Assurance with Jim Nasr</itunes:title>
    <title>S1E8: Leveraging Distributed Ledgers for Privacy Assurance with Jim Nasr</title>
    <itunes:summary><![CDATA[Today, I am joined by Jim Nasr, CEO of Acoer. I had the pleasure of collaborating with Jim on several projects during my 6-month stint as Privacy Strategist for Hedera. Jim joins me today to discuss the use of distributed ledger tech (DLT) to provide computational trust for real-time applications. Jim and I speak about the development of secure, privacy-preserving, and traceable technologies, which can gain adoption via open protocols and usable interfaces. --------- Thank you to our sponsor,...]]></itunes:summary>
    <description><![CDATA[<p>Today, I am joined by <a href='https://www.linkedin.com/in/jnasr/'>Jim Nasr</a>, CEO of Acoer. I had the pleasure of collaborating with Jim on several projects during my 6-month stint as Privacy Strategist for <a href='https://hedera.com/'>Hedera</a>. Jim joins me today to discuss the use of <a href='https://en.wikipedia.org/wiki/Distributed_ledger'>distributed ledger tech (DLT)</a> to provide <a href='https://en.wikipedia.org/wiki/Computational_trust'>computational trust</a> for real-time applications. Jim and I speak about the development of secure, privacy-preserving, and traceable technologies, which can gain adoption via open protocols and usable interfaces.</p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------<br/><br/></p><p>In part one of this two-episode conversation, Jim explains Acoer&apos;s approach to building DLT-enabled software and its initial application to healthcare and clinical trials. Jim shares his background and experience in tech both academically and professionally; as an entrepreneur in software development; his roles in large-scale tech companies and with the government at the CDC; and how he enjoyed “getting his hands dirty” in public health to bring automated trust and accountability to the space. At Acoer, Jim continues his previous work - to build open technologies - by leveraging DLT and also building interfaces with usable privacy and security. </p><p><br/>In this conversation, Jim also covers the security and privacy approaches that Acoer takes to ensure that its products work as advertised and so that the machinery of its clients is never compromised.<br/><br/></p><p>----------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>----------</p><p><br/><b>Topics Covered:</b></p><ul><li>How Acoer designs and builds its tech as components to be absorbed &amp; consumed by other machines</li><li>How using DLT reduces the need for intermediaries</li><li>Acoer&apos;s approach to building decentralized apps &amp; why it chose to build on <a href='https://en.wikipedia.org/wiki/Hashgraph'>hashgraph</a> tech instead of blockchain</li><li>Benefits gained from DLT&apos;s &quot;data stamping&quot; to computationally prove transactions &amp; to assist during data leakages, compliance issues, or to demonstrate privacy assurance</li><li>How you can use NFTs to represent individuals&apos; consents via RightsHash</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Learn about <a href='https://www.acoer.com/'>Acoer</a></li><li>Learn about <a href='https://www.rightshash.com/'>RightsHash</a></li></ul><p><b>Jim Nasr&apos;s Info:</b></p><ul><li>Follow Jim on <a href='https://www.linkedin.com/in/jnasr/'>LinkedIn</a></li><li>Follow Jim on <a href='https://twitter.com/jnasr'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>Today, I am joined by <a href='https://www.linkedin.com/in/jnasr/'>Jim Nasr</a>, CEO of Acoer. I had the pleasure of collaborating with Jim on several projects during my 6-month stint as Privacy Strategist for <a href='https://hedera.com/'>Hedera</a>. Jim joins me today to discuss the use of <a href='https://en.wikipedia.org/wiki/Distributed_ledger'>distributed ledger tech (DLT)</a> to provide <a href='https://en.wikipedia.org/wiki/Computational_trust'>computational trust</a> for real-time applications. Jim and I speak about the development of secure, privacy-preserving, and traceable technologies, which can gain adoption via open protocols and usable interfaces.</p><p>---------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>---------<br/><br/></p><p>In part one of this two-episode conversation, Jim explains Acoer&apos;s approach to building DLT-enabled software and its initial application to healthcare and clinical trials. Jim shares his background and experience in tech both academically and professionally; as an entrepreneur in software development; his roles in large-scale tech companies and with the government at the CDC; and how he enjoyed “getting his hands dirty” in public health to bring automated trust and accountability to the space. At Acoer, Jim continues his previous work - to build open technologies - by leveraging DLT and also building interfaces with usable privacy and security. </p><p><br/>In this conversation, Jim also covers the security and privacy approaches that Acoer takes to ensure that its products work as advertised and so that the machinery of its clients is never compromised.<br/><br/></p><p>----------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>----------</p><p><br/><b>Topics Covered:</b></p><ul><li>How Acoer designs and builds its tech as components to be absorbed &amp; consumed by other machines</li><li>How using DLT reduces the need for intermediaries</li><li>Acoer&apos;s approach to building decentralized apps &amp; why it chose to build on <a href='https://en.wikipedia.org/wiki/Hashgraph'>hashgraph</a> tech instead of blockchain</li><li>Benefits gained from DLT&apos;s &quot;data stamping&quot; to computationally prove transactions &amp; to assist during data leakages, compliance issues, or to demonstrate privacy assurance</li><li>How you can use NFTs to represent individuals&apos; consents via RightsHash</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Learn about <a href='https://www.acoer.com/'>Acoer</a></li><li>Learn about <a href='https://www.rightshash.com/'>RightsHash</a></li></ul><p><b>Jim Nasr&apos;s Info:</b></p><ul><li>Follow Jim on <a href='https://www.linkedin.com/in/jnasr/'>LinkedIn</a></li><li>Follow Jim on <a href='https://twitter.com/jnasr'>Twitter</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11852476-s1e8-leveraging-distributed-ledgers-for-privacy-assurance-with-jim-nasr.mp3" length="37719663" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/336594</link>
    <itunes:image href="https://storage.buzzsprout.com/axmtvb60e1q3ov8u97fzoehet76d?.jpg" />
    <itunes:author>Debra J Farber / Jim Nasr</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11852476</guid>
    <pubDate>Tue, 13 Dec 2022 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11852476/transcript" type="text/html" />
    <podcast:soundbite startTime="103.083" duration="60.0" />
    <podcast:chapters url="https://www.buzzsprout.com/2059470/11852476/chapters.json" type="application/json" />
    <psc:chapters>
  <psc:chapter start="0:00" title="S1E8: Leveraging Distributed Ledgers for Privacy Assurance with Jim Nasr" />
  <psc:chapter start="1:15" title="Introducing Jim Nasr and his company, Acoer" />
  <psc:chapter start="4:33" title="Why Jim Chose to Build His Apps Using Distributed Ledger Technology (DLT)" />
  <psc:chapter start="20:59" title="We discuss DLT Benefits Like Proof of Action &amp; Why Jim Chose to Build on Hedera&#39;s Layer 1 DLT Network" />
  <psc:chapter start="24:07" title="Leveraging DLT for &quot;Usable Privacy&quot;" />
  <psc:chapter start="39:48" title="How DLT Enables Accountability of Data, Using RightsHash as an Example" />
  <psc:chapter start="45:26" title="Jim&#39;s Thoughts on DLT Adoption" />
</psc:chapters>
    <itunes:duration>3139</itunes:duration>
    <itunes:keywords>privacy, blockchain, hashgraph, Hedera, distributed ledgers, DLT, healthcare, consent, public ledger, intermediaries, decentralization, RightsHash, iPrivata, Jim Nasr, privacy assurance, instant audit, accountability, innovation</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>8</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S1E7: Privacy Engineers: The Next Generation with Lorrie Cranor (CMU)</itunes:title>
    <title>S1E7: Privacy Engineers: The Next Generation with Lorrie Cranor (CMU)</title>
    <itunes:summary><![CDATA[In this episode, I’m joined by Lorrie Cranor, FORE Systems Professor, Computer Science and Engineering &amp; Public Policy at Carnegie Mellon University (CMU); Director, CyLab Usable Privacy and Security Laboratory; and Co-Director, of CMU's MSIT-Privacy Engineering Masters Program. We discuss the different tracks within the Privacy Engineering Program at CMU, privacy engineering hiring trends, the need for industry education, and Lorrie’s research outside of the classroom.  ---------- Thank ...]]></itunes:summary>
    <description><![CDATA[<p>In this episode, I’m joined by <a href='https://www.linkedin.com/in/lorriecranor/'>Lorrie Cranor</a>, FORE Systems Professor, Computer Science and Engineering &amp; Public Policy at Carnegie Mellon University (CMU); Director, CyLab Usable Privacy and Security Laboratory; and Co-Director, of CMU&apos;s <a href='http://privacy.cs.cmu.edu/'>MSIT-Privacy Engineering Masters Program</a>. We discuss the different tracks within the Privacy Engineering Program at CMU, privacy engineering hiring trends, the need for industry education, and Lorrie’s research outside of the classroom.<br/><br/>----------<br/>Thank you to our sponsor, <a href='https://www.privado.ai/'>Privado</a>, the developer-friendly privacy platform<br/>----------<br/><br/>Lorrie explains how this next generation of privacy experts and engineers can work together to bring new architectures, innovations, and software to market. She describes the kind of hands-on work in which her students participate, including a capstone project sponsored by Meta that’s exploring ways the platform can integrate more privacy education into its UI/UX.<br/><br/>In addition, Lorrie shares her perspective on the job market for privacy engineers for recent grads and explains how CMU’s Certificate Program in Privacy Engineering aims to meet the high demand for experienced privacy experts with knowledge of privacy engineering concepts. We also get into her research on cookie banners and privacy “nutrition labels” for IoT devices.<br/><br/><br/><b>Topics Covered:</b></p><ul><li>Lorrie’s professional background and what drew her into privacy engineering</li><li>What candidates can expect from the Privacy Engineering Program at CMU </li><li>Insights into how people interact with cookie banners and potential solutions to improve the user experience</li><li>Ways that we can bridge the hiring gap in our industry</li><li>Different sectors outside of tech that are looking for privacy experts, including finance and retail</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Apply to CMU&apos;s <a href='https://privacy.cs.cmu.edu/'>Privacy Engineering Program</a> (Applications due Dec 12th, 2022 for the next enrollment period)</li><li>Learn about CMU&apos;s <a href='https://www.cylab.cmu.edu/education/index.html'>CyLab Security &amp; Privacy Institute</a></li><li>Learn about the <a href='http://cups.cs.cmu.edu/'>CyLab Usable Privacy and Security (CUPS) Laboratory</a></li><li>Review CMU&apos;s research on <a href='https://www.iotsecurityprivacy.org/'>IoT Privacy &amp; Security Labels</a>.</li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Lorrie on <a href='https://www.linkedin.com/in/lorriecranor/'>LinkedIn</a></li><li>Follow Lorrie on <a href='https://twitter.com/lorrietweet?lang=en'>Twitter</a></li><li>Learn more <a href='https://lorrie.cranor.org/'>about</a> <a href='https://lorrie.cranor.org/'>Lorrie</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this episode, I’m joined by <a href='https://www.linkedin.com/in/lorriecranor/'>Lorrie Cranor</a>, FORE Systems Professor, Computer Science and Engineering &amp; Public Policy at Carnegie Mellon University (CMU); Director, CyLab Usable Privacy and Security Laboratory; and Co-Director, of CMU&apos;s <a href='http://privacy.cs.cmu.edu/'>MSIT-Privacy Engineering Masters Program</a>. We discuss the different tracks within the Privacy Engineering Program at CMU, privacy engineering hiring trends, the need for industry education, and Lorrie’s research outside of the classroom.<br/><br/>----------<br/>Thank you to our sponsor, <a href='https://www.privado.ai/'>Privado</a>, the developer-friendly privacy platform<br/>----------<br/><br/>Lorrie explains how this next generation of privacy experts and engineers can work together to bring new architectures, innovations, and software to market. She describes the kind of hands-on work in which her students participate, including a capstone project sponsored by Meta that’s exploring ways the platform can integrate more privacy education into its UI/UX.<br/><br/>In addition, Lorrie shares her perspective on the job market for privacy engineers for recent grads and explains how CMU’s Certificate Program in Privacy Engineering aims to meet the high demand for experienced privacy experts with knowledge of privacy engineering concepts. We also get into her research on cookie banners and privacy “nutrition labels” for IoT devices.<br/><br/><br/><b>Topics Covered:</b></p><ul><li>Lorrie’s professional background and what drew her into privacy engineering</li><li>What candidates can expect from the Privacy Engineering Program at CMU </li><li>Insights into how people interact with cookie banners and potential solutions to improve the user experience</li><li>Ways that we can bridge the hiring gap in our industry</li><li>Different sectors outside of tech that are looking for privacy experts, including finance and retail</li></ul><p><b>Resources Mentioned:</b></p><ul><li>Apply to CMU&apos;s <a href='https://privacy.cs.cmu.edu/'>Privacy Engineering Program</a> (Applications due Dec 12th, 2022 for the next enrollment period)</li><li>Learn about CMU&apos;s <a href='https://www.cylab.cmu.edu/education/index.html'>CyLab Security &amp; Privacy Institute</a></li><li>Learn about the <a href='http://cups.cs.cmu.edu/'>CyLab Usable Privacy and Security (CUPS) Laboratory</a></li><li>Review CMU&apos;s research on <a href='https://www.iotsecurityprivacy.org/'>IoT Privacy &amp; Security Labels</a>.</li></ul><p><b>Guest Info:</b></p><ul><li>Connect with Lorrie on <a href='https://www.linkedin.com/in/lorriecranor/'>LinkedIn</a></li><li>Follow Lorrie on <a href='https://twitter.com/lorrietweet?lang=en'>Twitter</a></li><li>Learn more <a href='https://lorrie.cranor.org/'>about</a> <a href='https://lorrie.cranor.org/'>Lorrie</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11807121-s1e7-privacy-engineers-the-next-generation-with-lorrie-cranor-cmu.mp3" length="32490203" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/324486</link>
    <itunes:image href="https://storage.buzzsprout.com/j0w6muc2qzdjxmwvn0ebviafomvk?.jpg" />
    <itunes:author>Debra J Farber / Lorrie Cranor</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11807121</guid>
    <pubDate>Tue, 06 Dec 2022 04:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11807121/transcript" type="text/html" />
    <podcast:soundbite startTime="580.617" duration="57.5" />
    <itunes:duration>2703</itunes:duration>
    <itunes:keywords>privacy engineer, privacy engineering, CMU, Carnegie Mellon University, CyLab, Usable Privacy, Masters in Privacy Engineering, Certificate in Privacy Engineering, privacy by design, computer science, hiring, IoT nutrition labels, cookies</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>7</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S1E6: The Explosion of Privacy Tech with Lourdes Turrecha (TROPT)</itunes:title>
    <title>S1E6: The Explosion of Privacy Tech with Lourdes Turrecha (TROPT)</title>
    <itunes:summary><![CDATA[This week, I’m joined by Lourdes Turrecha, Founder &amp; Chief Privacy Tech Strategist at The Rise of Privacy Tech (TROPT). TROPT's mission is to fuel privacy innovation by bringing together privacy tech founders, investors, buyers, &amp; expert-advisors to bridge the existing tech-capital-expertise gaps in the field. As a member of TROPT's Advisory Board, I’ve seen 1st-hand TROPT's innovative resources and events that they offer the industry. ---------- Thank you to our sponsor, Privado, the...]]></itunes:summary>
    <description><![CDATA[<p>This week, I’m joined by <a href='https://www.linkedin.com/in/lourdesmturrecha/'>Lourdes Turrecha</a>, Founder &amp; Chief Privacy Tech Strategist at <a href='https://www.riseofprivacytech.com/'>The Rise of Privacy Tech (TROPT)</a>. TROPT&apos;s mission is to fuel privacy innovation by bringing together privacy tech founders, investors, buyers, &amp; expert-advisors to bridge the existing tech-capital-expertise gaps in the field. As a member of TROPT&apos;s Advisory Board, I’ve seen 1st-hand TROPT&apos;s innovative resources and events that they offer the industry.</p><p>----------<br/>Thank you to our sponsor, <a href='https://www.privado.ai/'>Privado</a>, the developer-friendly privacy platform<br/>----------<br/><br/></p><p>In our conversation, Lourdes and I explore the different facets of TROPT, particularly focusing on what’s included in the recently-published &quot;<a href='https://www.riseofprivacytech.com/wp-content/uploads/2022/11/TROPT-Privacy-Tech-Stack-Whitepaper-2022.pdf'>TROPT Privacy Tech Stack 2.0 Whitepaper 2022</a>.&quot; We discuss how buyers currently navigate the space, how TROPT supports privacy tech founders &amp; the 5 biggest challenges that we see across privacy tech. <br/><br/>The whitepaper is a first-of-its-kind landscape that categorizes the different categories of privacy tech so the market can better understand the breadth and depth of the space. It highlights current trends and visions for the future of privacy tech, and addresses solutions to those 5 major pain points. Lourdes also dives into what we can expect from the <a href='https://hopin.com/events/tropt-data-privacy-week-2023-339870e7-61e4-4a7f-9b50-39a1506c270c/registration'>TROPT Data Privacy Week 2023</a> in January and how to get involved. </p><p><br/>----------<br/>Listen to the episode on <a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'>Apple Podcasts</a>, <a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'>Spotify</a>, <a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'>iHeartRadio</a>, or on your favorite podcast platform.<br/>----------</p><p><br/><b>Topics Covered:</b></p><ul><li>TROPT’s free resources and paid offerings for privacy tech key players</li><li>The thought process behind the TROPT Privacy Tech Stack Review program </li><li>The current frustrations of many privacy tech buyers and users&apos; experience, especially on the B2B side</li><li>An overview of the 3 main topics covered in the whitepaper </li><li>Proposed solutions for the challenges we’re facing in privacy tech </li></ul><p><b>Resources Mentioned:</b></p><ul><li><a href='https://www.riseofprivacytech.com/wp-content/uploads/2022/11/TROPT-Privacy-Tech-Stack-Whitepaper-2022.pdf'>Read the TROPT Defining the Privacy Tech Landscape Whitepaper 2021</a></li><li><a href='https://www.riseofprivacytech.com/wp-content/uploads/2022/11/TROPT-Privacy-Tech-Stack-Scape-11.17.2022.pdf'>Bookmark the TROPT Privacy Tech Stack &apos;Scape </a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Lourdes on <a href='https://www.linkedin.com/in/lourdesmturrecha/'>LinkedIn</a> </li><li>Follow Lourdes on <a href='https://twitter.com/lourdesturrecha?lang=en'>Twit</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>This week, I’m joined by <a href='https://www.linkedin.com/in/lourdesmturrecha/'>Lourdes Turrecha</a>, Founder &amp; Chief Privacy Tech Strategist at <a href='https://www.riseofprivacytech.com/'>The Rise of Privacy Tech (TROPT)</a>. TROPT&apos;s mission is to fuel privacy innovation by bringing together privacy tech founders, investors, buyers, &amp; expert-advisors to bridge the existing tech-capital-expertise gaps in the field. As a member of TROPT&apos;s Advisory Board, I’ve seen 1st-hand TROPT&apos;s innovative resources and events that they offer the industry.</p><p>----------<br/>Thank you to our sponsor, <a href='https://www.privado.ai/'>Privado</a>, the developer-friendly privacy platform<br/>----------<br/><br/></p><p>In our conversation, Lourdes and I explore the different facets of TROPT, particularly focusing on what’s included in the recently-published &quot;<a href='https://www.riseofprivacytech.com/wp-content/uploads/2022/11/TROPT-Privacy-Tech-Stack-Whitepaper-2022.pdf'>TROPT Privacy Tech Stack 2.0 Whitepaper 2022</a>.&quot; We discuss how buyers currently navigate the space, how TROPT supports privacy tech founders &amp; the 5 biggest challenges that we see across privacy tech. <br/><br/>The whitepaper is a first-of-its-kind landscape that categorizes the different categories of privacy tech so the market can better understand the breadth and depth of the space. It highlights current trends and visions for the future of privacy tech, and addresses solutions to those 5 major pain points. Lourdes also dives into what we can expect from the <a href='https://hopin.com/events/tropt-data-privacy-week-2023-339870e7-61e4-4a7f-9b50-39a1506c270c/registration'>TROPT Data Privacy Week 2023</a> in January and how to get involved. </p><p><br/>----------<br/>Listen to the episode on <a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'>Apple Podcasts</a>, <a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'>Spotify</a>, <a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'>iHeartRadio</a>, or on your favorite podcast platform.<br/>----------</p><p><br/><b>Topics Covered:</b></p><ul><li>TROPT’s free resources and paid offerings for privacy tech key players</li><li>The thought process behind the TROPT Privacy Tech Stack Review program </li><li>The current frustrations of many privacy tech buyers and users&apos; experience, especially on the B2B side</li><li>An overview of the 3 main topics covered in the whitepaper </li><li>Proposed solutions for the challenges we’re facing in privacy tech </li></ul><p><b>Resources Mentioned:</b></p><ul><li><a href='https://www.riseofprivacytech.com/wp-content/uploads/2022/11/TROPT-Privacy-Tech-Stack-Whitepaper-2022.pdf'>Read the TROPT Defining the Privacy Tech Landscape Whitepaper 2021</a></li><li><a href='https://www.riseofprivacytech.com/wp-content/uploads/2022/11/TROPT-Privacy-Tech-Stack-Scape-11.17.2022.pdf'>Bookmark the TROPT Privacy Tech Stack &apos;Scape </a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Lourdes on <a href='https://www.linkedin.com/in/lourdesmturrecha/'>LinkedIn</a> </li><li>Follow Lourdes on <a href='https://twitter.com/lourdesturrecha?lang=en'>Twit</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11646180-s1e6-the-explosion-of-privacy-tech-with-lourdes-turrecha-tropt.mp3" length="38765242" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/316708</link>
    <itunes:image href="https://storage.buzzsprout.com/bqab8a8zldl1kfrxyireyv2ud1he?.jpg" />
    <itunes:author>Debra J Farber / Lourdes Turrecha</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11646180</guid>
    <pubDate>Tue, 29 Nov 2022 08:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11646180/transcript" type="text/html" />
    <podcast:soundbite startTime="558.222" duration="59.5" />
    <itunes:duration>3226</itunes:duration>
    <itunes:keywords>privacy tech, privacy, privacy tech landscape, privacy-enhancing technologies, TROPT, The Rise of Privacy Tech, Lourdes Turrecha, privacy innovation</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>6</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S1E5: The Rise of Global Data Sharing Platforms with Stephen Wilson (Constellation Research)</itunes:title>
    <title>S1E5: The Rise of Global Data Sharing Platforms with Stephen Wilson (Constellation Research)</title>
    <itunes:summary><![CDATA[I’m joined by Stephen Wilson, accomplished data protection innovator, researcher, analyst and advisor who leads Digital Safety and Privacy efforts at Constellation Research and is Managing Director of Lockstep Technologies. In our conversation, we discuss the importance of information value chains, the emergence of data sharing platforms, discuss why data should be like clean drinking water, and explore the problems with "data ownership."  -------- Thank you to our sponsor, Privado, the devel...]]></itunes:summary>
    <description><![CDATA[<p>I’m joined by <a href='https://www.linkedin.com/in/lockstep/'>Stephen Wilson</a>, accomplished data protection innovator, researcher, analyst and advisor who leads Digital Safety and Privacy efforts at <a href='https://www.constellationr.com/business-theme/digital-safety-privacy'>Constellation Research</a> and is Managing Director of <a href='https://lockstep.com.au/technologies/'>Lockstep Technologies</a>. In our conversation, we discuss the importance of information value chains, the emergence of data sharing platforms, discuss why data should be like clean drinking water, and explore the problems with &quot;data ownership.&quot;<br/><br/>--------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>--------<br/><br/>Stephen explains the push for more data sharing and to establish user-centric business models that deliver value for businesses and benefits for individuals. We discuss emerging tools that assure the orderliness, fairness, and transparency of information value chains and why Stephen aims to take data processing &quot;out of the shadows&quot; with his research.</p><p>Lastly, we discuss key Facebook &amp; Google EU court cases that addresses collection &amp; use of facial biometrics  from people without sufficient consent and the challenges that Google and search engines have with addressing &quot;the right to be forgotten.&quot; Plus, we discuss the privacy expectations within the ‘digital town square,’ particularly through the lens of Twitter and Facebook. </p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>---------</p><p>Topics Covered:</p><ul><li>Stephen’s assertion that privacy is about restraint: what you choose to not know.</li><li>The rise of data sharing platforms to facilitate and scale global information value chains.</li><li>How if data is like “crude oil,” then it requires safe handling, and why we should treat data like &quot;clean drinking water&quot; instead.</li><li>The importance of data quality, data originality, and data lineage.</li><li>Stephen’s analysis of the growing market for “Data Protection as a Service,&quot; which includes: data clean rooms, privacy APIs, and more.</li><li>Why you don’t need to own your own data to get good privacy outcomes.</li></ul><p><b>Resources Mentioned</b>:</p><ul><li><a href='https://www.worldbank.org/en/publication/wdr2021'>Read the 2021 Data for Better Lives report (World Bank) </a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>I’m joined by <a href='https://www.linkedin.com/in/lockstep/'>Stephen Wilson</a>, accomplished data protection innovator, researcher, analyst and advisor who leads Digital Safety and Privacy efforts at <a href='https://www.constellationr.com/business-theme/digital-safety-privacy'>Constellation Research</a> and is Managing Director of <a href='https://lockstep.com.au/technologies/'>Lockstep Technologies</a>. In our conversation, we discuss the importance of information value chains, the emergence of data sharing platforms, discuss why data should be like clean drinking water, and explore the problems with &quot;data ownership.&quot;<br/><br/>--------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer-friendly privacy platform<br/></b>--------<br/><br/>Stephen explains the push for more data sharing and to establish user-centric business models that deliver value for businesses and benefits for individuals. We discuss emerging tools that assure the orderliness, fairness, and transparency of information value chains and why Stephen aims to take data processing &quot;out of the shadows&quot; with his research.</p><p>Lastly, we discuss key Facebook &amp; Google EU court cases that addresses collection &amp; use of facial biometrics  from people without sufficient consent and the challenges that Google and search engines have with addressing &quot;the right to be forgotten.&quot; Plus, we discuss the privacy expectations within the ‘digital town square,’ particularly through the lens of Twitter and Facebook. </p><p>---------<br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>---------</p><p>Topics Covered:</p><ul><li>Stephen’s assertion that privacy is about restraint: what you choose to not know.</li><li>The rise of data sharing platforms to facilitate and scale global information value chains.</li><li>How if data is like “crude oil,” then it requires safe handling, and why we should treat data like &quot;clean drinking water&quot; instead.</li><li>The importance of data quality, data originality, and data lineage.</li><li>Stephen’s analysis of the growing market for “Data Protection as a Service,&quot; which includes: data clean rooms, privacy APIs, and more.</li><li>Why you don’t need to own your own data to get good privacy outcomes.</li></ul><p><b>Resources Mentioned</b>:</p><ul><li><a href='https://www.worldbank.org/en/publication/wdr2021'>Read the 2021 Data for Better Lives report (World Bank) </a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11736717-s1e5-the-rise-of-global-data-sharing-platforms-with-stephen-wilson-constellation-research.mp3" length="42816048" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/307165</link>
    <itunes:image href="https://storage.buzzsprout.com/o2kzzvyggunz4nsw14qjrlclu11v?.jpg" />
    <itunes:author>Debra J. Farber / Stephen Wilson</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11736717</guid>
    <pubDate>Tue, 22 Nov 2022 06:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11736717/transcript" type="text/html" />
    <podcast:soundbite startTime="1248.15" duration="56.0" />
    <itunes:duration>3564</itunes:duration>
    <itunes:keywords>data sharing, value chain, privacy, information flow, data quality, data originality, data lineage, data minimization, Lockstep, Constellation Research, World Bank, data protection as a service, privacy APIs, data clean rooms, digital identity, data owner</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>5</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S1E4: The Hitchhiker&#39;s Guide to Privacy Engineering &amp; Creative Privacy with Mert Can Boyar (Privacy Innovation Lab)</itunes:title>
    <title>S1E4: The Hitchhiker&#39;s Guide to Privacy Engineering &amp; Creative Privacy with Mert Can Boyar (Privacy Innovation Lab)</title>
    <itunes:summary><![CDATA[In this episode, I interview Mert Can Boyar,  Director of Privacy Innovation Lab at Bilgi University and Founder of privacy tech company, Verilogy. Mert walks us through his creative approach to educating on core privacy engineering concepts, particularly through the lens of storytelling, visual art &amp; music. He also shares his vision &amp; mission behind his passion project, “The Hitchhiker’s Guide to Privacy Engineering." --------- Thank you to our sponsor, Privado, the developer-fr...]]></itunes:summary>
    <description><![CDATA[<p>In this episode, I interview <a href='https://www.linkedin.com/in/mert-can-boyar-02bb49a4/'>Mert Can Boyar</a>,  Director of <a href='https://privacyinnovationlab.com/'>Privacy Innovation Lab</a> at Bilgi University and Founder of privacy tech company, Verilogy. Mert walks us through his creative approach to educating on core privacy engineering concepts, particularly through the lens of storytelling, visual art &amp; music. He also shares his vision &amp; mission behind his passion project, “The Hitchhiker’s Guide to Privacy Engineering.&quot;</p><p>---------<br/>Thank you to our sponsor, <a href='https://www.privado.ai/'>Privado</a>, the developer-friendly privacy platform<br/>---------<br/><br/></p><p>Mert tells his &quot;origin story&quot; and dives into how he ended up in privacy and data protection. He highlights the thread of art &amp; entrepreneurship throughout his career, which has taken him from musician to lawyer to start-up founder, and now educator. </p><p><br/>Privacy Innovation Lab is a multi-stakeholder hub for privacy innovation. Mert highlights exciting projects that his students are working on, including an assessment tool to help practitioners build fair &amp; lawful AI models and new tech in the self-sovereign identity (SSI) space. </p><p><br/>While working at the lab, Mert came up with a “creative privacy&quot; strategy, which he uses to inspire young minds about privacy engineering. In this episode, he takes us behind-the-scenes of his comic book project that’s meant to educate people who want to understand how modern software and data processing technologies function. <br/><br/></p><p>---------<br/>Listen to the episode on <a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'>Apple Podcasts</a>, <a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'>Spotify</a>, <a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'>iHeartRadio</a>, or on your favorite podcast platform.<br/>---------<br/><br/></p><p>Topics Covered:</p><ul><li>What initially sparked Mert’s interest in data and privacy protection </li><li>How Mert uses his multifaceted &amp; creative skillsets to bridge knowledge gaps between privacy law &amp; engineering</li><li>Verilogy’s open source database tool that automates and streamlines the work that Mert was doing as a privacy lawyer </li><li>Fascinating projects underway at Privacy Innovation Lab </li><li>What Mert hopes to achieve with The Hitchhiker’s Guide to Privacy Engineering</li></ul><p>Resources Mentioned:</p><ul><li>Learn more about <a href='https://www.privacyinnovationlab.org/'>Privacy Innovation Lab</a></li><li>Subscribe on LinkedIn to <a href='https://www.linkedin.com/newsletters/hitchhiker-s-guide-to-privacy-6984858083546009601/'>The Hitchhiker&apos;s Guide to Privacy Engineering</a></li><li>Read about <a href='https://verilogy.com/'>Verilogy</a> </li></ul><p>Guest Info:</p><ul><li>Follow Mert on <a href='https://www.linkedin.com/in/mert-can-boyar-02bb49a4/'>LinkedIn</a> </li><li>Contact Mert at Privacy Innovation Lab:  <a href='mailto:mertcan.boyar@bilgi.edu.tr'>mertcan.boyar@bilgi.edu.tr</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this episode, I interview <a href='https://www.linkedin.com/in/mert-can-boyar-02bb49a4/'>Mert Can Boyar</a>,  Director of <a href='https://privacyinnovationlab.com/'>Privacy Innovation Lab</a> at Bilgi University and Founder of privacy tech company, Verilogy. Mert walks us through his creative approach to educating on core privacy engineering concepts, particularly through the lens of storytelling, visual art &amp; music. He also shares his vision &amp; mission behind his passion project, “The Hitchhiker’s Guide to Privacy Engineering.&quot;</p><p>---------<br/>Thank you to our sponsor, <a href='https://www.privado.ai/'>Privado</a>, the developer-friendly privacy platform<br/>---------<br/><br/></p><p>Mert tells his &quot;origin story&quot; and dives into how he ended up in privacy and data protection. He highlights the thread of art &amp; entrepreneurship throughout his career, which has taken him from musician to lawyer to start-up founder, and now educator. </p><p><br/>Privacy Innovation Lab is a multi-stakeholder hub for privacy innovation. Mert highlights exciting projects that his students are working on, including an assessment tool to help practitioners build fair &amp; lawful AI models and new tech in the self-sovereign identity (SSI) space. </p><p><br/>While working at the lab, Mert came up with a “creative privacy&quot; strategy, which he uses to inspire young minds about privacy engineering. In this episode, he takes us behind-the-scenes of his comic book project that’s meant to educate people who want to understand how modern software and data processing technologies function. <br/><br/></p><p>---------<br/>Listen to the episode on <a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'>Apple Podcasts</a>, <a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'>Spotify</a>, <a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'>iHeartRadio</a>, or on your favorite podcast platform.<br/>---------<br/><br/></p><p>Topics Covered:</p><ul><li>What initially sparked Mert’s interest in data and privacy protection </li><li>How Mert uses his multifaceted &amp; creative skillsets to bridge knowledge gaps between privacy law &amp; engineering</li><li>Verilogy’s open source database tool that automates and streamlines the work that Mert was doing as a privacy lawyer </li><li>Fascinating projects underway at Privacy Innovation Lab </li><li>What Mert hopes to achieve with The Hitchhiker’s Guide to Privacy Engineering</li></ul><p>Resources Mentioned:</p><ul><li>Learn more about <a href='https://www.privacyinnovationlab.org/'>Privacy Innovation Lab</a></li><li>Subscribe on LinkedIn to <a href='https://www.linkedin.com/newsletters/hitchhiker-s-guide-to-privacy-6984858083546009601/'>The Hitchhiker&apos;s Guide to Privacy Engineering</a></li><li>Read about <a href='https://verilogy.com/'>Verilogy</a> </li></ul><p>Guest Info:</p><ul><li>Follow Mert on <a href='https://www.linkedin.com/in/mert-can-boyar-02bb49a4/'>LinkedIn</a> </li><li>Contact Mert at Privacy Innovation Lab:  <a href='mailto:mertcan.boyar@bilgi.edu.tr'>mertcan.boyar@bilgi.edu.tr</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11646230-s1e4-the-hitchhiker-s-guide-to-privacy-engineering-creative-privacy-with-mert-can-boyar-privacy-innovation-lab.mp3" length="32140220" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/294537</link>
    <itunes:image href="https://storage.buzzsprout.com/ff5pmx6peeqnhzwv4o6efzyehpzq?.jpg" />
    <itunes:author>Debra J Farber / Mert Can Boyar</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11646230</guid>
    <pubDate>Tue, 15 Nov 2022 00:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11646230/transcript" type="text/html" />
    <podcast:soundbite startTime="760.0" duration="58.5" />
    <itunes:duration>2674</itunes:duration>
    <itunes:keywords>privacy engineering, creative privacy, Hitchhiker&#39;s Guide to Privacy Engineering, Privacy Innovation Lab, Verilogy, Privacy Awareness</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>4</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S1E3: &quot;Will &#39;Global Privacy Control&#39; (GPC) Fix Web Privacy?&quot; with Roy Smith (PrivacyCheq)</itunes:title>
    <title>S1E3: &quot;Will &#39;Global Privacy Control&#39; (GPC) Fix Web Privacy?&quot; with Roy Smith (PrivacyCheq)</title>
    <itunes:summary><![CDATA[In this episode, I’m joined by Roy Smith, CEO and founder of PrivacyCheq, a privacy tech company that develops privacy-enhancing technologies for mobile and web. We discuss the history of online privacy and data protection laws, current challenges within the ad tech space, and GPC, a newly proposed web standard for signaling privacy preferences. ----------- Thank you to our sponsor, Privado, the developer friendly privacy platform ----------- A most common myth that Roy sees end-users buying ...]]></itunes:summary>
    <description><![CDATA[<p>In this episode, I’m joined by Roy Smith, CEO and founder of PrivacyCheq, a privacy tech company that develops privacy-enhancing technologies for mobile and web. We discuss the history of online privacy and data protection laws, current challenges within the ad tech space, and GPC, a newly proposed web standard for signaling privacy preferences.</p><p>-----------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer friendly privacy platform<br/></b>-----------</p><p>A most common myth that Roy sees end-users buying into is that cookie banners are all that a company need  deploy for compliance with modern privacy and data protection laws. Roy breaks down how adtech companies use &quot;the cookie myth&quot; to distort how people perceive what&apos;s required for operational compliance. He illustrates the tsunami of global privacy regulations related to adtech and the limitations that exist due to siloed consent data. <br/><br/></p><p>We dive deeper into the W3C&apos;s newly proposed Global Privacy Control (GPC) specification and how GPC lets users signal their desired privacy levels just by browsing the web. Roy unpacks why it was developed and what problems it solves on a legal level. He also highlights his concern that implementing GPC will create a false sense of privacy as GPC signals depart from consumer expectations. <br/><br/></p><p>Listen to  our conversation on the benefits and drawbacks of GPC.<br/><br/></p><p>-----------<b><br/>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>-----------<br/><br/></p><p><b>Topics Covered:</b></p><ul><li>How the regulatory framework for privacy and tracking has changed over time </li><li>The global response to surveillance capitalism</li><li>The challenges and downfalls of the IAB&apos;s Transparency &amp; Consent Framework (TCF)</li><li>The problem of “consent fragmentation”</li><li>The W3C’s newly-proposed Global Privacy Control (GPC) specification</li><li>Where Roy sees opportunities for improvement</li><li>The nuances between WC3’s &quot;do-not-sell or share interaction&quot; and &quot;do-not-sell or share preference&quot;</li><li>Roy&apos;s point of view regarding web privacy and whether GPC is sufficient for signaling privacy preferences, the benefits to the adtech industry, and potential drawbacks. </li></ul><p><b>Resources Mentioned:</b></p><ul><li>Check out the GPC <a href='https://globalprivacycontrol.org/'>educational website</a> and the <a href='https://globalprivacycontrol.github.io/gpc-spec/'>proposed W3C technical specification </a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Roy on <a href='https://www.linkedin.com/in/rrsmithii/'>LinkedIn</a></li><li>Follow Roy on <a href='https://twitter.com/thegrail'>Twitter</a></li><li>Learn more on <a href='https://privacycheq.com/'>PrivacyCheq&apos;s website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>In this episode, I’m joined by Roy Smith, CEO and founder of PrivacyCheq, a privacy tech company that develops privacy-enhancing technologies for mobile and web. We discuss the history of online privacy and data protection laws, current challenges within the ad tech space, and GPC, a newly proposed web standard for signaling privacy preferences.</p><p>-----------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer friendly privacy platform<br/></b>-----------</p><p>A most common myth that Roy sees end-users buying into is that cookie banners are all that a company need  deploy for compliance with modern privacy and data protection laws. Roy breaks down how adtech companies use &quot;the cookie myth&quot; to distort how people perceive what&apos;s required for operational compliance. He illustrates the tsunami of global privacy regulations related to adtech and the limitations that exist due to siloed consent data. <br/><br/></p><p>We dive deeper into the W3C&apos;s newly proposed Global Privacy Control (GPC) specification and how GPC lets users signal their desired privacy levels just by browsing the web. Roy unpacks why it was developed and what problems it solves on a legal level. He also highlights his concern that implementing GPC will create a false sense of privacy as GPC signals depart from consumer expectations. <br/><br/></p><p>Listen to  our conversation on the benefits and drawbacks of GPC.<br/><br/></p><p>-----------<b><br/>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.<br/></b>-----------<br/><br/></p><p><b>Topics Covered:</b></p><ul><li>How the regulatory framework for privacy and tracking has changed over time </li><li>The global response to surveillance capitalism</li><li>The challenges and downfalls of the IAB&apos;s Transparency &amp; Consent Framework (TCF)</li><li>The problem of “consent fragmentation”</li><li>The W3C’s newly-proposed Global Privacy Control (GPC) specification</li><li>Where Roy sees opportunities for improvement</li><li>The nuances between WC3’s &quot;do-not-sell or share interaction&quot; and &quot;do-not-sell or share preference&quot;</li><li>Roy&apos;s point of view regarding web privacy and whether GPC is sufficient for signaling privacy preferences, the benefits to the adtech industry, and potential drawbacks. </li></ul><p><b>Resources Mentioned:</b></p><ul><li>Check out the GPC <a href='https://globalprivacycontrol.org/'>educational website</a> and the <a href='https://globalprivacycontrol.github.io/gpc-spec/'>proposed W3C technical specification </a></li></ul><p><b>Guest Info:</b></p><ul><li>Follow Roy on <a href='https://www.linkedin.com/in/rrsmithii/'>LinkedIn</a></li><li>Follow Roy on <a href='https://twitter.com/thegrail'>Twitter</a></li><li>Learn more on <a href='https://privacycheq.com/'>PrivacyCheq&apos;s website</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11619786-s1e3-will-global-privacy-control-gpc-fix-web-privacy-with-roy-smith-privacycheq.mp3" length="36854780" type="audio/mpeg" />
    <link>https://twitter.com/shiftprivacypod/status/1590058529381306368?</link>
    <itunes:image href="https://storage.buzzsprout.com/ofrhiik38r18hwfqqksy26d0b3sc?.jpg" />
    <itunes:author>Debra J Farber / Roy Smith</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11619786</guid>
    <pubDate>Mon, 07 Nov 2022 22:00:00 -0800</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11619786/transcript" type="text/html" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11619786/transcript.json" type="application/json" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11619786/transcript.srt" type="application/x-subrip" />
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11619786/transcript.vtt" type="text/vtt" />
    <podcast:soundbite startTime="1854.083" duration="55.5" />
    <itunes:duration>3067</itunes:duration>
    <itunes:keywords>privacy, adtech, cookies, do not track, IAB, GPC, COPPA, W3C, GDPR, do not sell, CCPA, surveillance capitalism, DaisyBit</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>3</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S1E2: &quot;The Magic of Zero Knowledge Biometrics&quot; with Dave Burnett (ZeroBiometrics)</itunes:title>
    <title>S1E2: &quot;The Magic of Zero Knowledge Biometrics&quot; with Dave Burnett (ZeroBiometrics)</title>
    <itunes:summary><![CDATA[(Episode Transcription)  This week, I’m joined by Dave Burnett, VP of Strategy at ZeroBiometrics, to discuss his company’s cutting edge approach to using one’s face to biometrically-authenticate to systems w/o storing personal data, preventing breaches. We’ll discuss current approaches to deploying biometric authentication, unpack surrounding privacy &amp; security challenges, and explore his company’s tech &amp; why it may enable the biometrics industry to leapfrog over current tech hurdles ...]]></itunes:summary>
    <description><![CDATA[<p>(<a href='https://www.buzzsprout.com/2059470/episodes/11591992'>Episode Transcription</a>)<br/><br/>This week, I’m joined by <a href='https://www.linkedin.com/in/davidleeburnett'>Dave Burnett</a>, VP of Strategy at <a href='https://zerobiometrics.com/'>ZeroBiometrics</a>, to discuss his company’s cutting edge approach to using one’s face to biometrically-authenticate to systems w/o storing personal data, preventing breaches. We’ll discuss current approaches to deploying biometric authentication, unpack surrounding privacy &amp; security challenges, and explore his company’s tech &amp; why it may enable the biometrics industry to leapfrog over current tech hurdles as there’s now a privacy-preserving method to biometric authentication .</p><p>—------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer friendly privacy platform.<br/></b>—------<br/><br/>Rather than iterate on older technology, ZeroBiometrics approached its biometric authentication tech using a clean-sheet design. As a result, they created tech <em>that captures no personal data</em>, not even a biometric. Thus, it doesn&apos;t know what someone looks like and doesn’t save personal data to authenticate. In our conversation, Dave pulls back the curtain this magical-sounding tech and shares compelling examples of how <a href='https://zerobiometrics.com/zeroface/'>ZeroFace</a> enables privacy-preserving biometric identification, verification, authentication, and account recovery. <br/><br/></p><p>The expansion of biometrics is unstoppable at this point. Security risks and privacy issues are too significant, and global legislation can&apos;t keep up. Dave illustrates why we can&apos;t keep working within the old biometric paradigm if we want to protect our identities and personal data and explains how his team works to bridge the gap between technologists and end-users.<br/><br/></p><p><b>Listen to the episode on</b> <a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'>Apple Podcasts</a>, <a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'>Spotify</a>, <a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'>iHeartRadio</a>, <b>or on your favorite podcast platform.<br/></b><br/></p><p><b>Topics Covered:<br/></b><br/></p><ul><li>Key challenges as we evolve from mobile biometrics to other use cases.</li><li>Technical &amp; policy differences that affect privacy.</li><li>How industry leaders like Apple have approached facial &amp; fingerprint biometrics.</li><li>How <a href='https://zerobiometrics.com/zeroface/'>ZeroFace</a> authenticates you w/o knowing what you look like</li><li>Addressing privacy usability challenges in the crypto space.</li><li>ZeroBiometrics’s impressive metrics for false acceptance (FAR) &amp; rejection rates (FRR)</li><li>How using a ZeroFace&apos;s QR code can radically change the way we travel, ship goods &amp; authenticate to our devices</li></ul><p><br/><b>Guest Info:</b></p><ul><li>Follow <a href='https://www.linkedin.com/in/davidleeburnett'>David on LinkedIn</a></li><li>Learn about <a href='https://zerobiometrics.com/'>ZeroBiometrics</a></li><li>Follow <a href='https://www.linkedin.com/company/zerobiometrics/'></a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>(<a href='https://www.buzzsprout.com/2059470/episodes/11591992'>Episode Transcription</a>)<br/><br/>This week, I’m joined by <a href='https://www.linkedin.com/in/davidleeburnett'>Dave Burnett</a>, VP of Strategy at <a href='https://zerobiometrics.com/'>ZeroBiometrics</a>, to discuss his company’s cutting edge approach to using one’s face to biometrically-authenticate to systems w/o storing personal data, preventing breaches. We’ll discuss current approaches to deploying biometric authentication, unpack surrounding privacy &amp; security challenges, and explore his company’s tech &amp; why it may enable the biometrics industry to leapfrog over current tech hurdles as there’s now a privacy-preserving method to biometric authentication .</p><p>—------<br/><b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer friendly privacy platform.<br/></b>—------<br/><br/>Rather than iterate on older technology, ZeroBiometrics approached its biometric authentication tech using a clean-sheet design. As a result, they created tech <em>that captures no personal data</em>, not even a biometric. Thus, it doesn&apos;t know what someone looks like and doesn’t save personal data to authenticate. In our conversation, Dave pulls back the curtain this magical-sounding tech and shares compelling examples of how <a href='https://zerobiometrics.com/zeroface/'>ZeroFace</a> enables privacy-preserving biometric identification, verification, authentication, and account recovery. <br/><br/></p><p>The expansion of biometrics is unstoppable at this point. Security risks and privacy issues are too significant, and global legislation can&apos;t keep up. Dave illustrates why we can&apos;t keep working within the old biometric paradigm if we want to protect our identities and personal data and explains how his team works to bridge the gap between technologists and end-users.<br/><br/></p><p><b>Listen to the episode on</b> <a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'>Apple Podcasts</a>, <a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'>Spotify</a>, <a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'>iHeartRadio</a>, <b>or on your favorite podcast platform.<br/></b><br/></p><p><b>Topics Covered:<br/></b><br/></p><ul><li>Key challenges as we evolve from mobile biometrics to other use cases.</li><li>Technical &amp; policy differences that affect privacy.</li><li>How industry leaders like Apple have approached facial &amp; fingerprint biometrics.</li><li>How <a href='https://zerobiometrics.com/zeroface/'>ZeroFace</a> authenticates you w/o knowing what you look like</li><li>Addressing privacy usability challenges in the crypto space.</li><li>ZeroBiometrics’s impressive metrics for false acceptance (FAR) &amp; rejection rates (FRR)</li><li>How using a ZeroFace&apos;s QR code can radically change the way we travel, ship goods &amp; authenticate to our devices</li></ul><p><br/><b>Guest Info:</b></p><ul><li>Follow <a href='https://www.linkedin.com/in/davidleeburnett'>David on LinkedIn</a></li><li>Learn about <a href='https://zerobiometrics.com/'>ZeroBiometrics</a></li><li>Follow <a href='https://www.linkedin.com/company/zerobiometrics/'></a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11591992-s1e2-the-magic-of-zero-knowledge-biometrics-with-dave-burnett-zerobiometrics.mp3" length="42839614" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/276703</link>
    <itunes:image href="https://storage.buzzsprout.com/hbr3f5kuz7h7scnzr8qbkkvshj7f?.jpg" />
    <itunes:author>Debra J Farber / Dave Burnett</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11591992</guid>
    <pubDate>Tue, 01 Nov 2022 06:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11591992/transcript" type="text/html" />
    <podcast:soundbite startTime="1251.364" duration="48.0" />
    <itunes:duration>3566</itunes:duration>
    <itunes:keywords>biometrics, authentication, ZKPs, zero-knowledge proofs, usable privacy, privacy-preserving architecture</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>2</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
  <item>
    <itunes:title>S1E1: &quot;Guardians of the Metaverse&quot; with Kavya Pearlman (XRSI)</itunes:title>
    <title>S1E1: &quot;Guardians of the Metaverse&quot; with Kavya Pearlman (XRSI)</title>
    <itunes:summary><![CDATA[(Transcription)  Welcome to the first episode of Shifting Privacy Left. To kick off the show, I’m joined by Kavya Pearlman, Exec Director of The eXtended Reality Safety Initiative (XRSI) to discuss  current challenges associated with extended reality (XR), the XRSI Privacy &amp; Safety Framework, and the importance of embedding privacy into today’s technology.  --------- **Thank you to our sponsor, Privado, the developer friendly privacy platform** --------- In our conversation, Kavya de...]]></itunes:summary>
    <description><![CDATA[<p>(<a href='https://www.buzzsprout.com/2059470/episodes/11543756'>Transcription</a>)<br/><br/>Welcome to the first episode of Shifting Privacy Left. To kick off the show, I’m joined by <a href='https://www.linkedin.com/in/kavya-pearlman/'>Kavya Pearlman</a>, Exec Director of <a href='https://xrsi.org/'>The eXtended Reality Safety Initiative (XRSI)</a> to discuss  current challenges associated with <a href='https://xrsi.org/definition/extended-reality-xr'>extended reality (XR)</a>, the XRSI Privacy &amp; Safety Framework, and the importance of embedding privacy into today’s technology.<br/><br/><b>---------<br/></b>**<b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer friendly privacy platform**<br/>---------</b></p><p>In our conversation, Kavya describes her vision for bridging the gap between government &amp; technologists. While consulting  Facebook back in 2016, she’s witnessed 1st-hand the impacts on society when technology risks are ignored or misunderstood. As XR technology develops, there’s a dire need for human-centered safeguarding and designing for privacy &amp; ethics. <br/><br/>We also discuss what it’s been like to create standards while the XR industry is still evolving, and why it’s crucial to influence standards at the foundational code-level. Kavya also shares her advice for builders of immersive products (developers, architects, designers, engineers, etc.) and what she urges regulators to consider when making laws for web3 tech. </p><p><br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.</b></p><p><br/><b>Topics Covered:</b></p><ul><li>The story behind XRSI, its mission &amp; overview of key programs.</li><li>The differences between the &quot;XR&quot; &amp; &quot;metaverse.&quot;</li><li>XRSI&apos;s definitions for new subsets of &quot;<a href='https://xrsi.org/definition/personal-data'>personal data</a>&quot; w/in immersive experiences: biometrically-inferred data &amp; psychographically-inferred data.</li><li>Safety, privacy &amp; ethical implications of XR data collection &amp; use. </li><li>Kavya explains the importance of the human in the loop.</li></ul><p><b>Check out XRSI:</b></p><ul><li><a href='https://xrsi.org/publication/the-xrsi-privacy-framework'>XRSI Privacy &amp; Safety Framework - XRSI PSF1</a></li><li><a href='https://metaversesafetyweek.org/'>Metaverse Safety Week</a></li></ul><p><b>Guest Info (Kavya Pearlman):</b></p><ul><li>Follow on Twitter: <a href='https://twitter.com/KavyaPearlman'>@KavyaPearlman</a></li><li>Connect on LinkedIn: <a href='https://www.linkedin.com/in/kavya-pearlman/'>Kavya Pearlman</a></li><li>Email Kavya: <a href='mailto:kavya@xrsi.org'>kavya@xrsi.org</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></description>
    <content:encoded><![CDATA[<p>(<a href='https://www.buzzsprout.com/2059470/episodes/11543756'>Transcription</a>)<br/><br/>Welcome to the first episode of Shifting Privacy Left. To kick off the show, I’m joined by <a href='https://www.linkedin.com/in/kavya-pearlman/'>Kavya Pearlman</a>, Exec Director of <a href='https://xrsi.org/'>The eXtended Reality Safety Initiative (XRSI)</a> to discuss  current challenges associated with <a href='https://xrsi.org/definition/extended-reality-xr'>extended reality (XR)</a>, the XRSI Privacy &amp; Safety Framework, and the importance of embedding privacy into today’s technology.<br/><br/><b>---------<br/></b>**<b>Thank you to our sponsor, </b><a href='https://www.privado.ai/'><b>Privado</b></a><b>, the developer friendly privacy platform**<br/>---------</b></p><p>In our conversation, Kavya describes her vision for bridging the gap between government &amp; technologists. While consulting  Facebook back in 2016, she’s witnessed 1st-hand the impacts on society when technology risks are ignored or misunderstood. As XR technology develops, there’s a dire need for human-centered safeguarding and designing for privacy &amp; ethics. <br/><br/>We also discuss what it’s been like to create standards while the XR industry is still evolving, and why it’s crucial to influence standards at the foundational code-level. Kavya also shares her advice for builders of immersive products (developers, architects, designers, engineers, etc.) and what she urges regulators to consider when making laws for web3 tech. </p><p><br/><b>Listen to the episode on </b><a href='https://podcasts.apple.com/us/podcast/the-shifting-privacy-left-podcast/id1651019312'><b>Apple Podcasts</b></a><b>, </b><a href='https://open.spotify.com/show/4bz3M4Bo0tigNrBBG3X7Im?si=fd03c98c67314d9e'><b>Spotify</b></a><b>, </b><a href='https://www.iheart.com/podcast/269-the-shifting-privacy-left-103595150/'><b>iHeartRadio</b></a><b>, or on your favorite podcast platform.</b></p><p><br/><b>Topics Covered:</b></p><ul><li>The story behind XRSI, its mission &amp; overview of key programs.</li><li>The differences between the &quot;XR&quot; &amp; &quot;metaverse.&quot;</li><li>XRSI&apos;s definitions for new subsets of &quot;<a href='https://xrsi.org/definition/personal-data'>personal data</a>&quot; w/in immersive experiences: biometrically-inferred data &amp; psychographically-inferred data.</li><li>Safety, privacy &amp; ethical implications of XR data collection &amp; use. </li><li>Kavya explains the importance of the human in the loop.</li></ul><p><b>Check out XRSI:</b></p><ul><li><a href='https://xrsi.org/publication/the-xrsi-privacy-framework'>XRSI Privacy &amp; Safety Framework - XRSI PSF1</a></li><li><a href='https://metaversesafetyweek.org/'>Metaverse Safety Week</a></li></ul><p><b>Guest Info (Kavya Pearlman):</b></p><ul><li>Follow on Twitter: <a href='https://twitter.com/KavyaPearlman'>@KavyaPearlman</a></li><li>Connect on LinkedIn: <a href='https://www.linkedin.com/in/kavya-pearlman/'>Kavya Pearlman</a></li><li>Email Kavya: <a href='mailto:kavya@xrsi.org'>kavya@xrsi.org</a></li></ul><p><a target="_blank" href="https://www.buzzsprout.com/twilio/text_messages/2059470/open_sms">Send a text</a></p> <p><br></p><p><br></p><a target="_blank" href="https://privado.ai">Privado.ai</a><br>Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.<br><br><a target="_blank" href="https://shiftingprivacyleft.com">Shifting Privacy Left Media</a><br>Where privacy engineers gather, share, &amp; learn<br><br><a target="_blank" href="https://www.buzzsprout.com/?referrer_id=2041385">Buzzsprout - Launch your podcast</a><br><br><br>Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.<br><br><p>Copyright © 2022 - 2024 Principled LLC. All rights reserved.</p>]]></content:encoded>
    <enclosure url="https://www.buzzsprout.com/2059470/episodes/11543756-s1e1-guardians-of-the-metaverse-with-kavya-pearlman-xrsi.mp3" length="40880066" type="audio/mpeg" />
    <link>https://shiftingprivacyleft.com/audio/8323/269795</link>
    <itunes:image href="https://storage.buzzsprout.com/9l98c3y2wgkrigj1y2zg9wn4mrnx?.jpg" />
    <itunes:author>Debra J Farber / Kavya Pearlman</itunes:author>
    <guid isPermaLink="false">Buzzsprout-11543756</guid>
    <pubDate>Mon, 24 Oct 2022 21:00:00 -0700</pubDate>
    <podcast:transcript url="https://www.buzzsprout.com/2059470/11543756/transcript" type="text/html" />
    <podcast:soundbite startTime="1510.35" duration="49.0" />
    <itunes:duration>3403</itunes:duration>
    <itunes:keywords>metaverse, ethics, privacy, safety, XR, virtual reality, augmented reality, developers</itunes:keywords>
    <itunes:season>1</itunes:season>
    <itunes:episode>1</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <itunes:explicit>false</itunes:explicit>
  </item>
</channel>
</rss>
