<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:podcast="https://podcastindex.org/namespace/1.0"
    xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/"
    xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"
    xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
    xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:spotify="http://www.spotify.com/ns/rss">
    <channel>
        <title>Cookies: Tech Security &amp; Privacy</title>
        <generator>Castos</generator>
        <atom:link href="https://feeds.castos.com/dokq" rel="self" type="application/rss+xml" />
        <link>https://cookies.castos.com</link>
        <description>Technology has transformed our lives, but there are hidden tradeoffs we make as we take advantage of these new tools. Cookies, as you know, can be a tasty snack -- but they can also be something that takes your data. This podcast is presented by the Princeton University School of Engineering and Applied Science.</description>
        <lastBuildDate>Tue, 23 Nov 2021 19:15:00 +0000</lastBuildDate>
        <language>en</language>
        <copyright>© 2020</copyright>
        
        <spotify:limit recentCount="8" />
        
        <spotify:countryOfOrigin>
            US CA GB
        </spotify:countryOfOrigin>
                    
                <itunes:subtitle>Technology has transformed our lives, but there are hidden tradeoffs we make as we take advantage of these new tools. Cookies, as you know, can be a tasty snack -- but they can also be something that takes your data. This podcast is presented by the Princeton University School of Engineering and Applied Science.</itunes:subtitle>
        <itunes:author>Princeton University School of Engineering and Applied Science</itunes:author>
        <itunes:type>episodic</itunes:type>
        <itunes:summary>Technology has transformed our lives, but there are hidden tradeoffs we make as we take advantage of these new tools. Cookies, as you know, can be a tasty snack -- but they can also be something that takes your data. This podcast is presented by the Princeton University School of Engineering and Applied Science.</itunes:summary>
        <itunes:owner>
            <itunes:name>Aaron Nathans</itunes:name>
            <itunes:email>anathans@princeton.edu</itunes:email>
        </itunes:owner>
        <itunes:explicit>false</itunes:explicit>
                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/image001.jpg"></itunes:image>
        
                                    <itunes:category text="Technology" />
                                                <itunes:category text="News">
                                            <itunes:category text="Tech News" />
                                    </itunes:category>
                    
                    <itunes:new-feed-url>https://feeds.castos.com/dokq</itunes:new-feed-url>
                
        
        <podcast:locked>yes</podcast:locked>
                                    <item>
                <title>
                    <![CDATA[Those Pesky Privacy Policies: Lorrie Cranor, Carnegie Mellon University]]>
                </title>
                <pubDate>Tue, 23 Nov 2021 19:15:00 +0000</pubDate>
                <dc:creator>Princeton University School of Engineering and Applied Science</dc:creator>
                <guid isPermaLink="true">
                    https://cookies.castos.com/podcasts/11517/episodes/those-pesky-privacy-policies-lorrie-cranor-carnegie-mellon-university-1</guid>
                                    <link>https://cookies.castos.com/episodes/those-pesky-privacy-policies-lorrie-cranor-carnegie-mellon-university-1</link>
                                <description>
                                            <![CDATA[<p>Does anyone actually read privacy policies? What's in them, and why can't we usually understand them? On our second season finale, we’ll talk with Professor Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie-Mellon University. The lab brings together more than 100 faculty from across campus to study security and privacy and help shape public policy in those areas. One of her specialties is how humans interact with security and privacy technologies, to make sure the mechanisms we build are not just secure in theory, but are actually things that we can use. Her TED Talk about password security has been viewed more than 1.5 million times. But today, we’ll talk about another pesky aspect of our digital lives – privacy policies, those mysterious terms and conditions we sign off on – often without reading them -- before we can use an app on our smartphone or laptop. </p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[Does anyone actually read privacy policies? What's in them, and why can't we usually understand them? On our second season finale, we’ll talk with Professor Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie-Mellon University. The lab brings together more than 100 faculty from across campus to study security and privacy and help shape public policy in those areas. One of her specialties is how humans interact with security and privacy technologies, to make sure the mechanisms we build are not just secure in theory, but are actually things that we can use. Her TED Talk about password security has been viewed more than 1.5 million times. But today, we’ll talk about another pesky aspect of our digital lives – privacy policies, those mysterious terms and conditions we sign off on – often without reading them -- before we can use an app on our smartphone or laptop. ]]>
                </itunes:subtitle>
                                    <itunes:episodeType>full</itunes:episodeType>
                                <itunes:title>
                    <![CDATA[Those Pesky Privacy Policies: Lorrie Cranor, Carnegie Mellon University]]>
                </itunes:title>
                                    <itunes:episode>8</itunes:episode>
                                                    <itunes:season>2</itunes:season>
                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[<p>Does anyone actually read privacy policies? What's in them, and why can't we usually understand them? On our second season finale, we’ll talk with Professor Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie-Mellon University. The lab brings together more than 100 faculty from across campus to study security and privacy and help shape public policy in those areas. One of her specialties is how humans interact with security and privacy technologies, to make sure the mechanisms we build are not just secure in theory, but are actually things that we can use. Her TED Talk about password security has been viewed more than 1.5 million times. But today, we’ll talk about another pesky aspect of our digital lives – privacy policies, those mysterious terms and conditions we sign off on – often without reading them -- before we can use an app on our smartphone or laptop. </p>]]>
                </content:encoded>
                                    <enclosure url="https://episodes.castos.com/5f08829b8ac454-19621136/11517/63a2a98a-3b7b-4431-9ed5-976d97547f89/Cookies-with-Lorrie-Cranor.mp3" length="18787762"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[Does anyone actually read privacy policies? What's in them, and why can't we usually understand them? On our second season finale, we’ll talk with Professor Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie-Mellon University. The lab brings together more than 100 faculty from across campus to study security and privacy and help shape public policy in those areas. One of her specialties is how humans interact with security and privacy technologies, to make sure the mechanisms we build are not just secure in theory, but are actually things that we can use. Her TED Talk about password security has been viewed more than 1.5 million times. But today, we’ll talk about another pesky aspect of our digital lives – privacy policies, those mysterious terms and conditions we sign off on – often without reading them -- before we can use an app on our smartphone or laptop. ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/cookies2-8-square.jpg"></itunes:image>
                                                                            <itunes:duration>00:26:04</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Princeton University School of Engineering and Applied Science]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[When It’s Best to Let A.I. Go Unused: Annette Zimmermann, Harvard University and University of York]]>
                </title>
                <pubDate>Wed, 17 Nov 2021 10:55:00 +0000</pubDate>
                <dc:creator>Princeton University School of Engineering and Applied Science</dc:creator>
                <guid isPermaLink="true">
                    https://cookies.castos.com/podcasts/11517/episodes/when-its-best-to-let-ai-go-unused-annette-zimmermann-harvard-university-and-university-of-york</guid>
                                    <link>https://cookies.castos.com/episodes/when-its-best-to-let-ai-go-unused-annette-zimmermann-harvard-university-and-university-of-york</link>
                                <description>
                                            <![CDATA[<p><span class="TextRun SCXW248808073 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW248808073 BCX0">Annette Zimmermann makes the provocative argument that there are times it might be better to take cutting-edge artificial intelligence tools and leave them unused. Annette is a </span></span><span class="TextRun Highlight SCXW248808073 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW248808073 BCX0">political philosopher working on the ethics of artificial intelligence and machine learning. She’s a technology and human rights fellow at the </span><span class="SpellingError SCXW248808073 BCX0">Carr</span><span class="NormalTextRun SCXW248808073 BCX0"> Center for Human Rights Policy at Harvard University, and an assistant professor in philosophy at the University of York in the United Kingdom. Annette was </span><span class="NormalTextRun SCXW248808073 BCX0">previously </span><span class="NormalTextRun SCXW248808073 BCX0">a postdoc at Princeton’s Center for Information Technology Policy</span><span class="NormalTextRun SCXW248808073 BCX0"> </span><span class="NormalTextRun SCXW248808073 BCX0">as well as at Princeton's University Center for Human Values</span><span class="NormalTextRun SCXW248808073 BCX0">.</span></span><span class="EOP SCXW248808073 BCX0"> </span></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[Annette Zimmermann makes the provocative argument that there are times it might be better to take cutting-edge artificial intelligence tools and leave them unused. Annette is a political philosopher working on the ethics of artificial intelligence and machine learning. She’s a technology and human rights fellow at the Carr Center for Human Rights Policy at Harvard University, and an assistant professor in philosophy at the University of York in the United Kingdom. Annette was previously a postdoc at Princeton’s Center for Information Technology Policy as well as at Princeton's University Center for Human Values. ]]>
                </itunes:subtitle>
                                    <itunes:episodeType>full</itunes:episodeType>
                                <itunes:title>
                    <![CDATA[When It’s Best to Let A.I. Go Unused: Annette Zimmermann, Harvard University and University of York]]>
                </itunes:title>
                                    <itunes:episode>7</itunes:episode>
                                                    <itunes:season>2</itunes:season>
                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[<p><span class="TextRun SCXW248808073 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW248808073 BCX0">Annette Zimmermann makes the provocative argument that there are times it might be better to take cutting-edge artificial intelligence tools and leave them unused. Annette is a </span></span><span class="TextRun Highlight SCXW248808073 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW248808073 BCX0">political philosopher working on the ethics of artificial intelligence and machine learning. She’s a technology and human rights fellow at the </span><span class="SpellingError SCXW248808073 BCX0">Carr</span><span class="NormalTextRun SCXW248808073 BCX0"> Center for Human Rights Policy at Harvard University, and an assistant professor in philosophy at the University of York in the United Kingdom. Annette was </span><span class="NormalTextRun SCXW248808073 BCX0">previously </span><span class="NormalTextRun SCXW248808073 BCX0">a postdoc at Princeton’s Center for Information Technology Policy</span><span class="NormalTextRun SCXW248808073 BCX0"> </span><span class="NormalTextRun SCXW248808073 BCX0">as well as at Princeton's University Center for Human Values</span><span class="NormalTextRun SCXW248808073 BCX0">.</span></span><span class="EOP SCXW248808073 BCX0"> </span></p>]]>
                </content:encoded>
                                    <enclosure url="https://episodes.castos.com/5f08829b8ac454-19621136/11517%2F1d1304b4-4be5-4d7b-8d8e-fdb6fe010cdb%2FCookies-with-Annette-Zimmermann.mp3" length="35112629"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[Annette Zimmermann makes the provocative argument that there are times it might be better to take cutting-edge artificial intelligence tools and leave them unused. Annette is a political philosopher working on the ethics of artificial intelligence and machine learning. She’s a technology and human rights fellow at the Carr Center for Human Rights Policy at Harvard University, and an assistant professor in philosophy at the University of York in the United Kingdom. Annette was previously a postdoc at Princeton’s Center for Information Technology Policy as well as at Princeton's University Center for Human Values. ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/episode7s2square.jpg"></itunes:image>
                                                                            <itunes:duration>00:48:44</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Princeton University School of Engineering and Applied Science]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[How to Fend Off a SIM-card Attack on Your Cell Phone: Kevin Lee, Princeton University]]>
                </title>
                <pubDate>Tue, 09 Nov 2021 06:04:00 +0000</pubDate>
                <dc:creator>Princeton University School of Engineering and Applied Science</dc:creator>
                <guid isPermaLink="true">
                    https://cookies.castos.com/podcasts/11517/episodes/how-to-fend-off-a-sim-card-attack-on-your-cell-phone-kevin-lee-princeton-university</guid>
                                    <link>https://cookies.castos.com/episodes/how-to-fend-off-a-sim-card-attack-on-your-cell-phone-kevin-lee-princeton-university</link>
                                <description>
                                            <![CDATA[<p><span class="TextRun SCXW249246232 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW249246232 BCX0">Kevin Lee recently </span><span class="NormalTextRun SCXW249246232 BCX0">co-</span><span class="NormalTextRun SCXW249246232 BCX0">wrote </span><span class="NormalTextRun SCXW249246232 BCX0">a fascinating study about how easy it is for an attacker to gain control of another person’s cell phone. From there, the attacker can use the phone’s multi-factor authentication tool – usually a security code provided over a text message -- to do all kinds of damage, including making unauthorized purchases. As part of the study, his research team managed to fool five wireless carriers, including Verizon Wireless, AT&amp;T and T-Mobile, into moving a customer’s account to a different phone’s SIM card without their permission. </span><span class="NormalTextRun SCXW249246232 BCX0">He’s a doctoral student in computer science at Princeton, affiliated with the Center for Information Technology Policy.</span></span><span class="EOP SCXW249246232 BCX0"> </span></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[Kevin Lee recently co-wrote a fascinating study about how easy it is for an attacker to gain control of another person’s cell phone. From there, the attacker can use the phone’s multi-factor authentication tool – usually a security code provided over a text message -- to do all kinds of damage, including making unauthorized purchases. As part of the study, his research team managed to fool five wireless carriers, including Verizon Wireless, AT&T and T-Mobile, into moving a customer’s account to a different phone’s SIM card without their permission. He’s a doctoral student in computer science at Princeton, affiliated with the Center for Information Technology Policy. ]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[How to Fend Off a SIM-card Attack on Your Cell Phone: Kevin Lee, Princeton University]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[<p><span class="TextRun SCXW249246232 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW249246232 BCX0">Kevin Lee recently </span><span class="NormalTextRun SCXW249246232 BCX0">co-</span><span class="NormalTextRun SCXW249246232 BCX0">wrote </span><span class="NormalTextRun SCXW249246232 BCX0">a fascinating study about how easy it is for an attacker to gain control of another person’s cell phone. From there, the attacker can use the phone’s multi-factor authentication tool – usually a security code provided over a text message -- to do all kinds of damage, including making unauthorized purchases. As part of the study, his research team managed to fool five wireless carriers, including Verizon Wireless, AT&amp;T and T-Mobile, into moving a customer’s account to a different phone’s SIM card without their permission. </span><span class="NormalTextRun SCXW249246232 BCX0">He’s a doctoral student in computer science at Princeton, affiliated with the Center for Information Technology Policy.</span></span><span class="EOP SCXW249246232 BCX0"> </span></p>]]>
                </content:encoded>
                                    <enclosure url="https://episodes.castos.com/5f08829b8ac454-19621136/11517%2F141135d8-929a-4fee-9d6a-81f028f559ff%2FCookies-with-Kevin-Lee.mp3" length="24104424"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[Kevin Lee recently co-wrote a fascinating study about how easy it is for an attacker to gain control of another person’s cell phone. From there, the attacker can use the phone’s multi-factor authentication tool – usually a security code provided over a text message -- to do all kinds of damage, including making unauthorized purchases. As part of the study, his research team managed to fool five wireless carriers, including Verizon Wireless, AT&T and T-Mobile, into moving a customer’s account to a different phone’s SIM card without their permission. He’s a doctoral student in computer science at Princeton, affiliated with the Center for Information Technology Policy. ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/cookies2-6.jpg"></itunes:image>
                                                                            <itunes:duration>00:33:27</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Princeton University School of Engineering and Applied Science]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Security Flaws of Online Learning:  Mihir Kshirsagar, Princeton University ]]>
                </title>
                <pubDate>Wed, 03 Nov 2021 12:15:00 +0000</pubDate>
                <dc:creator>Princeton University School of Engineering and Applied Science</dc:creator>
                <guid isPermaLink="true">
                    https://cookies.castos.com/podcasts/11517/episodes/the-security-flaws-of-online-learning-mihir-kshirsagar-princeton-university</guid>
                                    <link>https://cookies.castos.com/episodes/the-security-flaws-of-online-learning-mihir-kshirsagar-princeton-university</link>
                                <description>
                                            <![CDATA[<p><span class="TextRun SCXW98904112 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW98904112 BCX0">Are online learning platforms </span><span class="NormalTextRun SCXW98904112 BCX0">really secure</span><span class="NormalTextRun SCXW98904112 BCX0">? </span><span class="NormalTextRun SCXW98904112 BCX0">Mihir </span></span><a class="Hyperlink SCXW98904112 BCX0" href="https://scholar.princeton.edu/kshirsagar" target="_blank" rel="noreferrer noopener"><span class="TextRun SCXW98904112 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW98904112 BCX0">Kshirsagar</span></span></a><span class="TextRun Highlight SCXW98904112 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW98904112 BCX0"> </span><span class="NormalTextRun SCXW98904112 BCX0">co-wrote a</span><span class="NormalTextRun SCXW98904112 BCX0"> paper that spells out in startling detail everything you’ve wondered about -- but didn’t want to know -- about how online platforms are allowing students to have their personal data exploited as the students use them for online learning. And he discusses the one mistake instructors often make that could compromise the security of their students' data. </span><span class="NormalTextRun SCXW98904112 BCX0">He</span><span class="NormalTextRun SCXW98904112 BCX0"> has served at the New York Attorney General’s Bureau of Internet and Technology as the lead trial counsel on matters of consumer protection law and technology.</span></span><span class="EOP SCXW98904112 BCX0"> </span></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[Are online learning platforms really secure? Mihir Kshirsagar co-wrote a paper that spells out in startling detail everything you’ve wondered about -- but didn’t want to know -- about how online platforms are allowing students to have their personal data exploited as the students use them for online learning. And he discusses the one mistake instructors often make that could compromise the security of their students' data. He has served at the New York Attorney General’s Bureau of Internet and Technology as the lead trial counsel on matters of consumer protection law and technology. ]]>
                </itunes:subtitle>
                                    <itunes:episodeType>full</itunes:episodeType>
                                <itunes:title>
                    <![CDATA[The Security Flaws of Online Learning:  Mihir Kshirsagar, Princeton University ]]>
                </itunes:title>
                                    <itunes:episode>5</itunes:episode>
                                                    <itunes:season>2</itunes:season>
                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[<p><span class="TextRun SCXW98904112 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW98904112 BCX0">Are online learning platforms </span><span class="NormalTextRun SCXW98904112 BCX0">really secure</span><span class="NormalTextRun SCXW98904112 BCX0">? </span><span class="NormalTextRun SCXW98904112 BCX0">Mihir </span></span><a class="Hyperlink SCXW98904112 BCX0" href="https://scholar.princeton.edu/kshirsagar" target="_blank" rel="noreferrer noopener"><span class="TextRun SCXW98904112 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW98904112 BCX0">Kshirsagar</span></span></a><span class="TextRun Highlight SCXW98904112 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW98904112 BCX0"> </span><span class="NormalTextRun SCXW98904112 BCX0">co-wrote a</span><span class="NormalTextRun SCXW98904112 BCX0"> paper that spells out in startling detail everything you’ve wondered about -- but didn’t want to know -- about how online platforms are allowing students to have their personal data exploited as the students use them for online learning. And he discusses the one mistake instructors often make that could compromise the security of their students' data. </span><span class="NormalTextRun SCXW98904112 BCX0">He</span><span class="NormalTextRun SCXW98904112 BCX0"> has served at the New York Attorney General’s Bureau of Internet and Technology as the lead trial counsel on matters of consumer protection law and technology.</span></span><span class="EOP SCXW98904112 BCX0"> </span></p>]]>
                </content:encoded>
                                    <enclosure url="https://episodes.castos.com/5f08829b8ac454-19621136/11517%2Fe1896ebb-5927-4de9-b7b0-35075681957e%2FCookies-with-Mihir-Kshirsagar.mp3" length="24249468"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[Are online learning platforms really secure? Mihir Kshirsagar co-wrote a paper that spells out in startling detail everything you’ve wondered about -- but didn’t want to know -- about how online platforms are allowing students to have their personal data exploited as the students use them for online learning. And he discusses the one mistake instructors often make that could compromise the security of their students' data. He has served at the New York Attorney General’s Bureau of Internet and Technology as the lead trial counsel on matters of consumer protection law and technology. ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/cookies2-5-square.jpg"></itunes:image>
                                                                            <itunes:duration>00:33:39</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Princeton University School of Engineering and Applied Science]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[A few easy tips to up your privacy game: David Sherry, Princeton University ]]>
                </title>
                <pubDate>Tue, 26 Oct 2021 09:49:00 +0000</pubDate>
                <dc:creator>Princeton University School of Engineering and Applied Science</dc:creator>
                <guid isPermaLink="true">
                    https://cookies.castos.com/podcasts/11517/episodes/a-few-easy-tips-to-up-your-privacy-game-david-sherry-princeton-university</guid>
                                    <link>https://cookies.castos.com/episodes/a-few-easy-tips-to-up-your-privacy-game-david-sherry-princeton-university</link>
                                <description>
                                            <![CDATA[<p><span class="TextRun SCXW228359308 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW228359308 BCX0">How can you can </span><span class="NormalTextRun SCXW228359308 BCX0">improve</span><span class="NormalTextRun SCXW228359308 BCX0"> your privacy in your everyday use of web browsers, email, text messaging and other apps? Our guest is David Sherry, the chief information security officer here at Princeton. He’s responsible for shoring up security at </span><span class="NormalTextRun SCXW228359308 BCX0">this</span><span class="NormalTextRun SCXW228359308 BCX0"> Ivy League campus</span><span class="NormalTextRun SCXW228359308 BCX0"> of more than 15,000 people</span><span class="NormalTextRun SCXW228359308 BCX0">. He has 20 years of experience in information security management.</span><span class="NormalTextRun SCXW228359308 BCX0"> He</span><span class="NormalTextRun SCXW228359308 BCX0"> can -- and often does -- speak publicly about how he manages to herd all those cats to make Princeton safer for technology</span><span class="NormalTextRun SCXW228359308 BCX0">. But </span><span class="NormalTextRun SCXW228359308 BCX0">today</span><span class="NormalTextRun SCXW228359308 BCX0">,</span><span class="NormalTextRun SCXW228359308 BCX0"> </span><span class="NormalTextRun SCXW228359308 BCX0">he’s</span><span class="NormalTextRun SCXW228359308 BCX0"> agreed to provide tips that anyone can use to improve their privacy in their own digital lives.</span></span><span class="EOP SCXW228359308 BCX0"> </span></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[How can you can improve your privacy in your everyday use of web browsers, email, text messaging and other apps? Our guest is David Sherry, the chief information security officer here at Princeton. He’s responsible for shoring up security at this Ivy League campus of more than 15,000 people. He has 20 years of experience in information security management. He can -- and often does -- speak publicly about how he manages to herd all those cats to make Princeton safer for technology. But today, he’s agreed to provide tips that anyone can use to improve their privacy in their own digital lives. ]]>
                </itunes:subtitle>
                                    <itunes:episodeType>full</itunes:episodeType>
                                <itunes:title>
                    <![CDATA[A few easy tips to up your privacy game: David Sherry, Princeton University ]]>
                </itunes:title>
                                    <itunes:episode>4</itunes:episode>
                                                    <itunes:season>2</itunes:season>
                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[<p><span class="TextRun SCXW228359308 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW228359308 BCX0">How can you can </span><span class="NormalTextRun SCXW228359308 BCX0">improve</span><span class="NormalTextRun SCXW228359308 BCX0"> your privacy in your everyday use of web browsers, email, text messaging and other apps? Our guest is David Sherry, the chief information security officer here at Princeton. He’s responsible for shoring up security at </span><span class="NormalTextRun SCXW228359308 BCX0">this</span><span class="NormalTextRun SCXW228359308 BCX0"> Ivy League campus</span><span class="NormalTextRun SCXW228359308 BCX0"> of more than 15,000 people</span><span class="NormalTextRun SCXW228359308 BCX0">. He has 20 years of experience in information security management.</span><span class="NormalTextRun SCXW228359308 BCX0"> He</span><span class="NormalTextRun SCXW228359308 BCX0"> can -- and often does -- speak publicly about how he manages to herd all those cats to make Princeton safer for technology</span><span class="NormalTextRun SCXW228359308 BCX0">. But </span><span class="NormalTextRun SCXW228359308 BCX0">today</span><span class="NormalTextRun SCXW228359308 BCX0">,</span><span class="NormalTextRun SCXW228359308 BCX0"> </span><span class="NormalTextRun SCXW228359308 BCX0">he’s</span><span class="NormalTextRun SCXW228359308 BCX0"> agreed to provide tips that anyone can use to improve their privacy in their own digital lives.</span></span><span class="EOP SCXW228359308 BCX0"> </span></p>]]>
                </content:encoded>
                                    <enclosure url="https://episodes.castos.com/5f08829b8ac454-19621136/11517%2F54bdd2ab-686f-446d-8446-90cf284f9e45%2FCookies-with-David-Sherry.mp3" length="27290486"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[How can you can improve your privacy in your everyday use of web browsers, email, text messaging and other apps? Our guest is David Sherry, the chief information security officer here at Princeton. He’s responsible for shoring up security at this Ivy League campus of more than 15,000 people. He has 20 years of experience in information security management. He can -- and often does -- speak publicly about how he manages to herd all those cats to make Princeton safer for technology. But today, he’s agreed to provide tips that anyone can use to improve their privacy in their own digital lives. ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/episode4s2.jpg"></itunes:image>
                                                                            <itunes:duration>00:37:52</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Princeton University School of Engineering and Applied Science]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[How Search Engines Show Their Bias: Orestis Papakyriakopoulos, Princeton University, and Arwa Michelle Mboya, MIT]]>
                </title>
                <pubDate>Wed, 20 Oct 2021 04:50:00 +0000</pubDate>
                <dc:creator>Princeton University School of Engineering and Applied Science</dc:creator>
                <guid isPermaLink="true">
                    https://cookies.castos.com/podcasts/11517/episodes/how-search-engines-show-their-bias-orestis-papakyriakopoulos-princeton-university-and-arwa-michelle-mboya-mit</guid>
                                    <link>https://cookies.castos.com/episodes/how-search-engines-show-their-bias-orestis-papakyriakopoulos-princeton-university-and-arwa-michelle-mboya-mit</link>
                                <description>
                                            <![CDATA[<p><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW183969565 BCX0">Today’s guests have</span><span class="NormalTextRun SCXW183969565 BCX0"> written a study about the Google Search engine, and the subtle – and not-so-subtle – ways in which it shows its bias, and in many ways perpetuates tired old stereotypes.</span><span class="NormalTextRun SCXW183969565 BCX0"> </span><span class="NormalTextRun SCXW183969565 BCX0">Orestis</span><span class="NormalTextRun SCXW183969565 BCX0"> </span></span><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="SpellingError SCXW183969565 BCX0">Papakyriakopoulos</span></span><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW183969565 BCX0"> is </span></span><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW183969565 BCX0">a postdoctoral research associate at Princeton’s Center for Information Technology Policy. His research showcases political issues and provides ideas, frameworks, and practical solutions towards just, </span><span class="NormalTextRun SCXW183969565 BCX0">inclusive</span><span class="NormalTextRun SCXW183969565 BCX0"> and participatory algorithms. Arwa</span><span class="NormalTextRun SCXW183969565 BCX0"> Michelle Mboya</span><span class="NormalTextRun SCXW183969565 BCX0"> is a research assistant at the MIT Media Lab. She </span></span><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW183969565 BCX0">is a virtual reality programmer and researcher who investigates the socio-economic effects of enhanced imagination. </span></span><span class="EOP SCXW183969565 BCX0"> </span></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[Today’s guests have written a study about the Google Search engine, and the subtle – and not-so-subtle – ways in which it shows its bias, and in many ways perpetuates tired old stereotypes. Orestis Papakyriakopoulos is a postdoctoral research associate at Princeton’s Center for Information Technology Policy. His research showcases political issues and provides ideas, frameworks, and practical solutions towards just, inclusive and participatory algorithms. Arwa Michelle Mboya is a research assistant at the MIT Media Lab. She is a virtual reality programmer and researcher who investigates the socio-economic effects of enhanced imagination.  ]]>
                </itunes:subtitle>
                                    <itunes:episodeType>full</itunes:episodeType>
                                <itunes:title>
                    <![CDATA[How Search Engines Show Their Bias: Orestis Papakyriakopoulos, Princeton University, and Arwa Michelle Mboya, MIT]]>
                </itunes:title>
                                    <itunes:episode>3</itunes:episode>
                                                    <itunes:season>2</itunes:season>
                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[<p><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW183969565 BCX0">Today’s guests have</span><span class="NormalTextRun SCXW183969565 BCX0"> written a study about the Google Search engine, and the subtle – and not-so-subtle – ways in which it shows its bias, and in many ways perpetuates tired old stereotypes.</span><span class="NormalTextRun SCXW183969565 BCX0"> </span><span class="NormalTextRun SCXW183969565 BCX0">Orestis</span><span class="NormalTextRun SCXW183969565 BCX0"> </span></span><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="SpellingError SCXW183969565 BCX0">Papakyriakopoulos</span></span><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW183969565 BCX0"> is </span></span><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW183969565 BCX0">a postdoctoral research associate at Princeton’s Center for Information Technology Policy. His research showcases political issues and provides ideas, frameworks, and practical solutions towards just, </span><span class="NormalTextRun SCXW183969565 BCX0">inclusive</span><span class="NormalTextRun SCXW183969565 BCX0"> and participatory algorithms. Arwa</span><span class="NormalTextRun SCXW183969565 BCX0"> Michelle Mboya</span><span class="NormalTextRun SCXW183969565 BCX0"> is a research assistant at the MIT Media Lab. She </span></span><span class="TextRun Highlight SCXW183969565 BCX0" lang="en-us" xml:lang="en-us"><span class="NormalTextRun SCXW183969565 BCX0">is a virtual reality programmer and researcher who investigates the socio-economic effects of enhanced imagination. </span></span><span class="EOP SCXW183969565 BCX0"> </span></p>]]>
                </content:encoded>
                                    <enclosure url="https://episodes.castos.com/5f08829b8ac454-19621136/11517%2F205f6f71-588f-4870-89e0-addfb73ecd7b%2FCookies-with-Orestis-Papakyriakopoulos-Arwa-Michelle-Mboya-.mp3" length="28300559"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[Today’s guests have written a study about the Google Search engine, and the subtle – and not-so-subtle – ways in which it shows its bias, and in many ways perpetuates tired old stereotypes. Orestis Papakyriakopoulos is a postdoctoral research associate at Princeton’s Center for Information Technology Policy. His research showcases political issues and provides ideas, frameworks, and practical solutions towards just, inclusive and participatory algorithms. Arwa Michelle Mboya is a research assistant at the MIT Media Lab. She is a virtual reality programmer and researcher who investigates the socio-economic effects of enhanced imagination.  ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/Season-2-Episode-03-Podcast-Cover.jpg"></itunes:image>
                                                                            <itunes:duration>00:39:16</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Princeton University School of Engineering and Applied Science]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Tech Consumers Should Demand Better Security: Ruby Lee, Princeton Uni]]>
                </title>
                <pubDate>Tue, 12 Oct 2021 16:00:00 +0000</pubDate>
                <dc:creator>Princeton University School of Engineering and Applied Science</dc:creator>
                <guid isPermaLink="true">
                    https://cookies.castos.com/podcasts/11517/episodes/tech-consumers-should-demand-better-security-ruby-lee-princeton-uni</guid>
                                    <link>https://cookies.castos.com/episodes/tech-consumers-should-demand-better-security-ruby-lee-princeton-uni</link>
                                <description>
                                            <![CDATA[<p>As a chief computer architect at Hewlett-Packard in the 1980s, Ruby Lee was a leader in changing the way computers are built, simplifying their core instructions so they could do more. And she revolutionized the way computers use multimedia. If you’ve watched a video or streamed music on your computer or smart phone, Ruby had a lot to do with making that possible. In more recent years here at Princeton, her research has focused on security in computer architecture without sacrificing performance, which is what we’ll talk about today. And she’ll discuss why, even though it’s possible to build more secure devices, the marketplace doesn’t demand it. Ruby Lee is the Forest G. Hamrick Professor in Engineering, and Professor of Electrical and Computer Engineering.</p>
<p> </p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[As a chief computer architect at Hewlett-Packard in the 1980s, Ruby Lee was a leader in changing the way computers are built, simplifying their core instructions so they could do more. And she revolutionized the way computers use multimedia. If you’ve watched a video or streamed music on your computer or smart phone, Ruby had a lot to do with making that possible. In more recent years here at Princeton, her research has focused on security in computer architecture without sacrificing performance, which is what we’ll talk about today. And she’ll discuss why, even though it’s possible to build more secure devices, the marketplace doesn’t demand it. Ruby Lee is the Forest G. Hamrick Professor in Engineering, and Professor of Electrical and Computer Engineering.
 ]]>
                </itunes:subtitle>
                                    <itunes:episodeType>full</itunes:episodeType>
                                <itunes:title>
                    <![CDATA[Tech Consumers Should Demand Better Security: Ruby Lee, Princeton Uni]]>
                </itunes:title>
                                    <itunes:episode>2</itunes:episode>
                                                    <itunes:season>2</itunes:season>
                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[<p>As a chief computer architect at Hewlett-Packard in the 1980s, Ruby Lee was a leader in changing the way computers are built, simplifying their core instructions so they could do more. And she revolutionized the way computers use multimedia. If you’ve watched a video or streamed music on your computer or smart phone, Ruby had a lot to do with making that possible. In more recent years here at Princeton, her research has focused on security in computer architecture without sacrificing performance, which is what we’ll talk about today. And she’ll discuss why, even though it’s possible to build more secure devices, the marketplace doesn’t demand it. Ruby Lee is the Forest G. Hamrick Professor in Engineering, and Professor of Electrical and Computer Engineering.</p>
<p> </p>]]>
                </content:encoded>
                                    <enclosure url="https://episodes.castos.com/5f08829b8ac454-19621136/11517%2F0aeb4900-46e0-4600-b96f-37e006596bdd%2FCookies-with-Ruby-Lee-1.mp3" length="35537491"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[As a chief computer architect at Hewlett-Packard in the 1980s, Ruby Lee was a leader in changing the way computers are built, simplifying their core instructions so they could do more. And she revolutionized the way computers use multimedia. If you’ve watched a video or streamed music on your computer or smart phone, Ruby had a lot to do with making that possible. In more recent years here at Princeton, her research has focused on security in computer architecture without sacrificing performance, which is what we’ll talk about today. And she’ll discuss why, even though it’s possible to build more secure devices, the marketplace doesn’t demand it. Ruby Lee is the Forest G. Hamrick Professor in Engineering, and Professor of Electrical and Computer Engineering.
 ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/image002.jpg"></itunes:image>
                                                                            <itunes:duration>00:49:20</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Princeton University School of Engineering and Applied Science]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Barton Gellman Deletes His Account]]>
                </title>
                <pubDate>Wed, 06 Oct 2021 10:00:00 +0000</pubDate>
                <dc:creator>Princeton University School of Engineering and Applied Science</dc:creator>
                <guid isPermaLink="true">
                    https://cookies.castos.com/podcasts/11517/episodes/barton-gellman-deletes-his-account</guid>
                                    <link>https://cookies.castos.com/episodes/barton-gellman-deletes-his-account</link>
                                <description>
                                            <![CDATA[<p>To kick off our second season, we’re honored to welcome Barton Gellman, Princeton Class of 1982. Bart has won multiple Pulitzer Prizes, including for his groundbreaking work with The Washington Post in 2013 to reveal widespread surveillance by the National Security Agency. The stories showed that even though they weren’t the targets, law-abiding American citizens could still find their private email, social media content, and online activity swept up by our national security apparatus. Privacy has long been a passion of Gellman’s, and today we’ll ask him for tips we can use to make our own digital lives more private, from email to text messaging to apps and the cloud. He talks about tradeoffs he’s willing to make to be a full participant in the digital revolution, as well as one popular service he distrusts so much, he vows to delete his account entirely. And we’ll as talk about his book, “Dark Mirror: Edward Snowden and the American Surveillance State.” Bart Gellman was a visiting fellow at Princeton’s Center for Information Technology Policy a few years back.</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[To kick off our second season, we’re honored to welcome Barton Gellman, Princeton Class of 1982. Bart has won multiple Pulitzer Prizes, including for his groundbreaking work with The Washington Post in 2013 to reveal widespread surveillance by the National Security Agency. The stories showed that even though they weren’t the targets, law-abiding American citizens could still find their private email, social media content, and online activity swept up by our national security apparatus. Privacy has long been a passion of Gellman’s, and today we’ll ask him for tips we can use to make our own digital lives more private, from email to text messaging to apps and the cloud. He talks about tradeoffs he’s willing to make to be a full participant in the digital revolution, as well as one popular service he distrusts so much, he vows to delete his account entirely. And we’ll as talk about his book, “Dark Mirror: Edward Snowden and the American Surveillance State.” Bart Gellman was a visiting fellow at Princeton’s Center for Information Technology Policy a few years back.]]>
                </itunes:subtitle>
                                    <itunes:episodeType>full</itunes:episodeType>
                                <itunes:title>
                    <![CDATA[Barton Gellman Deletes His Account]]>
                </itunes:title>
                                    <itunes:episode>2</itunes:episode>
                                                    <itunes:season>2</itunes:season>
                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[<p>To kick off our second season, we’re honored to welcome Barton Gellman, Princeton Class of 1982. Bart has won multiple Pulitzer Prizes, including for his groundbreaking work with The Washington Post in 2013 to reveal widespread surveillance by the National Security Agency. The stories showed that even though they weren’t the targets, law-abiding American citizens could still find their private email, social media content, and online activity swept up by our national security apparatus. Privacy has long been a passion of Gellman’s, and today we’ll ask him for tips we can use to make our own digital lives more private, from email to text messaging to apps and the cloud. He talks about tradeoffs he’s willing to make to be a full participant in the digital revolution, as well as one popular service he distrusts so much, he vows to delete his account entirely. And we’ll as talk about his book, “Dark Mirror: Edward Snowden and the American Surveillance State.” Bart Gellman was a visiting fellow at Princeton’s Center for Information Technology Policy a few years back.</p>]]>
                </content:encoded>
                                    <enclosure url="https://episodes.castos.com/5f08829b8ac454-19621136/11517%2F29fb379b-e251-4ff2-a8e6-cd101e685cb6%2FCookies-with-Bart-Gellman-v2.mp3" length="37656139"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[To kick off our second season, we’re honored to welcome Barton Gellman, Princeton Class of 1982. Bart has won multiple Pulitzer Prizes, including for his groundbreaking work with The Washington Post in 2013 to reveal widespread surveillance by the National Security Agency. The stories showed that even though they weren’t the targets, law-abiding American citizens could still find their private email, social media content, and online activity swept up by our national security apparatus. Privacy has long been a passion of Gellman’s, and today we’ll ask him for tips we can use to make our own digital lives more private, from email to text messaging to apps and the cloud. He talks about tradeoffs he’s willing to make to be a full participant in the digital revolution, as well as one popular service he distrusts so much, he vows to delete his account entirely. And we’ll as talk about his book, “Dark Mirror: Edward Snowden and the American Surveillance State.” Bart Gellman was a visiting fellow at Princeton’s Center for Information Technology Policy a few years back.]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/5f08829b8ac454-19621136/images/image003.jpg"></itunes:image>
                                                                            <itunes:duration>00:52:16</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Princeton University School of Engineering and Applied Science]]>
                </itunes:author>
                            </item>
            </channel>
</rss>
