February 15, 2015

by Bruce Schneier

CTO, Co3 Systems, Inc.

schneier@schneier.com

https://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <https://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <https://www.schneier.com/crypto-gram/archives/2015/…>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively and intelligent comment section. An RSS feed is available.

In this issue:

Earlier this week, we learned that Samsung televisions are eavesdropping on their owners. If you have one of their Internet-connected smart TVs, you can turn on a voice command feature that saves you the trouble of finding the remote, pushing buttons and scrolling through menus. But making that feature work requires the television to listen to everything you say. And what you say isn’t just processed by the television; it may be forwarded over the Internet for remote processing. It’s literally Orwellian.

This discovery surprised people, but it shouldn’t have. The things around us are increasingly computerized, and increasingly connected to the Internet. And most of them are listening.

Our smartphones and computers, of course, listen to us when we’re making audio and video calls. But the microphones are always there, and there are ways a hacker, government, or clever company can turn those microphones on without our knowledge. Sometimes we turn them on ourselves. If we have an iPhone, the voice-processing system Siri listens to us, but only when we push the iPhone’s button. Like Samsung, iPhones with the “Hey Siri” feature enabled listen all the time. So do Android devices with the “OK Google” feature enabled, and so does an Amazon voice-activated system called Echo. Facebook has the ability to turn your smartphone’s microphone on when you’re using the app.

Even if you don’t speak, our computers are paying attention. Gmail “listens” to everything you write, and shows you advertising based on it. It might feel as if you’re never alone. Facebook does the same with everything you write on that platform, and even listens to the things you type but don’t post. Skype doesn’t listen — we think — but as Der Spiegel notes, data from the service “has been accessible to the NSA’s snoops” since 2011.

So the NSA certainly listens. It listens directly, and it listens to all these companies listening to you. So do other countries like Russia and China, which we really don’t want listening so closely to their citizens.

It’s not just the devices that listen; most of this data is transmitted over the Internet. Samsung sends it to what was referred to as a “third party” in its policy statement. It later revealed that third party to be a company you’ve never heard of — Nuance — that turns the voice into text for it. Samsung promises that the data is erased immediately. Most of the other companies that are listening promise no such thing and, in fact, save your data for a long time. Governments, of course, save it, too.

This data is a treasure trove for criminals, as we are learning again and again as tens and hundreds of millions of customer records are repeatedly stolen. Last week, it was reported that hackers had accessed the personal records of some 80 million Anthem Health customers and others. Last year, it was Home Depot, JP Morgan, Sony and many others. Do we think Nuance’s security is better than any of these companies? I sure don’t.

At some level, we’re consenting to all this listening. A single sentence in Samsung’s 1,500-word privacy policy, the one most of us don’t read, stated: “Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.” Other services could easily come with a similar warning: Be aware that your e-mail provider knows what you’re saying to your colleagues and friends and be aware that your cell phone knows where you sleep and whom you’re sleeping with — assuming that you both have smartphones, that is.

The Internet of Things is full of listeners. Newer cars contain computers that record speed, steering wheel position, pedal pressure, even tire pressure — and insurance companies want to listen. And, of course, your cell phone records your precise location at all times you have it on — and possibly even when you turn it off. If you have a smart thermostat, it records your house’s temperature, humidity, ambient light and any nearby movement. Any fitness tracker you’re wearing records your movements and some vital signs; so do many computerized medical devices. Add security cameras and recorders, drones and other surveillance airplanes, and we’re being watched, tracked, measured and listened to almost all the time.

It’s the age of ubiquitous surveillance, fueled by both Internet companies and governments. And because it’s largely happening in the background, we’re not really aware of it.

This has to change. We need to regulate the listening: both what is being collected and how it’s being used. But that won’t happen until we know the full extent of surveillance: who’s listening and what they’re doing with it. Samsung buried its listening details in its privacy policy — they have since amended it to be clearer — and we’re only having this discussion because a Daily Beast reporter stumbled upon it. We need more explicit conversation about the value of being able to speak freely in our living rooms without our televisions listening, or having e-mail conversations without Google or the government listening. Privacy is a prerequisite for free expression, and losing that would be an enormous blow to our society.

This essay previously appeared on CNN.com.

http://www.cnn.com/2015/02/11/opinion/…

http://www.thedailybeast.com/articles/2015/02/05/…

http://www.bbc.com/news/technology-31296188

http://global.samsungtomorrow.com/…

FBI monitoring webcams:

https://twitter.com/xor/status/564356757007261696/…

http://gizmodo.com/…

Turning on webcams remotely:

http://www.washingtonpost.com/s/the-switch/wp/…

Amazon Echo:

http://www.washingtonpost.com/s/the-switch/wp/…

Facebook listening on your smartphone:

http://www.forbes.com/sites/kashmirhill/2014/05/22/…

Facebook collecting what you type but don’t post:

http://www.slate.com/articles/technology/…

Der Spiegel article:

http://www.spiegel.de/international/germany/…

Anthem Health hack:

http://money.cnn.com/2015/02/04/technology/…

2014 major hacks:

http://www.zdnet.com/pictures/…

Samsung’s privacy policy:

https://www.samsung.com/sg/info/privacy/smarttv.html

Surveillance and the Internet of Things:

https://www.schneier.com/essays/archives/2013/05/…

NSA tracking cell phones, even when they’re turned off:

http://www.slate.com/s/future_tense/2013/07/22/…

Age of ubiquitous surveillance:

http://www.cnn.com/2013/10/16/opinion/…

At a CATO surveillance event last month, Ben Wittes talked about inherent presidential powers of surveillance with this hypothetical: “What should Congress have to say about the rules when Barack Obama wants to know what Vladimir Putin is talking about?” His answer was basically that Congress should have no say: “I think most people, going back to my Vladimir Putin question, would say that is actually an area of inherent presidential authority.” Edward Snowden, a surprise remote participant at the event, said the opposite, although using the courts in general rather than specifically Congress as his example. “…there is no court in the world — well, at least, no court outside Russia — who would not go, ‘This man is an agent of the foreign government. I mean, he’s the *head* of the government.’ Of course, they will say, ‘this guy has access to some kind of foreign intelligence value. We’ll sign the warrant for him.'”

There’s a principle here worth discussing at length. I’m not talking about the legal principle, as in what kind of court should oversee US intelligence collection. I’m not even talking about the constitutional principle, as in what are the US president’s inherent powers. I am talking about the philosophical principle: what sorts of secret unaccountable actions do we want individuals to be able to take on behalf of their country?

Put that way, I think the answer is obvious: as little as possible.

I am not a lawyer or a political scientist. I am a security technologist. And to me, the separation of powers and the checks and balances written into the US constitution are a security system. The more Barack Obama can do by himself in secret, the more power he has — and the more dangerous that is to all of us. By limiting the actions individuals and groups can take on their own, and forcing differing institutions to approve the actions of each other, the system reduces the ability for those in power to abuse their power. It holds them accountable.

We have enshrined the principle of different groups overseeing each other in many of our social and political systems. The courts issue warrants, limiting police power. Independent audit companies verify corporate balance sheets, limiting corporate power. And the executive, the legislative, and the judicial branches of government get to have their say in our laws. Sometimes accountability takes the form of prior approval, and sometimes it takes the form of ex post facto review. It’s all inefficient, of course, but it’s an inefficiency we accept because it makes us all safer.

While this is a fine guiding principle, it quickly falls apart in the practicalities of running a modern government. It’s just not possible to run a country where *every* action is subject to review and approval. The complexity of society, and the speed with which some decisions have to be made, can require unilateral actions. So we make allowances. Congress passes broad laws, and agencies turn them into detailed rules and procedures. The president is the commander in chief of the entire US military when it comes time to fight wars. Policemen have a lot of discretion on their own on the beat. And we only get to vote elected officials in and out of office every two, four, or six years.

The thing is, we can do better today. I’ve often said that the modern constitutional democracy is the best form of government mid-18th-century technology could produce. Because both communications and travel were difficult and expensive, it made sense for geographically proximate groups of people to choose one representative to go all the way over there and act for them over a long block of time.

Neither of these two limitations is true today. Travel is both cheap and easy, and communications are so cheap and easy as to be virtually free. Video conferencing and telepresence allow people to communicate without traveling. Surely if we were to design a democratic government today, we would come up with better institutions than the ones we are stuck with because of history.

And we can come up with more granular systems of checks and balances. So, yes, I think we would have a better government if a court had to approve all surveillance actions by the president, including those against Vladimir Putin. And today it might be possible to have a court do just that. Wittes argues that making some of these changes is impossible, given the current US constitution. He may be right, but that doesn’t mean they’re not good ideas.

Of course, the devil is always in the details. Efficiency is still a powerful counterargument. The FBI has procedures for temporarily bypassing prior approval processes if speed is essential. And granularity can still be a problem. Every bullet fired by the US military can’t be subject to judicial approval or even a military court, even though every bullet fired by a US policeman is — at least in theory — subject to judicial review. And while every domestic surveillance decision made by the police and the NSA is (also in theory) subject to judicial approval, it’s hard to know whether this can work for international NSA surveillance decisions until we try.

We are all better off now that many of the NSA’s surveillance programs have been made public and are being debated in Congress and in the media — although I had hoped for more congressional action — and many of the FISA Court’s formerly secret decisions on surveillance are being made public. But we still have a long way to go, and it shouldn’t take someone like Snowden to force at least some openness to happen.

This essay previously appeared on Lawfare.com, where Ben Wittes responded.

http://www.lawfareblog.com/2015/01/…

Wittes’s original essay:

http://www.lawfareblog.com/2014/12/…

Wittes’s response to my essay:

http://www.lawfareblog.com/2015/01/…

Last year, two Swiss artists programmed a Random Botnot Shopper, which every week would spend $100 in bitcoin to buy a random item from an anonymous Internet black market…all for an art project on display in Switzerland. It was a clever concept, except there was a problem. Most of the stuff the bot purchased was benign — fake Diesel jeans, a baseball cap with a hidden camera, a stash can, a pair of Nike trainers — but it also purchased ten ecstasy tablets and a fake Hungarian passport.

What do we do when a machine breaks the law? Traditionally, we hold the person controlling the machine responsible. People commit the crimes; the guns, lockpicks, or computer viruses are merely their tools. But as machines become more autonomous, the link between machine and controller becomes more tenuous.

Who is responsible if an autonomous military drone accidentally kills a crowd of civilians? Is it the military officer who keyed in the mission, the programmers of the enemy detection software that misidentified the people, or the programmers of the software that made the actual kill decision? What if those programmers had no idea that their software was being used for military purposes? And what if the drone can improve its algorithms by modifying its own software based on what the entire fleet of drones learns on earlier missions?

Maybe our courts can decide where the culpability lies, but that’s only because while current drones may be autonomous, they’re not very smart. As drones get smarter, their links to the humans that originally built them become more tenuous.

What if there are no programmers, and the drones program themselves? What if they are both smart and autonomous, and make strategic as well as tactical decisions on targets? What if one of the drones decides, based on whatever means it has at its disposal, that it no longer maintains allegiance to the country that built it and goes rogue?

Our society has many approaches, using both informal social rules and more formal laws, for dealing with people who won’t follow the rules of society. We have informal mechanisms for small infractions, and a complex legal system for larger ones. If you are obnoxious at a party I throw, I won’t invite you back. Do it regularly, and you’ll be shamed and ostracized from the group. If you steal some of my stuff, I might report you to the police. Steal from a bank, and you’ll almost certainly go to jail for a long time. A lot of this might seem more ad hoc than situation-specific, but we humans have spent millennia working this all out. Security is both political and social, but it’s also psychological. Door locks, for example, only work because our social and legal prohibitions on theft keep the overwhelming majority of us honest. That’s how we live peacefully together at a scale unimaginable for any other species on the planet.

How does any of this work when the perpetrator is a machine with whatever passes for free will? Machines probably won’t have any concept of shame or praise. They won’t refrain from doing something because of what other machines might think. They won’t follow laws simply because it’s the right thing to do, nor will they have a natural deference to authority. When they’re caught stealing, how can they be punished? What does it mean to fine a machine? Does it make any sense at all to incarcerate it? And unless they are deliberately programmed with a self-preservation function, threatening them with execution will have no meaningful effect.

We are already talking about programming morality into thinking machines, and we can imagine programming other human tendencies into our machines, but we’re certainly going to get it wrong. No matter how much we try to avoid it, we’re going to have machines that break the law.

This, in turn, will break our legal system. Fundamentally, our legal system doesn’t prevent crime. Its effectiveness is based on arresting and convicting criminals after the fact, and their punishment providing a deterrent to others. This completely fails if there’s no punishment that makes sense.

We already experienced a small example of this after 9/11, which was when most of us first started thinking about suicide terrorists and how post-facto security was irrelevant to them. That was just one change in motivation, and look at how those actions affected the way we think about security. Our laws will have the same problem with thinking machines, along with related problems we can’t even imagine yet. The social and legal systems that have dealt so effectively with human rulebreakers of all sorts will fail in unexpected ways in the face of thinking machines.

A machine that thinks won’t always think in the ways we want it to. And we’re not ready for the ramifications of that.

This essay previously appeared on Edge.org as one of the answers to the 2015 Edge Question: “What do you think about machines that think?”

http://edge.org/response-detail/26249

Random Botnet Shopper:

http://fusion.net/story/35883/…

Robot ethics:

http://www.nytimes.com/2015/01/11/magazine/…

The Random Botnet Shopper is “under arrest.”

http://animalnewyork.com/2015/…

I have long said that driving a car is the most dangerous thing we regularly do in our lives. Turns out deaths due to automobiles are declining, while deaths due to firearms are on the rise.

http://www.bloomberg.com/news/2012-12-19/…

http://www.economist.com/news/united-states/…

Appelbaum, Poitras, and others have another NSA article with an enormous Snowden document dump on Der Spiegel, giving details on a variety of offensive NSA cyberoperations to infiltrate and exploit networks around the world. There’s *a lot* here: 199 pages.

http://www.spiegel.de/international/world/…

Here they are in one compressed archive.

http://cryptome.org/2015/01/spiegel-15-0117.7z

Paired with the 666 pages released in conjunction with the December 28 Spiegel article on NSA cryptanalytic capabilities, we’ve seen a huge amount of Snowden documents in the past few weeks. According to one tally, it runs 3,560 pages in all.

http://www.spiegel.de/international/germany/…

http://cryptome.org/2014/12/nsa-spiegel-14-1228.rar

http://cryptome.org/2013/11/snowden-tally.htm

Discussion:

https://news.ycombinator.com/item?id=8905321

http://politics.slashdot.org/story/15/01/18/202220/…

In related news, the New York Times is reporting that the NSA has infiltrated North Korea’s networks, and provided evidence to blame the country for the Sony hacks.

http://www.nytimes.com/2015/01/19/world/asia/…

Also related, the Guardian has an article based on the Snowden documents saying that GCHQ has been spying on journalists.

http://www.theguardian.com/uk-news/2015/jan/19/…

http://arstechnica.com/tech-policy/2015/01/…

It’s a common fraud on sites like eBay: buyers falsely claim that they never received a purchased item in the mail. Here’s a paper on defending against this fraud through basic psychological security measures. It’s preliminary research, but probably worth experimental research.

https://isis.poly.edu/~hossein/publications/…

Remember back in 2013 when the then-director of the NSA Keith Alexander claimed that Section 215 bulk telephone metadata surveillance stopped “fifty-four different terrorist-related activities”? Remember when that number was backtracked several times, until all that was left was a single Somali taxi driver who was convicted of sending some money back home? This is the story of Basaaly Moalin.

http://www.newyorker.com/magazine/2015/01/26/…

Here’s an IDEA-variant with a 128-bit block length. While I think it’s a great idea to bring IDEA up to a modern block length, the paper has none of the cryptanalysis behind it that IDEA had. If nothing else, I would have expected more than eight rounds. If anyone wants to practice differential and linear cryptanalysis, here’s a new target for you.

http://eprint.iacr.org/2014/704.pdf

In the latest example of a military technology that has secretly been used by the police, we have radar guns that can see through walls.

http://www.usatoday.com/story/news/2015/01/19/…

http://reason.com//2015/01/20/…

http://www.upi.com/Top_News/US/2015/01/20/…

I missed this paper when it was first published in 2012: “Neuroscience Meets Cryptography: Designing Crypto Primitives Secure Against Rubber Hose Attacks”

https://www.usenix.org/conference/usenixsecurity12/…

Canada is spying on Internet downloads. Another story from the Snowden documents.

https://firstlook.org/theintercept/2015/01/28/…

https://www.documentcloud.org/documents/…

http://www.thestar.com/news/canada/2014/10/31/…

http://www.huffingtonpost.ca/2015/01/29/…

http://www.cbc.ca/news/canada/…

https://openmedia.ca/news/…

http://www.theglobeandmail.com/globe-debate/…

Here’s a story of a fake bank in China — a brick-and-mortar bank, not an online bank — that stole $32m from depositors over a year. Pro tip: real banks never offer 2%/week interest.

http://www.scmp.com/news/china/article/1689855/…

Hiding a Morse code message in a pop song, and delivering it to hostages in Colombia.

http://www.theverge.com/2015/1/7/7483235/…

Seems that a Texas school has suspended a 9-year-old for threatening another student with a replica One Ring. (Yes, *that* One Ring.)

http://www.nydailynews.com/news/national/…

I’ve written about this sort of thing before:

https://www.schneier.com/blog/archives/2009/11/…

My guess is that the school administration ended up trapped by its own policies, probably even believing that they were correctly being applied. You can hear that in this hearsay quote reported by the boy’s father: “Steward said the principal said threats to another child’s safety would not be tolerated — whether magical or not.”

http://www.oaoa.com/news/education/…

http://entertainment.slashdot.org/story/15/02/02/…

https://www.reddit.com/r/rage/comments/2ug5rw/…

Interesting paper: “There’s No Free Lunch, Even Using Bitcoin: Tracking the Popularity and Profits of Virtual Currency Scams,” by Marie Vasek and Tyler Moore.

http://lyle.smu.edu/~tylerm/fc15.pdf

http://www.ecnmag.com/news/2015/01/…

GPG financial difficulties:

https://www.schneier.com/blog/archives/2015/02/…

In the latest article based on the Snowden documents, the Intercept is reporting that the NSA and GCHQ are piggy-backing on the work of hackers.

https://firstlook.org/theintercept/2015/02/04/…

Here are two essays trying to understand NSA malware and how it works, in light of the enormous number of documents released by Der Spiegel recently.

https://nex.sx//…

http://.thinkst.com/p/…

Long New York Times article based on “former American and Indian officials and classified documents disclosed by Edward J. Snowden” outlining the intelligence failures leading up to the 2008 Mumbai terrorist attacks.

http://www.nytimes.com/2014/12/22/world/asia/…

DJI is programming no-fly zones into its drone software.

http://www.roboticstrends.com/article/…

If this sounds like digital rights management, it basically is. And it will fail in all the ways that DRM fails. Cory Doctorow has explained it all very well.

http://boingboing.net/2012/01/10/lockdown.html

NSF award for cryptography for kids:

http://www.nsf.gov/awardsearch/showAward?AWD_ID=1518982

In an interview this week, President Obama said that terrorism does not pose an existential threat:

What I do insist on is that we maintain a proper perspective and that we do not provide a victory to these terrorist networks by overinflating their importance and suggesting in some fashion that they are an existential threat to the United States or the world order. You know, the truth of the matter is that they can do harm. But we have the capacity to control how we respond in ways that do not undercut what’s the — you know, what’s essence of who we are.

He said something similar in January.

On one hand, what he said is blindingly obvious; and overinflating terrorism’s risks plays into the terrorists’ hands. Climate change is an existential threat. So is a comet hitting the earth, intelligent robots taking over the planet, and genetically engineered viruses. There are lots of existential threats to humanity, and we can argue about their feasibility and probability. But terrorism is not one of them. Even things that actually kill tens of thousands of people each year — car accidents, handguns, heart disease — are not existential threats.

But no matter how obvious this is, until recently it hasn’t been something that serious politicians have been able to say. When Vice President Biden said something similar last year, one commentary carried the headline “Truth or Gaffe?” In 2004, when presidential candidate John Kerry gave a common-sense answer to a question about the threat of terrorism, President Bush used those words in an attack ad. As far as I know, these comments by Obama and Biden are the first time major politicians are admitting that terrorism does not pose an existential threat and are not being pilloried for it.

Overreacting to the threat is still common, and exaggeration and fear still make good politics. But maybe now, a dozen years after 9/11, we can finally start having rational conversations about terrorism and security: what works, what doesn’t, what’s worth it, and what’s not.

Obama interview:

http://www.realclearpolitics.com/video/2015/02/01/…

Earlier Obama interview:

http://www.realclearpolitics.com/video/2015/01/16/…

Article making the point that terrorism is not an existential threat:

http://www.foreignaffairs.com/articles/66186/…

How overreacting plays into the terrorists’ hands:

http://www.theatlantic.com/national/archive/2013/04/…

Some actual existential threats:

http://www.nickbostrom.com/existential/risks.html

Biden’s comments:

http://thehill.com/policy/international/…

http://reason.com//2014/10/03/…

Bush’s attack ad:

http://www.cnn.com/2004/ALLPOLITICS/10/10/…

A recent overreaction:

https://www.schneier.com/blog/archives/2015/01/…

The politics of exaggeration and fear:

http://www.cnn.com/2013/05/20/opinion/…

In January, the National Academies of Science (NAS) released a report on the bulk collection of signals intelligence. Basically, a year previously President Obama tasked the Director of National Intelligence with assessing “the feasibility of creating software that would allow the Intelligence Community more easily to conduct target information acquisition rather than bulk collection.” The DNI asked the NAS to answer the question, and the result is this report.

The conclusion is about what you’d expect. From the NAS press release:

No software-based technique can fully replace the bulk collection of signals intelligence, but methods can be developed to more effectively conduct targeted collection and to control the usage of collected data, says a new report from the National Research Council. Automated systems for isolating collected data, restricting queries that can be made against those data, and auditing usage of the data can help to enforce privacy protections and allay some civil liberty concerns, the unclassified report says.

[…]

A key value of bulk collection is its record of past signals intelligence that may be relevant to subsequent investigations, the report notes. The committee was not asked to and did not consider whether the loss of effectiveness from reducing bulk collection would be too great, or whether the potential gain in privacy from adopting an alternative collection method is worth the potential loss of intelligence information. It did observe that other sources of information — for example, data held by third parties such as communications providers — might provide a partial substitute for bulk collection in some circumstances.

Right. The singular value of spying on everyone and saving all the data is that you can go back in time and use individual pieces of that data. There’s nothing that can substitute for that.

And what the report committee didn’t look at is very important. Here’s Herb Lin, cyber policy and security researcher and a staffer on this report:

…perhaps the most important point of the report is what it does not say. It concludes that giving up bulk surveillance entirely will entail some costs to national security, but it does not say that we should keep or abandon bulk surveillance. National security is an important national priority and so are civil liberties. We don’t do EVERYTHING we could do for national security — we accept some national security risks. And we don’t do everything we could do for civil liberties — we accept some reductions in civil liberties. Where, when, and under what circumstances we accept either — that’s the most important policy choice that the American people can make.

Just because something can be done does not mean that 1) it is effective, or 2) it should be done. There’s a lot of evidence that bulk collection is not valuable.

Here’s an overview of the report. And a news article. And the DNI press release.

http://www.nap.edu/catalog/19414/…

https://www8.nationalacademies.org/onpinews/…

http://www.whitehouse.gov/the-press-office/2014/01/…

http://www.dni.gov/index.php/newsroom/…

Commentary:

http://www.lawfareblog.com/2015/01/…

http://www.lawfareblog.com/2015/01/…

Bulk collection doesn’t stop terrorists:

http://www.newamerica.net/publications/policy/…

News article:

http://reason.com//2015/01/16/…

I’m speaking at Freedom to Connect in New York on 3/2.

http://freedom-to-connect.net/agenda.html

In early March I’m going on a book tour. These are the cities and dates:

New York: 3/2, 7:00 PM

http://store-locator.barnesandnoble.com/event/85908

Boston: 3/4, 7:00 PM

http://www.harvard.com/event/bruce_schneier/

Washington DC: 3/5, 7:00 PM

http://store-locator.barnesandnoble.com/event/85909

Seattle: 3/9, 7:00 PM

http://townhallseattle.org/event/bruce-schneier/

San Francisco: 3/10, 6:30 PM

http://www.commonwealthclub.org/events/2015-03-10/…

Minneapolis: 3/18, 7:00 PM

http://store-locator.barnesandnoble.com/event/85971

I’m speaking at South by Southwest (SXSW) in Austin on 3/14:

http://sxsw.com/

In January, as part of a Harvard computer science symposium, I had a public conversation with Edward Snowden. The topics were largely technical, ranging from cryptography to hacking to surveillance to what to do now.

https://www.youtube.com/watch?…

http://computefest.seas.harvard.edu/symposium

http://www.bostonglobe.com/business/2015/01/23/…

http://www.seas.harvard.edu/news/2015/01/…

http://www.forbes.com/sites/gilpress/2015/01/27/…

Co3 Systems is expanding into Europe. This was supposed to be a secret until the middle of February, but we were found out. We already have European customers; this is our European office.

http://www.channelweb.co.uk/crn-uk/news/2392248/…

And, by the way, we’re hiring, primarily in the Boston area.

https://www.co3sys.com/company/careers

For its “Top Influencers in Security You Should Be Following in 2015” blog post, TripWire asked me: “If you could have one infosec-related superpower, what would it be?” I answered:

Most superpowers are pretty lame: super strength, super speed, super sight, super stretchiness.

Teleportation would probably be the most useful given my schedule, but for subverting security systems, you can’t beat invisibility. You can bypass almost every physical security measure with invisibility, and when you trip an alarm — say, a motion sensor — the guards that respond will conclude that you’re a false alarm.

Oh, you want an “infosec” superpower. Hmmm. The ability to detect the origin of packets? The ability to bypass firewalls without a sound? The ability to mimic anyone’s biometric? Those are all too techy for me. Maybe the ability to translate my thoughts into articles and books without going through the tedious process of writing. But then, what would I do on long airplane flights? So maybe I need teleportation after all.

http://www.tripwire.com/state-of-security/featured/…

After a year of talking about it, my new book is finally published.

This is the copy from the inside front flap:

You are under surveillance right now.

Your cell phone provider tracks your location and knows who’s with you. Your online and in-store purchasing patterns are recorded, and reveal if you’re unemployed, sick, or pregnant. Your e-mails and texts expose your intimate and casual friends. Google knows what you’re thinking because it saves your private searches. Facebook can determine your sexual orientation without you ever mentioning it.

The powers that surveil us do more than simply store this information. Corporations use surveillance to manipulate not only the news articles and advertisements we each see, but also the prices we’re offered. Governments use surveillance to discriminate, censor, chill free speech, and put people in danger worldwide. And both sides share this information with each other or, even worse, lose it to cybercriminals in huge data breaches.

Much of this is voluntary: we cooperate with corporate surveillance because it promises us convenience, and we submit to government surveillance because it promises us protection. The result is a mass surveillance society of our own making. But have we given up more than we’ve gained? In Data and Goliath, security expert Bruce Schneier offers another path, one that values both security and privacy. He shows us exactly what we can do to reform our government surveillance programs and shake up surveillance-based business models, while also providing tips for you to protect your privacy every day. You’ll never look at your phone, your computer, your credit cards, or even your car in the same way again.

And there’s a great quote on the cover: “The public conversation about surveillance in the digital age would be a good deal more intelligent if we all read Bruce Schneier first.” –Malcolm Gladwell, author of David and Goliath.

I’ve gotten some great responses from people who read the bound galley, and hope for some good reviews in mainstream publications. So far, there’s one review.

You can buy the book everywhere online. The book’s webpage has links to all the major online retailers. I particularly like IndieBound, which routes your purchase through a local independent bookseller.

And if you can, please write a review for Amazon, Goodreads, or anywhere else.

https://www.schneier.com/book-dg.html

The review (so far):

https://www.schneier.com/news/archives/2015/01/…

Earlier blog posts about the book:

https://www.schneier.com/blog/archives/2014/03/…

https://www.schneier.com/blog/archives/2014/04/…

https://www.schneier.com/blog/archives/2014/10/…

Late last year, in a criminal case involving export violations, the US government disclosed a mysterious database of telephone call records that it had queried in the case.

The defendant argued that the database was the NSA’s, and that the query was unconditional and the evidence should be suppressed. The government said that the database was not the NSA’s. As part of the back and forth, the judge ordered the government to explain the call records database.

Someone from the Drug Enforcement Agency did that last week. Apparently, there’s *another* bulk telephone metadata collection program and a “federal law enforcement database” authorized as part of a federal drug trafficking statute:

This database [redacted] consisted of telecommunications metadata obtained from United Stated telecommunications service providers pursuant to administrative subpoenas served up on the service providers under the provisions of 21 U.S.C. 876. This metadata related to international telephone calls originating in the United States and calling [redacted] designated foreign countries, one of which was Iran, that were determined to have a demonstrated nexus to international drug trafficking and related criminal activities.

The program began in the 1990s and was “suspended” in September 2013.

https://ia902702.us.archive.org/24/items/…

http://arstechnica.com/tech-policy/2015/01/…

http://www.wsj.com/articles/…

http://yro.slashdot.org/story/15/01/18/0215255/…

https://news.ycombinator.com/item?id=8901610

http://theweek.com/speedreads/534415/…

Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <https://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Bruce Schneier is an internationally renowned security technologist, called a “security guru” by The Economist. He is the author of 12 books — including “Liars and Outliers: Enabling the Trust Society Needs to Survive” — as well as hundreds of articles, essays, and academic papers. His influential newsletter “Crypto-Gram” and his blog “Schneier on Security” are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation’s Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Chief Technology Officer at Co3 Systems, Inc. See <https://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of Co3 Systems, Inc.

Copyright (c) 2015 by Bruce Schneier.