[Gpg4win-users-de] Fw: CRYPTO-GRAM, July 15, 2013

Peter Hennig peter.hennig at web.de
Mo Jul 15 13:17:08 CEST 2013


Hallo,

die beiden 'Punkte'

> Protecting E-Mail from Eavesdropping
> Is Cryptography Engineering or Science?

sind vielleicht interessant für die Leser in diesem 'Kreis'.

Gruß,
Peter


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

In leichter Abwandlung des 'Wahlspruches' / der 'boilerplate' von Werner:


"Die Gedanken sind frei - ja, aber man wüsste gern, was sie denken."

- Denis Scheck, Literaturkritiker und Journalist, kürzlich im Deutschlandfunk -










> Gesendet: Montag, 15. Juli 2013 um 05:25 Uhr
> Von: "Bruce Schneier" <schneier at SCHNEIER.COM>
> An: CRYPTO-GRAM-LIST at LISTSERV.MODWEST.COM
> Betreff: CRYPTO-GRAM, July 15, 2013
>
>              CRYPTO-GRAM
>
>             July 15, 2013
>
>           by Bruce Schneier
>     Chief Security Technology Officer, BT
>         schneier at schneier.com
>        http://www.schneier.com
>
>
> A free monthly newsletter providing summaries, analyses, insights, and
> commentaries on security: computer and otherwise.
>
> For back issues, or to subscribe, visit
> <http://www.schneier.com/crypto-gram.html>.
>
> You can read this issue on the web at
> <http://www.schneier.com/crypto-gram-1307.html>. These same essays and
> news items appear in the "Schneier on Security" blog at
> <http://www.schneier.com/blog>, along with a lively and intelligent
> comment section. An RSS feed is available.
>
>
> ** *** ***** ******* *********** *************
>
> In this issue:
>       Blowback from the NSA Surveillance
>       Evidence that the NSA Is Storing Voice Content,
>         Not Just Metadata
>       NSA Secrecy and Personal Privacy
>       Petition the NSA to Subject its Surveillance Program to
>         Public Comment
>       New Details on Skype Eavesdropping
>       Pre-9/11 NSA Thinking
>       How the NSA Eavesdrops on Americans
>       NSA E-Mail Eavesdropping
>       News
>       US Offensive Cyberwar Policy
>       Finding Sociopaths on Facebook
>       Schneier News
>       My Fellowship at the Berkman Center
>       Protecting E-Mail from Eavesdropping
>       Is Cryptography Engineering or Science?
>       Sixth Movie-Plot Threat Contest Winner
>
>
> ** *** ***** ******* *********** *************
>
>       Blowback from the NSA Surveillance
>
>
>
> There's one piece of blowback that isn't being discussed -- aside from
> the fact that Snowden has killed the chances of any liberal arts major
> getting a DoD job for at least a decade -- and that's how the massive
> NSA surveillance of the Internet affects the US's role in Internet
> governance.
>
> Ron Deibert makes this point:
>
>      But there are unintended consequences of the NSA scandal that
>      will undermine U.S. foreign policy interests -- in particular,
>      the "Internet Freedom" agenda espoused by the U.S. State
>      Department and its allies.
>
>      The revelations that have emerged will undoubtedly trigger a
>      reaction abroad as policymakers and ordinary users realize the
>      huge disadvantages of their dependence on U.S.-controlled
>      networks in social media, cloud computing, and
>      telecommunications, and of the formidable resources that are
>      deployed by U.S. national security agencies to mine and monitor
>      those networks.
>
> Writing about the new Internet nationalism, I talked about the ITU
> meeting in Dubai last fall, and the attempt of some countries to wrest
> control of the Internet from the US.  That movement just got a huge PR
> boost.  Now, when countries like Russia and Iran say the US is simply 
> too untrustworthy to manage the Internet, no one will be able to argue.
>
> We can't fight for Internet freedom around the world, then turn around
> and destroy it back home.  Even if we don't see the contradiction, the 
> rest of the world does.
>
> http://www.cnn.com/2013/06/12/opinion/deibert-nsa-surveillance/
>
> The new Internet nationalism:
> https://www.schneier.com/essay-416.html
> http://www.cnas.org/theinternetyalta
>
>
> ** *** ***** ******* *********** *************
>
>       Evidence that the NSA Is Storing Voice Content, Not Just
>         Metadata
>
>
>
> There's been some interesting speculation that the NSA is storing
> everyone's phone calls, and not just metadata.  The first link, below,
> is definitely worth reading.
>
> I expressed skepticism about this just a month ago.  My assumption had
> always been that everyone's compressed voice calls are just too much
> data to move around and store.  Now, I don't know.
>
> There's a bit of a conspiracy-theory air to all of this speculation, but
> underestimating what the NSA will do is a mistake.  General Alexander
> has told members of Congress that they *can* record the contents of
> phone calls.  And they have the technical capability.
>
> I believe that, to the extent that the NSA is analyzing and storing
> conversations, they're doing speech-to-text as close to the source as
> possible and working with that.  Even if you have to store the audio for
> conversations in foreign languages, or for snippets of conversations the
> conversion software is unsure of, it's a lot fewer bits to move around
> and deal with.
>
> And, by the way, I hate the term "metadata."  What's wrong with "traffic
> analysis," which is what we've always called that sort of thing?
>
> http://blog.rubbingalcoholic.com/post/52913031241/its-not-just-metadata-the-nsa-is-getting-everything
> or http://tinyurl.com/l4e92ex
>
> My previous skepticism:
> https://www.schneier.com/blog/archives/2013/05/is_the_us_gover.html
>
> More:
> http://news.cnet.com/8301-13578_3-57589495-38/nsa-spying-flap-extends-to-contents-of-u.s-phone-calls/
> or http://tinyurl.com/mpjlpb5
> http://www.wired.com/threatlevel/2012/03/ff_nsadatacenter/all/
> http://dailycaller.com/2013/06/10/what-do-they-know-about-you-an-interview-with-nsa-analyst-william-binney/?print=1
> or http://tinyurl.com/ls5jp8m
> https://docs.google.com/spreadsheet/ccc?key=0AuqlWHQKlooOdGJrSzhBVnh0WGlzWHpCZFNVcURkX0E#gid=0
> or http://tinyurl.com/n62lojy
>
> Metadata:
> http://www.guardian.co.uk/technology/interactive/2013/jun/12/what-is-metadata-nsa-surveillance?CMP=twt_gu#meta=1111111
> or http://tinyurl.com/kvtq6tj
>
>
> ** *** ***** ******* *********** *************
>
>       NSA Secrecy and Personal Privacy
>
>
>
> In an excellent essay about privacy and secrecy, law professor Daniel
> Solove makes an important point.  There are two types of NSA secrecy
> being discussed.  It's easy to confuse them, but they're very different.
>
>      Of course, if the government is trying to gather data about a
>      particular suspect, keeping the specifics of surveillance
>      efforts secret will decrease the likelihood of that suspect
>      altering his or her behavior.
>
>      But secrecy at the level of an individual suspect is different
>      from keeping the very existence of massive surveillance
>      programs secret. The public must know about the general
>      outlines of surveillance activities in order to evaluate
>      whether the government is achieving the appropriate balance
>      between privacy and security. What kind of information is
>      gathered? How is it used? How securely is it kept? What kind of
>      oversight is there? Are these activities even legal? These
>      questions can't be answered, and the government can't be held
>      accountable, if surveillance programs are completely
>      classified.
>
> This distinction is also becoming important as Snowden keeps talking.
> There are a lot of articles about Edward Snowden cooperating with the
> Chinese government.  I have no idea if this is true -- Snowden denies it
> -- or if it's part of an American smear campaign designed to change the
> debate from the NSA surveillance programs to the whistleblower's
> actions.  (It worked against Assange.) In anticipation of the inevitable
> questions, I want to change a previous assessment statement: I consider
> Snowden a hero for whistleblowing on the existence and details of the
> NSA surveillance programs, but not for revealing specific operational
> secrets to the Chinese government.  Charles Pierce wishes Snowden would
> stop talking.  I agree; the more this story is about him the less it is
> about the NSA.  Stop giving interviews and let the documents do the talking.
>
> Back to Daniel Solove, this excellent 2011 essay on the value of privacy
> is making the rounds again.  And it should.
>
>      Many commentators had been using the metaphor of George
>      Orwell's "1984" to describe the problems created by the
>      collection and use of personal data. I contended that the
>      Orwell metaphor, which focuses on the harms of surveillance
>      (such as inhibition and social control) might be apt to
>      describe law enforcement's monitoring of citizens. But much of
>      the data gathered in computer databases is not particularly
>      sensitive, such as one's race, birth date, gender, address, or
>      marital status. Many people do not care about concealing the
>      hotels they stay at, the cars they own or rent, or the kind of
>      beverages they drink. People often do not take many steps to
>      keep such information secret. Frequently, though not always,
>      people's activities would not be inhibited if others knew this
>      information.
>
>      I suggested a different metaphor to capture the problems: Franz
>      Kafka's "The Trial," which depicts a bureaucracy with
>      inscrutable purposes that uses people's information to make
>      important decisions about them, yet denies the people the
>      ability to participate in how their information is used. The
>      problems captured by the Kafka metaphor are of a different sort
>      than the problems caused by surveillance. They often do not
>      result in inhibition or chilling. Instead, they are problems of
>      information processing -- the storage, use, or analysis of data
>      -- rather than information collection. They affect the power
>      relationships between people and the institutions of the modern
>      state. They not only frustrate the individual by creating a
>      sense of helplessness and powerlessness, but they also affect
>      social structure by altering the kind of relationships people
>      have with the institutions that make important decisions about
>      their lives.
>
> The whole essay is worth reading, as is -- I hope -- my essay on the
> value of privacy from 2006.
>
> I have come to believe that the solution to all of this is regulation.
> And it's not going to be the regulation of data collection; it's going
> to be the regulation of data use.
>
> Blog entry URL:
> http://www.schneier.com/blog/archives/2013/06/nsa_secrecy_and.html
>
> Solove's essay:
> http://www.washingtonpost.com/opinions/five-myths-about-privacy/2013/06/13/098a5b5c-d370-11e2-b05f-3ea3f0e7bb5a_story.html
> or http://tinyurl.com/kg228sk
>
> Snowden and the Chinese:
> http://online.wsj.com/article/SB10001424127887324049504578543101447528698.html 
> or http://tinyurl.com/ltaqtfu
> http://www.nytimes.com/2013/06/15/world/asia/ex-nsa-contractors-disclosures-could-complicate-his-fate.html
> or http://tinyurl.com/lkps3wv
> http://www.upi.com/Top_News/US/2013/06/14/Snowden-may-be-working-with-China-lawmakers-say/UPI-10511371196800/
> or http://tinyurl.com/mj2vj7s
> http://www.guardian.co.uk/world/2013/jun/17/edward-snowden-nsa-files-whistleblower
> or http://tinyurl.com/lpfl59w
>
> Wikileaks smears:
> http://www.fastcompany.com/1707146/anatomy-smear-wikileaks-assange-wanted-sex-surprise-not-rape
> or http://tinyurl.com/m4v2ma5
> http://www.fair.org/blog/2011/03/02/nyt-and-the-julian-assange-smear-campaign/
> or http://tinyurl.com/lrefrf9
>
> My previous Snowden essays:
> http://www.schneier.com/blog/archives/2013/06/government_secr.html
> http://www.schneier.com/blog/archives/2013/06/prosecuting_sno.html
>
> Charles Pierce on Snowden:
> http://www.esquire.com/blogs/politics/The_Snowden_Effect_Rolls_On
>
> Solove's 2011 essay:
> https://chronicle.com/article/Why-Privacy-Matters-Even-if/127461/
>
> My essay:
> https://www.schneier.com/essay-114.html
>
> A good rebuttal to the "nothing to hide" argument:
> http://www.wired.com/opinion/2013/06/why-i-have-nothing-to-hide-is-the-wrong-way-to-think-about-surveillance/
> or http://tinyurl.com/ly7tray
>
>
> ** *** ***** ******* *********** *************
>
>       Petition the NSA to Subject its Surveillance Program to Public
>         Comment
>
>
>
> I have signed a petition calling on the NSA to "suspend its domestic
> surveillance program pending public comment."  This is what's going on:
>
>      In a request today to National Security Agency director Keith
>      Alexander and Defense Secretary Chuck Hagel, the group argues
>      that the NSA's recently revealed domestic surveillance program
>      is "unlawful" because the agency neglected to request public
>      comments first. A federal appeals court previously ruled that
>      was necessary in a lawsuit involving airport body scanners.
>
>      "In simple terms, a line has been crossed," Marc Rotenberg,
>      executive director of the Electronic Privacy Information
>      Center, told CNET. "The agency's function has been transformed,
>      and we think the public should have an opportunity to say
>      something about that."
>
>      It's an ambitious -- and untested -- legal argument. No court
>      appears to have ever ruled that the Administrative Procedure
>      Act, which can require agencies to solicit public comment, has
>      applied to the supersecret intelligence community. The APA
>      explicitly excludes from judicial review, for instance,
>      "military authority exercised in the field in time of war."
>
>      EPIC is relying on a July 2011 decision (PDF) it obtained from
>      the U.S. Court of Appeals for the D.C. Circuit dealing with
>      installing controversial full-body scanners at airports. The
>      Transportation Security Agency, the court said, was required to
>      obtain comment on a rule that "substantively affects the
>      public."
>
> This isn't an empty exercise.  While it's unlikely that a judge will
> order the NSA to suspend the program pending public approval, the
> process will put pressure on Washington to subject the NSA to more
> oversight, and pressure the NSA into more transparency.  We've used
> these tactics before.  Two decades ago, EPIC launched a similar petition
> against the Clipper Chip, a  process that eventually led to the Clinton
> administration and the FBI abandoning the effort.  And EPIC's more
> recent action against TSA full-body scanners is one of the reasons we
> have privacy safeguards on the millimeter wave scanners they are still
> using.
>
> The more people who sign this petition, this, the clearer the message it
> sends to Washington: a message that people care about the privacy of
> their telephone records, Internet transactions, and online
> communications. Secret judges should not be allowed to use secret
> interpretations of secret laws to authorize the NSA to engage in
> domestic surveillance.  Sooner or later, a court is going to recognize
> that.  Until then, the more noise the better.
>
> Add your voice here.  It just might work.
>
> Petition:
> http://epic.org/2013/06/epic-bamford-diffie-schneier-c.html
> http://epic.org/NSApetition/
>
> News article:
> http://news.cnet.com/8301-13578_3-57589640-38/body-scanner-ruling-could-squelch-nsa-domestic-spying/
> or http://tinyurl.com/n7zbdae
>
>
> ** *** ***** ******* *********** *************
>
>       New Details on Skype Eavesdropping
>
>
>
> This article, on the cozy relationship between the commercial
> personal-data industry and the intelligence industry, has new
> information on the security of Skype.
>
>      Skype, the Internet-based calling service, began its own secret
>      program, Project Chess, to explore the legal and technical
>      issues in making Skype calls readily available to intelligence
>      agencies and law enforcement officials, according to people
>      briefed on the program who asked not to be named to avoid
>      trouble with the intelligence agencies.
>
>      Project Chess, which has never been previously disclosed, was
>      small, limited to fewer than a dozen people inside Skype, and
>      was developed as the company had sometimes contentious talks
>      with the government over legal issues, said one of the people
>      briefed on the project. The project began about five years ago,
>      before most of the company was sold by its parent, eBay, to
>      outside investors in 2009. Microsoft acquired Skype in an $8.5
>      billion deal that was completed in October 2011.
>
>      A Skype executive denied last year in a blog post that recent
>      changes in the way Skype operated were made at the behest of
>      Microsoft to make snooping easier for law enforcement. It
>      appears, however, that Skype figured out how to cooperate with
>      the intelligence community before Microsoft took over the
>      company, according to documents leaked by Edward J. Snowden, a
>      former contractor for the N.S.A. One of the documents about the
>      Prism program made public by Mr. Snowden says Skype joined
>      Prism on Feb. 6, 2011.
>
> Reread that Skype denial from last July, knowing that at the time the
> company knew that they were giving the NSA access to customer
> communications.  Notice how it is precisely worded to be technically
> accurate, yet leave the reader with the wrong conclusion.  This is where
> we are with all the tech companies right now; we can't trust their
> denials, just as we can't trust the NSA -- or the FBI -- when it denies
> programs, capabilities, or practices.
>
> Back in January, we wondered whom Skype lets spy on their users.  Now we
> know.
>
> The article quoted:
> https://www.nytimes.com/2013/06/20/technology/silicon-valley-and-spy-agency-bound-by-strengthening-web.html
> or http://tinyurl.com/qdl249l
>
> Skype's denial:
> http://blogs.skype.com/2012/07/26/what-does-skypes-architecture-do/
>
> We can't trust the NSA:
> http://www.schneier.com/blog/archives/2013/06/details_of_nsa.html
> https://www.eff.org/deeplinks/2013/06/director-national-intelligences-word-games-explained-how-government-deceived
> or http://tinyurl.com/ma7dk5j
> https://www.eff.org/nsa-spying/wordgames
> http://www.wired.com/threatlevel/2013/06/nsa-numbers/
> http://fabiusmaximus.com/2013/06/11/nsa-surveillance-51264/
>
> My post from last January:
> https://www.schneier.com/blog/archives/2013/01/who_does_skype.html
>
>
> ** *** ***** ******* *********** *************
>
>       Pre-9/11 NSA Thinking
>
>
>
> This quote is from the Spring 1997 issue of "CRYPTOLOG," the internal
> NSA newsletter.  The writer is William J. Black, Jr., the Director's
> Special Assistant for Information Warfare.
>
>      Specifically, the focus is on the potential abuse of the
>      Government's applications of this new information technology
>      that will result in an invasion of personal privacy. For us,
>      this is difficult to understand. We *are* "the government,"
>      and we have no interest in invading the personal privacy of
>      U.S. citizens.
>
> This is from a Seymour Hersh "New Yorker" interview with NSA Director
> General Michael Hayden in 1999:
>
>      When I asked Hayden about the agency's capability for
>      unwarranted spying on private citizens -- in the unlikely
>      event, of course, that the agency could somehow get the
>      funding, the computer scientists, and the knowledge to begin
>      making sense out of the Internet -- his response was heated.
>      "I'm a kid from Pittsburgh with two sons and a daughter who are
>      closet libertarians," he said. "I am not interested in doing
>      anything that threatens the American people, and threatens the
>      future of this agency. I can't emphasize enough to you how
>      careful we are. We have to be so careful -- to make sure that
>      America is never distrustful of the power and security we can
>      provide."
>
> It's easy to assume that both Black and Hayden were lying, but I believe
> them.  I believe that, 15 years ago, the NSA was entirely focused on
> intercepting communications outside the US.
>
> What changed?  What caused the NSA to abandon its non-US charter and
> start spying on Americans?  From what I've read, and from a bunch of
> informal conversations with NSA employees, it was the 9/11 terrorist
> attacks.  That's when everything changed, the gloves came off, and all 
> the rules were thrown out the window.  That the NSA's interests
> coincided with the business model of the Internet is just a -- lucky, in
> their view -- coincidence.
>
> Black quote:
> http://www.nsa.gov/public_info/_files/cryptologs/cryptolog_135.pdf
>
> Hayden quote:
> http://cryptome.org/nsa-hersh.htm
>
>
> ** *** ***** ******* *********** *************
>
>       How the NSA Eavesdrops on Americans
>
>
>
> A few weeks ago, the "Guardian" published two new Snowden documents.
> These outline how the NSA's data-collection procedures allow it to
> collect lots of data on Americans, and how the FISA court fails to
> provide oversight over these procedures.
>
> The documents are complicated, but I strongly recommend that people read
> both the "Guardian" analysis and the EFF analysis -- and possibly the
> "USA Today" story.
>
> Frustratingly, this has not become a major news story.  It isn't being
> widely reported in the media, and most people don't know about it.  At
> this point, the only aspect of the Snowden story that is in the news is
> the personal story.  The press seems to have had its fill of the far
> more important policy issues.
>
> I don't know what there is that can be done about this, but it's how we
> all lose.
>
> http://www.guardian.co.uk/world/interactive/2013/jun/20/exhibit-a-procedures-nsa-document
> or http://tinyurl.com/pz3j9wm
> http://www.guardian.co.uk/world/interactive/2013/jun/20/exhibit-b-nsa-procedures-document
> or http://tinyurl.com/oxp6hxo
>
> Analysis:
> http://www.guardian.co.uk/world/2013/jun/20/fisa-court-nsa-without-warrant
> or http://tinyurl.com/q3vudcd
> https://www.eff.org/deeplinks/2013/06/depth-review-new-nsa-documents-expose-how-americans-can-be-spied-without-warrant
> or http://tinyurl.com/muaypw8
> http://www.usatoday.com/story/news/nation/2013/06/20/nsa-surveillance-fisa-court/2442899/
> or http://tinyurl.com/puvsbkf
>
>
> ** *** ***** ******* *********** *************
>
>       NSA E-Mail Eavesdropping
>
>
>
> More Snowden documents analyzed by the "Guardian" -- two articles --
> discuss how the NSA collected e-mails and data on Internet activity of
> both Americans and foreigners.  The program might have ended in 2011, or
> it might have continued under a different name.  This is the program
> that resulted in that bizarre tale of Bush officials confronting
> then-Attorney General John Ashcroft in his hospital room; the "New York
> Times" story discusses that.  What's interesting is that the NSA
> collected this data under one legal pretense.  When that justification
> evaporated, they searched around until they found another pretense.
>
> This story is being picked up a bit more than the previous story, but
> it's obvious that the press is fatiguing of this whole thing.  Without
> the Ashcroft human interest bit, it would be just another story of the
> NSA eavesdropping on Americans -- and that's lasts week's news.
>
> http://www.guardian.co.uk/world/2013/jun/27/nsa-data-mining-authorised-obama
> or http://tinyurl.com/p4wa3x6
> http://www.guardian.co.uk/world/2013/jun/27/nsa-online-metadata-collection
> or http://tinyurl.com/pyrgcuy
>
> More stories:
> https://www.nytimes.com/2013/06/28/us/nsa-report-says-internet-metadata-were-focus-of-visit-to-ashcroft.html
> or http://tinyurl.com/mdubk3d
> http://reason.com/24-7/2013/06/28/nsa-surveillance-may-have-prompted-confr
> or http://tinyurl.com/m4g3teo
>
>
> ** *** ***** ******* *********** *************
> 
>       News
>
>
>
> "Final Report on Project C-43."  This finally explains what John Ellis
> was talking about in "The Possibility of Non-Secret Encryption" when he
> dropped a tantalizing hint about wartime work at Bell Labs.
> http://techpinions.com/an-old-mystery-solved-project-c-43-and-public-key-encryption/18205
> or http://tinyurl.com/kwbuldy
> Related:
> https://www.schneier.com/essay-377.html
>
> Details of NSA data requests from US corporations.
> http://www.schneier.com/blog/archives/2013/06/details_of_nsa.html
>
> John Mueller and Mark Stewart ask the important questions about the NSA
> surveillance programs: why were they secret, what have they
> accomplished, and what do they cost?
> https://chronicle.com/blogs/conversation/2013/06/13/3-questions-about-nsa-surveillance/
> or http://tinyurl.com/klmv6df
> This essay attempts to figure out if they accomplished anything.
> http://www.cnn.com/2013/06/17/opinion/bergen-nsa-spying/index.html
> This essay attempts to figure out if they can be effective at all.
> http://live.wsj.com/article_email/SB10001424127887324049504578543542258054884-lMyQjAxMTAzMDEwNDExNDQyWj.html?mod=wsj_valettop_email
> or http://tinyurl.com/k6mqdpg
>
> Companies allow US intelligence to exploit vulnerabilities before they
> patch them.  No word on whether these companies would delay a patch if
> asked nicely -- or if there's any way the government can require them
> to.  Anyone feel safer because of this?
> http://www.bloomberg.com/news/2013-06-14/u-s-agencies-said-to-swap-data-with-thousands-of-firms.html
> or http://tinyurl.com/mvaew4f
>
> A fine piece: "A Love Letter to the NSA Agent who is Monitoring my
> Online Activity."
> http://www.happyplace.com/24470/a-love-letter-to-the-nsa-agent-who-is-monitoring-my-online-activity
> or http://tinyurl.com/q7dvxns
> A similar sentiment is expressed in this video.
> http://www.funnyordie.com/videos/ba0cc80eec/nsa-wiretapping-public-service-announcement
> or http://tinyurl.com/q2dpkoz
>
> Lessons from Japan's response to terrorism by Aum Shinrikyo.
> http://lhote.blogspot.com/2013/06/to-understand-terrorism-and-threat.html or
> http://tinyurl.com/mwxpdu6
>
> The future of satellite surveillance is pretty scary -- and cool.
> http://www.wired.com/wiredscience/2013/06/startup-skybox/
> Remember, it's not any one thing that's worrisome; it's everything together.
>
> Interesting story of a spear phishing attack against the "Financial Times."
> http://labs.ft.com/2013/05/a-sobering-day/
>
> Ron Beckstrom gives a talk  (video and transcript) about "Mutually
> Assured Destruction," "Mutually Assured Disruption," and "Mutually
> Assured Dependence."
> http://www.youtube.com/watch?v=uwxC1HslvQg
> http://www.beckstrom.com/PDFspeech.pdf
>
> Great story on the cracking of the Kryptos Sculpture at the CIA 
> headquarters.
> http://www.wired.com/threatlevel/2013/06/analyst-who-cracked-kryptos/ or
> http://tinyurl.com/m6gemqm
>
> Interesting article on the history of, and the relationship between,
> secrecy and privacy:  "As a matter of historical analysis, the
> relationship between secrecy and privacy can be stated in an axiom: the
> defense of privacy follows, and never precedes, the emergence of new
> technologies for the exposure of secrets. In other words, the case for
> privacy always comes too late. The horse is out of the barn. The post
> office has opened your mail. Your photograph is on Facebook. Google
> already knows that, notwithstanding your demographic, you hate kale."
> http://www.newyorker.com/reporting/2013/06/24/130624fa_fact_lepore
>
> Lessons from biological security.
> http://blogs.hbr.org/cs/2013/06/when_your_data_is_under_siege.html
> I recommend his book, "Learning from the Octopus: How Secrets from
> Nature Can Help Us Fight Terrorist Attacks, Natural Disasters, and Disease."
>
> This is an interesting article about a new breed of malware that also
> hijacks the victim's phone text messaging system, to intercept one-time
> passwords sent via that channel.
> http://www.americanbanker.com/issues/178_111/new-breed-of-banking-malware-hijacks-text-messages-1059745-1.html
> or http://tinyurl.com/psbamd3
>
> Adding a remote kill switch to cell phones would deter theft.
> http://money.cnn.com/2013/06/13/technology/mobile/smartphone-theft/index.html
> or http://tinyurl.com/matamdm
> http://www.nbcnews.com/business/law-enforcement-demands-smartphone-kill-switch-6C10315942
> or http://tinyurl.com/nhrs782
> Here we can see how the rise of the surveillance state permeates
> everything about computer security.  On the face of it, this is a good
> idea.  Assuming it works -- that 1) it's not possible for thieves to
> resurrect phones in order to resell them, and 2) that it's not possible
> to turn this system into a denial-of-service attack tool -- it would
> deter crime.  The general category of security is "benefit denial," like
> ink tags attached to garments in retail stores and car radios that no
> longer function if removed.  But given what we now know, do we trust
> that the government wouldn't abuse this system and kill phones for other
> reasons?  Do we trust that media companies won't kill phones it decided
> were sharing copyrighted materials?  Do we trust that phone companies
> won't kill phones from delinquent customers?  What might have been a
> straightforward security system becomes a dangerous tool of control,
> when you don't trust those in power.
>
> The NSA has published some new symmetric algorithms: SIMON and SPECK
> http://eprint.iacr.org/2013/404.pdf
> It's always fascinating to study NSA-designed ciphers.  I was
> particularly interested in the algorithms' similarity to Threefish, and
> how they improved on what we did.  I was most impressed with their key
> schedule.  I am *always* impressed with how the NSA does key schedules.
>   And I enjoyed the discussion of requirements.  Missing, of course, is
> any cryptanalytic analysis.
>
> This is a really good paper describing the unique threat model of
> children in the home, and the sorts of security philosophies that are
> effective in dealing with them.  Stuart Schechter, "The User IS the
> Enemy, and (S)he Keeps Reaching for that Bright Shiny Power Button!"
> Definitely worth reading.
> http://research.microsoft.com/apps/pubs/?id=194484
>
> The US Department of Defense is blocking sites that are reporting about
> the Snowden documents.  I presume they're not censoring sites that are
> smearing him personally. Note that the DoD is only blocking those sites
> on its own network, not on the Internet at large.  The blocking is being
> done by automatic filters, presumably the same ones used to block porn
> or other sites it deems inappropriate.
> http://www.usnews.com/news/blogs/washington-whispers/2013/06/28/blackout-defense-department-blocks-all-articles-about-nsa-leaks-from-millions-of-computers
> or http://tinyurl.com/o2lp9q5
> http://www.huffingtonpost.com/2013/06/28/army-blocks-the-guardian_n_3515374.html
> or http://tinyurl.com/o6rlorf
>
> Interesting law journal article:  "Privacy Protests: Surveillance
> Evasion and Fourth Amendment Suspicion," by Elizabeth E. Joh.
> https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2285095
> Read this while thinking about the lack of any legal notion of civil
> disobedience in cyberspace.
>
> Here's a transcript of a panel discussion about NSA surveillance.
> There's a lot worth reading here, but I want to link to Bob Litt's
> opening remarks.  He's the General Counsel for ODNI, and he has a lot to
> say about the programs revealed so far in the Snowden documents.
> http://www.schneier.com/blog/archives/2013/07/the_office_of_t.html
> As always, the fundamental issue is trust. If you believe Litt, this is
> all very comforting.  If you don't, it's more lies and misdirection.
> Taken at face value, it explains why so many tech executives were able
> to say they had never heard of PRISM: it's the internal NSA name for the
> database, and not the name of the program.  I also note that Litt uses
> the word "collect" to mean what it actually means, and not the way his
> boss, Director of National Intelligence James Clapper, Jr., used it to
> deliberately lie to Congress.
>
> How Apple continues to make security invisible.
> http://www.macworld.com/article/2041724/apples-security-strategy-make-it-invisible.html
> or http://tinyurl.com/lpej6pz
> iOS security white paper:
> http://css.csail.mit.edu/6.858/2012/readings/ios-security-may12.pdf
>
> Evgeny Morozov makes a point about surveillance and big data: it just
> looks for useful correlations without worrying about causes, and leads
> people to implement "fixes" based simply on those correlations -- rather
> than understanding and correcting the underlying causes.
> http://www.slate.com/articles/technology/future_tense/2013/06/with_big_data_surveillance_the_government_doesn_t_need_to_know_why_anymore.single.html
> or http://tinyurl.com/nbtkzm5
>
> A philosophical perspective on the value of privacy.
> http://www.schneier.com/blog/archives/2013/07/another_perspec.html
>
> This study concludes that there is a benefit to forcing companies to
> undergo privacy audits: "The results show that there are empirical
> regularities consistent with the privacy disclosures in the audited
> financial statements having some effect. Companies disclosing privacy
> risks are less likely to incur a breach of privacy related to
> unintentional disclosure of privacy information; while companies
> suffering a breach of privacy related to credit cards are more likely to
> disclose privacy risks afterwards. Disclosure after a breach is
> negatively related to privacy breaches related to hacking, and
> disclosure before a breach is positively related to breaches concerning
> insider trading."
> http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2271871
>
> This is a really interesting article on secret languages.  It starts by
> talking about a "cant" dictionary of 16th-century thieves' argot, and
> ends up talking about secret languages in general.
> http://thejunket.org/2012/04/issue-three/language-turned-convict/
>
> Nice history of Project SHAMROCK, the NSA's illegal domestic
> surveillance program from the 1970s.  It targeted telegrams.
> http://arstechnica.com/tech-policy/2013/06/how-a-30-year-old-lawyer-exposed-nsa-mass-surveillance-of-americans-in-1975/
> or http://tinyurl.com/lrxvpsc
>
> We don't know what they mean, but there are a bunch of NSA code names on
> LinkedIn profiles: ANCHORY, AMHS, NUCLEON, TRAFFICTHIEF, ARCMAP, SIGNAV,
> COASTLINE, DISHFIRE, FASTSCOPE, OCTAVE/CONTRAOCTAVE, PINWALE, UTT,
> WEBCANDID, MICHIGAN, PLUS, ASSOCIATION, MAINWAY, FASCIA, OCTSKYWARD,
> INTELINK, METRICS, BANYAN, MARINA
> https://www.techdirt.com/articles/20130617/13482623512/discovering-names-secret-nsa-surveillance-programs-via-linkedin.shtml
> or http://tinyurl.com/lrxvpsc
>
> This is a *really* interesting article on something I've never thought
> about before: how free games trick players into paying for stuff.
> http://www.gamasutra.com/blogs/RaminShokrizade/20130626/194933/The_Top_F2P_Monetization_Tricks.php
> or http://tinyurl.com/nbto4vm
>
>
> ** *** ***** ******* *********** *************
>
>       US Offensive Cyberwar Policy
>
>
>
> Today, the United States is conducting offensive cyberwar actions around
> the world.
>
> More than passively eavesdropping, we're penetrating and damaging
> foreign networks for both espionage and to ready them for attack. We're
> creating custom-designed Internet weapons, pretargeted and ready to be
> "fired" against some piece of another country's electronic
> infrastructure on a moment's notice.
>
> This is much worse than what we're accusing China of doing to us. We're
> pursuing policies that are both expensive and destabilizing and aren't
> making the Internet any safer. We're reacting from fear, and causing
> other countries to counter-react from fear. We're ignoring resilience in
> favor of offense.
>
> Welcome to the cyberwar arms race, an arms race that will define the
> Internet in the 21st century.
>
> Presidential Policy Directive 20, issued last October and released by
> Edward Snowden, outlines US cyberwar policy. Most of it isn't very
> interesting, but there are two paragraphs about "Offensive Cyber Effect
> Operations," or OCEO, that are intriguing:
>
>      OECO can offer unique and unconventional capabilities to
>      advance US national objectives around the world with little or
>      no warning to the adversary or target and with potential
>      effects ranging from subtle to severely damaging. The
>      development and sustainment of OCEO capabilities, however, may
>      require considerable time and effort if access and tools for a
>      specific target do not already exist.
>
>      The United States Government shall identify potential targets
>      of national importance where OCEO can offer a favorable balance
>      of effectiveness and risk as compared with other instruments of
>      national power, establish and maintain OCEO capabilities
>      integrated as appropriate with other US offensive capabilities,
>      and execute those capabilities in a manner consistent with the
>      provisions of this directive.
>
> These two paragraphs, and another paragraph about OCEO, are the only
> parts of the document classified "top secret." And that's because what
> they're saying is very dangerous.
>
> Cyberattacks have the potential to be both immediate and devastating.
> They can disrupt communications systems, disable national
> infrastructure, or, as in the case of Stuxnet, destroy nuclear reactors;
> but only if they've been created and targeted beforehand. Before
> launching cyberattacks against another country, we have to go through
> several steps.
>
> We have to study the details of the computer systems they're running and
> determine the vulnerabilities of those systems. If we can't find
> exploitable vulnerabilities, we need to create them: leaving "back
> doors," in hacker speak. Then we have to build new cyberweapons designed
> specifically to attack those systems.
>
> Sometimes we have to embed the hostile code in those networks -- these
> are called "logic bombs" -- to be unleashed in the future. And we have
> to keep penetrating those foreign networks, because computer systems
> always change and we need to ensure that the cyberweapons are still
> effective.
>
> Like our nuclear arsenal during the Cold War, our cyberweapons arsenal
> must be pretargeted and ready to launch.
>
> That's what Obama directed the US Cyber Command to do. We can see
> glimpses of how effective we are in Snowden's allegations that the NSA
> is currently penetrating foreign networks around the world: "We hack
> network backbones -- like huge Internet routers, basically -- that give
> us access to the communications of hundreds of thousands of computers
> without having to hack every single one."
>
> The NSA and the US Cyber Command are basically the same thing. They're
> both at Fort Meade in Maryland, and they're both led by Gen. Keith
> Alexander. The same people who hack network backbones are also building
> weapons to destroy those backbones. At a March Senate briefing,
> Alexander boasted of creating more than a dozen offensive cyber units.
>
> Longtime NSA watcher James Bamford reached the same conclusion in his
> recent profile of Alexander and the US Cyber Command (written before the
> Snowden revelations). He discussed some of the many cyberweapons the US
> purchases:
>
>      According to Defense News' C4ISR Journal and Bloomberg
>      Businessweek, Endgame also offers its intelligence clients --
>      agencies like Cyber Command, the NSA, the CIA, and British
>      intelligence -- a unique map showing them exactly where their
>      targets are located. Dubbed Bonesaw, the map displays the
>      geolocation and digital address of basically every device
>      connected to the Internet around the world, providing what's
>      called network situational awareness. The client locates a
>      region on the password-protected web-based map, then picks a
>      country and city -- say, Beijing, China. Next the client types
>      in the name of the target organization, such as the Ministry of
>      Public Security's No. 3 Research Institute, which is
>      responsible for computer security -- or simply enters its
>      address, 6 Zhengyi Road. The map will then display what
>      software is running on the computers inside the facility, what
>      types of malware some may contain, and a menu of
>      custom-designed exploits that can be used to secretly gain
>      entry. It can also pinpoint those devices infected with
>      malware, such as the Conficker worm, as well as networks turned
>      into botnets and zombies -- the equivalent of a back door left
>      open...
>
>      The buying and using of such a subscription by nation-states
>      could be seen as an act of war. 'If you are engaged in
>      reconnaissance on an adversary's systems, you are laying the
>      electronic battlefield and preparing to use it' wrote Mike
>      Jacobs, a former NSA director for information assurance, in a
>      McAfee report on cyberwarfare. 'In my opinion, these activities
>      constitute acts of war, or at least a prelude to future acts of
>      war.' The question is, who else is on the secretive company's
>      client list? Because there is as of yet no oversight or
>      regulation of the cyberweapons trade, companies in the
>      cyber-industrial complex are free to sell to whomever they
>      wish. "It should be illegal," said the former senior
>      intelligence official involved in cyberwarfare. "I knew about
>      Endgame when I was in intelligence. The intelligence community
>      didn't like it, but they're the largest consumer of that
>      business."
>
> That's the key question: How much of what the United States is currently
> doing is an act of war by international definitions? Already we're
> accusing China of penetrating our systems in order to map "military
> capabilities that could be exploited during a crisis." What PPD-20 and
> Snowden describe is much worse, and certainly China, and other
> countries, are doing the same.
>
> All of this mapping of vulnerabilities and keeping them secret for
> offensive use makes the Internet less secure, and these pretargeted,
> ready-to-unleash cyberweapons are destabilizing forces on international
> relationships. Rooting around other countries' networks, analyzing
> vulnerabilities, creating back doors, and leaving logic bombs could
> easily be construed as acts of war. And all it takes is one
> overachieving national leader for this all to tumble into actual war.
>
> It's time to stop the madness. Yes, our military needs to invest in
> cyberwar capabilities, but we also need international rules of cyberwar,
> more transparency from our own government on what we are and are not
> doing, international cooperation between governments, and viable 
> cyberweapons treaties. Yes, these are difficult. Yes, it's a long, slow 
> process. Yes, there won't be international consensus, certainly not in
> the beginning. But even with all of those problems, it's a better path
> to go down than the one we're on now.
>
> We can start by taking most of the money we're investing in offensive
> cyberwar capabilities and spend them on national cyberspace resilience.
> MAD, mutually assured destruction, made sense because there were two
> superpowers opposing each other. On the Internet there are all sorts of
> different powers, from nation-states to much less organized groups. An
> arsenal of cyberweapons begs to be used, and, as we learned from
> Stuxnet, there's always collateral damage to innocents when they are.
> We're much safer with a strong defense than with a counterbalancing offense.
>
> This essay originally appeared on CNN.com.  It had the title "Has U.S.
> Started an Internet War?" -- which I had nothing to do with.  Almost
> always, editors choose titles for my essay without asking my opinion --
> or telling me beforehand.
> http://www.cnn.com/2013/06/18/opinion/schneier-cyberwar-policy/index.html or
> http://tinyurl.com/mr2uwa5
>
> Cyberwar arms race:
> https://www.schneier.com/essay-421.html
> https://www.schneier.com/essay-411.html
>
> Presidential Policy Directive 20:
> http://www.guardian.co.uk/world/interactive/2013/jun/07/obama-cyber-directive-full-text
> or http://tinyurl.com/qa376yb
>
> EPIC's suit from last October:
> http://epic.org/privacy/cybersecurity/Pres-Policy-Dir-20-FactSheet.pdf
> or http://tinyurl.com/pfkp2qf
>
> Policy outline:
> http://www.guardian.co.uk/world/2013/jun/07/obama-china-targets-cyber-overseas
> or http://tinyurl.com/lfbwy4c
>
> Snowden's allegations:
> http://www.telegraph.co.uk/news/worldnews/northamerica/usa/10117478/Edward-Snowden-claims-US-hacks-Chinese-targets.html
> or http://tinyurl.com/mexfyxc
> 
> Alexander's statement:
> http://www.youtube.com/watch?v=A7GUraTzzPo
> 
> James Bamford's writing:
> http://www.wired.com/threatlevel/2013/06/general-keith-alexander-cyberwar/all/
> or http://tinyurl.com/ltv6npp
>
> US accuses China:
> https://www.nytimes.com/2013/05/07/world/asia/us-accuses-chinas-military-in-cyberattacks.html
> or http://tinyurl.com/ma54cbx
>
> Here's an essay on the NSA's -- or Cyber Command's -- TAO: the Office of
> Tailored Access Operations.  This is the group in charge of hacking China.
> http://www.foreignpolicy.com/articles/2013/06/10/inside_the_nsa_s_ultra_secret_china_hacking_group
> or http://tinyurl.com/kcvk8hk
>
> None of this is new.  Read this Seymour Hersh article on this subject
> from 2010.
> http://www.newyorker.com/reporting/2010/11/01/101101fa_fact_hersh?currentPage=all
> or http://tinyurl.com/2wkl2dv
>
>
> ** *** ***** ******* *********** *************
>
>       Finding Sociopaths on Facebook
>
>
>
> On his blog,  Scott Adams suggests that it might be possible to identify
> sociopaths based on their interactions on social media.
>
>      My hypothesis is that science will someday be able to identify
>      sociopaths and terrorists by their patterns of Facebook and
>      Internet use. I'll bet normal people interact with Facebook in
>      ways that sociopaths and terrorists couldn't duplicate.
>
>      Anyone can post fake photos and acquire lots of friends who are
>      actually acquaintances. But I'll bet there are so many patterns
>      and tendencies of "normal" use on Facebook that a terrorist
>      wouldn't be able to successfully fake it.
>
> Okay, but so what?  Imagine you had such an amazingly accurate
> test...then what?  Do we investigate those who test positive, even
> though there's no suspicion that they've actually done anything?  Do we
> follow them around?  Subject them to additional screening at airports?
> Throw them in jail because we *know* the streets will be safer because
> of it?  Do we want to live in a "Minority Report" world?
>
> The problem isn't just that such a system is wrong, it's that the
> mathematics of testing makes this sort of thing pretty ineffective in
> practice.  It's called the "base rate fallacy."  Suppose you have a test
> that's 90% accurate in identifying both sociopaths and non-sociopaths.
> If you assume that 4% of people are sociopaths, then the chance of
> someone who tests positive actually being a sociopath is 26%.  (For
> every thousand people tested, 90% of the 40 sociopaths will test
> positive, but so will 10% of the 960 non-sociopaths.)  You have
> postulate a test with an amazing 99% accuracy -- only a 1% false 
> positive rate -- even to have an 80% chance of someone testing positive 
> actually being a sociopath.
>
> This fallacy isn't new.  It's the same thinking that caused us to intern
> Japanese-Americans during World War II, stop people in their cars
> because they're black, and frisk them at airports because they're
> Muslim.  It's the same thinking behind massive NSA surveillance programs
> like PRISM.  It's one of the things that scares me about police DNA
> databases.
>
> Many authors have written stories about thoughtcrime.  Who has written
> about genecrime?
>
> http://dilbert.com/blog/entry/the_internet_fingerprint
>
> The 4% number:
> http://www.amazon.com/dp/0767915828/counterpane
>
> BTW, if you want to meet an actual sociopath, I recommend this book and
> blog.
> http://www.amazon.com/Confessions-Sociopath-Spent-Hiding-Plain/dp/0307956644/ref=sr_1_1?s=books&ie=UTF8&qid=1371329503&sr=1-1&keywords=confessions+of+a+sociopath/marginalrevol-20
> or http://tinyurl.com/kh25sk4
> http://www.nytimes.com/2013/06/16/books/review/confessions-of-a-sociopath-by-m-e-thomas.html
> or http://tinyurl.com/lngn4ho
> http://www.sociopathworld.com/
>
>
> ** *** ***** ******* *********** *************
>
>       Schneier News
>
>
>
> I'm now on the board of directors of the EFF.
> https://www.eff.org/press/releases/renowned-security-expert-bruce-schneier-joins-eff-board-directors
> or http://tinyurl.com/o5f865h
>
> Last month, I gave a talk at Google.  It's another talk about power and
> security, my continually evolving topic-of-the-moment that could very
> well become my next book.  This installment is different than the
> previous talks and interviews, but not different enough that you should
> feel the need to watch it if you've seen the others.
> https://www.youtube.com/watch?v=m3NJ-Ow2Lvg&feature=youtu.be
> There are things I got wrong.  There are contradictions.  There are
> questions I couldn't answer.  But that's my process, and I'm okay with
> doing it semi-publicly.  As always, I appreciate comments, criticisms,
> reading suggestions, and so on.
> http://boingboing.net/2013/06/29/beyond-feudal-security-what.html
> http://renaissancechambara.jp/2013/06/30/bruce-schneier-on-the-state-of-the-web/
> or http://tinyurl.com/l5vpfkr
>
> A long interview with me on EconTalk; this one is mostly about security
> and power.
> http://www.econtalk.org/archives/2013/06/schneier_on_pow.html
>
> I was on the Lou Dobbs Show in mid-June.
> http://video.foxbusiness.com/v/2484933934001
>
>
> ** *** ***** ******* *********** *************
>
>       My Fellowship at the Berkman Center
>
>
>
> I have been awarded a fellowship at the Berkman Center for Internet and
> Society at Harvard University, for the 2013-2014 academic year.  I'm
> excited about this; Berkman and Harvard is where a lot of the cool kids
> hang out, and I'm looking forward to working with them this coming year.
>
> In particular, I have three goals for the year:
>
> *  I want to have my own project.  I'll be continuing to work on my
> research -- and possible book -- on security and power and technology.
> There are a bunch of people I would like to work with at Harvard:
> Yochai Benkler, Larry Lessig, Jonathan Zittrain, Joseph Nye, Jr., Steven
> Pinker, Michael Sandel.  And others at MIT: Ethan Zuckerman, David
> Clark.  I know I've forgotten names.
>
> *  I want to make a difference on a few other Berkman projects.  I don't
> know what yet, but I know there will be options.
>
> *  I want to work with some students on their projects.  There are
> *always* interesting student projects, and I would like to be an
> informal advisor on a few of them.  So if any of you are Harvard or MIT
> students and have a project you think I would be interested in, please
> email me.
>
> I'm not moving to Boston for the year, but I'll be there a lot.
>
> https://cyber.law.harvard.edu/newsroom/2013_2014_community
>
>
> ** *** ***** ******* *********** *************
>
>       Protecting E-Mail from Eavesdropping
>
>
>
> In the wake of the Snowden NSA documents, reporters have been asking me
> whether encryption can solve the problem.  Leaving aside the fact that
> much of what the NSA is collecting can't be encrypted by the user --
> telephone metadata, e-mail headers, phone calling records, e-mail you're
> reading from a phone or tablet or cloud provider, anything you post on
> Facebook -- it's hard to give good advice.
>
> In theory, an e-mail program will protect you, but the reality is much
> more complicated.
>
> * The program has to be vulnerability-free.  If there is some back door
> in the program that bypasses, or weakens, the encryption, it's not
> secure.  It's very difficult, almost impossible, to verify that a
> program is vulnerability-free.
>
> * The user has to choose a secure password.  Luckily, there's advice on
> how to do this.
>
> * The password has to be managed securely.  The user can't store it in a
> file somewhere.  If he's worried about security for after the FBI has
> arrested him and searched his house, he shouldn't write it on a piece of
> paper, either.
>
> * Actually, he should understand the threat model he's operating under.
>   Is it the NSA trying to eavesdrop on *everything*,  or an FBI
> investigation that specifically targets him -- or a targeted attack,
> like dropping a Trojan on his computer, that bypasses e-mail encryption
> entirely?
>
> This is simply too much for  the poor reporter, who wants an
> easy-to-transcribe answer.
>
> We've known how to send cryptographically secure e-mail since the early
> 1990s.  Twenty years later, we're still working on the security
> engineering of e-mail programs.  And if the NSA is eavesdropping on
> encrypted e-mail, and if the FBI is decrypting messages from suspects'
> hard drives, they're both breaking the engineering, not the underlying
> cryptographic algorithms.
>
> On the other hand, the two adversaries can be very different.  The NSA
> has to process a ginormous amount of traffic.  It's the "drinking from a
> fire hose" problem; they cannot afford to devote a lot of time to
> decrypting everything, because they simply don't have the computing
> resources.  There's just too much data to collect.  In these situations,
> even a modest level of encryption is enough -- until you are
> specifically targeted.  This is why the NSA saves all encrypted data it
> encounters; it might want to devote cryptanalysis resources to it at
> some later time.
>
> Password advice:
> http://www.schneier.com/blog/archives/2013/06/a_really_good_a.html
>
> The NSA saves encrypted traffic:
> https://threatpost.com/new-nsa-leak-sheds-light-on-encrypted-data-retention/
> or http://tinyurl.com/kacpkaf
> http://www.forbes.com/sites/andygreenberg/2013/06/20/leaked-nsa-doc-says-it-can-collect-and-keep-your-encrypted-data-as-long-as-it-takes-to-crack-it/
> or http://tinyurl.com/lk9tlxs
>
>
> ** *** ***** ******* *********** *************
>
>       Is Cryptography Engineering or Science?
>
>
>
> Responding to a tweet by Thomas Ptacek saying, "If you're not learning
> crypto by coding attacks, you might not actually be learning crypto,"
> Colin Percival published a well-thought-out rebuttal, saying in part:
>
>      If we were still in the 1990s, I would agree with Thomas. 1990s
>      cryptography was full of holes, and the best you could hope for
>      was to know how your tools were broken so you could try to work
>      around their deficiencies. This was a time when DES and RC4
>      were widely used, despite having well-known flaws. This was a
>      time when people avoided using CTR mode to convert block
>      ciphers into stream ciphers, due to concern that a weak block
>      cipher could break if fed input blocks which shared many (zero)
>      bytes in common. This was a time when people cared about the
>      "error propagation" properties of block ciphers -- that is, how
>      much of the output would be mangled if a small number of bits
>      in the ciphertext are flipped. This was a time when people
>      routinely advised compressing data before encrypting it,
>      because that "compacted" the entropy in the message, and thus
>      made it "more difficult for an attacker to identify when he
>      found the right key". It should come as no surprise that SSL,
>      designed during this era, has had a long list of design flaws.
>
>      Cryptography in the 2010s is different. Now we start with basic
>      components which are believed to be highly secure -- e.g.,
>      block ciphers which are believed to be indistinguishable from
>      random permutations -- and which have been mathematically
>      proven to be secure against certain types of attacks -- e.g.,
>      AES is known to be immune to differential cryptanalysis. From
>      those components, we then build higher-order systems using
>      mechanisms which have been proven to not introduce
>      vulnerabilities. For example, if you generate an ordered
>      sequence of packets by encrypting data using an
>      indistinguishable-from-random-permutation block cipher (e.g.,
>      AES) in CTR mode using a packet sequence number as the CTR
>      nonce, and then append a weakly-unforgeable MAC (e.g.,
>      HMAC-SHA256) of the encrypted data and the packet sequence
>      number, the packets both preserve privacy and do not permit any
>      undetected tampering (including replays and reordering of
>      packets). Life will become even better once Keccak (aka. SHA-3)
>      becomes more widely reviewed and trusted, as its "sponge"
>      construction can be used to construct -- with provable security
>      -- a very wide range of important cryptographic components.
>
> He recommends a more modern approach to cryptography: "studying the
> theory and designing systems which you can *prove* are secure."
>
> I think *both* of statements are true -- and not contradictory at all.
> The apparent disagreement stems from differing definitions of cryptography.
>
> Many years ago, on the Cryptographer's Panel at an RSA conference,
> then-chief scientist for RSA Bert Kaliski talked about the rise of
> something he called the "crypto engineer."  His point was that the
> practice of cryptography was changing.  There was the traditional
> mathematical cryptography -- designing and analyzing algorithms and
> protocols, and building up cryptographic theory -- but there was also a
> more practice-oriented cryptography: taking existing cryptographic
> building blocks and creating secure systems out of them.  It's this
> latter group he called crypto engineers.  It's the group of people I
> wrote "Applied Cryptography," and, most recently, co-wrote "Cryptography
> Engineering," for.  Colin knows this, directing his advice to
> "developers" -- Kaliski's crypto engineers.
>
> Traditional cryptography is a science -- applied mathematics -- and
> applied cryptography is engineering.  I prefer the term "security
> engineering," because it necessarily encompasses a lot more than
> cryptography -- see Ross Andersen's great book of that name.  And
> mistakes in engineering are where a lot of real-world cryptographic
> systems break.
>
> Provable security has its limitations.  Cryptographer Lars Knudsen once
> said: "If it's provably secure, it probably isn't."  Yes, we have
> provably secure cryptography, but those proofs take very specific forms
> against very specific attacks.  They reduce the number of security
> assumptions we have to make about a system, but we still have to make a
> lot of security assumptions.
>
> And cryptography has its limitations in general, despite the apparent
> strengths.  Cryptography's great strength is that it gives the defender
> a natural advantage: adding a single bit to a cryptographic key
> increases the work to encrypt by only a small amount, but doubles the
> work required to break the encryption. This is how we design algorithms
> that -- in theory -- can't be broken until the universe collapses back
> on itself.
>
> Despite this, cryptographic systems are broken all the time: well before
> the heat death of the universe.  They're broken because of software
> mistakes in coding the algorithms.  They're broken because the
> computer's memory management system left a stray copy of the key lying
> around, and the operating system automatically copied it to disk.
> They're broken because of buffer overflows and other security flaws.
> They're broken by side-channel attacks.  They're broken because of bad
> user interfaces, or insecure user practices.
>
> Lots of people have said: "In theory, theory and practice are the same.
>   But in practice, they are not."  It's true about cryptography.  If you
> want to be a cryptographer, study mathematics.  Study the mathematics of
> cryptography, and especially cryptanalysis.  There's a lot of art to the
> science, and you won't be able to design good algorithms and protocols
> until you gain experience in breaking existing ones.  If you want to be
> a security engineer, study implementations and coding.  Take the tools
> cryptographers create, and learn how to use them well.
>
> The world needs security engineers even more than it needs
> cryptographers.  We're great at mathematically secure cryptography, and
> terrible at using those tools to engineer secure systems.
>
> Ptacek's tweet:
> https://twitter.com/tqbf/status/346328557989007360
>
> Percival's rebuttal:
> http://www.daemonology.net/blog/2013-06-17-crypto-science-not-engineering.html
> or http://tinyurl.com/lr87lln
>
> Ross Anderson's book:
> http://www.cl.cam.ac.uk/~rja14/book.html
>
> After writing this, I found a conversation between the two where they
> both basically agreed with me:
> https://news.ycombinator.com/item?id=5896167
>
> See also Stefan Lucks' response to this essay on my blog.
> http://www.schneier.com/blog/archives/2013/07/is_cryptography.html#c1558384
> or http://tinyurl.com/krabumo
>
>
> ** *** ***** ******* *********** *************
>
>       Sixth Movie-Plot Threat Contest Winner
>
>
>
> On April 1, I announced the Sixth Mostly-Annual Movie-Plot Threat Contest:
>
>      For this year's contest, I want a cyberwar movie-plot threat.
>      (For those who don't know, a movie-plot threat is a scare story
>      that would make a great movie plot, but is much too specific to
>      build security policy around.) Not the Chinese attacking our
>      power grid or shutting off 911 emergency services -- people are
>      already scaring our legislators with that sort of stuff. I want
>      something good, something no one has thought of before.
>
> On May 15, I announced the five semi-finalists. Voting continued through
> the end of the month, and the winner is Russell Thomas:
>
>      It's November 2015 and the United Nations Climate Change
>      Conference (UNCCC) is underway in Amsterdam, Netherlands. Over
>      the past year, ocean level rise has done permanent damage to
>      critical infrastructure in Maldives, killing off tourism and
>      sending the economy into freefall. The Small Island Developing
>      States are demanding immediate relief from the Green Climate
>      Fund, but action has been blocked. Conspiracy theories
>      flourish. For months, the rhetoric between developed and
>      developing countries has escalated to veiled and not-so-veiled
>      threats. One person in elites of the Small Island Developing
>      States sees an opportunity to force action.
>
>      He's Sayyid Abdullah bin Yahya, an Indonesian engineer and
>      construction magnate with interests in Bahrain, Bangladesh, and
>      Maldives, all directly threatened by recent sea level rise. Bin
>      Yahya's firm installed industrial control systems on several
>      flood control projects, including in the Maldives, but these
>      projects are all stalled and unfinished for lack of financing.
>      He also has a deep, abiding enmity against Holland and the
>      Dutch people, rooted in the 1947 Rawagede massacre that killed
>      his grandfather and father. Like many Muslims, he declared that
>      he was personally insulted by Queen Beatrix's gift to the
>      people of Indonesia on the 50th anniversary of the massacre --
>      a Friesian cow. "Very rude. That's part of the Dutch soul, this
>      rudeness", he said at the time. Also like many Muslims, he
>      became enraged and radicalized in 2005 when the Dutch newspaper
>      Jyllands-Posten published cartoons of the Prophet.
>
>      Of all the EU nations, Holland is most vulnerable to rising sea
>      levels. It has spent billions on extensive barriers and flood
>      controls, including the massive Oosterscheldekering storm surge
>      barrier, designed and built in the 80s to protect against a
>      10,000-year storm surge. While it was only used 24 times
>      between 1986 and 2010, in the last two years the gates have
>      been closed 46 times.
>
>      As the UNCCC conference began in November 2015, the
>      Oosterscheldekering was closed yet again to hold off the surge
>      of an early winter storm. Even against low expectations, the
>      first day's meetings went very poorly. A radicalized and
>      enraged delegation from the Small Island Developing States
>      (SIDS) presented an ultimatum, leading to denunciations and
>      walkouts. "What can they do -- start a war?" asked the Dutch
>      Minister of Infrastructure and the Environment in an unguarded
>      moment. There was talk of canceling the rest of the conference.
>
>
>      Overnight, there are a series of news stories in China, South
>      America, and United States reporting malfunctions of dams that
>      resulted in flash floods and death of tens or hundreds people
>      in several cases. Web sites associated with the damns were all
>      defaced with the text of the SIDS ultimatum. In the morning,
>      all over Holland there were reports of malfunctions of control
>      equipment associated with flood monitoring and control systems.
>      The winter storm was peaking that day with an expected surge of
>      7 meters (22 feet), larger than the Great Flood of 1953. With
>      the Oosterscheldekering working normally, this is no worry. But
>      at 10:43am, the storm gates unexpectedly open.
>
> Microsoft Word claims it's 501 words, but I'm letting that go.
>
> This is the first professional -- a researcher -- who has won the
> contest.  Be sure to check out his blogs, and his paper at WEIS this year.
>
> Congratulations, Russell Thomas.  Your box of fabulous prizes is on its
> way to you.
>
> http://www.schneier.com/blog/archives/2013/07/sixth_movie-plo_1.html
>
>
> ** *** ***** ******* *********** *************
>
> Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing
> summaries, analyses, insights, and commentaries on security: computer
> and otherwise. You can subscribe, unsubscribe, or change your address on
> the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are
> also available at that URL.
>
> Please feel free to forward CRYPTO-GRAM, in whole or in part, to
> colleagues and friends who will find it valuable. Permission is also 
> granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
>
> CRYPTO-GRAM is written by Bruce Schneier. Bruce Schneier is an
> internationally renowned security technologist, called a "security guru"
> by The Economist. He is the author of 12 books -- including "Liars and
> Outliers: Enabling the Trust Society Needs to Survive" -- as well as
> hundreds of articles, essays, and academic papers. His influential
> newsletter "Crypto-Gram" and his blog "Schneier on Security" are read by
> over 250,000 people. He has testified before Congress, is a frequent
> guest on television and radio, has served on several government
> committees, and is regularly quoted in the press. Schneier is a fellow
> at the Berkman Center for Internet and Society at Harvard Law School, a
> program fellow at the New America Foundation's Open Technology
> Institute, a board member of the Electronic Frontier Foundation, an
> Advisory Board Member of the Electronic Privacy Information Center, and
> the Security Futurologist for BT -- formerly British Telecom.  See
> <http://www.schneier.com>.
>
> Crypto-Gram is a personal newsletter. Opinions expressed are not
> necessarily those of BT.
>
> Copyright (c) 2013 by Bruce Schneier.
>
> ** *** ***** ******* *********** *************
>
> ** *** ***** ******* *********** *************
>
> To unsubscribe, click this link:
>
> http://listserv.modwest.com/cgi-bin/wa?TICKET=NzM1MDk0IHBldGVyLmhlbm5pZ0BXRUIuREUgQ1JZUFRPLUdSQU0tTElTVBirGpWEtLwZ&c=SIGNOFF
>



Mehr Informationen über die Mailingliste Gpg4win-users-de