Photos from Seattle Washington

Most everyone who knows me knows that I am leaving the Washington DC area and moving to the great Northwest — Seattle Washington. I had the opportunity recently to take a motorcycle ride through Whidbey Island and discovered that Deception Pass is a spectacular sight.

Obviously I didn’t take this picture but I figured I should include a recent photo.
Deception_Pass1 (68k image)

This picture focuses in on the Deception Pass State Park. Minimal edits to the picture to lighten the trees. Shot from my Digital Rebel, 24mm focal length, 1/400 sec, f/10, iso 100. Really a spectacular photo.

Deception_Pass2 (62k image)

One more from the town of Langley. Langley is a tiny town on the coast of Whidbey Island. I’m not sure what compelled me, but I walked out into the middle of the street and took this picture. Shot from my Digital Rebel, 110mm focal length, 1/250 sec, f/5.6, iso 100.

Langley (70k image)

One more. Yes, it occasionally rains in Seattle. Afterwards, the rainbows look spectacular. Shot from my Digital Rebel, 24mm focal length, 1/125 sec, f/5.6, iso 100.

Rainbow-1-1 (35k image)

Generating Colliding X.509 Certificates

I just finished reading Colliding X.509 Certificates by Arjen Lenstra, Xiaoyun Wang, and Benne de Weger and I now have chills running up my spine. If I understand the paper correctly, the researchers generated two RSA moduli that could be swapped but still produce the same MD5, which means that the contents of a certificate signed by a trusted third party could be replaced using the same signature. The attack isn’t on the public key itself since the factors necessary to generate the private key are still computationally hard to obtain but rather on the content of the certificate. The key assumption is that the certificate is signed by a third party signer, which supplies the public key for verification.

Even as posed, this is a pretty scary paper. You could generate a certificate with your legitimate content in it (distinguished name, etc.), get that signed by a Trusted Third Party (TTP) and replace the key with another that wasn’t actually signed by the TTP. In essence this means that the TTP signature does not guarantee that the certificate holder actually has the private key to go along with the key that was originally signed. This also means that certificates signed using MD5 are not to be trusted.

The Next Step In Online Music

I’ve been thinking about a post that I made a few days ago that linked to an article discussing the economics of Napster-To-Go versus iTunes. The conclusions from this article focused on the losses of the music industry as a result of P2P sharing.

This morning I read this USAToday article which describes how many big players are giving away MP3 players, Blackberries, and PDAs in exchange for some purchase (e.g. buy a round-trip ticket on United and get a BlackBerry). Of course, the string here is that you have to subscribe to some service contract for some period of time, where they make up the money.

That brings me to what the next evolution in online music. Much like we get cell-phones for a very low price in exchange for a 2 year service agreement, the next step is that we will be able to get music players with the same sort of deal. It makes perfect sense to me as the music industry begins to embrace digital music subscriptions. And why not? Imagine getting an iPod for free in exchange for a 2 year, $30/month subscription to iTunes. That would be $720 paid over time, but you get all the music you can drink in that period. The music dies when the subscription dies, but since the service provider can keep it forever for me, why do I care? I can get the music whenever I want. With extension for multiple players (e.g. family plans) I end up with an unlimited, on-demand music library, which is probably my ideal.

So what’s wrong with the existing Napster model? To much money up front. I have to buy a music player for $300 to get to use it. If the music player is part of the subscription, then just like a cell phone, I’ll throw it away when my 2 year service agreement is up. After all, that’s sort of what’s happening with MP3 players anyway. I’ve had 2 disk-based units now (an Archos and an iPod) over the span of about 4 or so years. They break, they become old tech, I want a new one. What a perfect scheme. If Apple’s on the ball, that’s what will happen next, but I’m betting that Napster or some other service provider will jump on this first.

SHA-1 Broken

Bruce Schneier reports on his blog that SHA-1 has been broken as described in a paper by Chinese researchers Xiaoyun Wang, Yiqun Lisa Yin, and Hongbo Yu. Federal Information Processing Standard 180 (FIPS-180) describes SHA-1 in the following way:

Explanation: This Standard specifies a Secure Hash Algorithm, SHA-1, for computing a condensed representation of a message or a data file. When a message of any length

The SHA-1 is called secure because it is computationally infeasible to find a message which corresponds to a given message digest, or to find two different messages which produce the same message digest. Any change to a message in transit will, with very high probability, result in a different message digest, and the signature will fail to verify. SHA-1 is a technical revision of SHA (FIPS 180). A circular left shift operation has been added to the specifications in section 7, line b, page 9 of FIPS 180 and its equivalent in section 8, line c, page 10 of FIPS 180. This revision improves the security provided by this standard. The SHA-1 is based on principles similar to those used by Professor Ronald L. Rivest of MIT when designing the MD4 message digest algorithm (“The MD4 Message Digest Algorithm,” Advances in Cryptology – CRYPTO ’90 Proceedings, Springer-Verlag, 1991, pp. 303-311), and is closely modelled after that algorithm.

The general conclusion of this paper is that collisions can be found after 2^69 hash operations, instead of the brute force 2^80. A collision is where two given messages are found to produce the same result. This effectively means that 2^11 fewer operations are required to produce a collision. Computationally, this means that if it took a week to compute 2^69 hash operations before a collision, it would have taken 2048 weeks to compute 2^80 hash operations before, which is about 39 years. That’s a pretty significant reduction in the amount of time necessary to break a hash. Now it still takes a long time to compute a hash and 2^69 of them is a huge amount, but as Moore’s law continues to give us faster processors, a 2^11 reduction in operations is very, very important. It effectively renders SHA-1 useless for the long-term, and maybe even for the short term.