My primary source of Hollywood entertainment is the Videocenter movie rental store located at a stones throw from my flat in Prenzlauerberg. Each movie is available for 1 Euro per day, the collection is as current as the DVD release schedule, and the place is run by courteous bunch of folks. There is a popcorn machine, soda fridge, snack isle, and even the Ben & Jerry's cooler. All yours, for 1 Euro. 1 movie per weekend * 1 euro per movie = 4 Euros per month.
Thats the monthly Hollywood bill for me and the wife. Now that is a hard-to-beat deal.
If I was wearing a cable/IPTV VoD service provider's hat I would be hard-pressed to beat this deal because breaking even at 1 Euro for another distribution medium is a tough cookie (see my related post on Netflix's VoD distribution cost). Plus cable/IPTV cannot deliver my Ben & Jerry's ice cream tub.
Wait, that gives me an idea. Maybe Cable/IPTV VoD service providers can team up with ice cream trucks to have them deliver ice-cream and get a cut from Ben & Jerrys. Pizza, snacks, popcorn, T-Shirts, I don't know, movie specific stuff. Perhaps this may allow service providers to compete with Videocenters. The thing is, they need to start looking outside technology and into ice cream trucks.
Sunday, June 29, 2008
Sunday, June 22, 2008
Renewable energy : Fossil fuels :: David : Goliath
The driving force for innovation in alternative energy sources like wind energy, solar power, and bio-fuels is the steep increase in crude oil and natural gas futures (No I don't believe it is out of love for the environment). I wanted to understand how much time is needed make a significant hole in the fossil fuel demand by way of diverting the energy demand to alternative fuels. A beautiful figure from the 2006 Annual Energy Review released by the US Energy Information Administration is a nice starting point. The figure is US specific, and does not consider energy hungry China or India, but if the US energy juggernaut can be tamed with say, 50% alternative energy sources, then I am certain that China and India will happily adopt these viable alternative energy sources as well. (Plus I don't have the beautiful figures for the rest of the world so lets work with the US data!)
About 14% of the total US energy comes from renewable sources and nuclear power, in fact the figure also says that only 6% of the energy comes from renewable sources excluding nuclear energy. The rest comes from fossil fuels (including natural gas). So lets try to guesstimate, based on this data and varying rates of renewable energy growth in the coming years, the time until we derive as much energy from renewable energy as from do from fossil fuels today.
I have plotted 4 scenarios based on 5%, 10%, 15%, and 20% annual growth of renewable energy starting from their base 2006 value (from the Energy flow diagram). The plot indicates that it is going to take between a 10% to 15% annual growth of renewable energy in order to catch up with the present fossil fuel energy contribution by 2030. While there is no hard-written reason for the annual growth to not exceed 10-15%, I believe that there are significant inertial factors, like deployed fossil-fuel based equipment, lack of skilled engineers, innovation lag, legal issues, etc., which will keep renewable energy from growing at higher annual rates.
Conclusions:
Basically, this simple back-of-the-envelope calculation indicates that we need sustained double digit growth in renewable energy over the next 2 decades to challenge the fossil fuel Goliath. The figures also seems indicate that in the short term (5-10 years) fossil fuels are going to be the primary energy source. Therefore the world is going to need either a huge increase in supply or an appreciable decrease in demand of fossil fuels notwithstanding any alternative sources of energy in the short term.
Supply will grow as better technology is used for oil and gas exploration. It has already become worthwhile to use high-sulfur crude instead of sweet light crude oil. But the fact remains that demand will have to abate to meet the short supply through higher prices and unfortunately, slower economies.
Sunday, June 15, 2008
Net neutrality: The value of a byte travelling on the Internet
In response to Richard Bannet's article.
Let us say, for simplicity, that most of us are connected to the best-effort, statistically multiplexed, Internet. What does this mean? This means that every byte on the Internet going from point A to point B will, on average, get the same service from the Internet (same probability of loss, same delay, same delay-jitter, etc.) Therefore the Internet has the tendency to treat each byte traversing it as equal because in our simple example of 2 bytes going from A to B, the fraction of service (or utility) that each byte receives from the Internet is equal.
However, most people agree that the importance, or utility, of every byte on the Internet is not equal. For example, it may be more important to quickly transfer a byte from a voip conversation than a byte from a file transfer. Or it may be more important to send bytes that update stock prices than to send bytes to play a You tube video.
Or so I think. But what do you think? What does Skype think? What does Google think? What does Comcast think? What does the government of a country think? And if they think differently, then whose voice matters? Or should anyone's voice matter more than the others?
This is the key point of the Net Neutrality corundum. Everyone agrees that the present design of the best-effort Internet is suboptimal in that it treats every byte as equal and gives equal precedence to equal fractions of content. But the issue with doing away with this Net Neutrality model is that vested interests will decide which particular byte is more important than another byte. Can we trust one single company, or authority, to make the correct decision on this one? As a market believer, I would first say that let the market decide, i.e., let the price per byte, and hence the value attached to that particular byte, be the deciding factor. But the big issue is whether a flexible, effective, and dynamic market of this sort can be set up and quickly integrated with the existing and upcoming Internet protocols. Until this happens, I am more comfortable with time-tested simple statistical multiplexing, the fair but sub-optimal egalitarian algorithm, to do the job.
I am relieved that the question of Net Neutrality does have a technical solution - setup a market to do the job. I am just concerned of whether there is enough political patience to wait for the technology to develop this byte market we will need.
Let us say, for simplicity, that most of us are connected to the best-effort, statistically multiplexed, Internet. What does this mean? This means that every byte on the Internet going from point A to point B will, on average, get the same service from the Internet (same probability of loss, same delay, same delay-jitter, etc.) Therefore the Internet has the tendency to treat each byte traversing it as equal because in our simple example of 2 bytes going from A to B, the fraction of service (or utility) that each byte receives from the Internet is equal.
However, most people agree that the importance, or utility, of every byte on the Internet is not equal. For example, it may be more important to quickly transfer a byte from a voip conversation than a byte from a file transfer. Or it may be more important to send bytes that update stock prices than to send bytes to play a You tube video.
Or so I think. But what do you think? What does Skype think? What does Google think? What does Comcast think? What does the government of a country think? And if they think differently, then whose voice matters? Or should anyone's voice matter more than the others?
This is the key point of the Net Neutrality corundum. Everyone agrees that the present design of the best-effort Internet is suboptimal in that it treats every byte as equal and gives equal precedence to equal fractions of content. But the issue with doing away with this Net Neutrality model is that vested interests will decide which particular byte is more important than another byte. Can we trust one single company, or authority, to make the correct decision on this one? As a market believer, I would first say that let the market decide, i.e., let the price per byte, and hence the value attached to that particular byte, be the deciding factor. But the big issue is whether a flexible, effective, and dynamic market of this sort can be set up and quickly integrated with the existing and upcoming Internet protocols. Until this happens, I am more comfortable with time-tested simple statistical multiplexing, the fair but sub-optimal egalitarian algorithm, to do the job.
I am relieved that the question of Net Neutrality does have a technical solution - setup a market to do the job. I am just concerned of whether there is enough political patience to wait for the technology to develop this byte market we will need.
Wednesday, June 11, 2008
Scott McNealy's talk at TU Berlin
Scott McNealy gave a talk on "Open Wins: Leadership and Innovation" at the Technical University of Berlin today. Scott McNealy is of the "The Network is the Computer" fame from the dotcom boom of the 90s and is presently the Chairman of Sun Microsystems. I was part of the audience and found a few things worth blogging.
The key message of his talk was that Sun is and has been an open source champion forever. I don't buy into that (Solaris wasn't open source till 2005). But Scott blamed a prior agreement with AT&T Unix for this. Whatever the truth is, I cannot seem to place Solaris in the same bandwagon as Linux when I think open source. I suspect that Sun is trying to leverage the open source developer community to shoulder the costs of keeping Solaris updated (he himself alluded to the bug-squelching power of open source software). Perhaps like the Fedora project of Redhat?
Scott also touted the Sun Ultrasparc T2 processor - with lots of multi threading and multi core support and the 1.5 Watt/thread low-power footprint. Off course there was the usual Microsoft/Oracle bashing (Microsoft and its patches... so un-open source, Oracle and its $40k per-core licensing... so un-Mysql), but the hidden message was that Sun hopes that these new T2s find their way into routers and energy conscious data-centers. Although in my opinion I doubt if Oracle's grip on enterprise database computing is going to loosen up anytime soon, so Mr. McNealy may find it a tad-bit difficult to wean enterprises from their established Oracle databases into to Sun-acquired Mysql.
I asked Mr. McNealy about why Sun-promoted Open Office does not enjoy the MS Office - .NET (think VB Macros) type of integration with Java. The answer I got seemed to suggest that I should ask the open source community about this. Why shouldn't Sun take the lead in this? Java is their baby and Sun says it is all for Open office, then why not cure the Achilles heel of Open office through Java integration? This makes me suspect what I said about Solaris earlier - that Sun is trying to leverage the open source developer community to shoulder the costs of integrating a credible scripting platform into Open office.
Undoubtedly Sun has done more for open source than most other companies (for example, as Scott rightly pointed out, Google has a rather poor record on this front). Lets hope Sun does a lot more in the future.
Saturday, June 7, 2008
Entertainment technology and diminishing returns
I am trained in information theory, a science which emphasizes squeezing every last bit of efficiency out of communication channels to get a voice or video signal across in its best form. A noble endeavor no doubt, and one that has spawned a whole analog, and later digital, entertainment industry. But how much does it matter to the end-user?
My basic thesis is this: up to a certain quality people do care about how much error-free information gets across. But after that the human brain's smoothening kicks in - the apparatus in us that can skillfully ignore any small blemishes in the audio track or on the screen. While it is true that video technology already takes advantage of this "help" from the human brain (thats why finite frame rates and digitized pictures work), I have a feeling that sometimes technology needlessly pushes bits which are not useful, leading to "diminishing returns" on entertainment technology investment. After all, how many of us can make out the difference between a 192kbps-encoded MP3 file and its uncompressed counterpart that is 10 times bigger?
For another example, lets look at HDTV. Unless you are watching from close distance, the low pass filter in your eyes will substantially smooth out the sharp images on the HDTV screen. Undoubtedly HDTV looks better (but how better?) than SD TV, but is the delta enough to drive consumer pull in the mass market?
The rapid sales of HDTV s and blue-ray disks seems to suggest so. But I'd like to know what fraction of the content being seen on these HDTV s is really HD? And when the world does shift to HD, will the lowly SD TV be forgotten, going the B&W TV way? Probably not, because a vast library of content is stored in SD TV format. My kids will probably watch my old Friends and M*A*S*H DVDs or my father's music video collection (stored on VHS!). So their eyes and senses will probably accept fuzzy-ol' SD TV as well. Entertainment is about content quality first and then about technology quality.
My basic thesis is this: up to a certain quality people do care about how much error-free information gets across. But after that the human brain's smoothening kicks in - the apparatus in us that can skillfully ignore any small blemishes in the audio track or on the screen. While it is true that video technology already takes advantage of this "help" from the human brain (thats why finite frame rates and digitized pictures work), I have a feeling that sometimes technology needlessly pushes bits which are not useful, leading to "diminishing returns" on entertainment technology investment. After all, how many of us can make out the difference between a 192kbps-encoded MP3 file and its uncompressed counterpart that is 10 times bigger?
For another example, lets look at HDTV. Unless you are watching from close distance, the low pass filter in your eyes will substantially smooth out the sharp images on the HDTV screen. Undoubtedly HDTV looks better (but how better?) than SD TV, but is the delta enough to drive consumer pull in the mass market?
The rapid sales of HDTV s and blue-ray disks seems to suggest so. But I'd like to know what fraction of the content being seen on these HDTV s is really HD? And when the world does shift to HD, will the lowly SD TV be forgotten, going the B&W TV way? Probably not, because a vast library of content is stored in SD TV format. My kids will probably watch my old Friends and M*A*S*H DVDs or my father's music video collection (stored on VHS!). So their eyes and senses will probably accept fuzzy-ol' SD TV as well. Entertainment is about content quality first and then about technology quality.
Sunday, May 25, 2008
Wildflowers in the Prenzlauerberg spring
Wednesday, May 21, 2008
Finally, the "Net" in Netflix; plus, the bandwidth question
Netflix has released a set-top box that users can use to receive movies directly over their broadband Internet connections. The box, developed by the silicon valley company Roku, has received good reviews on CNET and PC Magazine for its nice interface and more-or-less good performance over most home-broadband connections.
Advantages for users
The Netflix system delivers video streams at 2.2 Mbps, 1Mbps, and an even lower bit-rate depending on the connection between the server and the receiving box. The quality naturally degrades according to the lessening bitrate, but let us assume that a user has a great Internet connection and that no bandwidth bottleneck exists between the serving CDN and this user and so s/he can watch the best 2.2Mbps quality for the entire 120 minutes of a movie.
Size of the movie:
2.2 Mbps x 7200 seconds (i.e. 120 minutes) = 15840 Mb = 15840/8 MB = 1980MB = 1980/1000 GB
= 1.98 GB.
So downloading a movie at the best quality means the CDN serves about 2GB of data to the end-user's Netflix- Roku box.
To arrive on the bandwidth costs, lets go with the figures presented in this PBS article about CDN pricing. Disclaimer: This article is more than a year old, and I have been reading about CDN price wars all along. So the current cost of bandwidth may actually be lower than stated.
From the PBS article the costs for streaming a 2GB stream to a user (assuming volume wholesale pricing):
Ok first off, I like the P2P number the most but lets ignore that because P2P may not be able to compete in quality with CDNs (See my paper on this). Even if Netflix uses the most expensive Akamai CDN, they are getting away with just $32 cents per movie instead of the dollar-plus cost in the DVD-mailing model. Even if we assume a few more cents of overhead per movie due to the technology costs, I think Netflix is well in the green with this.
The beauty of Netflix's strategy is that they will be able to gradually wean people from the DVD mailing model to the this online content delivery model because of the convenience of the latter. And this without jeopardizing the DVD mailing model because there is no cannibalization here - its perfect migration with one less DVD mailing customer corresponding to one more streaming customer. Every DVD streamed will add up and lead to a drastic reduction in Netflix's operating costs.
Meanwhile, Roku will probably make some money out of their $99 box.
Last question: And the ISP?
Thats for later. Enjoy your movies.
Update: Netflix bandwidth costs come to about $5cents as of June 2009, according to this article.
Advantages for users
- No propagation delay from snail-mail shipping DVDs - no more waiting for 2 days.
- No need to mail back DVDs.
- Ability to switch to another movie or show - you are not stuck with that wrong movie you placed in your Netflix queue.
- No extra cost except the broadband connection and the $99 cost of the box.
- Savings in storage, handling, and shipping costs (to-and-fro) of the DVDs. Theoretically, if all Netflix subscribers switch to this technology then Netflix can close its nation-wide distribution centers and also save on postage costs: assuming that Netflix pays the standard first-class mail rate of $0.42 to USPS, thats a $0.84 saving per mailed DVD. I think that the present overall cost of circulating a DVD to a user may be well above a dollar for Netflix.
- Centralized content control and the ability to speedily deploy new movies, shows etc.
- Ability to expand beyond the US in a relatively painless way - no distribution centers to set up, no additional staffing costs (analogous to how iTunes operates in Europe).
The Netflix system delivers video streams at 2.2 Mbps, 1Mbps, and an even lower bit-rate depending on the connection between the server and the receiving box. The quality naturally degrades according to the lessening bitrate, but let us assume that a user has a great Internet connection and that no bandwidth bottleneck exists between the serving CDN and this user and so s/he can watch the best 2.2Mbps quality for the entire 120 minutes of a movie.
Size of the movie:
2.2 Mbps x 7200 seconds (i.e. 120 minutes) = 15840 Mb = 15840/8 MB = 1980MB = 1980/1000 GB
= 1.98 GB.
So downloading a movie at the best quality means the CDN serves about 2GB of data to the end-user's Netflix- Roku box.
To arrive on the bandwidth costs, lets go with the figures presented in this PBS article about CDN pricing. Disclaimer: This article is more than a year old, and I have been reading about CDN price wars all along. So the current cost of bandwidth may actually be lower than stated.
From the PBS article the costs for streaming a 2GB stream to a user (assuming volume wholesale pricing):
Single server: $0.26
Akamai : $0.32
P2P: $0.0024
P2P: $0.0024
Ok first off, I like the P2P number the most but lets ignore that because P2P may not be able to compete in quality with CDNs (See my paper on this). Even if Netflix uses the most expensive Akamai CDN, they are getting away with just $32 cents per movie instead of the dollar-plus cost in the DVD-mailing model. Even if we assume a few more cents of overhead per movie due to the technology costs, I think Netflix is well in the green with this.
The beauty of Netflix's strategy is that they will be able to gradually wean people from the DVD mailing model to the this online content delivery model because of the convenience of the latter. And this without jeopardizing the DVD mailing model because there is no cannibalization here - its perfect migration with one less DVD mailing customer corresponding to one more streaming customer. Every DVD streamed will add up and lead to a drastic reduction in Netflix's operating costs.
Meanwhile, Roku will probably make some money out of their $99 box.
Last question: And the ISP?
Thats for later. Enjoy your movies.
Update: Netflix bandwidth costs come to about $5cents as of June 2009, according to this article.
Tuesday, May 20, 2008
If there is a Microsoft Yahoo deal then startups will feel the pinch
A Microsoft takeover of all or some of Yahoo will be a good thing for Microsoft in it's battle to unseat Google from the Internet's helm. Perhaps a later Microsoft-Facebook arrangement will finally present a credible challenge to Google. I doubt if the Yahoo board will now agree to anything less than the 72% premium over the original share price Microsoft had offered earlier, so Yahoo shareholders will also come out wealthier from the deal. End users are likely to benefit from a stronger alternative to Google as well.
But one quarter will suffer quietly in the short to medium term: startup companies. Yahoo and Microsoft are some of the most prolific startup acquirers (see Figure 1, from this blog). Yahoo merging with Microsoft removes a big buyer for many startup companies. Moreover, Microsoft will have that much less cash (approximately $44B less based on the first MS offer) to throw at startup acquisitions. With the credit supply tightening up and the economy slowing down, you can be sure of a capital drought ahead for many Internet and software startups.
Friday, May 16, 2008
Crude oil demand: India, China, and the USA

Crude oil prices have never been higher (Brent sweet crude is trading at about $125 a barrel on the NYMEX as of this post). Part of the reason is attributed to the growing demand from emerging economies like India and China that is putting upward pressure on the price of oil. Alan Greenspan writes in his book that the annual world demand for for crude oil has grown by 1.6% since the late 80s while the production has only grown by 0.8% or so annually. The gap has lead investors to bid up crude oil futures in anticipation of the tightening supply, further driving up prices as the buffer between supply and demand has narrowed significantly.
I downloaded crude oil import data from the UN data website for India, China, and the USA and plotted it (Figure 1). Unsurprisingly, the USA imports far more crude oil than India or China. It is more interesting to note is that the growth in US crude oil imports has been of the same order or steeper than that of India and China. Therefore, demand is being driven higher more by the USA than by India or China.
In his book, Alan Greenspan speaks about the "crude oil intensity" of a nation, defined as its crude oil consumption normalized by its GDP. He states that this number is far higher for China and India than it is for the USA because the latter has shifted to a less oil-intensive service economy in the past few decades. From my perspective, I think that the real crude oil intensity of the USA may be much more than Greenspan computes it to be because of USA's large number of imports from China. For example, a plastic toy imported from China counts the crude oil used to manufacture it and transport it to the USA as crude oil used by China.
It seems clear that the biggest lever to reduce crude oil demand lies in the hands of the USA. India and China are emerging economies eager to lift 100s of millions of people out of poverty. As such, they may not have the political capital to cut back on their increasing (but still small) usage of crude oil. On the other hand, even a small percentage cutback in the USA will reduce demand significantly. Lets hope that the USA moves towards more efficient cars, better public transport systems and away from its suburban driving culture in order to keep crude oil within reach of poorer nations of the World.
I downloaded crude oil import data from the UN data website for India, China, and the USA and plotted it (Figure 1). Unsurprisingly, the USA imports far more crude oil than India or China. It is more interesting to note is that the growth in US crude oil imports has been of the same order or steeper than that of India and China. Therefore, demand is being driven higher more by the USA than by India or China.
In his book, Alan Greenspan speaks about the "crude oil intensity" of a nation, defined as its crude oil consumption normalized by its GDP. He states that this number is far higher for China and India than it is for the USA because the latter has shifted to a less oil-intensive service economy in the past few decades. From my perspective, I think that the real crude oil intensity of the USA may be much more than Greenspan computes it to be because of USA's large number of imports from China. For example, a plastic toy imported from China counts the crude oil used to manufacture it and transport it to the USA as crude oil used by China.
It seems clear that the biggest lever to reduce crude oil demand lies in the hands of the USA. India and China are emerging economies eager to lift 100s of millions of people out of poverty. As such, they may not have the political capital to cut back on their increasing (but still small) usage of crude oil. On the other hand, even a small percentage cutback in the USA will reduce demand significantly. Lets hope that the USA moves towards more efficient cars, better public transport systems and away from its suburban driving culture in order to keep crude oil within reach of poorer nations of the World.
Wednesday, May 14, 2008
On dear peer-to-peer
I will be presenting a paper* at the IEEE IwQoS early next month containing the analysis of a large scale peer-to-peer live video multicast streaming session on the Internet. Think of the P2P video multicast system as the Bittorrent for video streaming (instead of file-sharing). The system was sending a video stream of a baseball match to 10s of thousands of viewers on the Internet using P2P technology.
The presentation and paper are available online. Here are two results from the paper that in my opinion warrant particular notice.
Figure 1 shows the aggregate download and upload bandwidth consumed by the P2P system. Note the scale on the Y axis - Gbps! I am wondering, this is only 1 video stream. What happens to the Internet when 1000s of such streams become available online? Were networks designed for such usage?
An interesting artifact in this figure is that the aggregate download rate of all peers exceeds the aggregate upload rate of all peers. The difference was made up through "bandwidth injecting super-servers" of the content provider. Still, it is absolutely remarkable that the amount of additional bandwidth required is almost constant even as the number of peers increase (Figure 2).
*Joint work with Jatinder Pal Singh (T-Labs) and Aditya Mavlankar, Pierpaolo Baccichet, and Bernd Girod (Stanford University)
The presentation and paper are available online. Here are two results from the paper that in my opinion warrant particular notice.
Figure 1 shows the aggregate download and upload bandwidth consumed by the P2P system. Note the scale on the Y axis - Gbps! I am wondering, this is only 1 video stream. What happens to the Internet when 1000s of such streams become available online? Were networks designed for such usage?
An interesting artifact in this figure is that the aggregate download rate of all peers exceeds the aggregate upload rate of all peers. The difference was made up through "bandwidth injecting super-servers" of the content provider. Still, it is absolutely remarkable that the amount of additional bandwidth required is almost constant even as the number of peers increase (Figure 2).

Figure 1: Total bandwidth. Click to enlarge.
In Figure 2, you can see the number of concurrent peers in the P2P system over the time-period of the streamed game (hour 4 to hour 8). Look at the rate of change of peers in (peers joining, peers leaving) the P2P system. Keeping in mind that most of the bandwidth comes from these very peers, it is remarkable that this highly dynamic pool of peers is able to sustain the P2P system. Things get very exciting at the end of the baseball game (Hour 8): Everybody wants to leave. Now that is a big challenge for any P2P system.

Figure 2: Peer Dynamics. Click to Enlarge.
*Joint work with Jatinder Pal Singh (T-Labs) and Aditya Mavlankar, Pierpaolo Baccichet, and Bernd Girod (Stanford University)
Saturday, May 10, 2008
Spinning a jetliner...in a loom!
There was an incredible article in Fortune about the manufacturing process of the Boeing 787. The first plane is slated to fly before the end of the year and Boeing is already reporting that the 787 is the fastest selling jetliner of all times. In fact, Boeing is taking flak from customers for delaying the delivery of the aircraft for want of parts. Apparently, a handful of suppliers just cannot keep up with the demand!
But what fascinated me was that the aircraft's shell will not be made out of aluminum alloys. Instead, it is made of carbon composites. This material is created from carbon fibers that are spun in a way reminiscent of spinning thread (See picture below). Epoxy raisins and subsequent heat treatment creates the carbon composite material for the plane's body. The material is lighter, stronger, and amenable to better aerodynamic design. For example, the entire toilet of the 787 weighs just 170 lb!
Spinning the carbon fibers for the Boeing 787 (courtesy, Fortune). The complete slide-show is available here.
Needless to say, the light aircraft means it is significantly more fuel efficient. At the same time, the greater strength means that the cabin can be pressurized to about 6000ft, making flying more comfortable. The material also allows more comfortable humidity levels in the cabin because unlike metals, carbon does not corrode.
Kudos to the Boeing engineers for designing this marvel!
But what fascinated me was that the aircraft's shell will not be made out of aluminum alloys. Instead, it is made of carbon composites. This material is created from carbon fibers that are spun in a way reminiscent of spinning thread (See picture below). Epoxy raisins and subsequent heat treatment creates the carbon composite material for the plane's body. The material is lighter, stronger, and amenable to better aerodynamic design. For example, the entire toilet of the 787 weighs just 170 lb!

Needless to say, the light aircraft means it is significantly more fuel efficient. At the same time, the greater strength means that the cabin can be pressurized to about 6000ft, making flying more comfortable. The material also allows more comfortable humidity levels in the cabin because unlike metals, carbon does not corrode.
Kudos to the Boeing engineers for designing this marvel!
Friday, May 9, 2008
The recession and US housing prices early in the decade

Click to Enlarge
"In the United States, homes had increased in value so much that households, feeling flush, seemed more willing to spend"
It is widely acknowledged now (2008) that housing prices were inflated artificially due to the easy credit available in the first half of the decade. For example, Figure 1 shows the average and median US home prices between 1980 and 2007 (source: US census). Notice the very rapid increase in new home prices between 2000 and 2006. And the most recent slowdown.
Greenspan has touched on the fundamental force responsible for the 2001 recession being so mild - the economy's white knight in shining armor was the US consumer with pockets full from the soaring house prices. All those refinancing dollars kept up the consumer spending. The 2000-01 dot com debacle was perhaps larger than it seemed; but the cushioning effect of housing prices made the pain a lot less, back then.
Does this mean that it is payback time now? Did we end up borrowing from the future in 2001? With housing prices plateauing and actually decreasing in some markets, I very much doubt of housing will bail out the economy this time.
Wanted: New knight in shining armor, preferably resilient to speculative forces.
Well I've read that the weak dollar is good for US exports and also makes foreign imports dearer (e.g. $120+ oil). Perhaps these two factors will buttress the US economy this time. Wait and watch.
Thursday, May 8, 2008
Alan Greenspan's book

The Guru's book (click to visit Amazon site). Picture courtesy Penguin.
I have been reading Alan Greenspan's "The Age of Turbulence: Adventures in a New World" (ISBN 1594201315) over the past couple of days. This book is part autobiography, but given the stature of the man, the book gives an unique insight into how the Global economy has progressed since the War from the perspective of the most well known central banker of all times.
In the book, Alan declares his unflinching faith in Adam Smith's capitalist ideas. Interestingly he admits being deeply influenced by Ayn Rand's (sometimes extreme) beliefs in laissez-faire capitalism. Perhaps the fall of the Soviet system prior to his writing the book stoked the ferocity of Alan's pro-market capitalism beliefs in the book. Whatever the reasons, the fact that market capitalism remains the only time-tested successful economic system makes me comfortable with much of what he says on this topic.
But the book has a lot more to it. One of the interesting features is the interplay of monetary policy and politics in Washington described in the book. It is absolutely remarkable that Alan Greenspan successfully navigated a continuous 18-year term as Fed Chairman with presidents from either side of the political spectrum. Although Alan is a republican, he plainly states his poor opinion of George Bush and extols Bill Clinton"s economic policies. I believe that his ability to work with multiple administrations may also explain his ability to successfully tackle situations like the 87' stock market crash and the post 9/11 economic landscape.
The book explains the basic workings of the Fed in an easy-to-comprehend manner. You learn of technicalities like the Fed Fund rate and the Discount rate, the organizational structure of the Federal Reserve and its relation to the US Treasury, its mandate, etc. While this information is also available elsewhere, Alan brings facts to life by introducing the workings of the Fed through anecdotes and examples from his experience.
I am still on page 200. This is one of those rare books about which I hesitate to comment more before better understanding what the Guru wants us to hear. Stay tuned for more commentary on this superb book.
Saturday, April 19, 2008
TATA's acquisition spree. Answer to the "why" question
I have known Tata since I was a child. I rode Tata buses to school, soaped myself with Tata soaps, stayed in Tata hotels and probably lived in houses supported by Tata steel. More recently both my brother and my wife worked for a Tata company. But the Tata I knew then was different than today's Tata. The global, competitive, aggressive, and ambitious Tata.
India's Tata group has become very well known in Europe recently after taking over UK's Corus and Ford's Land Rover and Jaguar businesses. It is also in talks to buy T-Systems from Deutsche Telekom in Germany. If you add to all this the excitement of the €1800 Nano car designed by Tata, you have got a credible Asian multi-national company in the world's eyes.
Many question the business sense of taking over Landrover and Jaguar given that the premium attached to these brands has a minuscule market in India, where a Hyundai Sonata is considered a luxury salon car! Others question the ability of the Tatas to control unionized European operations.
But I am bullish about their recent acquisitions and their ability to turn these into strategic wins. The foremost reason for my belief is the relative professionalism in the way Tatas conduct their business. For example, top Tata managers of the holding company, Tata Sons, are selected and groomed through the Tata Administrative Service (TAS), admission to which is based on a merit-based, competitive and through screening process. So we can be certain that folks running the acquisition show from the Tata side will be competent and highly trained.
The second reason for my bullish assessment is the value of the technology transfer from Jaguar and Landrover to Tata. About 8 years ago Tata motors rode a huge success in the "Tata SUMO", a rugged diesel-powered 4x4 that looked like a SUV but cost a lot less. The SUMO was an instant hit because it rode well on India's broken roads and because the government of India subsidizes diesel, it was highly cost effective. I still remember seeing caravans of hired SUMOs on highways during weekends and vacations.
Then Toyota came and stole the show with the Toyota "Qualis". This gem had more to offer: Toyota quality, quieter engine, superior interiors, and better fuel efficiency. Although the Qualis was priced slightly higher than the SUMO, it quickly overtook the latter. Tata had lost out because Toyota had superior technology. Lesson learn t for Tata: India had stepped out of its socialist past and now quality and technology mattered to Indian customers.
Jaguar and Landrover will fill this important gap for Tata. Another example where Tata can benefit from better technology is Tata's subcompact "Indica". I have ridden this car and can confidently state that it is noisier and bumpier than its Suzuki counterpart on Indian roads. Jaguar and Landrover technology will perfectly compliment Tata technology. A company that produces some of the cheapest steel in the world (because Tata owns iron and coal mines in India) combined with an established brand and distribution network in India will be unbeatable with the infusion of the latest technical know how from the acquisitions.
India's Tata group has become very well known in Europe recently after taking over UK's Corus and Ford's Land Rover and Jaguar businesses. It is also in talks to buy T-Systems from Deutsche Telekom in Germany. If you add to all this the excitement of the €1800 Nano car designed by Tata, you have got a credible Asian multi-national company in the world's eyes.
Many question the business sense of taking over Landrover and Jaguar given that the premium attached to these brands has a minuscule market in India, where a Hyundai Sonata is considered a luxury salon car! Others question the ability of the Tatas to control unionized European operations.
But I am bullish about their recent acquisitions and their ability to turn these into strategic wins. The foremost reason for my belief is the relative professionalism in the way Tatas conduct their business. For example, top Tata managers of the holding company, Tata Sons, are selected and groomed through the Tata Administrative Service (TAS), admission to which is based on a merit-based, competitive and through screening process. So we can be certain that folks running the acquisition show from the Tata side will be competent and highly trained.
The second reason for my bullish assessment is the value of the technology transfer from Jaguar and Landrover to Tata. About 8 years ago Tata motors rode a huge success in the "Tata SUMO", a rugged diesel-powered 4x4 that looked like a SUV but cost a lot less. The SUMO was an instant hit because it rode well on India's broken roads and because the government of India subsidizes diesel, it was highly cost effective. I still remember seeing caravans of hired SUMOs on highways during weekends and vacations.
Then Toyota came and stole the show with the Toyota "Qualis". This gem had more to offer: Toyota quality, quieter engine, superior interiors, and better fuel efficiency. Although the Qualis was priced slightly higher than the SUMO, it quickly overtook the latter. Tata had lost out because Toyota had superior technology. Lesson learn t for Tata: India had stepped out of its socialist past and now quality and technology mattered to Indian customers.
Jaguar and Landrover will fill this important gap for Tata. Another example where Tata can benefit from better technology is Tata's subcompact "Indica". I have ridden this car and can confidently state that it is noisier and bumpier than its Suzuki counterpart on Indian roads. Jaguar and Landrover technology will perfectly compliment Tata technology. A company that produces some of the cheapest steel in the world (because Tata owns iron and coal mines in India) combined with an established brand and distribution network in India will be unbeatable with the infusion of the latest technical know how from the acquisitions.
Sunday, April 6, 2008
Beyond desktop search...can we make user-PC data more valuable?
I ordered an 8GB flash disk last week (turns out there is a 16GB one around, but I am modest ;-) ). Since I don't have a whole lot of media content to put on this fat-stick, I will instead end up putting all my work over the past few years on it. If I factor in the emails, I should easily fill up 8GB with a couple of years' worth of data. Wow. I remember having a hard time filling a DSDD 5.25 inch 576 kB diskette back in the early 90s.
The low down is that we have *lots* of data. 7MP pictures, podcasts, email archives, documents and web-downloads, not to mention audio/visual media - all this can quickly add up. Fortunately storage has kept up, or perhaps the pace of storage encourages more data generation in the first place? Whatever the truth, we have a situation where we have a whole lot of data sitting in our computers.
There have been many instances of large volumes of data in non-PC computer scenarios. For example, real databases have routinely run into terabytes. The key difference between user-PC data and these databases is the heterogeneity and the lack of structure in the former. User-PC data comes in various formats and is generated by completely different applications. Even when the same application generates the data (e.g. an email mailbox file) the goal has never been to store the data in a way extract global information later.
Desktop search software is the first step in mining information from User-PC data. But search is really a very preliminary tool because it only flags the existence of the information sought via the specified key-words. There is very little cognizance of the bigger picture. Data mining - that power tool which works so beautifully for databases and other highly structured data - does not exist yet for User-PC data.
Isn't it time we started building algorithms beyond just search to help users extract useful information from their gigabytes of data?
The low down is that we have *lots* of data. 7MP pictures, podcasts, email archives, documents and web-downloads, not to mention audio/visual media - all this can quickly add up. Fortunately storage has kept up, or perhaps the pace of storage encourages more data generation in the first place? Whatever the truth, we have a situation where we have a whole lot of data sitting in our computers.
There have been many instances of large volumes of data in non-PC computer scenarios. For example, real databases have routinely run into terabytes. The key difference between user-PC data and these databases is the heterogeneity and the lack of structure in the former. User-PC data comes in various formats and is generated by completely different applications. Even when the same application generates the data (e.g. an email mailbox file) the goal has never been to store the data in a way extract global information later.
Desktop search software is the first step in mining information from User-PC data. But search is really a very preliminary tool because it only flags the existence of the information sought via the specified key-words. There is very little cognizance of the bigger picture. Data mining - that power tool which works so beautifully for databases and other highly structured data - does not exist yet for User-PC data.
Isn't it time we started building algorithms beyond just search to help users extract useful information from their gigabytes of data?
Subscribe to:
Posts (Atom)