You've all heard the news - investment houses going bankrupt, being sold off to retail banks, large insurance companies being nationalized - all within the span of a couple of weeks. What amazes me the most is the sort of numbers floating around. A year ago the market capitalization of these busted companies was 100s of billions of dollars. They employed 10s of thousands of people, including ivy league-educated finance jocks and MBAs. Their stock prices seemed to go up, up and away - a complete endorsement of their magical money making ability. Then, ignominiously, they went broke.
Monetary circulation is a closed entity - dollars don't just float away into outer space. So I am thinking - where did all those loaned dollars go? To understand that, lets look at Mr. John Doe's 4-bed/3-bath home in suburban San Diego that was built ca. 2004 and bought by Mr. Doe at a hefty price via a loan. Say Mr. Doe has fallen behind on his payments in 2008, therefore adding to the toxicity of the CDOs - Collateralized Debt Obligations - that wrongly counted Mr. Doe's mortgage as AAA+ reliable. But all this happened in 2008. Where did the money go to in 2004?
The developer bought the land from the state. Therefore a part of the loan capital went to the state. The house itself was built using superior building materials (expensive house) and therefore, part of that capital flowed into the pockets of the building material company shareholders - the glass company, the wood company, the lighting company etc. A big part of the house price was profit for the builder/architect company and therefore, it went to these companies' shareholders. There was Latin American labor to built the house, and so some of the money went to Latin America via Western Union transfers. Some more must have flowed to China for building materials, or perhaps to Italy for the Italian marble.
Now the key point is that the value of the asset handed to Mr. Doe was supposed to rise as time went by, because this house was in the San Diego area, with the beautiful Southern California climate, the wonderful, peaceful, and happy society, the good public school in the neighborhood, and consequently the never ending demand for housing as people from all over the world came looking for a piece of this beautiful part of the world. In fact, Mr. Doe bought the house factoring all this into the future equation to pay back the hefty mortgage. In the worst case (he thought), he could just sell the house and pay back the mortgage, making a neat sum for himself. And until he sold, he could live a good life in the expensive home.
Unfortunately for Mr. Doe (and everyone else), the price of his house actually fell, and this voided the whole argument of the previous paragraph. Now if Mr. Doe's house goes into foreclosure, Mr. Doe's lending bank will only recover the reduced price of the house. The notional and fluffy value described in the previous paragraph could not be converted back into hard money when it was needed in 2008. Money has been lost, and this fact bubbles up to all those CDOs on Wall Street. Until the value of the asset -that house - rises again, there is no way to fix the problem.
Saturday, September 27, 2008
Monday, August 4, 2008
Hype around "Hypertargeted" advertising, and what REALLY matters in click-based advertising
This article in today's Wall Street Journal discusses Myspace's "Hyper targeted" advertising system. The system studies profiles, messages, and other information of Myspace users and divides users into more than 1000 distinct "buckets" or categories. This classification can be used to target specific customer groups very effectively. Or so News Corp. (which bought Myspace for about $580m) hopes.
The article goes on to explain that "hypertargeting" has had checkered success, with some advertising campaigns having moderate success while others doing perfectly well with less "targeted" and more generic location-based (zip code) online advertising.
There was one very interesting example in the article. Quoting from the article
"The New York Health & Racquet Club spent $5,000 on a MySpace campaign that displayed 2.3 million ads to users on the site. Though the health club could have chosen to target ads at people who say in their profiles that they enjoy rock climbing, yoga or working out, it chose instead to simply target by age and ZIP codes near its facilities. The club said it was relatively happy with the campaign, which generated roughly 1,000 clicks, a response rate of just 0.04%."
-Source: WSJ
Now lets see, the New York Health & Racquet Club spent $5000 for 1000 clicks, i.e., $5 per click. If we assume that 5% of those folks who clicked on the ad actually signed up for club membership (means 50 sign-ups) then the per-membership marketing cost is $100 per customer. Not bad, considering that the average membership is $75-$100 per month. One the other hand, if only 10 people signed up, then you have a much higher price of $500 per new customer.
The key question therefore is, what is the post-click conversion rate, i.e., yield per click. Because this determines the value of the click for an advertiser (like the health club), and by extension, the price of the click that a content syndicator like Myspace can set. Ultimately it is the advertiser's landing website that needs to make customers out of users.
The Myspaces and Googles of the online world may well find it worth their while to start helping advertisers convert clicks into dollars instead of stopping at matching the exact user profile with the exact advertiser. Hypertargeting is good, but paying customers are much much better.
The article goes on to explain that "hypertargeting" has had checkered success, with some advertising campaigns having moderate success while others doing perfectly well with less "targeted" and more generic location-based (zip code) online advertising.
There was one very interesting example in the article. Quoting from the article
"The New York Health & Racquet Club spent $5,000 on a MySpace campaign that displayed 2.3 million ads to users on the site. Though the health club could have chosen to target ads at people who say in their profiles that they enjoy rock climbing, yoga or working out, it chose instead to simply target by age and ZIP codes near its facilities. The club said it was relatively happy with the campaign, which generated roughly 1,000 clicks, a response rate of just 0.04%."
-Source: WSJ
Now lets see, the New York Health & Racquet Club spent $5000 for 1000 clicks, i.e., $5 per click. If we assume that 5% of those folks who clicked on the ad actually signed up for club membership (means 50 sign-ups) then the per-membership marketing cost is $100 per customer. Not bad, considering that the average membership is $75-$100 per month. One the other hand, if only 10 people signed up, then you have a much higher price of $500 per new customer.
The key question therefore is, what is the post-click conversion rate, i.e., yield per click. Because this determines the value of the click for an advertiser (like the health club), and by extension, the price of the click that a content syndicator like Myspace can set. Ultimately it is the advertiser's landing website that needs to make customers out of users.
The Myspaces and Googles of the online world may well find it worth their while to start helping advertisers convert clicks into dollars instead of stopping at matching the exact user profile with the exact advertiser. Hypertargeting is good, but paying customers are much much better.
Tuesday, July 22, 2008
What matters to me, whats in my head, and this blog

Figure: My Wordle view (Click to enlarge)
I came across Wordle - a service that lets you create a word cloud from any text, highlighting those words that occur frequently. The above figure is generated from the text of this blog. The picture says it all!!!
Sunday, July 20, 2008
Tesla, EVs, and their mass adoption

Figure: Tesla Roadster, the sporty Electric Vehicle
The current Fortune has an article about the teething troubles of the Tesla Roadster, an electric vehicle (EV) being touted as an all-electric sports car (click here for more pictures). Apparently, more than a 1000 people, including some who-is-who s, have signed up to take delivery of the first fully electric sports car. The article states that excitement remains high, never mind that Tesla is having problems keeping the delivery date for most orders.
Tesla's website says that the Lithium ion cell powered vehicle can cover 220 miles per recharge. Now that is quite impressive, if you consider that according to the AAA an average American drives only 29 miles per day. As long as you are not driving cross-country the Tesla Roadster should almost replace your conventional sports car, almost because recharging the 6,831 Lithium ion cells on the Tesla Roadster takes 3.5 hours as compared to the 5 minutes of tanking-up the conventional Porche Boxster. The long recharge time is still not a deal breaker - if you can remember to charge your cell-phone every night then plugging in the car every evening shouldn't be that hard either.
The question is, will EV technology follow the conventional wisdom that early adopter products migrate down to the mass market? Does it make economic sense to buy such a car for the John Doe on the street, if not now, then 5 years into the future?
The Tesla motors website says that the operating cost for the Tesla Roadster is under 2 cents per mile. The operating cost for a comparable Porsche Boxster is about 20 cents per mile (calculated from this website, with gas at $4 per gallon)*.
Unfortunately, the Tesla Roadster 2009 edition costs about $109,000 while the Boxster costs less than half, about $50,000. Or, put another way, you will have to drive
(109000-50000)/(0.20-0.02) = 327,778 miles,
before the extra price of the Tesla Roadster can be justified!!!
Since Lithium ion batteries will not last 327K miles (neither will the rest of the car), I think that EV technology is not getting into the mass market anytime soon. Even if the price of gas triples, you will have to drive more than 100K miles in your EV before it saves you any money. And I haven't even factored in the lost opportunity of investing the $59,000 difference elsewhere.
So clearly the argument of saving on energy costs is meaningless if the EV is going to cost an arm and a leg. Question is, can EV manufacturers, or liberal government subsidies, narrow the price gap between EVs and gas-powered vehicles?
*assuming that the operating costs only cover energy costs
Sunday, July 13, 2008
Crude oil: how much do we have?

Its always good to see how much ice-cream still remains in the tub in the freezer. So I visualized the World's proved crude oil reserves (Figure 1) with data from the Energy Information Administration. The EIA provides a wonderful excel sheet with all this data, I just made the plot.
I like some things here:
I like some things here:
- The discontinuities - sharp jumps - upwards.
- The fact that the world-wide proved crude oil reserves have more than doubled in about 3 decades.
- Most of this doubling happened in active oil producing regions (the optimist in me thinks that more prospecting in other under-studied regions may yield some more discontinuities, in the right direction, i.e., up).
- We seem to be pumping out less than we are discovering (thats why the aggregate proven reserves point upwards)
So, why are crude oil prices shooting through the roof if there is so much buried under us? These are some supply-side* reasons:
- Crude oil is harder to get because new reserves are geographically challenging.
- Sweet light crude is harder to find, and oil companies need to look at harder-to-extract and harder-to-refine heavy crude.
- There is not enough refining capacity.
- There is not enough investment in new oil fields.
- Some of the crude oil lies in politically unstable regions.
Amen to my optimism!
*For some demand-side analysis see this post.
Saturday, July 5, 2008
Unlearning Google
Google's Viacom fiasco is an ominous wake-up call for anyone who cares about his or her online privacy. Today Viacom, tomorrow some other company, another day a government, can arm-twist Google into giving away log data containing user names, IP addresses, keywords, watched content, mouse-clicks, email, and any other information that Google collects.
So far, Google has only used user data for directed marketing. At least it is only about wringing money out of people's thoughts and desires through the ad sense infrastructure. The problem is, the same data can be easily massaged into revealing political, ethical, racial, religious, sexual, and other personal leanings of a person. There may be money to be made out of this data as well, but more importantly, there is the real danger of misusing this information as a pretext for prosecution or blackmail.
Google publicly defends its privacy record. Unfortunately, user privacy is not the most important objective for a publicly traded company. It is shareholder value. And to create shareholder value, a company needs to survive. A determined government can easily make the survival of a company subject to compliance with the government's wish. Google says it "Does no evil". Trouble with this slogan is, who decides what "evil" is?
Another scary scenario can be built around theft of sensitive user data. The media reported that Google is handing over 4TB of You-tube log data to Viacom. Now 4TB is a substantial, but not a lot for future data storage technology: We may have 4TB USB pen drives within the next 5 years. What if one disgruntled employee smuggled this data out of Google and auctioned it off to blackmailers for a few hundred grand?
No easy answers here.
I can keep ranting about Google and privacy and all that, but I am writing this blog on Google property (Blogger)!!! My wife and I are avid Gmail and Orkut and Google Reader and Google search and Google news users. Are we toast? Or, can we wean ourselves from Google?
I parsed our Firefox history over a few weeks to figure out where we stand in terms of Google-to-non-Google websites visited in order to get an idea of our Google dependence. The results are not pretty. Google properties accounted for just over 50% of all the websites visited (Figure 1).
Fortunately, there are non-Google alternatives to all Google applications. So in theory we can start using other applications instead of Google. Off course, there is nothing to guarantee that other websites will not yield to the same pressures as Google. But at least we can spread our web footprint - one entity will not have a complete view of a our web presence as Google does today.
The Firefox history indicated that we visit a few websites often and the rest are rarely visited (Figure 2). The often-visited websites were the usual suspects - search, web-mail, social networking, blogs, and news - and Google dominated this space. This is a great sign because it shows that even though Google is big in terms of visits, it is not very heterogeneous in the content/services it offers. Google is not my bank, not my bookstore, not my voip provider, not my university, and not my community. In fact, if I remove the top-6 Google properties from the data then the distribution starts looking much more uniform. My web log data spread on heterogeneous websites. Doesn't this flavor of obfuscation help privacy?
There may still be hope for privacy on the Internet.
So far, Google has only used user data for directed marketing. At least it is only about wringing money out of people's thoughts and desires through the ad sense infrastructure. The problem is, the same data can be easily massaged into revealing political, ethical, racial, religious, sexual, and other personal leanings of a person. There may be money to be made out of this data as well, but more importantly, there is the real danger of misusing this information as a pretext for prosecution or blackmail.
Google publicly defends its privacy record. Unfortunately, user privacy is not the most important objective for a publicly traded company. It is shareholder value. And to create shareholder value, a company needs to survive. A determined government can easily make the survival of a company subject to compliance with the government's wish. Google says it "Does no evil". Trouble with this slogan is, who decides what "evil" is?
Another scary scenario can be built around theft of sensitive user data. The media reported that Google is handing over 4TB of You-tube log data to Viacom. Now 4TB is a substantial, but not a lot for future data storage technology: We may have 4TB USB pen drives within the next 5 years. What if one disgruntled employee smuggled this data out of Google and auctioned it off to blackmailers for a few hundred grand?
No easy answers here.
I can keep ranting about Google and privacy and all that, but I am writing this blog on Google property (Blogger)!!! My wife and I are avid Gmail and Orkut and Google Reader and Google search and Google news users. Are we toast? Or, can we wean ourselves from Google?
I parsed our Firefox history over a few weeks to figure out where we stand in terms of Google-to-non-Google websites visited in order to get an idea of our Google dependence. The results are not pretty. Google properties accounted for just over 50% of all the websites visited (Figure 1).
Fortunately, there are non-Google alternatives to all Google applications. So in theory we can start using other applications instead of Google. Off course, there is nothing to guarantee that other websites will not yield to the same pressures as Google. But at least we can spread our web footprint - one entity will not have a complete view of a our web presence as Google does today.
The Firefox history indicated that we visit a few websites often and the rest are rarely visited (Figure 2). The often-visited websites were the usual suspects - search, web-mail, social networking, blogs, and news - and Google dominated this space. This is a great sign because it shows that even though Google is big in terms of visits, it is not very heterogeneous in the content/services it offers. Google is not my bank, not my bookstore, not my voip provider, not my university, and not my community. In fact, if I remove the top-6 Google properties from the data then the distribution starts looking much more uniform. My web log data spread on heterogeneous websites. Doesn't this flavor of obfuscation help privacy?
There may still be hope for privacy on the Internet.
Sunday, June 29, 2008
Hollywood DVD - a buck a pop, beat that...maybe with ice cream?
My primary source of Hollywood entertainment is the Videocenter movie rental store located at a stones throw from my flat in Prenzlauerberg. Each movie is available for 1 Euro per day, the collection is as current as the DVD release schedule, and the place is run by courteous bunch of folks. There is a popcorn machine, soda fridge, snack isle, and even the Ben & Jerry's cooler. All yours, for 1 Euro. 1 movie per weekend * 1 euro per movie = 4 Euros per month.
Thats the monthly Hollywood bill for me and the wife. Now that is a hard-to-beat deal.
If I was wearing a cable/IPTV VoD service provider's hat I would be hard-pressed to beat this deal because breaking even at 1 Euro for another distribution medium is a tough cookie (see my related post on Netflix's VoD distribution cost). Plus cable/IPTV cannot deliver my Ben & Jerry's ice cream tub.
Wait, that gives me an idea. Maybe Cable/IPTV VoD service providers can team up with ice cream trucks to have them deliver ice-cream and get a cut from Ben & Jerrys. Pizza, snacks, popcorn, T-Shirts, I don't know, movie specific stuff. Perhaps this may allow service providers to compete with Videocenters. The thing is, they need to start looking outside technology and into ice cream trucks.
Thats the monthly Hollywood bill for me and the wife. Now that is a hard-to-beat deal.
If I was wearing a cable/IPTV VoD service provider's hat I would be hard-pressed to beat this deal because breaking even at 1 Euro for another distribution medium is a tough cookie (see my related post on Netflix's VoD distribution cost). Plus cable/IPTV cannot deliver my Ben & Jerry's ice cream tub.
Wait, that gives me an idea. Maybe Cable/IPTV VoD service providers can team up with ice cream trucks to have them deliver ice-cream and get a cut from Ben & Jerrys. Pizza, snacks, popcorn, T-Shirts, I don't know, movie specific stuff. Perhaps this may allow service providers to compete with Videocenters. The thing is, they need to start looking outside technology and into ice cream trucks.
Sunday, June 22, 2008
Renewable energy : Fossil fuels :: David : Goliath
The driving force for innovation in alternative energy sources like wind energy, solar power, and bio-fuels is the steep increase in crude oil and natural gas futures (No I don't believe it is out of love for the environment). I wanted to understand how much time is needed make a significant hole in the fossil fuel demand by way of diverting the energy demand to alternative fuels. A beautiful figure from the 2006 Annual Energy Review released by the US Energy Information Administration is a nice starting point. The figure is US specific, and does not consider energy hungry China or India, but if the US energy juggernaut can be tamed with say, 50% alternative energy sources, then I am certain that China and India will happily adopt these viable alternative energy sources as well. (Plus I don't have the beautiful figures for the rest of the world so lets work with the US data!)
About 14% of the total US energy comes from renewable sources and nuclear power, in fact the figure also says that only 6% of the energy comes from renewable sources excluding nuclear energy. The rest comes from fossil fuels (including natural gas). So lets try to guesstimate, based on this data and varying rates of renewable energy growth in the coming years, the time until we derive as much energy from renewable energy as from do from fossil fuels today.
I have plotted 4 scenarios based on 5%, 10%, 15%, and 20% annual growth of renewable energy starting from their base 2006 value (from the Energy flow diagram). The plot indicates that it is going to take between a 10% to 15% annual growth of renewable energy in order to catch up with the present fossil fuel energy contribution by 2030. While there is no hard-written reason for the annual growth to not exceed 10-15%, I believe that there are significant inertial factors, like deployed fossil-fuel based equipment, lack of skilled engineers, innovation lag, legal issues, etc., which will keep renewable energy from growing at higher annual rates.
Conclusions:
Basically, this simple back-of-the-envelope calculation indicates that we need sustained double digit growth in renewable energy over the next 2 decades to challenge the fossil fuel Goliath. The figures also seems indicate that in the short term (5-10 years) fossil fuels are going to be the primary energy source. Therefore the world is going to need either a huge increase in supply or an appreciable decrease in demand of fossil fuels notwithstanding any alternative sources of energy in the short term.
Supply will grow as better technology is used for oil and gas exploration. It has already become worthwhile to use high-sulfur crude instead of sweet light crude oil. But the fact remains that demand will have to abate to meet the short supply through higher prices and unfortunately, slower economies.
Sunday, June 15, 2008
Net neutrality: The value of a byte travelling on the Internet
In response to Richard Bannet's article.
Let us say, for simplicity, that most of us are connected to the best-effort, statistically multiplexed, Internet. What does this mean? This means that every byte on the Internet going from point A to point B will, on average, get the same service from the Internet (same probability of loss, same delay, same delay-jitter, etc.) Therefore the Internet has the tendency to treat each byte traversing it as equal because in our simple example of 2 bytes going from A to B, the fraction of service (or utility) that each byte receives from the Internet is equal.
However, most people agree that the importance, or utility, of every byte on the Internet is not equal. For example, it may be more important to quickly transfer a byte from a voip conversation than a byte from a file transfer. Or it may be more important to send bytes that update stock prices than to send bytes to play a You tube video.
Or so I think. But what do you think? What does Skype think? What does Google think? What does Comcast think? What does the government of a country think? And if they think differently, then whose voice matters? Or should anyone's voice matter more than the others?
This is the key point of the Net Neutrality corundum. Everyone agrees that the present design of the best-effort Internet is suboptimal in that it treats every byte as equal and gives equal precedence to equal fractions of content. But the issue with doing away with this Net Neutrality model is that vested interests will decide which particular byte is more important than another byte. Can we trust one single company, or authority, to make the correct decision on this one? As a market believer, I would first say that let the market decide, i.e., let the price per byte, and hence the value attached to that particular byte, be the deciding factor. But the big issue is whether a flexible, effective, and dynamic market of this sort can be set up and quickly integrated with the existing and upcoming Internet protocols. Until this happens, I am more comfortable with time-tested simple statistical multiplexing, the fair but sub-optimal egalitarian algorithm, to do the job.
I am relieved that the question of Net Neutrality does have a technical solution - setup a market to do the job. I am just concerned of whether there is enough political patience to wait for the technology to develop this byte market we will need.
Let us say, for simplicity, that most of us are connected to the best-effort, statistically multiplexed, Internet. What does this mean? This means that every byte on the Internet going from point A to point B will, on average, get the same service from the Internet (same probability of loss, same delay, same delay-jitter, etc.) Therefore the Internet has the tendency to treat each byte traversing it as equal because in our simple example of 2 bytes going from A to B, the fraction of service (or utility) that each byte receives from the Internet is equal.
However, most people agree that the importance, or utility, of every byte on the Internet is not equal. For example, it may be more important to quickly transfer a byte from a voip conversation than a byte from a file transfer. Or it may be more important to send bytes that update stock prices than to send bytes to play a You tube video.
Or so I think. But what do you think? What does Skype think? What does Google think? What does Comcast think? What does the government of a country think? And if they think differently, then whose voice matters? Or should anyone's voice matter more than the others?
This is the key point of the Net Neutrality corundum. Everyone agrees that the present design of the best-effort Internet is suboptimal in that it treats every byte as equal and gives equal precedence to equal fractions of content. But the issue with doing away with this Net Neutrality model is that vested interests will decide which particular byte is more important than another byte. Can we trust one single company, or authority, to make the correct decision on this one? As a market believer, I would first say that let the market decide, i.e., let the price per byte, and hence the value attached to that particular byte, be the deciding factor. But the big issue is whether a flexible, effective, and dynamic market of this sort can be set up and quickly integrated with the existing and upcoming Internet protocols. Until this happens, I am more comfortable with time-tested simple statistical multiplexing, the fair but sub-optimal egalitarian algorithm, to do the job.
I am relieved that the question of Net Neutrality does have a technical solution - setup a market to do the job. I am just concerned of whether there is enough political patience to wait for the technology to develop this byte market we will need.
Wednesday, June 11, 2008
Scott McNealy's talk at TU Berlin
Scott McNealy gave a talk on "Open Wins: Leadership and Innovation" at the Technical University of Berlin today. Scott McNealy is of the "The Network is the Computer" fame from the dotcom boom of the 90s and is presently the Chairman of Sun Microsystems. I was part of the audience and found a few things worth blogging.
The key message of his talk was that Sun is and has been an open source champion forever. I don't buy into that (Solaris wasn't open source till 2005). But Scott blamed a prior agreement with AT&T Unix for this. Whatever the truth is, I cannot seem to place Solaris in the same bandwagon as Linux when I think open source. I suspect that Sun is trying to leverage the open source developer community to shoulder the costs of keeping Solaris updated (he himself alluded to the bug-squelching power of open source software). Perhaps like the Fedora project of Redhat?
Scott also touted the Sun Ultrasparc T2 processor - with lots of multi threading and multi core support and the 1.5 Watt/thread low-power footprint. Off course there was the usual Microsoft/Oracle bashing (Microsoft and its patches... so un-open source, Oracle and its $40k per-core licensing... so un-Mysql), but the hidden message was that Sun hopes that these new T2s find their way into routers and energy conscious data-centers. Although in my opinion I doubt if Oracle's grip on enterprise database computing is going to loosen up anytime soon, so Mr. McNealy may find it a tad-bit difficult to wean enterprises from their established Oracle databases into to Sun-acquired Mysql.
I asked Mr. McNealy about why Sun-promoted Open Office does not enjoy the MS Office - .NET (think VB Macros) type of integration with Java. The answer I got seemed to suggest that I should ask the open source community about this. Why shouldn't Sun take the lead in this? Java is their baby and Sun says it is all for Open office, then why not cure the Achilles heel of Open office through Java integration? This makes me suspect what I said about Solaris earlier - that Sun is trying to leverage the open source developer community to shoulder the costs of integrating a credible scripting platform into Open office.
Undoubtedly Sun has done more for open source than most other companies (for example, as Scott rightly pointed out, Google has a rather poor record on this front). Lets hope Sun does a lot more in the future.
Saturday, June 7, 2008
Entertainment technology and diminishing returns
I am trained in information theory, a science which emphasizes squeezing every last bit of efficiency out of communication channels to get a voice or video signal across in its best form. A noble endeavor no doubt, and one that has spawned a whole analog, and later digital, entertainment industry. But how much does it matter to the end-user?
My basic thesis is this: up to a certain quality people do care about how much error-free information gets across. But after that the human brain's smoothening kicks in - the apparatus in us that can skillfully ignore any small blemishes in the audio track or on the screen. While it is true that video technology already takes advantage of this "help" from the human brain (thats why finite frame rates and digitized pictures work), I have a feeling that sometimes technology needlessly pushes bits which are not useful, leading to "diminishing returns" on entertainment technology investment. After all, how many of us can make out the difference between a 192kbps-encoded MP3 file and its uncompressed counterpart that is 10 times bigger?
For another example, lets look at HDTV. Unless you are watching from close distance, the low pass filter in your eyes will substantially smooth out the sharp images on the HDTV screen. Undoubtedly HDTV looks better (but how better?) than SD TV, but is the delta enough to drive consumer pull in the mass market?
The rapid sales of HDTV s and blue-ray disks seems to suggest so. But I'd like to know what fraction of the content being seen on these HDTV s is really HD? And when the world does shift to HD, will the lowly SD TV be forgotten, going the B&W TV way? Probably not, because a vast library of content is stored in SD TV format. My kids will probably watch my old Friends and M*A*S*H DVDs or my father's music video collection (stored on VHS!). So their eyes and senses will probably accept fuzzy-ol' SD TV as well. Entertainment is about content quality first and then about technology quality.
My basic thesis is this: up to a certain quality people do care about how much error-free information gets across. But after that the human brain's smoothening kicks in - the apparatus in us that can skillfully ignore any small blemishes in the audio track or on the screen. While it is true that video technology already takes advantage of this "help" from the human brain (thats why finite frame rates and digitized pictures work), I have a feeling that sometimes technology needlessly pushes bits which are not useful, leading to "diminishing returns" on entertainment technology investment. After all, how many of us can make out the difference between a 192kbps-encoded MP3 file and its uncompressed counterpart that is 10 times bigger?
For another example, lets look at HDTV. Unless you are watching from close distance, the low pass filter in your eyes will substantially smooth out the sharp images on the HDTV screen. Undoubtedly HDTV looks better (but how better?) than SD TV, but is the delta enough to drive consumer pull in the mass market?
The rapid sales of HDTV s and blue-ray disks seems to suggest so. But I'd like to know what fraction of the content being seen on these HDTV s is really HD? And when the world does shift to HD, will the lowly SD TV be forgotten, going the B&W TV way? Probably not, because a vast library of content is stored in SD TV format. My kids will probably watch my old Friends and M*A*S*H DVDs or my father's music video collection (stored on VHS!). So their eyes and senses will probably accept fuzzy-ol' SD TV as well. Entertainment is about content quality first and then about technology quality.
Sunday, May 25, 2008
Wildflowers in the Prenzlauerberg spring
Wednesday, May 21, 2008
Finally, the "Net" in Netflix; plus, the bandwidth question
Netflix has released a set-top box that users can use to receive movies directly over their broadband Internet connections. The box, developed by the silicon valley company Roku, has received good reviews on CNET and PC Magazine for its nice interface and more-or-less good performance over most home-broadband connections.
Advantages for users
The Netflix system delivers video streams at 2.2 Mbps, 1Mbps, and an even lower bit-rate depending on the connection between the server and the receiving box. The quality naturally degrades according to the lessening bitrate, but let us assume that a user has a great Internet connection and that no bandwidth bottleneck exists between the serving CDN and this user and so s/he can watch the best 2.2Mbps quality for the entire 120 minutes of a movie.
Size of the movie:
2.2 Mbps x 7200 seconds (i.e. 120 minutes) = 15840 Mb = 15840/8 MB = 1980MB = 1980/1000 GB
= 1.98 GB.
So downloading a movie at the best quality means the CDN serves about 2GB of data to the end-user's Netflix- Roku box.
To arrive on the bandwidth costs, lets go with the figures presented in this PBS article about CDN pricing. Disclaimer: This article is more than a year old, and I have been reading about CDN price wars all along. So the current cost of bandwidth may actually be lower than stated.
From the PBS article the costs for streaming a 2GB stream to a user (assuming volume wholesale pricing):
Ok first off, I like the P2P number the most but lets ignore that because P2P may not be able to compete in quality with CDNs (See my paper on this). Even if Netflix uses the most expensive Akamai CDN, they are getting away with just $32 cents per movie instead of the dollar-plus cost in the DVD-mailing model. Even if we assume a few more cents of overhead per movie due to the technology costs, I think Netflix is well in the green with this.
The beauty of Netflix's strategy is that they will be able to gradually wean people from the DVD mailing model to the this online content delivery model because of the convenience of the latter. And this without jeopardizing the DVD mailing model because there is no cannibalization here - its perfect migration with one less DVD mailing customer corresponding to one more streaming customer. Every DVD streamed will add up and lead to a drastic reduction in Netflix's operating costs.
Meanwhile, Roku will probably make some money out of their $99 box.
Last question: And the ISP?
Thats for later. Enjoy your movies.
Update: Netflix bandwidth costs come to about $5cents as of June 2009, according to this article.
Advantages for users
- No propagation delay from snail-mail shipping DVDs - no more waiting for 2 days.
- No need to mail back DVDs.
- Ability to switch to another movie or show - you are not stuck with that wrong movie you placed in your Netflix queue.
- No extra cost except the broadband connection and the $99 cost of the box.
- Savings in storage, handling, and shipping costs (to-and-fro) of the DVDs. Theoretically, if all Netflix subscribers switch to this technology then Netflix can close its nation-wide distribution centers and also save on postage costs: assuming that Netflix pays the standard first-class mail rate of $0.42 to USPS, thats a $0.84 saving per mailed DVD. I think that the present overall cost of circulating a DVD to a user may be well above a dollar for Netflix.
- Centralized content control and the ability to speedily deploy new movies, shows etc.
- Ability to expand beyond the US in a relatively painless way - no distribution centers to set up, no additional staffing costs (analogous to how iTunes operates in Europe).
The Netflix system delivers video streams at 2.2 Mbps, 1Mbps, and an even lower bit-rate depending on the connection between the server and the receiving box. The quality naturally degrades according to the lessening bitrate, but let us assume that a user has a great Internet connection and that no bandwidth bottleneck exists between the serving CDN and this user and so s/he can watch the best 2.2Mbps quality for the entire 120 minutes of a movie.
Size of the movie:
2.2 Mbps x 7200 seconds (i.e. 120 minutes) = 15840 Mb = 15840/8 MB = 1980MB = 1980/1000 GB
= 1.98 GB.
So downloading a movie at the best quality means the CDN serves about 2GB of data to the end-user's Netflix- Roku box.
To arrive on the bandwidth costs, lets go with the figures presented in this PBS article about CDN pricing. Disclaimer: This article is more than a year old, and I have been reading about CDN price wars all along. So the current cost of bandwidth may actually be lower than stated.
From the PBS article the costs for streaming a 2GB stream to a user (assuming volume wholesale pricing):
Single server: $0.26
Akamai : $0.32
P2P: $0.0024
P2P: $0.0024
Ok first off, I like the P2P number the most but lets ignore that because P2P may not be able to compete in quality with CDNs (See my paper on this). Even if Netflix uses the most expensive Akamai CDN, they are getting away with just $32 cents per movie instead of the dollar-plus cost in the DVD-mailing model. Even if we assume a few more cents of overhead per movie due to the technology costs, I think Netflix is well in the green with this.
The beauty of Netflix's strategy is that they will be able to gradually wean people from the DVD mailing model to the this online content delivery model because of the convenience of the latter. And this without jeopardizing the DVD mailing model because there is no cannibalization here - its perfect migration with one less DVD mailing customer corresponding to one more streaming customer. Every DVD streamed will add up and lead to a drastic reduction in Netflix's operating costs.
Meanwhile, Roku will probably make some money out of their $99 box.
Last question: And the ISP?
Thats for later. Enjoy your movies.
Update: Netflix bandwidth costs come to about $5cents as of June 2009, according to this article.
Tuesday, May 20, 2008
If there is a Microsoft Yahoo deal then startups will feel the pinch
A Microsoft takeover of all or some of Yahoo will be a good thing for Microsoft in it's battle to unseat Google from the Internet's helm. Perhaps a later Microsoft-Facebook arrangement will finally present a credible challenge to Google. I doubt if the Yahoo board will now agree to anything less than the 72% premium over the original share price Microsoft had offered earlier, so Yahoo shareholders will also come out wealthier from the deal. End users are likely to benefit from a stronger alternative to Google as well.
But one quarter will suffer quietly in the short to medium term: startup companies. Yahoo and Microsoft are some of the most prolific startup acquirers (see Figure 1, from this blog). Yahoo merging with Microsoft removes a big buyer for many startup companies. Moreover, Microsoft will have that much less cash (approximately $44B less based on the first MS offer) to throw at startup acquisitions. With the credit supply tightening up and the economy slowing down, you can be sure of a capital drought ahead for many Internet and software startups.
Friday, May 16, 2008
Crude oil demand: India, China, and the USA

Crude oil prices have never been higher (Brent sweet crude is trading at about $125 a barrel on the NYMEX as of this post). Part of the reason is attributed to the growing demand from emerging economies like India and China that is putting upward pressure on the price of oil. Alan Greenspan writes in his book that the annual world demand for for crude oil has grown by 1.6% since the late 80s while the production has only grown by 0.8% or so annually. The gap has lead investors to bid up crude oil futures in anticipation of the tightening supply, further driving up prices as the buffer between supply and demand has narrowed significantly.
I downloaded crude oil import data from the UN data website for India, China, and the USA and plotted it (Figure 1). Unsurprisingly, the USA imports far more crude oil than India or China. It is more interesting to note is that the growth in US crude oil imports has been of the same order or steeper than that of India and China. Therefore, demand is being driven higher more by the USA than by India or China.
In his book, Alan Greenspan speaks about the "crude oil intensity" of a nation, defined as its crude oil consumption normalized by its GDP. He states that this number is far higher for China and India than it is for the USA because the latter has shifted to a less oil-intensive service economy in the past few decades. From my perspective, I think that the real crude oil intensity of the USA may be much more than Greenspan computes it to be because of USA's large number of imports from China. For example, a plastic toy imported from China counts the crude oil used to manufacture it and transport it to the USA as crude oil used by China.
It seems clear that the biggest lever to reduce crude oil demand lies in the hands of the USA. India and China are emerging economies eager to lift 100s of millions of people out of poverty. As such, they may not have the political capital to cut back on their increasing (but still small) usage of crude oil. On the other hand, even a small percentage cutback in the USA will reduce demand significantly. Lets hope that the USA moves towards more efficient cars, better public transport systems and away from its suburban driving culture in order to keep crude oil within reach of poorer nations of the World.
I downloaded crude oil import data from the UN data website for India, China, and the USA and plotted it (Figure 1). Unsurprisingly, the USA imports far more crude oil than India or China. It is more interesting to note is that the growth in US crude oil imports has been of the same order or steeper than that of India and China. Therefore, demand is being driven higher more by the USA than by India or China.
In his book, Alan Greenspan speaks about the "crude oil intensity" of a nation, defined as its crude oil consumption normalized by its GDP. He states that this number is far higher for China and India than it is for the USA because the latter has shifted to a less oil-intensive service economy in the past few decades. From my perspective, I think that the real crude oil intensity of the USA may be much more than Greenspan computes it to be because of USA's large number of imports from China. For example, a plastic toy imported from China counts the crude oil used to manufacture it and transport it to the USA as crude oil used by China.
It seems clear that the biggest lever to reduce crude oil demand lies in the hands of the USA. India and China are emerging economies eager to lift 100s of millions of people out of poverty. As such, they may not have the political capital to cut back on their increasing (but still small) usage of crude oil. On the other hand, even a small percentage cutback in the USA will reduce demand significantly. Lets hope that the USA moves towards more efficient cars, better public transport systems and away from its suburban driving culture in order to keep crude oil within reach of poorer nations of the World.
Subscribe to:
Posts (Atom)