While walking in Media Markt's laptop aisle yesterday I was surprised to see the number of mini laptops (Asus EEPC, Acer Aspire 1, Toshiba netbook etc.) in the €299-500 price range. Interestingly, some of them are being sold like cellphones - with a price tag of just 1 Euro with a 2 year "3G data plan". The data plan costs 37 Euros, and probably gives customers a few GB of bandwidth a month.
So adopters get to carry their *free* netbook and Internet connection anywhere, without paying outrageous Wifi hotspot/hotel Internet charges. Perhaps in a few months this Netbook offer will spur many to switch to 3G Internet and give up on tethered Internet (cable, DSL) entirely. Are we looking at this cannibalization in the next couple of years? Low cost cellphones cannibalized fixed line telephone users. Can netbooks with 3G data plans cannibalize tethered DSL?
It all depends on the service quality of 3G data vs. DSL. Now 3G data can seldom serve more than 100s of kbps versus the multiple mbps of DSL connections. But I hazard that there is a sizeable market segment to whom the mobility (and free netbook) will appeal more than blitzy Internet connections.
Another issue will be the scalability of 3G data - 3G infrastructure has some fundamental bandwidth limits - which make a mass deployment in dense areas problematic. There is LTE, the next generation of cellular wireless networks that promises much more bandwidth, but its deployment is only planned over the next decade. Perhaps mass data demand over wireless will speed things up for LTE (or WiMAX).
The multi-person home tethered service is probably safe from wireless 3G data for now. Splitting 3G bandwidth is certainly possible via Internet sharing or a router that accepts a 3G data card and then wifis the bandwidth to multiple home PCs, but the slow speed will be an issue for demanding applications like online video, gaming, etc. 3G data is still a 'midband' service. Moreover, netbooks are difficult to use with their small screens and cramped keyboards. Presently deployed laptops and desktops are certainly more comfortable (minus their limited mobility).
The Netbooks' rise is absolutely remarkable. A couple of years ago MIT's OLPC project was the seed of the idea to create cheap affordable laptops for school children in developing countries. Then Intel threw in its own school laptop competitor. Asus stole the commercial show with its EEPC 701. Intel's spectacular strategy of creating the low-power Atom processor created a captive netbook market for Intel and allowed for smaller and lighter batteries. Microsoft resurrected Windows XP as the Netbook operating system while Linux provided a free alternative to Windows XP. And then there is the 3G data availability through USB sticks.
It all seems to have come together at the right time.
Sunday, November 30, 2008
Saturday, November 8, 2008
Pure gold statistics about the Internet
I found an absolutely remarkable report issued by the OECD in the summer of 2008. It has got lots of goodies on OECD Internet usage. Here is the URL
The future of the Internet Economy, a Statistical profile
One graph in the report that was very interesting and it is pasted above. The graph shows OECD countries' Internet usage conditioned on the age-group. It is clear that younger people use the Internet much more than older people.
Now as time progresses the young people will age, and I do not think the young people of today will give up on the Internet as they age. Meanwhile the next generations will be even more Internet savvy than today's youngsters. So the low down is that the next 15-20 years will see a high growth period for OECD Internet broadband demand. There is about 20-30 more years of growth in this space before Internet broadband growth saturates, say, the way Electricity did.
The future of the Internet Economy, a Statistical profile
One graph in the report that was very interesting and it is pasted above. The graph shows OECD countries' Internet usage conditioned on the age-group. It is clear that younger people use the Internet much more than older people.
Now as time progresses the young people will age, and I do not think the young people of today will give up on the Internet as they age. Meanwhile the next generations will be even more Internet savvy than today's youngsters. So the low down is that the next 15-20 years will see a high growth period for OECD Internet broadband demand. There is about 20-30 more years of growth in this space before Internet broadband growth saturates, say, the way Electricity did.
Wednesday, October 29, 2008
My splintered digital world
I get the feeling that my digital life is divided, unequally, among my email accounts, my phone, my USB stick, my laptop, my desktop, my cameras, my linux workstation, my disks, my backups, my Facebook, my Linked in, my blog, my webpage, my o my o my.
When I first started using the PCs 2 decades ago I had a nice floppy disk to carry my digital data. It fit in, or it was thrown out - great garbage removers those 360kB DSDD disks. Well I am not going to go down the Luddite path here, I am perfectly happy with todays infinite storage. But I am concerned about data splintering. My data is in all these various formats, versions, and names somewhere in various parts of my digital ecospace.
Now not to conclude that I don't have a system to file different things in different directories and the like. But too much information may result in suboptimal storage. For example, I have nicely named and dated folders for my pictures. But when 94 photos of my niece arrive via email it takes precious minutes to download and store each picture. How do I aggregate information quickly and without manual labor?
Algorithmic search presents the next best alternative - just keep everything anywhere and then have your computers crawl abd index the information. But search does not span devices (At least right now). How do I pull up the phone number stored in my home phone's caller ID while sitting at work?
Problems problems problems. Thats great because this means there is a whole lot of work to do in this area. Start-up anyone?
When I first started using the PCs 2 decades ago I had a nice floppy disk to carry my digital data. It fit in, or it was thrown out - great garbage removers those 360kB DSDD disks. Well I am not going to go down the Luddite path here, I am perfectly happy with todays infinite storage. But I am concerned about data splintering. My data is in all these various formats, versions, and names somewhere in various parts of my digital ecospace.
Now not to conclude that I don't have a system to file different things in different directories and the like. But too much information may result in suboptimal storage. For example, I have nicely named and dated folders for my pictures. But when 94 photos of my niece arrive via email it takes precious minutes to download and store each picture. How do I aggregate information quickly and without manual labor?
Algorithmic search presents the next best alternative - just keep everything anywhere and then have your computers crawl abd index the information. But search does not span devices (At least right now). How do I pull up the phone number stored in my home phone's caller ID while sitting at work?
Problems problems problems. Thats great because this means there is a whole lot of work to do in this area. Start-up anyone?
Monday, October 27, 2008
Smartphones or flash drives to replace the laptop (?)
I am intrigued by the possibility to leave my laptop at home (or work) instead of carrying it around every single day, as discussed in this WSJ article. Are we already there?
I have been running a small experiment on myself for the past few months about the feasibility of this approach. My use case is one with heavy usage of my Outlook mailbox, lots of documents, and some software like MS Office, Emacs and Matlab.
In my experiment I have stored all my working data files in an 8 GB Sandisk cruzer USB drive. I plug the cruzer drive into various computers I have access to, just as I would a smartphone with so much flash memory. It mostly works (i.e. I don't miss my laptop), but here are the unresolved issues
1. Security: Yes I mean the consequence of loosing the flash drive (smartphone), but also the issue with secure corporate Outlook email on my laptop via VPN and certificates. It is impossible to have the same corporate setup at home on another computer (at least where I work). But this may not be an issue for those who use web mail.
2. Software: Well lets face it, not all software can be installed on every computer. The other option is trying to install software on the flash disk, but then many software installations bind themselves to the computer - for example - those registry keys of MS Office installations. Perhaps this is an area where more innovation is needed to untether software from hardware. For now, I use my computer agnostic Emacs editor as my data input tool out side of my laptop. Oh, and I also use it when I am on my laptop. I love it!
3. Customization: There are ways to copy your browser favorites, screen savers, wall papers etc. on your flash drive or smartphone, but I would say figuring all this out is cumbersome. Instead there is this cool Mokafive concept of carrying your whole OS and data and customized software all on one flash drive! Just boot off the USB drive and you are done. I found out from the IT guys though that security software will complain about this. Another problem is that loading and running an OS off the USB drive will be slowwwwwwwww.
I have been running a small experiment on myself for the past few months about the feasibility of this approach. My use case is one with heavy usage of my Outlook mailbox, lots of documents, and some software like MS Office, Emacs and Matlab.
In my experiment I have stored all my working data files in an 8 GB Sandisk cruzer USB drive. I plug the cruzer drive into various computers I have access to, just as I would a smartphone with so much flash memory. It mostly works (i.e. I don't miss my laptop), but here are the unresolved issues
1. Security: Yes I mean the consequence of loosing the flash drive (smartphone), but also the issue with secure corporate Outlook email on my laptop via VPN and certificates. It is impossible to have the same corporate setup at home on another computer (at least where I work). But this may not be an issue for those who use web mail.
2. Software: Well lets face it, not all software can be installed on every computer. The other option is trying to install software on the flash disk, but then many software installations bind themselves to the computer - for example - those registry keys of MS Office installations. Perhaps this is an area where more innovation is needed to untether software from hardware. For now, I use my computer agnostic Emacs editor as my data input tool out side of my laptop. Oh, and I also use it when I am on my laptop. I love it!
3. Customization: There are ways to copy your browser favorites, screen savers, wall papers etc. on your flash drive or smartphone, but I would say figuring all this out is cumbersome. Instead there is this cool Mokafive concept of carrying your whole OS and data and customized software all on one flash drive! Just boot off the USB drive and you are done. I found out from the IT guys though that security software will complain about this. Another problem is that loading and running an OS off the USB drive will be slowwwwwwwww.
Wednesday, October 22, 2008
Lala, streaming for life, cloud computing, bandwidth, and the ISP
Lala says it will sell you the right to stream a song, for life, for 10 cents. Thats quite a sweet deal for someone who listens to music on the computer only. In case you really want to take the song on the go, you can buy the song permanently for your music player for the same 99 cents. Lets do some bandwidth Math now.
Use case 1: Fixed line user
A user streams 8 hours of music every day @ 128 kbps from Lala. First off, 8 hours at 4 minutes per song is about 60*8/4 = 120 songs. If the user's Lala library has a different 120 songs for each day of the week, s/he has 600 songs (= 120 * 5). An investment of $60, for a lifetime. Now this is quite a good deal compared to the corresponding $600 based on the current 99 cents-a-song model. Offcourse you loose the right to download the song into your iPod, but for this use case lets say it doesnt matter to the user. The critical point is, users will not download the song one time as in the current model but will download it everytime they want to hear it.
Now lets do the bandwidth calculation.
(8 * 3600) seconds * 128 kbps = 460MB
So we have a 460 MB sustained streaming download per day. For 20 week-days a month, we are talking about a bandwidth usage of 9GB per month. This usage certainly puts use case 1 into the ISPs' "power user" category. Do we have enough bandwidth provisioning in the core and access networks to deal with large numbers of such users?
Use case 2: Mobile Internet user
Use case 2 is a mobile internet user (think UMTS on a laptop) user. Even if we cut the music streaming about 1 hour per day, we have a usuage of over 1GB per month just for music. Do we have that sort of bandwidth on 3G networks and will flat-rate data plans tolerate such perfectly legal users?
Use case 1: Fixed line user
A user streams 8 hours of music every day @ 128 kbps from Lala. First off, 8 hours at 4 minutes per song is about 60*8/4 = 120 songs. If the user's Lala library has a different 120 songs for each day of the week, s/he has 600 songs (= 120 * 5). An investment of $60, for a lifetime. Now this is quite a good deal compared to the corresponding $600 based on the current 99 cents-a-song model. Offcourse you loose the right to download the song into your iPod, but for this use case lets say it doesnt matter to the user. The critical point is, users will not download the song one time as in the current model but will download it everytime they want to hear it.
Now lets do the bandwidth calculation.
(8 * 3600) seconds * 128 kbps = 460MB
So we have a 460 MB sustained streaming download per day. For 20 week-days a month, we are talking about a bandwidth usage of 9GB per month. This usage certainly puts use case 1 into the ISPs' "power user" category. Do we have enough bandwidth provisioning in the core and access networks to deal with large numbers of such users?
Use case 2: Mobile Internet user
Use case 2 is a mobile internet user (think UMTS on a laptop) user. Even if we cut the music streaming about 1 hour per day, we have a usuage of over 1GB per month just for music. Do we have that sort of bandwidth on 3G networks and will flat-rate data plans tolerate such perfectly legal users?
Sunday, October 19, 2008
Van Goghs from the S&P Stock Index

I was playing around with the S&P historical data (monthly averages from 1871 to 2008) and came up with Figure 1. In this figure I show the value of $100 invested at each month since 1871 in an S&P index fund (see my related post) and this is the first independent axis. Another independent axis varies the lead time to sell, i.e., the time the investor waits for before selling the invested fund. Finally the dependent (z, vertical) axis shows the total return (principal + profit/loss) on the $100 that was invested initially.
This 3-D graph is in itself quite interesting although it is too dense to offer any direct insights. So I flew to the top of the graph (the virtual geek way - I set the viewing azimuth to 0 and the elevation to 90 degrees). And then I created my Van Goghs of the S&P Stock Index!
In Figures 2a, 2b, and 2c, the vertical axis is the time of making the investment of $100 in the S&P index fund. Blue signifies losses, and hotter colors (reds, yellows) signify profits in the color maps - notice that Matlab has assigned different colormap scales to each of the figures.
The horizontal axis is the time for which the investor holds on to the index fund before selling it (in years). Note the dark blue streak around the Great Depression (1929-) in all the graphs. It gets thinner as you move from right to left - since someone who exited just before the big fall saved themselves, but those who had invested earlier but held on lost (blues). You see blue lines around the year 2000 - when the Internet stock bubble burst. But you also see the dark red streaks of pure profits interspersed throughout the graphs.
Search for the rare combinations of small lead times and large profits in the 3 figures. Thats where investors invested and were quickly able to make large profits - if they exited wisely. And that will make for a wistful "if only I had invested and divested in those red streak times!"
Impressionist no doubt!
Wednesday, October 15, 2008
Why to (still) believe in the Stock market




One of the most exciting things about the stock market is its unpredictability. Some liken it to casino gambling in that there is randomness in the return on investment. Moreover, the conventional thinking is that the odds are stacked against the small invester since he is competing against highly organized hedgefunds and mutual funds with talented fund managers. So the question is, can the ordinary investor make money on the stock market. I know the answer is yes, but by "ordinary investor" I mean someone who just uses a simple mechanism of buying stocks at a low price and the selling them after a certain time lag when prices are high. Nothing fancy like short selling, derivatives, etc.
I did some basic analysis on S&P historical data to debunk the first misgiving about the stock market being a casino, and events for the last couple of weeks (Oct. 2008) have debunked the myth of the know-all big fund. They are all bleeding red ink as much as small investors (no, portfolio diversity didnt save the day for them, but that is for another blog post).
Lets get back to the S&P historical data. I obtained the monthly S&P averages from the January 1900 till May 2008. I then wrote a Matlab script to invest in an S&P index "fund" during each month an amount of $100, and sell this after a pre-specified lag (1 month, 12 months, 5 years, 10 years). The figures for the different lags are given above (click to enlarge) and show sale value (Y axis) of the $100 investment that was made in the S&P index in the time specified on the X axis. These graphs are not adjusted for inflation.
For small lags (1 month, 12 months), the graphs look random and seem to support the casino effect. However, for larger lags (60 months - 5 years, and 120 months - 10 years) there is a clear trend. Some times were better investment times than other times. For example, it was smart to buy in the late 80s when the stocks were low and sell during the late 90s when the stocks were high (120 month lag).
Some conclusions from this basic analysis are
- There is room for applying basic intelligence, and hence, this is no casino play where winning follows a certain probability distribution.
- Timing is everything when considering stocks as an investment. It is as important to guess the selling time as it is the buying time. For example, folks who bought in 1920 did very well by selling in 1925 rather than 1930. Buying and then keeping stocks away like fixed-term treasury certificates is a bad idea.
- Short term gains in index funds are hard to come by. Try specific stocks for this (and assume the greater risk of no diversity in this case).
I am still working on this analysis. Will keep this blog posted If you want the Matlab scripts just email me.
Saturday, September 27, 2008
Where is all that Wall Street money?
You've all heard the news - investment houses going bankrupt, being sold off to retail banks, large insurance companies being nationalized - all within the span of a couple of weeks. What amazes me the most is the sort of numbers floating around. A year ago the market capitalization of these busted companies was 100s of billions of dollars. They employed 10s of thousands of people, including ivy league-educated finance jocks and MBAs. Their stock prices seemed to go up, up and away - a complete endorsement of their magical money making ability. Then, ignominiously, they went broke.
Monetary circulation is a closed entity - dollars don't just float away into outer space. So I am thinking - where did all those loaned dollars go? To understand that, lets look at Mr. John Doe's 4-bed/3-bath home in suburban San Diego that was built ca. 2004 and bought by Mr. Doe at a hefty price via a loan. Say Mr. Doe has fallen behind on his payments in 2008, therefore adding to the toxicity of the CDOs - Collateralized Debt Obligations - that wrongly counted Mr. Doe's mortgage as AAA+ reliable. But all this happened in 2008. Where did the money go to in 2004?
The developer bought the land from the state. Therefore a part of the loan capital went to the state. The house itself was built using superior building materials (expensive house) and therefore, part of that capital flowed into the pockets of the building material company shareholders - the glass company, the wood company, the lighting company etc. A big part of the house price was profit for the builder/architect company and therefore, it went to these companies' shareholders. There was Latin American labor to built the house, and so some of the money went to Latin America via Western Union transfers. Some more must have flowed to China for building materials, or perhaps to Italy for the Italian marble.
Now the key point is that the value of the asset handed to Mr. Doe was supposed to rise as time went by, because this house was in the San Diego area, with the beautiful Southern California climate, the wonderful, peaceful, and happy society, the good public school in the neighborhood, and consequently the never ending demand for housing as people from all over the world came looking for a piece of this beautiful part of the world. In fact, Mr. Doe bought the house factoring all this into the future equation to pay back the hefty mortgage. In the worst case (he thought), he could just sell the house and pay back the mortgage, making a neat sum for himself. And until he sold, he could live a good life in the expensive home.
Unfortunately for Mr. Doe (and everyone else), the price of his house actually fell, and this voided the whole argument of the previous paragraph. Now if Mr. Doe's house goes into foreclosure, Mr. Doe's lending bank will only recover the reduced price of the house. The notional and fluffy value described in the previous paragraph could not be converted back into hard money when it was needed in 2008. Money has been lost, and this fact bubbles up to all those CDOs on Wall Street. Until the value of the asset -that house - rises again, there is no way to fix the problem.
Monetary circulation is a closed entity - dollars don't just float away into outer space. So I am thinking - where did all those loaned dollars go? To understand that, lets look at Mr. John Doe's 4-bed/3-bath home in suburban San Diego that was built ca. 2004 and bought by Mr. Doe at a hefty price via a loan. Say Mr. Doe has fallen behind on his payments in 2008, therefore adding to the toxicity of the CDOs - Collateralized Debt Obligations - that wrongly counted Mr. Doe's mortgage as AAA+ reliable. But all this happened in 2008. Where did the money go to in 2004?
The developer bought the land from the state. Therefore a part of the loan capital went to the state. The house itself was built using superior building materials (expensive house) and therefore, part of that capital flowed into the pockets of the building material company shareholders - the glass company, the wood company, the lighting company etc. A big part of the house price was profit for the builder/architect company and therefore, it went to these companies' shareholders. There was Latin American labor to built the house, and so some of the money went to Latin America via Western Union transfers. Some more must have flowed to China for building materials, or perhaps to Italy for the Italian marble.
Now the key point is that the value of the asset handed to Mr. Doe was supposed to rise as time went by, because this house was in the San Diego area, with the beautiful Southern California climate, the wonderful, peaceful, and happy society, the good public school in the neighborhood, and consequently the never ending demand for housing as people from all over the world came looking for a piece of this beautiful part of the world. In fact, Mr. Doe bought the house factoring all this into the future equation to pay back the hefty mortgage. In the worst case (he thought), he could just sell the house and pay back the mortgage, making a neat sum for himself. And until he sold, he could live a good life in the expensive home.
Unfortunately for Mr. Doe (and everyone else), the price of his house actually fell, and this voided the whole argument of the previous paragraph. Now if Mr. Doe's house goes into foreclosure, Mr. Doe's lending bank will only recover the reduced price of the house. The notional and fluffy value described in the previous paragraph could not be converted back into hard money when it was needed in 2008. Money has been lost, and this fact bubbles up to all those CDOs on Wall Street. Until the value of the asset -that house - rises again, there is no way to fix the problem.
Monday, August 4, 2008
Hype around "Hypertargeted" advertising, and what REALLY matters in click-based advertising
This article in today's Wall Street Journal discusses Myspace's "Hyper targeted" advertising system. The system studies profiles, messages, and other information of Myspace users and divides users into more than 1000 distinct "buckets" or categories. This classification can be used to target specific customer groups very effectively. Or so News Corp. (which bought Myspace for about $580m) hopes.
The article goes on to explain that "hypertargeting" has had checkered success, with some advertising campaigns having moderate success while others doing perfectly well with less "targeted" and more generic location-based (zip code) online advertising.
There was one very interesting example in the article. Quoting from the article
"The New York Health & Racquet Club spent $5,000 on a MySpace campaign that displayed 2.3 million ads to users on the site. Though the health club could have chosen to target ads at people who say in their profiles that they enjoy rock climbing, yoga or working out, it chose instead to simply target by age and ZIP codes near its facilities. The club said it was relatively happy with the campaign, which generated roughly 1,000 clicks, a response rate of just 0.04%."
-Source: WSJ
Now lets see, the New York Health & Racquet Club spent $5000 for 1000 clicks, i.e., $5 per click. If we assume that 5% of those folks who clicked on the ad actually signed up for club membership (means 50 sign-ups) then the per-membership marketing cost is $100 per customer. Not bad, considering that the average membership is $75-$100 per month. One the other hand, if only 10 people signed up, then you have a much higher price of $500 per new customer.
The key question therefore is, what is the post-click conversion rate, i.e., yield per click. Because this determines the value of the click for an advertiser (like the health club), and by extension, the price of the click that a content syndicator like Myspace can set. Ultimately it is the advertiser's landing website that needs to make customers out of users.
The Myspaces and Googles of the online world may well find it worth their while to start helping advertisers convert clicks into dollars instead of stopping at matching the exact user profile with the exact advertiser. Hypertargeting is good, but paying customers are much much better.
The article goes on to explain that "hypertargeting" has had checkered success, with some advertising campaigns having moderate success while others doing perfectly well with less "targeted" and more generic location-based (zip code) online advertising.
There was one very interesting example in the article. Quoting from the article
"The New York Health & Racquet Club spent $5,000 on a MySpace campaign that displayed 2.3 million ads to users on the site. Though the health club could have chosen to target ads at people who say in their profiles that they enjoy rock climbing, yoga or working out, it chose instead to simply target by age and ZIP codes near its facilities. The club said it was relatively happy with the campaign, which generated roughly 1,000 clicks, a response rate of just 0.04%."
-Source: WSJ
Now lets see, the New York Health & Racquet Club spent $5000 for 1000 clicks, i.e., $5 per click. If we assume that 5% of those folks who clicked on the ad actually signed up for club membership (means 50 sign-ups) then the per-membership marketing cost is $100 per customer. Not bad, considering that the average membership is $75-$100 per month. One the other hand, if only 10 people signed up, then you have a much higher price of $500 per new customer.
The key question therefore is, what is the post-click conversion rate, i.e., yield per click. Because this determines the value of the click for an advertiser (like the health club), and by extension, the price of the click that a content syndicator like Myspace can set. Ultimately it is the advertiser's landing website that needs to make customers out of users.
The Myspaces and Googles of the online world may well find it worth their while to start helping advertisers convert clicks into dollars instead of stopping at matching the exact user profile with the exact advertiser. Hypertargeting is good, but paying customers are much much better.
Tuesday, July 22, 2008
What matters to me, whats in my head, and this blog

Figure: My Wordle view (Click to enlarge)
I came across Wordle - a service that lets you create a word cloud from any text, highlighting those words that occur frequently. The above figure is generated from the text of this blog. The picture says it all!!!
Sunday, July 20, 2008
Tesla, EVs, and their mass adoption

Figure: Tesla Roadster, the sporty Electric Vehicle
The current Fortune has an article about the teething troubles of the Tesla Roadster, an electric vehicle (EV) being touted as an all-electric sports car (click here for more pictures). Apparently, more than a 1000 people, including some who-is-who s, have signed up to take delivery of the first fully electric sports car. The article states that excitement remains high, never mind that Tesla is having problems keeping the delivery date for most orders.
Tesla's website says that the Lithium ion cell powered vehicle can cover 220 miles per recharge. Now that is quite impressive, if you consider that according to the AAA an average American drives only 29 miles per day. As long as you are not driving cross-country the Tesla Roadster should almost replace your conventional sports car, almost because recharging the 6,831 Lithium ion cells on the Tesla Roadster takes 3.5 hours as compared to the 5 minutes of tanking-up the conventional Porche Boxster. The long recharge time is still not a deal breaker - if you can remember to charge your cell-phone every night then plugging in the car every evening shouldn't be that hard either.
The question is, will EV technology follow the conventional wisdom that early adopter products migrate down to the mass market? Does it make economic sense to buy such a car for the John Doe on the street, if not now, then 5 years into the future?
The Tesla motors website says that the operating cost for the Tesla Roadster is under 2 cents per mile. The operating cost for a comparable Porsche Boxster is about 20 cents per mile (calculated from this website, with gas at $4 per gallon)*.
Unfortunately, the Tesla Roadster 2009 edition costs about $109,000 while the Boxster costs less than half, about $50,000. Or, put another way, you will have to drive
(109000-50000)/(0.20-0.02) = 327,778 miles,
before the extra price of the Tesla Roadster can be justified!!!
Since Lithium ion batteries will not last 327K miles (neither will the rest of the car), I think that EV technology is not getting into the mass market anytime soon. Even if the price of gas triples, you will have to drive more than 100K miles in your EV before it saves you any money. And I haven't even factored in the lost opportunity of investing the $59,000 difference elsewhere.
So clearly the argument of saving on energy costs is meaningless if the EV is going to cost an arm and a leg. Question is, can EV manufacturers, or liberal government subsidies, narrow the price gap between EVs and gas-powered vehicles?
*assuming that the operating costs only cover energy costs
Sunday, July 13, 2008
Crude oil: how much do we have?

Its always good to see how much ice-cream still remains in the tub in the freezer. So I visualized the World's proved crude oil reserves (Figure 1) with data from the Energy Information Administration. The EIA provides a wonderful excel sheet with all this data, I just made the plot.
I like some things here:
I like some things here:
- The discontinuities - sharp jumps - upwards.
- The fact that the world-wide proved crude oil reserves have more than doubled in about 3 decades.
- Most of this doubling happened in active oil producing regions (the optimist in me thinks that more prospecting in other under-studied regions may yield some more discontinuities, in the right direction, i.e., up).
- We seem to be pumping out less than we are discovering (thats why the aggregate proven reserves point upwards)
So, why are crude oil prices shooting through the roof if there is so much buried under us? These are some supply-side* reasons:
- Crude oil is harder to get because new reserves are geographically challenging.
- Sweet light crude is harder to find, and oil companies need to look at harder-to-extract and harder-to-refine heavy crude.
- There is not enough refining capacity.
- There is not enough investment in new oil fields.
- Some of the crude oil lies in politically unstable regions.
Amen to my optimism!
*For some demand-side analysis see this post.
Saturday, July 5, 2008
Unlearning Google
Google's Viacom fiasco is an ominous wake-up call for anyone who cares about his or her online privacy. Today Viacom, tomorrow some other company, another day a government, can arm-twist Google into giving away log data containing user names, IP addresses, keywords, watched content, mouse-clicks, email, and any other information that Google collects.
So far, Google has only used user data for directed marketing. At least it is only about wringing money out of people's thoughts and desires through the ad sense infrastructure. The problem is, the same data can be easily massaged into revealing political, ethical, racial, religious, sexual, and other personal leanings of a person. There may be money to be made out of this data as well, but more importantly, there is the real danger of misusing this information as a pretext for prosecution or blackmail.
Google publicly defends its privacy record. Unfortunately, user privacy is not the most important objective for a publicly traded company. It is shareholder value. And to create shareholder value, a company needs to survive. A determined government can easily make the survival of a company subject to compliance with the government's wish. Google says it "Does no evil". Trouble with this slogan is, who decides what "evil" is?
Another scary scenario can be built around theft of sensitive user data. The media reported that Google is handing over 4TB of You-tube log data to Viacom. Now 4TB is a substantial, but not a lot for future data storage technology: We may have 4TB USB pen drives within the next 5 years. What if one disgruntled employee smuggled this data out of Google and auctioned it off to blackmailers for a few hundred grand?
No easy answers here.
I can keep ranting about Google and privacy and all that, but I am writing this blog on Google property (Blogger)!!! My wife and I are avid Gmail and Orkut and Google Reader and Google search and Google news users. Are we toast? Or, can we wean ourselves from Google?
I parsed our Firefox history over a few weeks to figure out where we stand in terms of Google-to-non-Google websites visited in order to get an idea of our Google dependence. The results are not pretty. Google properties accounted for just over 50% of all the websites visited (Figure 1).
Fortunately, there are non-Google alternatives to all Google applications. So in theory we can start using other applications instead of Google. Off course, there is nothing to guarantee that other websites will not yield to the same pressures as Google. But at least we can spread our web footprint - one entity will not have a complete view of a our web presence as Google does today.
The Firefox history indicated that we visit a few websites often and the rest are rarely visited (Figure 2). The often-visited websites were the usual suspects - search, web-mail, social networking, blogs, and news - and Google dominated this space. This is a great sign because it shows that even though Google is big in terms of visits, it is not very heterogeneous in the content/services it offers. Google is not my bank, not my bookstore, not my voip provider, not my university, and not my community. In fact, if I remove the top-6 Google properties from the data then the distribution starts looking much more uniform. My web log data spread on heterogeneous websites. Doesn't this flavor of obfuscation help privacy?
There may still be hope for privacy on the Internet.
So far, Google has only used user data for directed marketing. At least it is only about wringing money out of people's thoughts and desires through the ad sense infrastructure. The problem is, the same data can be easily massaged into revealing political, ethical, racial, religious, sexual, and other personal leanings of a person. There may be money to be made out of this data as well, but more importantly, there is the real danger of misusing this information as a pretext for prosecution or blackmail.
Google publicly defends its privacy record. Unfortunately, user privacy is not the most important objective for a publicly traded company. It is shareholder value. And to create shareholder value, a company needs to survive. A determined government can easily make the survival of a company subject to compliance with the government's wish. Google says it "Does no evil". Trouble with this slogan is, who decides what "evil" is?
Another scary scenario can be built around theft of sensitive user data. The media reported that Google is handing over 4TB of You-tube log data to Viacom. Now 4TB is a substantial, but not a lot for future data storage technology: We may have 4TB USB pen drives within the next 5 years. What if one disgruntled employee smuggled this data out of Google and auctioned it off to blackmailers for a few hundred grand?
No easy answers here.
I can keep ranting about Google and privacy and all that, but I am writing this blog on Google property (Blogger)!!! My wife and I are avid Gmail and Orkut and Google Reader and Google search and Google news users. Are we toast? Or, can we wean ourselves from Google?
I parsed our Firefox history over a few weeks to figure out where we stand in terms of Google-to-non-Google websites visited in order to get an idea of our Google dependence. The results are not pretty. Google properties accounted for just over 50% of all the websites visited (Figure 1).
Fortunately, there are non-Google alternatives to all Google applications. So in theory we can start using other applications instead of Google. Off course, there is nothing to guarantee that other websites will not yield to the same pressures as Google. But at least we can spread our web footprint - one entity will not have a complete view of a our web presence as Google does today.
The Firefox history indicated that we visit a few websites often and the rest are rarely visited (Figure 2). The often-visited websites were the usual suspects - search, web-mail, social networking, blogs, and news - and Google dominated this space. This is a great sign because it shows that even though Google is big in terms of visits, it is not very heterogeneous in the content/services it offers. Google is not my bank, not my bookstore, not my voip provider, not my university, and not my community. In fact, if I remove the top-6 Google properties from the data then the distribution starts looking much more uniform. My web log data spread on heterogeneous websites. Doesn't this flavor of obfuscation help privacy?
There may still be hope for privacy on the Internet.
Sunday, June 29, 2008
Hollywood DVD - a buck a pop, beat that...maybe with ice cream?
My primary source of Hollywood entertainment is the Videocenter movie rental store located at a stones throw from my flat in Prenzlauerberg. Each movie is available for 1 Euro per day, the collection is as current as the DVD release schedule, and the place is run by courteous bunch of folks. There is a popcorn machine, soda fridge, snack isle, and even the Ben & Jerry's cooler. All yours, for 1 Euro. 1 movie per weekend * 1 euro per movie = 4 Euros per month.
Thats the monthly Hollywood bill for me and the wife. Now that is a hard-to-beat deal.
If I was wearing a cable/IPTV VoD service provider's hat I would be hard-pressed to beat this deal because breaking even at 1 Euro for another distribution medium is a tough cookie (see my related post on Netflix's VoD distribution cost). Plus cable/IPTV cannot deliver my Ben & Jerry's ice cream tub.
Wait, that gives me an idea. Maybe Cable/IPTV VoD service providers can team up with ice cream trucks to have them deliver ice-cream and get a cut from Ben & Jerrys. Pizza, snacks, popcorn, T-Shirts, I don't know, movie specific stuff. Perhaps this may allow service providers to compete with Videocenters. The thing is, they need to start looking outside technology and into ice cream trucks.
Thats the monthly Hollywood bill for me and the wife. Now that is a hard-to-beat deal.
If I was wearing a cable/IPTV VoD service provider's hat I would be hard-pressed to beat this deal because breaking even at 1 Euro for another distribution medium is a tough cookie (see my related post on Netflix's VoD distribution cost). Plus cable/IPTV cannot deliver my Ben & Jerry's ice cream tub.
Wait, that gives me an idea. Maybe Cable/IPTV VoD service providers can team up with ice cream trucks to have them deliver ice-cream and get a cut from Ben & Jerrys. Pizza, snacks, popcorn, T-Shirts, I don't know, movie specific stuff. Perhaps this may allow service providers to compete with Videocenters. The thing is, they need to start looking outside technology and into ice cream trucks.
Sunday, June 22, 2008
Renewable energy : Fossil fuels :: David : Goliath
The driving force for innovation in alternative energy sources like wind energy, solar power, and bio-fuels is the steep increase in crude oil and natural gas futures (No I don't believe it is out of love for the environment). I wanted to understand how much time is needed make a significant hole in the fossil fuel demand by way of diverting the energy demand to alternative fuels. A beautiful figure from the 2006 Annual Energy Review released by the US Energy Information Administration is a nice starting point. The figure is US specific, and does not consider energy hungry China or India, but if the US energy juggernaut can be tamed with say, 50% alternative energy sources, then I am certain that China and India will happily adopt these viable alternative energy sources as well. (Plus I don't have the beautiful figures for the rest of the world so lets work with the US data!)
About 14% of the total US energy comes from renewable sources and nuclear power, in fact the figure also says that only 6% of the energy comes from renewable sources excluding nuclear energy. The rest comes from fossil fuels (including natural gas). So lets try to guesstimate, based on this data and varying rates of renewable energy growth in the coming years, the time until we derive as much energy from renewable energy as from do from fossil fuels today.
I have plotted 4 scenarios based on 5%, 10%, 15%, and 20% annual growth of renewable energy starting from their base 2006 value (from the Energy flow diagram). The plot indicates that it is going to take between a 10% to 15% annual growth of renewable energy in order to catch up with the present fossil fuel energy contribution by 2030. While there is no hard-written reason for the annual growth to not exceed 10-15%, I believe that there are significant inertial factors, like deployed fossil-fuel based equipment, lack of skilled engineers, innovation lag, legal issues, etc., which will keep renewable energy from growing at higher annual rates.
Conclusions:
Basically, this simple back-of-the-envelope calculation indicates that we need sustained double digit growth in renewable energy over the next 2 decades to challenge the fossil fuel Goliath. The figures also seems indicate that in the short term (5-10 years) fossil fuels are going to be the primary energy source. Therefore the world is going to need either a huge increase in supply or an appreciable decrease in demand of fossil fuels notwithstanding any alternative sources of energy in the short term.
Supply will grow as better technology is used for oil and gas exploration. It has already become worthwhile to use high-sulfur crude instead of sweet light crude oil. But the fact remains that demand will have to abate to meet the short supply through higher prices and unfortunately, slower economies.
Subscribe to:
Posts (Atom)