Saturday, July 26, 2008

Cloud Computing and High Availability

Last week, the fail whale, a concept that has become associated with the recurring Twitter service’s outages, swam across the north pacific and hit Amazon’s S3 service. I am talking about the already widely discussed outage of Amazon’s S3 service. It is fair to say, the services dependent on the Amazon’s S3 services – i.e. polvore.com – really felt the “business and user impact” of the outage. Did the users of those dependent services really care that those services were using Amazon’s S3 to save costs? Of course, they do not. The dependent services wrote apologizing blog entries, and never-ending debates of pros of cons of the cloud computing started yet again. But I won’t bore you with yet another synopsis on the outage.


UPDATE: Yesterday, Amazon did a great job of being transparent with the issue that caused the outage.


However, as a technology product leader, who also runs a software-as-a-service product at IBM, I am always faced with new challenges related to the shared application code based and more importantly, the shared application infrastructure. It is a no-brainer that specialized services (i.e. Amazon’s S3) always can do better job at lower costs than the individual internal IT services could do and cost. But at the same time, most people do forget to realize that the more clients the cloud-based services get, the more the impact will be felt during an outage. Therefore, with the increased usage of the service, the tolerance of a failure goes to zero, and uptime expectations go through the roof. Mathematically, we can represent it as: Cloud computing uptime expectations = number of clients x cost the service. Amazon’s S3 had an outage. But is that an anomaly? No. If your answer is yes, you have never run a large-scale system. However, the impact of the Amazon’s S3 service was unbearable to most of its clients. Again, please keep in mind, cloud-based storage means nothing to the users of FriendFeed, Twitter or Polyvore.com.


I am a big proponent of both infrastructure cloud computing services and software-as-a-service applications. However, this Amazon’s S3 outage got me thinking as how we as an industry could come-up with a solution. We know it does not matter how much redundancy a distributed cloud-based system has, some day, some thing does break. So, the obvious armchair architects’ solution of having redundancy of disks, servers, unbreakable distributed system design and other infrastructure elements just won’t avoid another outage.


I think one possible solution could be as the interoperability of the cloud-based infrastructure services. The concept is analogous to SMTP and POP protocols for the email-based services. Let’s take an example of online storage. Amazon S3 and participating competitors would agree on a standard API to retrieve and store data in the cloud. Users would select the service based on their criteria initially. S3 and its competitors could offer an “extra insurance” of redundant cloud storage feature at the time sign-up. With the feature, the users could choose the cloud of a competitor of the selected company as a “redundant” cloud in case the selected company’s cloud fails.


Now, this solution has not gone through any deep analysis and is more of a random thought. But I do wonder the other factors that could play into it. The companies would have to compete hard to keep the customers as they will be one click from switching to the competitor and perhaps making you the “redundant” cloud. Another factor could how someone would cost the service of being redundant? X% of the primary service and full charges during the failure of the primary provider? Also, what would the economical advantage for the companies that interoperate with each other versus the ones who don’t cooperate? Open source foundations – i.e. Apache Software – have pioneered the standardizations among a lot of locally installed software. Will we need a similar foundation to manage the cloud-based services interoperability?

Saturday, July 12, 2008

Public Companies and Wall Street

Since January, I have followed Microsoft’s Yahoo! acquisition proposal, and then withdrawn, and then semi-proposed [search only], and finally the ending of the discussion. And the last statement to come back to the deal table only if Carl Icahn is able to replace the Yahoo! board. In between all of this, Yahoo! lost most of its senior executives and executed another re-organization; the executives of the two companies issued conflicting statements, and blamed each other for tanking the discussion of a merger or partial acquisition.

In all of this, I have also concluded that that the Wall Street’s never ending desire to make as much money as possible [in short term] provoked discussions and actions that otherwise would have been much more civil, less controversial and could have resulted into a friendly good deal also.

So, I do wonder. Yes, we all want to make money. But would anyone be ok making money by selling his soul? In case of an Internet company, the soul of the company is its products and users of those products. If your CEO makes a statement that creating the shareholders value is the most part of his job, isn’t he putting the money before the soul – products and users? So, is Wall Street capitalism such a vicious spiral that the more you spin around it, the more you care about just money, and just ignore the products and users – who could make or break your business?

None of us will get to know the real stories of the meetings that happened between Microsoft and Yahoo!, but personally, I am just disappointed on how Yahoo!, Microsoft and Mr. Icahn have handled it. Microsoft approach made it hostile. Yahoo! has gone to the point of begging for the deal. And Mr. Icahn just wants to make money of the stock he has bought. In all of this, no one really cared about the product overlaps and resulting confused users.

Perhaps, the reality of the Wall Street capitalism is to torpedo the companies through its greedy approach of short term gains. And the system recovers itself as new companies come along and users move on. In the Internet, we have seen that happening to AOL, Excite and other early Web 1.0 portals. However, I do consider Yahoo! a bit different as it still does have the right talent to make it happen. At the same time, the recent departures of executives and the stories of technical employees leaving for greener pastures could make it difficult if too many people do end-up leaving. Microsoft, which still loses money in its Internet business unit, is not the right answer due to their vast cultural differences. And lastly, I still think, the companies are too deep in the vicious Wall Street stock price cycle to come out of it and make best possible decisions for the users and products.

When I started working 8 years ago, I always wanted to complete my project as early as possible to move to the next one. And that attitude resulted into some bad decisions that gave me life time lesson of “There is no short cut to success”. Therefore, I strongly believe that the involved technology companies, who are competing in this hyper competitive environment, can still bounce back and slowly become a very strong player. All it would take is the right leadership, technical talent, and maniacal focus on the long term aspects – products and users – over the Wall Street short term forces.

Saturday, July 5, 2008

PC Migration in the Internet 2.0 Era

Last week, my company’s 4-year “forceful” auto refresh program dispatched a ThinkPad T61 to replace my 4-year old ThinkPad T41. The company’s policy is 3-year refresh cycle but I was too lazy to ask for a new one in the last 12 months as I really didn’t want to go through the painful and time consuming migration process. Additionally, my ThinkPad T41 had had been very stable and durable, except when I spilt tea on it twice resulting into motherboard and keyboard replacements.

I was apprehensive of the migration because I thought it would be as painful and time consuming as my previous ones had been. Application re-installs (and who had all those CDs?), CD/DVD burning of my data, and re-configuration of so many programs. I was so dead wrong.

Here is it how it goes. As soon as I got my new PC, I dug up my text file of the “PC migration tasks” and started going through it.

My PC was already loaded with the corporate image containing all the security and office software so I crossed them out quickly. Also, 3 years ago, I migrated to the outside disk drive “continuous data protection” solution to backup all of the data from my user direction. The new ThinkPad immediately pulled the multiple GB of the data in just few minutes from the external disk drive over the USB 2.0 port. So, there was no need to burn the data CD. But the shocker for me was the no more need of the web bookmarks. I had stopped using del.icio.us bookmarks as I had started to browse the entire web through the Google Reader. And FireFox bookmarks were also not needed because my habits had changed. I just remembered every main website (yahoo.com, google.com, etc.) because I visited them everyday. And the rest of my web browsing was either through Web search or Google Reader search in case I wanted to visit an article I had either tagged or had in the back of my mind. So, this PC migration signaled the “death of web bookmarks” to me.

Thereafter, I went through the software installs on my PC. Out of that, I had stopped all of that because either they were programming tools (I transitioned to product management full time 4 years ago) or just desktop tools that were not needed in the era of Internet 2.0. I moved to the Quicken online in lieu of the installation, and had already uploaded my pictures to the Flickr in lieu of local software. I had stopped using the MSN, Yahoo or Google Talk as my company’s communication took place exclusively over Sametime. And I rarely found any time to do personal instant messaging. I preferred phone text messaging, voice calls (yes, my mom still wants to talk to me), twittering, Facebooking, and friendfeeding. I questioned myself. Do I really those IM clients? Not really but I still took few minutes to install them. Lastly, the most time consuming task was the migration of my lotus notes (yeah, we are mandated to use that) local connections and references to the team rooms over to the new PC.

Lastly, I had to install iTunes as there was no cloud-based version of that. I was disappointed more because Apple’s iTunes also didn’t allow me to download the songs from the iTunes cloud. I had to manually copy from the old PC. I did wonder as when we will see the iTunes to be in the cloud and allow us to just change PCs and de-authorize the old PC through the web. May be Apple needs some competitive pressure to work on it?

Now, if I was a programmer, my migration would have at least included a compiler / JDK installation and code editor like Eclipse. I doubt that compilers will go into the cloud but I do wonder if the local tools (i.e. Eclipse) would just be able to store the configuration of the workbench in the cloud and just retrieve it on the 2nd PC. May be they already do that as I don’t know as I don’t use the programming tools anymore.

All of the above took little over an hour (lotus notes took most of it) and I was ready with the new laptop. I immediately fired-up FireFox and was surfing the web.

So, going forward, I won’t be apprehensive of changing PCs. If my company moves the mail to the web and apple moves iTunes to the cloud, I would be 100% cloud-computing (my local external drive is part of that cloud) compatible.

Sunday, April 13, 2008

SCCC 2.0

Santa Clara Cricket Club 2.0-
The link is the official communication of the technology transformation of my cricket club from the internally-developed hosted custom web applications era (Web 1.0) to the hosted SaaS web applications era (Web 2.0)

Friday, March 28, 2008

Leading Indicators-based Product Management

Carly Fiorina, the former CEO of HP, said in one of her speeches, I paraphrase, “The companies that survive in long term are managed and measured through leading indicators versus lagging indicators. A company’s quarterly results denote a lagging indicator because they represent the past decisions”.

I believe that Carly’s aforementioned quote is a very important principle on how one would manage a team, a company or a product. At work, I lead a large team to develop, maintain and continuously improve a large software-as-a-service product. And everyday, my team and I collectively make a lot of decisions on the product’s direction and day-to-day operations. However at the end of every day, I always think hard as did we make the right decisions that day? Did we make sure that our decisions will work in both short- and long-term? Did we make sure people understood how those decisions will be carried out? Would our customers like the changes made through the decisions? Would our employees accept that change that was associated with our decisions? Would we achieve the product vision?

After understanding Carly’s approach on leading indicators-based management, I have concluded that as far as our decisions incorporate the leading indicators , we will be fine for the most of them. Personally, I always try to approach all product decision discussions with the following leading indicators in my mind.

Follow the Users

This leading indicator has been proven again and again, and the consumerization of the technology is taking this to a new level. If the product features are not continuously developed and enhanced based on the users’ feedback, the product will fail. The SaaS model and the Blogosphere have made the users’ feedback based development very fast and highly effective. Do we really need old user group meetings and conferences? I don’t think so. I believe the Blogosphere can provide instant feedback and SaaS model has enabled instant features deployment and beta testing.

Satisfy the Existing Users

This leading indicator has been proven more than once even at a very large scales. A classic example is AOL. At the start of the Internet, the AOL portal was the main hub of the early Internet users. Today, you go ask school kids about AOL – I can guarantee 95% responses as “what is AOL?” In contrast, the word Google would have the opposite response. So why did AOL lose the brand when it had a head start of almost five years against Google? Simple answer: They didn’t satisfy the users and with a single click, the users switched to the better websites.

Prepare for the Growth

In February 2008, Yahoo! Live was launched the live videos - well ahead of the video industry leader, YouTube, which will release the live video sometime later this year. However, the Yahoo! Live service went down the first day of its go live day and got a bad reputation from the start. What a colossal mistake! Yahoo! simply failed to understand the contemporary users, who don’t have any tolerance for a product failure until they really like the product. Yes, the occasional outages are tolerated, but only after the users like the product, not before they can even get to use it. So, this indicator requires us to always plan the infrastructure for growth. If you cannot sustain a world wide go live, stage it by country. If the country’s population is too much to contain, do a limited invitations-based beta. As an example, though, I cannot confirm this; the Gmail product entered the market through invitation-only approach. I would speculate the creation of a scalable product infrastructure could be one reason behind the invitations-only approach.

Sunday, March 9, 2008

The Consumerization of Enterprise IT

Nicholas Carr, a disruptive technology author, writes in his latest book, The Big Switch, “A hundred years ago, companies stopped generating their own power with steam engines and dynamos and plugged into the newly built electric grid. The cheap power pumped out by electric utilities didn’t just change how businesses operate. It set off a chain reaction of economic and social transformations that brought the modern world into existence. Today, a similar revolution is under way. Hooked up to the Internet’s global computing grid, massive information-processing plants have begun pumping data and software code into our homes and businesses. This time, it’s computing that’s turning into a utility. The shift is already remaking the computer industry, bringing new competitors like Google and Salesforce.com to the fore and threatening stalwarts like Microsoft and Dell. But the effects will reach much further. Cheap, utility-supplied computing will ultimately change society as profoundly as cheap electricity did. We can already see the early effects — in the shift of control over media from institutions to individuals, in debates over the value of privacy, in the export of the jobs of knowledge workers, even in the growing concentration of wealth. As information utilities expand, the changes will only broaden, and their pace will only accelerate”.

I could not agree more. This change is very disruptive and will shake things upside down. However, no body would argue of difficulty of challenges that lie ahead of us to make this change in the enterprises. Historically, enterprises have always resisted a change due to a combination of antiquated controlled senior leadership styles, job security fears within the middle management, and a simple resistance to adapt to the new ways among the employees. So, the question is simple: How can we enable our enterprises make this switch? And those of us, who have worked in enterprise projects, know this very well that enterprises’ users love the “custom” solutions. Their “requirements” result into customizations of the off-the-shelf software products or in some cases, development of custom software projects. All of those customizations and custom projects are expensive to develop, are mostly late and over budget, and have a steep maintenance cost – yes that gives job security to the same developers. Consequently, the contemporary enterprises have IT budgets in millions of dollars and are turning to off shore outsourcing to cut the costs.

So, is it simply impossible? Do those enterprises really need to restart? Is the roll out of a new strategy the magic answer? Well, I think, restart is not an option for almost all of the companies. The roll-out of a new strategy is necessary but not sufficient. I believe the answer lies in what pundits are calling “Consumerization of Enterprise IT”. Basically, the same consumer Internet-based technologies that have made us Internet-savvy users will make their way into the enterprises to simplify and standardize the enterprise IT systems. However, from my own experiences, I have witnessed two broad challenges that need to be overcome to move enterprises from the existing customized enterprise IT solutions to the standardized online software solutions.

First one is the features (or lack thereof) and limited configurability (by design) of the online software. In terms of features, I would say it is a function of time when the online enterprise software catches-up with the local enterprise software. I would give it three-five more years based on my own research. However, as it happens with any new technology, some customers will embrace the online software now as it is perfect enough, though not perfect yet. As a result, this will give them a jump start against the competitors later down the road. In contrast, the limited configurability is by design the underlying principle of the online software. Why? We all know from this product world, if a company cannot replicate a product through a standardized model, it won’t make profits and its customers will not get a low price product. Let me illustrate this using an example. Imagine a world without a standardized way to drive a car (gears, dashboard, accelerator, break’s positions etc). If the car industry had not standardized on those standard elements, the cars would still be expensive and less in use as they would have required too much training to drive, almost impossible to switch etc. In the similar fashion through standardization of using the online software for the common processes - HR, Accounting, Procurement, and Contact Center - the customers will reap the benefits over time, and avoid costly upgrades resulting into business disruptions. Of course, the core business processes are exception to this rule. What is an example of a core process? How an Airline determines its ticket price for a particular flight. How a car manufacturer sets-up its robots-based assembly line to manufacture more efficiently etc.

The second challenge to overcome is not about technology or business processes. It is about people. We all know that to make a change in enterprises is like making elephants dance. So, this is where I believe that the proliferation of Web 2.0 based consumer technologies is going to help us. Let me give you an example on this. In circa 2002, the Santa Clara Cricket Club, where I serve as a CTO, had a very manual email-based players’ availability management process. At that time, I would take an initiative and developed a custom Java-based web application to manage players’ availabilities. It was a runaway success. However, over time, the home-grown application started to out-grow itself as I didn’t have much time to update the code and infrastructure with the latest changes and security fixes. Fast forward to 2007, the application was considered antiquated. It didn’t integrate with any portal. It had a steep development cycle for minor change requests. In summary, the application was hindering the Santa Clara Cricket Club’s growth. Fortunately, around this time, the outside world had changed also. We had Google Apps available to us. One weekend, my colleague and I spent few fours configuring Google Apps’ Calendar for our club members. And the next weekend, we moved all of our user accounts to the Google Apps and with a switch of button; we abandoned our old availability application. This was a risky bet. What would happen if the users could not use the standardized calendar-based availability? As it happened, almost everything went smooth. We had user ids and passwords problems but nobody reported any problem using the application. More than hundred users were able to adapt themselves to the Google Calendar without any training. How did that happen? Isn’t that dream for our enterprise applications? Well, it all happened because Google Apps’ calendar was similar to the online calendars that our users were using to manage their own personal calendars. So, our users’ experience of using consumer technologies resulted into a smooth cut-over to the new enterprise app without any hurdles. What was my conclusion from this experiment? The consumerization of the online enterprise software will be the most profound way to overcome the challenges in users’ adoption of the online software over the in-house customized software.

Saturday, February 9, 2008

Elections 2.0

The Web 2.0 and related technologies, “Information Technology 2.0”, are creating profound changes in our socioeconomic structures, political campaigns, cultural behaviors and business environments. These changes had first started to occur in the business-to-consumer space with the use of online social networks, wikis, personal blogs, podcasts and online videos. And over the last few years, the businesses have joined the bandwagon and the cloud computing and on-demand SaaS architectures concepts have become major part of enterprise technology related discussions.

However, over the last 6 months, another change occurred – unrelated to businesses or consumers. The Web 2.0 and related technologies started to make their impact in the 2008 elections political campaigns. The candidates have had their individual websites in 2000 and 2004 elections. However, for the 2008 elections, we have started to see Web 2.0 tools - online social networks, online videos (i.e. CNN/YouTube debates), CampaignForce SaaS application, RSS readers, Blogs, Twittering, and keywords-based advertisements on Google, Microsoft and Yahoo search networks – as new communications channels in addition to traditional TVs, Radios and individual websites. This use of Web 2.0 and related technologies has enabled the 2008 candidates to collaborate and communicate with their supporters and undecided voters in the unprecedented ways. More importantly, the candidates are reaching more people than before and at a much less cost. In the last few months, various industry bloggers and pundits have called this phenomenon as Elections 2.0 and I would agree with them.

Personally, over the last few weeks, I have made following three major observations in these Elections 2.0 campaigns.

First, frankly, I had never followed political campaigns closely until the 2008 campaigns came along. Yesterday, when I contemplated why and how the change occurred, I came out with a single answer, the Google Reader. Why? I don’t watch TV that much. I don’t read print news papers. And in the last eight years, my source of information has always been the online websites. Before I started using Google Reader, I would go each individual news website and only read the sports, business and technology articles. The politics never interested me so I never clicked on those links. Fast forward to today, I only read news, articles and blogs through my Google Reader subscriptions. Over the last six months, my subscriptions started to contain political campaigns related articles and blogs. Why I started read them in the Reader when I had not bothered in the websites? It is much easier and faster and I don’t have to leave my screen as I am rarely interested in details. The headlines over the last six months generated curiosity and I followed the links to read more. I developed awareness, formed my opinion and became more involved.

The second one is the proliferation of online videos in the Elections 2.0 campaigns and debates. I strongly believe that an online short video is the most powerful communication mechanism out of all Web 2.0 technologies. Yes, Blogs, Podcasts and the profiles on the online social networks help but we are still human. We believe more on talk. We change more often when someone talks to us. The online videos have provided a very low cost, yet a very effective communication vehicle to the Elections 2.0 candidates. The most profound result of this? Read the third observation.

The third one is around the involvement of youth in the Elections 2.0 campaigns. It has become clear when it comes to Election 2.0 technologies; the youth (generation X) is way ahead of generation Y and baby boomers. Why? Because the generation X is more collaborative than all the generations before. They like transparency from others. At the same time, they don’t mind expressing their opinions. I had always thought the political choices were very private except when you were in a rally. However, today, I see people – especially younger generation - expressing their political choices in their profiles openly and persuading others to follow their choices. There has never been such an active participation of youth in any elections campaigns before. This has forced candidates to change their tactics. They cannot ignore the youth in their campaigns anymore.

The gist of all three observations is simple. The Internet and related advancements have changed the politics arena as we knew it. This is yet another evidence of the profound non-technological changes that will be made possible using the Internet related advancements.

Saturday, February 2, 2008

Readers - Productivity Improvement or Information Overload?

My work day usually begins around 6 AM to accommodate my east coast colleagues and meetings. A typical work day of mine includes a lot of collaborative meetings, emails, chats, and of course readings of hundreds of technology and business articles that pop-up in my Google Reader. I have a habit of hitting my Google Reader after few hours (around 8-9 AM PT) as most of my subscriptions are about Silicon Valley, Web 2.0, and Internet companies so most of them get content as the work day starts in the pacific coast.

Yesterday, I followed the same routine but little I knew the technology world had exploded within those two morning hours. My Reader was loaded with the news, articles, and blogs talking about Microsoft’s proposed Yahoo! acquisition. I was just two hours late, but I felt like I had missed the whole party! The articles and opinions were being written left and right by the industry pundits, and bloggers. Almost every subscription had some twisted story about the announcement. Additionally, as usual, the stock market was reacting. The Yahoo! stock was going up and the Microsoft stock was going down! Around PT noon time, the things had calmed down. The reality was setting in. The news and blog articles moved from the announcements to the consequences of the merger. Articles around technologies clashes, cultural mis-match, product over laps and lay offs started to pop-up in my subscriptions.

As I finished the work day, I had read so much about the merger that it felt like the announcement was made weeks or months ago. I didn’t need any more information to form my opinion or talk about the merger. I had all numbers on my finger tips. I could speak to both positive and negative consequences of the acquisition.

Later in the day, I stayed away from the Internet for few hours, and during those hours, I would realize that today I had just experienced a profound change in the way I consume breaking news in this new era of blogs and RSS readers. Only few years ago, I would normally read an acquisition or merger announcement on a news or technology website. Later in the week, my weekly magazine subscriptions would have articles on both positive and negative consequences of the announcement. Only after reading those paper magazines articles, I would understand details around the announcement and form my own opinion. Typically, it would be a week long process. Well, with the Microsoft-Yahoo! announcement yesterday, I consumed the same information in the same day and my opinion around the acquisition was ready at the end of the day.

Is this productivity improvement? Or is it information over-load? Well, I believe, it is more about personal preference of consuming information. Personally, I like juggling and working on multiple projects or initiatives. At work, I always get bored if I only have one thing going on. The Blogosphere has enabled me to multi-task more and thus has increased my satisfaction with the technology industry. However, there is a caveat around the multi-tasking. Each task takes more time and requires me to start much earlier in time to complete with quality and attention it needs. Therefore, I have adopted the habit of superior scheduling. And over time, I have come to realize that the habit of superior scheduling is the only way to remain productive and effective if you like multi-tasking and Blogosphere subscriptions.

I would like to know your thoughts around this shift on the information consumption through the blogs and articles aggregated in a single view of your favorite RSS reader.

Sunday, January 27, 2008

Internet Services - Trust, Privacy and Safety

A week ago, I hosted few friends for a dinner. During and after the dinner, our conversations went from religions to politics to how Internet-based technologies are changing our personal, social and work lives. During the technology discussions, in which I was the most vocal for the obvious reasons, one of my friends said that she felt safer having TurboTax Tax Desktop Application in lieu of the online version of TurboTax that stores everything in the Intuit’s data center. I immediately challenged this comment with a rhetorical question as why she thinks that storing her information on her desktop is more secure than storing the same data in an Internet-based service. Thereafter, we all spent considerable time debating around the privacy and trust of contemporary Web 2.0 Internet services.

On a related note, yesterday, I met few ex-colleagues for a lunch in San Francisco. During our discussions, one of them said that she looking for the cheapest price for a book. My first recommendation was Amazon but she had already looked at Amazon and didn’t feel that the available options were cheap. My next recommendation was eBay and to my astonishment she replied that she does not feel safe opening accounts on multiple websites. Amazon was kind of known name to her but she didn’t feel safe opening an account on eBay. I was baffled.

These two discussions provoked a thought process within myself on why people have a perception that Internet-based personal business services are not safe and would expose their personal data. Is it because that the use of Internet-based services makes us more susceptible to the marketing and advertising schemes? Why do they think that a laptop or desktop at home or in a car is more secure than the hardened data centers of the industry giants? After some analysis, I have come-up with three broad categories of reasons that are causing these data protection concerns from my friends.

The first category is the personal data loss statistics, manual opt-out targeted advertisements debacles, and debates of personal data ownership over the last few years. The year 2007 was the worst year in terms of personal data loss statistics. More than 79 million personal records were part of identity theft, an increase of over 400% from the year 2006. These statistics instill fear in all of us and will require some major work from the involved companies to gain back the trust of their employees or customers whose records were part of the loss or identity theft. In the similar fashion, the FaceBook’s Beacon debacle didn’t help the situation. The users were furious as why their web actions (i.e. purchases) were shared with the friends without their explicit permissions. In another example, the recent Google Reader’s sharing feature is still generating debates on the boundaries around the sharing.

The second category represents the proliferation of obscene pop-ups, phishing emails, and spam emails among many other techniques used by the Internet thieves to steal personal information of the Internet users. For technical people, this might be a non-issue because they can interpret phishing URLs, malware, spam, websites and hacker techniques. However, the non-technical people (normal people) cannot necessarily discern a real email send by a provider from a phishing email sent by a hacker; though, the latest browser versions have come-up with built-in features to protect these normal users. But as it happens in technology industry, the hackers will come-up with the new ways to trick our normal users. And the safety issues will remain there. We can only hope that the issues will decrease with time because more of these safety issues will result into less and less use of the Internet services.

The third category is the education (or lack thereof) around the data privacy rights and laws. As I was writing this article, I came across this blog entry on the data privacy protection. It talks about the data privacy day of 2008, to be celebrated tomorrow (01/28/2008). I could not agree with the concept and the effort. My recent discussions with co-workers, ex-colleagues and others friends have made me realize that the most of the Internet users are simply not aware of data privacy rights and laws.

I have described the major reasons of my friends’ comments, but what can we do to fix these reasons remains a challenge. Briefly, I would say that the privacy education is a very good first step towards the solution. Additionally, we need advancements in the database technology to assure anonymity and data encryption in the database tapes and storage. These advancements (anonymity and data encryption) will assure privacy of the users’ data even if the hackers are able to get hold of raw data or tapes. Lastly, we must make computers to be as safe as our cars are. Of course, the education for the drivers (users of the computers) is also important part of it as the car’s (computer) safety has a major dependency on the skills of its driver (user).

If you have a suggestion, please feel free to leave it as a comment to this blog entry.

Thursday, January 17, 2008

The Next Digital Decade

Last week, as I settled down at work after a two-week’s vacation, my Google Reader subscriptions were loaded with the news, blogs, gadgets marketing ads, videos and podcasts from the 2008 Consumer Electronics Show . Bill Gateslast CES speech, including a funny video of his last day at Microsoft, was quite impressive. In the speech, he discussed the digital media industry past, present and the ever-changing future. He made a comment that the next decade of the digital media will be more connected than ever before and will be mostly driven through the software innovations.

Personally, when I think about the next decade of digital media, I am unable to make a decision due to so many choices. The contemporary digital media gadgets market is very fragmented. On top of that, the applications are mostly locked to the specific devices and sometimes to the service providers. We have multiple competing hardware architectures and operating systems. I wonder when will all the mobile applications be available on all devices? Why a digital device cannot be as general purpose as a PC? Why can’t I access the mobile applications without buying specific devices?

I believe that if we want to realize Bill Gates’ dream, the industry must move from custom hardware gadgets and proprietary operating systems to fewer more standard hardware client gadgets and operating systems. If we want ubiquitous connectivity and choice, we need fewer hardware architectures to be used in our TVs, Setup boxes, DVD/DVR players, Stereo systems, Video systems, Personal productivity gadgets, and gadgets that we cannot imagine as of now. Once we have fewer hardware architectures, we would naturally have fewer operating systems. I would be living a perfect world if I said that there would be a single architecture and one OS running on it. In reality, we will have fewer in the same way as the PC industry that has x86 as the dominant hardware architecture, and Linux, Windows and Mac OS as the main operating systems using the x86 architecture. Unfortunately, what I saw in the 2008 Consumer Electronics Show was the proliferation of more custom hardware gadgets and operating systems. Yes, devices were cool we had multiple different hardware architectures and different operating systems. This is bad for the application services developers because they would need to develop and test on more gadgets and operating systems to increase the market penetration. Pragmatically, there will be no way for any single application services developer to keep-up with the never ending combinations. Therefore, the more vendors enter the market, the more fragmented market will become. Consequently, the next era of the digital media may actually become less connected due to the proliferation of the non-compatible devices.

It would be presumptuous of me to give an advice to any digital media provider. However, it is quite clear that unless industry leaders join forces to agree on a fewer set of digital media hardware architectures and operating systems; I do not see a ubiquitous connected digital media decade. A recent research has concluded that there is at least 30 realistic combinations mobile handset, carrier, and operating systems in the North America market alone. This makes application providers’ lives very difficult as they have to develop and test on every single combination to reach everyone. Imagine, if the contemporary websites had to do the same? Or the Internet Browser concept never existed? Remember, those fat client days?

Good or bad, as of now, the contemporary digital media vendors such as Apple, Palm, Microsoft, Sony, Panasonic among hundreds of others, have mostly their own operating systems and hardware architectures. Google’s Android is one step in the right direction but its success is also argued. No body can predict what will happen in the next 5-10 years but I sure hope that digital media hardware architectures become as common as x86 and hopefully, we can have operating systems such as Android, Windows Mobile, and Mac OS among others using those common architectures.