Is Skype up for sale, again?

By Marguerite Reardon

Comments from eBay's CEO have sparked speculation that the online marketplace may be looking to unload Skype. But any suitor that comes knocking on eBay's door will likely have to pay a hefty price tag for the Internet phone service.


The Times of London reported Monday that John Donahoe, chief executive of eBay, described Skype as a "great standalone business" on a conference call last week with investors after the company reported disappointing fourth-quarter results.

The news has left many in the industry wondering if eBay will put Skype, which it paid a hefty US$2.6 billion to buy in 2005, on the auction block. Donahoe had said last year that eBay would consider selling the business unit if it could not be integrated with its auction or PayPal payment system.

And according to statements made during the conference call, it looks like Donahoe does not think there is much the Skype technology can do to help eBay's other businesses. When asked what eBay was doing to add shareholder value to Skype, Donahoe admitted that "the synergies between Skype and the other parts of our portfolio are minimal", the paper said.

So is eBay looking to unload Skype? The answer is probably yes and no. Because Skype is not core to eBay's online auction business, experts believe that eBay would be happy to let Skype go, at the right price. But Skype, which just posted a 26 percent gain in revenue for the fourth quarter compared to a year ago, happens to be one of the only bright spots in eBay's overall business, which means that eBay is not desperate to let it go.

"Skype is not a drag on eBay at all," said Jim Friedland, senior Internet equity analyst for Cowen and Company. "In fact, it's one of the fastest-growing assets the company has right now. But I'm sure the company would sell it if they could get a hefty premium for it."
Indeed, eBay reported that its overall net income for the fourth quarter fell more than 30 percent compared to the same period last year, marking the first time ever that the company has seen its year-over-year earnings drop. It was also the second quarter in a row that the company saw the total value of all goods sold on the site fall, suggesting that the company's core business is struggling.
Meanwhile, PayPal and Skype, eBay's other two main business units, grew during the fourth quarter. PayPal revenues were up 11 percent to US$623 million. And Skype's revenue grew 26 percent to US$145 million.
Skype has also been adding new subscribers at a rapid pace. Scott Durchslag, the company's chief operating officer, told reporters at the Consumer Electronics Show this month that it has been adding about 30 million subscribers a quarter. It now has 370 million registered users worldwide. And these users are making lots of phones calls. Today, about 8 percent of the world's voice minutes originate from a Skype call, he said.
All told, Durchslag said Skype has been growing about 50 percent compared to the previous year in almost every metric--from minutes used to new subscribers to revenues. He also said the company just had its seventh straight quarter of profitability.
Because of this growth in Skype, eBay has little reason to sell Skype at this point. It could hold onto the service and run it as a separate business and still generate revenue.
But, of course, any business or asset is for sale for the right price. But the price that a potential suitor would have to pay for Skype is probably too high.
Three years ago, eBay paid US$2.6 billion for Skype. There is no question now that the price tag was too high. In 2007, eBay said it would take a US$900 million so-called impairment write-down against the value of Skype. This means that eBay has been forced to reassess the value of the Internet telephony company relative to its overall business today. By recording a charge, the company is essentially saying that it has taken a loss on its original investment.
A "peak" in value?
Based on its current financials, the highest price that eBay could hope to get for Skype is about US$1.6 billion, Friedland said. And he said that would be a generous offer. eBay itself has valued the Skype assets on its balance sheet at US$2 billion, so it is unlikely the company would accept a lower figure, Friedland surmised.
Friedland also said the inflated price tag that eBay paid for Skype is already built into eBay's stock price, which means the company is under no pressure to sell off a bad asset.
"I'm sure eBay's shareholders probably think that money could have been better spent on something else, like paying them dividends," he said. "But that's water under the bridge at this point. Going forward, Skype doesn't really hurt the value of eBay."
In addition to the slumping economy, there are other reasons why potential suitors would likely not be willing to pay a premium for Skype.
For one, the strategic value of Skype today is not what it was three and a half years ago when eBay bought it. Today, the three major Internet and search companies that might be interested in Skype--Microsoft, Google, and Yahoo--already have comparable messaging services, including instant messaging and voice and video calling. So from a technology standpoint, Skype does not offer anything that the other companies do not already have. Even phone companies AT&T and Verizon Communications already have their own flavors of voice over IP technology.
So an acquirer would likely be buying Skype for its 370 million registered users, which is nothing to sneeze at. But the big question is how much money can be made from these users? Sure, people love using Skype's free services, but most of its revenue is made from a small portion of its users. Skype generates most of its revenue from its SkypeOut service, which charges users to make calls from the Skype service to regular landline phones and cell phones.
The SkypeOut revenue stream is sufficient to sustain Skype's business model today, but as IP networks are deployed throughout the world and all communications becomes IP-enabled, there will be fewer opportunities to make money from connecting Skype calls to the regular phone network. What's more, as Skype adds more subscribers, those users are more likely to talk to one another over the free Skype-to-Skype network rather than paying to call these friends and family on regular phones. Of course, it will likely take years for this scenario to play out, but this fact could color a potential acquirer's willingness to pay a premium for the service.
"As more people adopt Skype, there's potential for the asset to peak in value," Friedland said. "It won't likely happen for another five to eight years. And unless Skype comes up with a new meaningful revenue driver, it could start to decline."
Skype doesn't plan to sit around waiting for its business model to wither. The company sees mobile phone applications and video as big components of its future strategy. But again the question remains: How will Skype monetize these services?
Skype is also looking to make a push into the business market.
"We're seeing a whole new opportunity in the business market, as companies that I'd never have thought would be a target for Skype are proactively coming to us and asking for a solution," Durchslag said at CES earlier this month.
Most likely suitor: Microsoft
Despite all these factors, there is still a possibility that a company or two might be interested in buying Skype. Friedland said it would not surprise him if Microsoft made a play for Skype.
"We view Microsoft as the most likely candidate to acquire the Skype assets," he said. "Microsoft would probably be doing this to strengthen its Internet position and to buy market share."
He pointed to other deals that Microsoft has done lately that are geared more at grabbing market share than scoring profits. He used the recent mobile search deal with Verizon Wireless as a prime example. Microsoft has signed a deal with Verizon to become the default search engine on all Verizon Wireless phones. As part of the deal, Verizon agreed to pay a minimum of US$400 million. But Friedland said he doesn't think Microsoft will make much more than the minimum revenue on this deal.
"Microsoft has been very aggressive about making deals," he said. "So for them it might be all about buying market share."
This article was first published as a blog post on CNET News.com.
READ MORE - Is Skype up for sale, again?

Technology to help Asian banks tackle crisis

By Sol E. Solomon

Negative economic factors affecting the global industry have led banks in the Asia-Pacific region to align their technology priorities to new strategies, designed to help them effectively respond to the current crisis, a new report noted.

Financial Insights noted in its report, Top Ten Strategic IT Initiatives for Asia-Pacific Banks in 2009: What is Your Counter-Cyclical Strategy?, banks are looking at "counter-cyclical" technology initiatives such as portfolio analytics, asset-liability management, and credit collections and recovery. The aim of these initiatives is to help banks minimize the impact of lower demand for banking products, shifts in customers' banking preferences, and rising credit delinquencies and defaults, noted the advisory firm, which is a subsidiary of IDC.

Michael Araneta, senior research manager at Financial Insights Asia-Pacific, said banks now have to find ways to generate new demand and find new sources of income.

"Technology imperatives here include CRM (customer relationship management) and customer-centric projects, and payments initiatives to generate fee income," Araneta said in a statement.

Li-May Chew, senior research manager at the research firm, said banks recognize that the modes of operation during boom times are ineffective in a crisis environment.

"IT optimization will be the key concern for bank IT leaders as they search for clarity in their existing technology assets, and see how these can be integrated more effectively to meet current and future requirements," Chew explained in the statement.

"The overarching objective is simply to do more with what you have," she added.

Despite the current economic crisis, Financial Insights said the Asia-Pacific region continues to provide opportunities to banks--thanks to recent wealth accumulation, relatively stable economic and corporate fundamentals, as well as pockets of under-served banking segments in the region.

According to the research firm, banks in the region are still expected to increase technology spending in several overarching priority areas in 2009, albeit at lower rates of growth compared to those seen in previous years.

The 2009 priority list is made up of discrete projects that are typically ad-hoc and tactical in nature. These include virtualization, customer loyalty, customer retention, credit collections and recovery and software-as-a-service, the report noted.
READ MORE - Technology to help Asian banks tackle crisis

10 qualities of IT greatness

By Michael Krigsman

CIO success depends on connecting the chaotic IT environment to high-level strategic and business priorities that matter to the broader organization. Here's a list of 10 qualities of IT departments that have successfully bridge the strategic gap.

Successful IT groups are resilient, flexible, and highly responsive to organizational business needs. For many CIOs, this goal appears elusive and completely disconnected from the daily grind of servers, users, downtime, and help desks. Despite the difficulty, CIO success depends on connecting the chaotic, often crisis-driven, IT environment to high-level strategic and business priorities that matter to the broader organization.

Linking a tactical IT culture rooted in reaction and response to broader strategic goals is a worthy, if difficult, challenge, which requires understanding the areas of intersection between IT and the business. Despite the obstacles, IT must cross this bridge without disrupting its own operational ability to deliver projects on-time and within budget while still achieving planned scope. To be sure, this balancing act requires careful and delicate choreography!

Paul M. Ingevaldson spent 40 years in IT, most recently as CIO of international retailer, Ace Hardware. His recent column in Computerworld caught my attention because it presents ten qualities of IT departments that have successfully bridged the strategic gap.
I spoke with Paul and asked him to explain the list:
To accomplish the goal, you must take the list as a whole; it's not an à la carte menu and you can't leave pieces out.
For example, an effective steering committee ensures that IT projects reflect the consensus of C-level executives around business goals and the organization's automation strategy. Moreover, the organization needs a consistent set of rules around IT priorities, without the CEO randomly deciding how to manage IT. Unless the CIO reports directly to the CEO, and is therefore on equal footing with other officers, it's very hard to make this work.
Bridging the gap from tactical to strategic requires understanding the organization's business goals, which should be IT's basic context, and creating an IT group that's capable of executing well.
Paul's list of great IT qualities (which I have edited) addresses both considerations.
  1. The CIO reports to the CEO or, at least, the chief operating officer, giving the CIO clout and ensuring IT's independence.
  2. An IT steering committee, composed of C-level executives from the business units, makes allocation decisions based on a defined set of priorities and criteria such as ROI. The committee is necessary to ensure that investment decisions are made in the interests of the entire company and not just an individual department.
  3. The organization spends an appropriate percentage of corporate revenue on IT, indicating the company's level of commitment to IT.
  4. A well-managed, highly visible security team is in place, since this is one of the most vulnerable areas of IT.
  5. Disaster recovery plans and processes, involving users and a documented recovery plan, are well-established and tested regularly.
  6. An ongoing commitment to training keeps IT staffers up to date. Organizations that don't train IT folks and use lots of consultants are not sufficiently focused on in-house staff.
  7. Rigid adherence to an appropriate system development life cycle, that both IT and the user community understand, is a priority. Documenting the selection process offers insight into the professionalism of the IT organization.
  8. Well defined technical and managerial career paths let all workers achieve higher pay and status. This is the only way to retain top technical people who don't want to manage others.
  9. A monthly major IT project status report is widely distributed throughout the company.
  10. The CIO participates in long-range, organizational strategic planning. If not, it's clear the business views IT as an implementer and not a strategic enabler.
For many IT departments, realizing every item on this list will take time, focus, and lots of internal selling and convincing. Despite the effort, that goal is in the best interest of all technology stakeholders within an organization: shareholders, senior management, users, and IT itself.
The goal is clear and the bridge available, making the new year a great time to take steps forward!
Michael Krigsman is CEO of Asuret, a software and consulting company dedicated to reducing software implementation failures. He is also CEO of Cambridge Publications, which specializes in developing tools and processes for software implementations and related business practice automation propjects. This article was first published as a blog post on ZDNet.com.
READ MORE - 10 qualities of IT greatness

IBM teams up with universities on cloud project

By Dawn Kawamoto

IBM on Monday announced that it has partnered with three universities to develop one of the first cloud computing platforms in the Middle East.


Big Blue, along with Carnegie Mellon University, Qatar University, and Texas A&M University in Qatar, plans to use the Qatar Cloud Computing Center to handle advanced research for search, data mining, scientific modeling and simulation, computational biology, and financial modeling.

"This will help us realize our vision of developing, evaluating, and extending a cloud-computing infrastructure in Qatar, to target regional applications and projects to help advance research," Majd F. Sakr, an associate professor at Carnegie Mellon University in Qatar, said in a statement.

Five pilot application projects will include oil and gas seismic modeling and exploration; Arabic language Web search engines; and a cloud-computing curriculum to teach at the universities.

IBM and universities will collaborate on building the infrastructure for the cloud-computing platform, followed by developing applications to build on the Hadoop open-source programming model.
READ MORE - IBM teams up with universities on cloud project

Conficker Worm Slimy and Sticky

Plus, Microsoft swaps security chiefs, IBM is unfazed by the economy, and more.

by Doug Barney

The Conficker worm, an RPC attack that's been in the wild since last October, is taking a squishing but it keeps on wriggling.

And the fact that the worm is still very much alive has been the source of much finger-pointing. CERT, for instance, claims that it's Windows' Autorun that makes it so easy for the worm to slink from machine to machine. CERT advices that Autorun be disabled and criticizes Microsoft for what it calls "ineffective" guidelines. Microsoft's answer? Poppycock!
READ MORE - Conficker Worm Slimy and Sticky

New NetBeans Certification

New year, new president...new certifications! Emmett previews a few of the latest exams to go live or beta in recent weeks. Plus, Books of the Week tackle VMware and Google.
by Emmett Dulaney  
1/21/2009 -- Sun has released a Sun Certified Specialist for NetBeans IDE 6.1 exam (CX-310-045) that consists of 61 multiple-choice questions for those who develop in the Java desktop. Available through Prometric testing centers, the cost is $300 and the passing score is 59 percent (36 correct answers). Here are the six sections that comprise the exam:
  • IDE configuration
  • Project setup
  • Java SE development
  • Java EE Web development
  • Editing
  • Testing, profiling and debugging
A complete list of objectives can be found here.
New Network+ Exam Now Live
In that same vein, on Jan. 9, CompTIA went live with the updated Network+ exam (N10-004). This test now consists of 100 questions (up from 90) with a passing score of 720 (up from 554) on a scale from 100-900. The old version of the exam will continue to remain available until July 31 to allow anyone still in training to finish their classes using existing material and earn the certification (good for life).
More information can be found here: http://certification.comptia.org/network/glancebox.aspx. The following is a list of the six top-level domains and their weighting:
  • Network technologies:20 percent
  • Network media and topologies: 20 percent
  • Network devices: 17 percent
  • Network management: 20 percent
  • Network tools: 12 percent
  • Network security: 11 percent
Oracle 11g Performance Exam in Beta
The Oracle Database 11g: Performance Tuning exam (1Z1-054) has now entered the beta phase, with testing scheduled to end March 31. The cost of the exam while in beta is $50 and it consists of 18 sections and between 180-220 questions. You can find a complete list of objectives here.
Book of the Week: VCP Exam Cram
The VMware Certified Professional Exam Cram by Elias Khnaser packs a lot of information in an easy-to-follow text.  It begins by looking at virtualization overall, then hones in on the intricacies of the ESX Server and the infrastructure you need to master this technology. By so doing, this is one of the few exam crams that actually serves double-duty: Not only can it prepare you to ace the exam, but it can also serve as a great reference tool after the fact.
As you read the book, it's easy to see and appreciate how well the author knows this material -- so well, in fact, that I feel confident saying that if you can't pass the VCP 310 exam with this book, you probably won't be able to do so with any other study aid.
Book of the Week: The Google Story
A number of books have been written about the hottest technology company of the past decade (if not ever), and David Vise's The Google Story is just one of them. What sets it apart from the others, however, is the fact that it has been updated and re-released with more current information about the company.
My favorite element is the inclusion of the Google Labs Aptitude Test (GLAT), which is used to see if you have the sort of intellect for the "mind-bending problems encountered each day at Google Labs." A fun read, "The Google Story" helps fill in the back story of a company you likely interact with on a daily basis.
READ MORE - New NetBeans Certification

How Will Barack Obama Change the Channel?

Barack Obama is set to become the 44th president of the United States. How should partners position themselves for the new administration's IT spending priorities?
by Rich Freeman
On the evening of Nov. 3, 2008, Sudhakar Shenoy was anxiously awaiting the outcome of a closely watched contest.

Shenoy is chairman and CEO of IMC Inc., a Reston, Va.-based solution provider and Gold Certified Partner that works extensively with the federal government. But it wasn't the next day's presidential face-off between Democrat Barack Obama and Republican John McCain that was commanding his attention -- at least, not exactly.

That evening, Shenoy was settling into his seat at FedEx Field in Landover, Md., to watch the hometown Washington Redskins take on the visiting Pittsburgh Steelers. As he well knew, long-standing legend held that this game's outcome could foreshadow the following day's events: In all but one United States presidential race since 1936, a win by the Redskins in their last pre-election home game heralded a victory for the party currently occupying the White House, while a Redskins loss signaled a defeat for the incumbents. Shenoy has been in the IT industry for 27 years, a period spanning four presidential administrations. So he knew that, regardless of which candidate won on Nov. 4, business conditions were about to change for his company as well. He just didn't know how. For Shenoy and many of his peers in the federal government IT market, the inevitable product of such uncertainty was fear. "Right now, we're all a little worried," he said before the game. "I wouldn't say we're scared stiff, but we're worried, because we don't know what's coming down the pike."

The final score that night? Steelers 23, Redskins 6. And sure enough, some 24 hours later, Barack Obama won a sweeping and historic victory to become the 44th president of the United States. Now, with Obama just weeks away from his inauguration, Shenoy and thousands of Microsoft partners like him are still nervously waiting to learn what that victory might mean for them. Says Shenoy: "The reality will only be known in the first 100 days of the new administration."
Playing the 'What If' Game
Though it was Barack Obama who emerged victorious from November's presidential election, partners that work with the federal government can't help but speculate on what a win by his Republican opponent, Senator John McCain of Arizona, might have meant for them.
Many believe that, as a military veteran, McCain would have cut defense spending less than Obama is expected to do, which would probably translate to more opportunities for IT companies serving the Pentagon. And in light of McCain's well-known aversion to government waste, demand for analytics software and other solutions that can help sniff out ineffective federal programs would probably have grown as well.
Meanwhile, McCain's fervent support for offshore drilling might have meant increased spending by oil and gas companies on exploration technologies and oil field management solutions. And some observers believe IT outsourcers would probably have done better under McCain than they will under Obama.
"It's fair to say that there's a more outsourcing-led bent in a Republican-led administration then there would be in a traditionally Democratic-led administration," says Rishi Sood, of analyst firm Gartner Inc.
On the other hand, McCain vowed late in his campaign to impose an across-the-board federal budget freeze. "[That] would mean maybe less use of cutting-edge programs, or new investments in technology," notes Brian Karlisch, CEO of Gold Certified Partner and solution provider Buchanan & Edwards Inc.
Or maybe not, says analyst Rob Enderle of IT research organization Enderle Group. He notes that both as a small-town mayor and as governor of Alaska, Republican vice presidential nominee Sarah Palin dramatically increased technology outlays. "She tends to be agreeable to the thought that technology can be used to increase the efficiency of government significantly," Enderle says. If McCain had put her in charge of the federal IT budget-an assignment Enderle considers highly possible -- the result might have been major spending increases.
But, of course, the voters' decision on Election Day means that we'll never know for sure.
-- R.F.
Opportunities for the Shrewd

For partners serving the federal market, it's likely to be even longer before they fully feel the new administration's impact. The government is already well into its 2009 fiscal year, following the direction set for it by the outgoing administration some time ago. Though his newly installed department heads are likely to start shifting funding priorities by early summer 2009, Obama won't have a crack at developing his own budget before Ql of the 2010 fiscal year, which begins Oct. 1, 2009.

Exactly how that budget document will affect federal IT spending remains a mystery, but most experts see outlays growing anemically-if at all. "I think overall they'll be leaner," says Thom Rubel, practice director for government programs at Government Insights, a division of analyst firm IDC. After all, Washington has already committed at least $700 billion to bailing out struggling financial institutions, and steering the nation through what's shaping up to be a severe and prolonged recession is sure to consume hundreds of billions more. With some observers predicting annual budget deficits of $1 trillion or higher in coming years, IT funding is likely to be in shorter supply than usual.

Indeed, Rubel expects Obama's first budget to raise overall IT investments by perhaps 1 percent to 2 percent at most. Yet most government-market veterans still believe that there will be plenty of opportunities for shrewd partners to exploit. For instance, with money tight, most federal agencies will be desperately seeking ways to lower costs. That could be good news for companies with virtualization expertise, notes Brian Karlisch, CEO of Buchanan & Edwards Inc., a Gold Certified Partner and infrastructure integration specialist with headquarters in Washington, D.C. By enabling organizations to consolidate applications on fewer and more powerful servers, virtualization can help cut bills for hardware, maintenance and electricity.

The Bush Legacy
As partners look ahead to Barack Obama's presidential administration, they're also reflecting back on President George W. Bush's two terms in office. For those serving the federal government, the memories are mostly fond ones.
"They've been good years," says Brian Karlisch, CEO of Buchanan & Edwards Inc., a Gold Certified Partner and infrastructure integration specialist. Indeed, though federal IT spending has grown modestly in the past year or so, it jumped from $30 billion to $66 billion between 2000 and 2006, according to analysts at Gartner Inc. Karlisch cites the wars in Afghanistan and Iraq, as well as the Bush administration's post-Sept. 11 focus on homeland security, as key reasons for that growth. "Technology has just played an enormous component in all of that, from back-office systems to Web systems to kiosks to bomb-detection equipment," he says.
Many IT companies without government clients, however, take a dimmer view of the Bush years. "It's hard to get excited when the economy just tanked," observes Enderle Group analyst Rob Enderle. In addition, he adds, Bush policymakers showed little interest in helping U.S. technology companies compete more effectively abroad. "They didn't do a lot of harm, but [they] also didn't do a lot of good," Enderle notes. "Generally, the technology industry will be glad to see this administration go away."
-- R.F.
Other partners predict an increased demand for project-management solutions. "The government has been running in circles on many projects," says Jose Marroig, CEO of Vienna, Va.-based Projility Inc., a Certified Partner that provides management and technology consulting services to a client list rich in government agencies. Federal managers are already hungry for high-tech ways to increase efficiency and operational discipline, Marroig notes, and that appetite is sure to increase in coming years.

Look for the nation's battered economy to drive increased spending by some federal agencies, too. For example, Rubel notes, the U.S. Treasury Department is likely to be in the market for financial software and reporting tools that can help it manage all those ailing bank stocks and illiquid mortgage-backed securities that it's buying. Similarly, some partners predict that the U.S. Department of Health and Human Services will be in need of workflow solutions and other technologies that can help its employees more effectively administer Medicare and Medicaid benefits. And don't be surprised if local and regional governments have more money for benefits-related technology as well, adds Rishi Sood, a research vice president at analyst company Gartner Inc. That's because Obama has vowed to help hard-pressed states, counties and cities cope with the impact of plummeting tax revenues and rising demand for social services.

Moreover, though some partners anticipate smaller military and homeland security budgets under Obama, many expect to see additional funds dedicated to shoring up the nation's cybersecurity defenses. Indeed, proof that such spending is overdue came just days after the November election, when news broke that overseas hackers had successfully broken into computers belonging to the White House and both the Obama and McCain campaigns.

"The U.S. has kind of been under siege with regard to hostile attacks, and the [Bush] administration just hasn't seemed all that interested in addressing the problem," says Rob Enderle, principal analyst at Enderle Group, an IT research organization based in San Jose, Calif. Both Enderle and Rubel expect that attitude to change under Obama, who pledged during the campaign to appoint the nation's first chief technology officer.
Geek in Chief
Budget priorities alone don't explain the affinity some in the IT community feel toward the president-elect, who raised untold millions from contributors in California's Silicon Valley during his successful presidential run.
Obama, his tech industry supporters believe, is one of them. From his campaign's innovative use of text messaging, blogs and social networking to the BlackBerry device that he's said to check compulsively, the president-elect appears more comfortable with technology than any of his predecessors. As Enderle Group analyst Rob Enderle puts it: "He seems to get technology from a very personal standpoint."
Then there's the fact that Obama's campaign platform included a pledge to appoint the nation's first-ever chief technology officer, "to ensure that our government and all its agencies have the right infrastructure, policies and services for the 21st century." (The CTO hadn't yet been named as this issue went to press.)
For many in the IT industry, those factors add up to a sense that the nation's new commander in chief has a strong understanding of the ways that technology can help address the nation's problems.
-- R.F.
Meanwhile, some government partners believe that opportunities related to Internet solutions and so-called Web 2.0 tools such as blogs and social-networking sites will spike under the new administration. Obama's campaign made sophisticated use of such technologies, and early signs suggest his White House will do the same: Within 48 hours of Obama's election, his transition team had launched an expansive and elegantly designed Web site, complete with detailed policy statements, a blog, a sign-up tool for receiving updates by e-mail and an online application form for jobs in the new administration. "Web 2.0 is already starting to catch on within [federal] agencies," notes Karlisch. With a tech-savvy administration taking office, that trend is only likely to accelerate.

Regulation and Reform
Of course, the government market isn't the only one that Obama administration policies are likely to affect. IT companies that support financial services companies will probably see demand shifts, too. With capital in short supply despite the government's massive relief program, technology spending by financial institutions will drop 4 percent in 2009, warns Jeanne Capachin, research vice president for global banking in IDC's Financial Insights division. But many financial services partners expect the Obama administration and a heavily Democratic Congress to impose strict new regulatory controls on Wall Street. That should be good news for ISVs and solution providers with compliance expertise. "Increased regulation always means more information and more systems requirements," says Marc Hebert, chief marketing officer at Gold Certified Partner Virtusa Corp., a consulting and outsourcing company in Westborough, Mass., that works heavily with financial-services companies.

Given the emphasis that Obama placed on universal health care during his campaign, many people saw flush times ahead for IT providers that serve doctors, hospitals and insurers. But slumping corporate earnings and mounting unemployment may force the incoming president to put his potentially costly reform plans on hold initially. "Right now, the primary focus is going to have to be on the economy," says Elizabeth Boehm, a principal analyst and health care expert at Forrester Research Inc. Even so, Obama has promised to invest $10 billion over the next five years on driving increased adoption of electronic health-information systems. In addition to electronic medical-record applications, which digitize patient information, those systems could include solutions that help doctors and hospitals lower costs as well as improve care by exchanging clinical data online.

The Obama years could also be good ones for partners that serve the energy sector-eventually, anyway. "In the short term, we think that IT spending in the utilities industry is going to decrease until the market recovers," says Jill Feblowitz, practice director for business technology at IDC's Energy Insights division. Still, Obama has vowed to spend $150 billion over the next 10 years on "climate-friendly" energy ventures such as biofuels and wind power. Feblowitz believes "smart meters" and "smart grids" could be part of that initiative as well. Such systems transmit real-time information on how much power households consume and when they consume it, enabling utilities to promote energy conservation by charging higher prices at peak times. Energy Insights believes that the smart-meter market will rise from roughly $3.4 billion today to $5.5 billion by 2011.
Questions Without Answers

More broadly, many in the IT community think the new president's policies in other areas could prove beneficial to them. For example, Obama has promised to more vigorously protect copyrights and trademarks overseas and to crack down on punitive regulatory and tax barriers that tilt the playing field in foreign markets. Enderle believes that the Obama administration may also ease export restrictions that were designed to keep certain advanced technologies, like encryption, from falling into adversaries' hands. "The reality is, in many cases, the technology overseas is as advanced -- if not more advanced -- than technology here," Enderle notes. In those cases, what government controls ultimately do is put American hardware and software makers at a competitive disadvantage abroad.
Still, some partners fear that the incoming administration's tax policies could end up having a less welcome impact on them. Obama says he will eliminate capital gains taxes for small businesses as well as provide tax credits for employee health care coverage and "investments in innovation." However, Shenoy is one of many in the Microsoft channel who expect their corporate tax bills to rise just the same, putting added pressure on already strained balance sheets. "We'll probably have to watch our cost structure more carefully," Shenoy predicts.

Meanwhile, the tax question isn't the only one causing anxiety throughout the technology industry. Could Obama's pro-labor stance shift power from IT companies to their employees? "Right now, the technology industry is running reasonably well with the existing balance, and you wouldn't want to muck with it," says Enderle. Will the new president's stated intention to end tax breaks for companies that ship jobs overseas hurt service providers with operations in India and other emerging markets? "His populist perspective on job loss does suggest there's some risk to IT outsourcing companies like us," observes Virtusa's Hebert.

It's likely to be many months before we have definitive answers. In the meantime, however, Shenoy finds solace in two convictions: First, that Obama ultimately wants what's best for the country; second, that nothing the new commander in chief does is likely to change the popularity of Microsoft software. He sums up the latter point this way: "I don't care which administration comes in, [businesses] will still have to buy Microsoft products."
READ MORE - How Will Barack Obama Change the Channel?

Conficker Worm Still Wreaking Havoc on Windows Systems

Users of Windows Server service that haven't patched a previously disclosed worm hole (MS08-067) are taking a big risk.

by Jabulani Leffall


Users of Windows Server service that haven't patched a previously disclosed worm hole (MS08-067) are taking a big risk. More and more enterprises continue to get hit by a Conficker worm variant, according to Roger Halbheer, chief security adviser for Microsoft's Europe, Middle East and Africa Group, in a blog post on Wednesday.

Microsoft on Tuesday released its January security update. However, the company is still trying to get Windows pros to fix an issue that first appeared as an out-of-band bulletin back in October. That patch was supposed to take care of a remote code execution (RCE) exploit in remote procedure call requests (RPC) that could affect a whole network if harnessed by hackers.

These stepped-up warnings from Microsoft come after independent ITSec shop Panda Security said in several reports that Windows users haven't seen the last of the RCE exploit. Both Panda and Symantec officials have said they've seen increased attacks, as well as a growth in malware, deriving from Conficker attacks.

In cases where the security patch hasn't been applied, Conficker-type bugs can ding Windows-based PCs with malicious RPC packets. Specifically, the bug allows corrupt subroutines on a network to be executed automatically. The worm can affect Windows 2000, XP and Vista operating systems, as well as Windows Servers 2003 and 2008.

Because this month's patch cycle was so thin, now might be the moment to look seriously at the October fix, experts say.

"For administrators who failed to patch the RPC vulnerability that was reported back in October 2008, this is the best time to go back and patch the issue," said Paul Henry, security and forensic analyst for endpoint security outfit Lumension. "Security experts are starting to see new variants appearing in the wild for this. We're seeing more widespread use of the vulnerability today than we did back in October."

The fact that so many Windows systems seem to be at risk raises questions about security patch turnaround times in the enterprise. A Qualys survey found that more than 50 percent of machines get patched after approximately 30 days.

"After that period, we see the patch rates go down and the overall number of machines that are attackable only slowly diminishing," said Wolfgang Kandek, chief technology officer of Qualys Inc. "Unfortunately this leaves enough machines to be exploited by the "Conficker" worm types even today, over 45 days later."
Kandek sees a general lack of alarm among IT pros.

"We would have liked to see a faster reaction by the computer users given the significance of the patch but there still seems to be a barrier to reach everybody and make them understand the urgency of patching."
READ MORE - Conficker Worm Still Wreaking Havoc on Windows Systems

Microsoft Releases Desktop Virtualization Beta

Microsoft released a beta of Microsoft Enterprise Desktop Virtualization (MED-V), which lets users run older OSes and applications on newer Windows systems.

by Kurt Mackie

Microsoft on Thursday released a beta of Microsoft Enterprise Desktop Virtualization (MED-V), a solution that lets users run older operating systems and applications on newer Windows-based systems. The MED-V 1.0 Beta is Microsoft's first release incorporating technology from Kidaro, a company Microsoft acquired in May. One of the main uses for MED-V is to give companies a bit more time to use so-called "legacy" software applications even while they upgrade Windows across the network. Microsoft's scenario is an IT shop's migration to Windows Vista.

A video explaining the technology on Microsoft's MED-V Web site suggests that IT professionals could upgrade their networks to Vista while still running Windows XP-based applications through the use of Microsoft Virtual PC 2007 and MED-V.

The product aims to smooth upgrade challenges for IT teams, according to Ran Oelgiesser, Microsoft's senior product manager for MED-V, in an announcement.

"With MED-V 1.0, you can easily create, deliver and centrally manage virtual Windows XP or 2000 environments (based on Microsoft Virtual PC 2007), and help your users to run legacy applications on their Windows Vista desktops," Oelgiesser said. "No need to wait for the testing and migration of those incompatible applications to complete."

What MED-V does is not the same as application virtualization. By virtualizing the desktop, MED-V lets the user create virtual machines of an operating system that can run on either a server or an individual PC. Those VMs could be used to provide a Windows XP desktop environment for the end user, even though the system runs on Vista.

MED-V is part of the Microsoft Desktop Optimization Pack (MDOP), which includes various "virtualization and management technologies" for Windows-based platforms, according to Microsoft's announcement. Microsoft plans to release the first version of the MED-V product "in the second quarter of this year" as a part of MDOP, Oelgiesser explained.

The company may also make it easier to run VMs on equipment outside the company as part of Microsoft's Vista Enterprise Centralized Desktop (VECD) licensing.

"It is our intention that in future releases, MED-V in conjunction with the new VECD licensing, may be used to deliver a corporate virtual image to 'unmanaged' PCs, and reduce the tension between IT control and user flexibility," the Microsoft's MED-V home page states.

Microsoft offered up changes to VECD licensing in September, as analyzed here. The licensing is part of Microsoft's Software Assurance plan.

MED-V, which works with Virtual PC 2007, may not be supported on the Windows 7 Beta, which Microsoft unveiled this past week. A Windows 7 TechNet Forum comment stated that Virtual PC 2007 isn't officially supported by the Windows 7 Beta. Still, some testers have reported success in using Virtual PC 2007 with the OS beta release (Build 7000).
READ MORE - Microsoft Releases Desktop Virtualization Beta

Wi-Fi won't replace mobile broadband

By Victoria Ho

Mobile broadband is expected to weather the economic downturn because it is becoming mainstream, with Wi-Fi hotspots not posing a significant threat, according to analysts.


Daryl Schoolar, senior analyst, wireless broadband at In-Stat, said in an interview with ZDNet Asia, he doesn't expect mobile broadband usage to take a big hit because it is becoming a mainstream commodity.
"A lot of laptop data card subscriptions are paid for by businesses, so I don't expect that to be affected. Businesses may look for other areas to cut cost. Connectivity is indispensable," Schoolar said, but added that consumers and SMBs may think twice about signing up for new contracts.

According to a recent Parks Associates study, the mobile broadband market is expected to weather the economic downturn because it is "transitioning into a mainstream service".

The study projects the number of U.S. mobile broadband users to more than double between 2008 and 2013.The research firm expects mobile broadband reliance to be increased by sales of smartphones throughout the period.

Mobile broadband is expected to continue to prosper in spite of Wi-Fi hotspots being "very popular" in the United States. Schoolar said operators such as AT&T offer bundled services such as free Wi-Fi connectivity at hotspots for their existing DSL or iPhone subscribers, so this helps to prevent Wi-Fi and cellular broadband from being two competing services.

Operators in Asia are adopting similar tactics. Bryan Wang, research director, Asia-Pacific connectivity, Springboard Research, said in an e-mail interview with ZDNet Asia, China Mobile has bundled 10 hours per month of Wi-Fi usage with the upper-tier cellular broadband subscriptions.

Wang said there is a possibility some corporations may consider Wi-Fi as an alternative to curb spending on mobile broadband, but noted that in countries like Singapore, Hong Kong and Taiwan, there is consumer momentum to switch over to 3G cellular broadband from current fixed broadband services.

"China Mobile uses both to provide comprehensive coverage for customers," he said, but added that "we see the trend of consumers moving to cellular broadband, as well as business users".

He said: "Cellular broadband is not necessarily more expensive than fixed broadband... This has been happening in the past six months for non-gaming consumers, who just need to surf anytime, anywhere."

Free is not good enough
One trump card mobile broadband has over Wi-Fi is ubiquity, said In-Stat's Schoolar. "Out in the field, hotspots are not an option," he said.

"It's a tough market for Wi-Fi operators. They have to branch out and provide other services than just selling access, [such as] running a managed LAN service, for example, or tying up to control valuable real estate like airports and business hotels," he added.

Wang said: "Wi-Fi coverage is still an issue for most countries."

And bundled offers from operators giving home users fixed broadband and cellular data could "kill" paid Wi-Fi hotspot services, he said.

In the United States, AT&T sells a data plan together with its iPhone, as does T-Mobile with its Android-based G1 smartphone, said Schoolar.
READ MORE - Wi-Fi won't replace mobile broadband

Five strategies for 2009 IT gold

By Michael Krigsman

For successful IT projects, consider these strategies, which cover relationships between IT and its environment as well as address culture and process.

Let's talk about running successful IT projects in 2009. This discussion is more important than ever, because IT problems remain common, with some estimates suggesting 68 percent of projects fail.

Despite staggering odds, follow these five strategies to reach the IT pot of gold.

1. Meet business needs.
Every IT project must accomplish a business goal or risk becoming a wasteful boondoggle. Poor communication between business and technology groups complicates this simple concept inside many organizations.

If the business side routinely criticizes your IT team, get together and ask them for guidance. While isolation brings failure, discussion is a true harbinger of success. Conversation with the business is the right place to begin an IT improvement program for 2009.

2. Innovate.
Conversations with the business should help both sides work together with greater creativity and flexibility. Adaptability is fundamental to survival, especially in tough economic times, so being ready to accept change is prerequisite for success.

Although listening carefully to user requirements is the first step, being self-critical as an organization is also necessary. Great things happen when IT embraces a culture of continuous change and improvement.

3. Be honest.
Denial is the handmaiden of failure and a leading cause of project death. Change is impossible until a team accurately recognizes its own weaknesses. Having done so, the team can take remedial measures that shore up weaknesses and support strengths.
Objective self-appraisal is the hardest item on this list to accomplish; few organizations do this well.

4. Align vendors.
Virtually all projects involve the IT Devil's Triangle: the customer, technology vendor, and services provider. As I have previously written, "These groups have interlocking, and often conflicting, agendas that drive many projects toward failure."
Given the great importance of these relationships, success depends on managing the vendors to your advantage. Use contractual incentives and penalties to ensure external vendors operate with your best interests in mind.

5. Arrange sponsorship.
Many IT initiatives go across political boundaries within an organization. For these reasons, gaining consensus among participants and stakeholders is sometimes hard.

Since problems inevitably arise, a strong executive sponsor is a critical success factor on all large projects. Make sure the sponsor fully understands his or her role and is committed to active participation. The best sponsors care passionately about the project's goals. Conversely, sponsors who don't play an appropriate advocacy role when needed can kill an otherwise healthy project.

These five points cover relationships between IT and its environment, which includes internal stakeholders and external partners. It also addresses culture and process, bringing together essential ingredients to overcome many problems that plague IT.

What do you think is the best path to achieving successful IT in 2009?

Michael Krigsman is CEO of Asuret, a software and consulting company dedicated to reducing software implementation failures. He is also CEO of Cambridge Publications, which specializes in developing tools and processes for software implementations and related business practice automation propjects. This article was first published as a blog post on ZDNet.com.
READ MORE - Five strategies for 2009 IT gold

Sun eyes cloudware position

By Victoria Ho

Sun Microsystems is gunning for the cloud space and eyeing, in particular, the position of being a platform provider.


Speaking at a media session Tuesday, Matt Thompson, senior director, developer cloud tools at Sun, said the company intends to be a platform-as-a-service (PaaS) provider, that is, to provide the underlying facilities supporting software-as-a-service (SaaS) applications.

This is distinguished from providing the infrastructure for the cloud such as data centers, which is a further basic layer in the cloud structure.

Thompson, four months into his role in Sun's cloud division, explained that the company had recently reshuffled to focus on the cloud space.

"The key to [succeeding in] the cloud computing business is to be a platform," he said.

He brought up Windows Azure as an example of a competitor eyeing the PaaS position. Azure is Microsoft's cloud-friendly version of its Windows OS, designed to run over the Internet from Microsoft's data centers.
Azure is positioned as an alternative platform for developers, allowing them to write programs outside of their business' servers.

Sun's OpenSolaris OS is also targeted toward the developer community, as a test bed for programs. "It will take more magic than [Azure] for Microsoft to be a platform provider though," Thompson said, adding that OpenSolaris was a better candidate because of added administrative tools, such as its ZFS filesystem.

Sun hopes OpenSolaris will be its entry point into the PaaS scene, by courting developers. Its Sun Tech Days conference is aimed at warming developers to the platform, by educating them on Sun's technology such as Java and Solaris.

"There is a demand for new developers to learn how to deploy apps through the cloud. And there is a huge demand for elastic compute power, even in large enterprises," said Thompson.

Jeff Jackson, senior vice president, Solaris Engineering, Sun, said: "A lot of financial institutions' IT departments will pay for a test bed to run a pilot outside of the network, if it is reliable. Small development firms too want to see if their apps will scale."

Jackson estimates there are some 250,000 registered OpenSolaris users, with "hundreds of thousands" active users.

He is targeting the number of registered users to go up to a million by the end of this year, with that number further multiplied next year, he said.
READ MORE - Sun eyes cloudware position

Nortel plans a restructuring in Asia

By Joel D. Pinaroc

Nortel Networks will focus on a massive restructuring plan in its Asia-Pacific operations following news that the troubled network gear maker filed for bankruptcy last week.


In a statement Monday from Nortel's Asia-Pacific headquarters, the company said it "has initiated a business and financial restructuring process to strengthen the business for the long-term".

The statement also said "this is a solution that will allow Nortel to deal decisively with its cost and debt burden, to effectively restructure its operations and to narrow its strategic focus in an effective and timely manner".

The company, burdened by bond debts, is reportedly considering "shredding" some of its business units to remain afloat.

But the statement from the Asia-Pacific headquarters said that "Nortel is still very much in business and we will continue to be 100 percent focused on driving results for all of our stakeholders".

Reports said Nortel, saddled by debts amid a slowdown in sales from its major markets, filed for bankruptcy protection in the United States, Canada and Europe, in an apparent effort to "save" some of its business.
The bankruptcy filing came a day before the firm was due to make an interest payment of about US$107 million, reports also said.

As expected, Nortel shares took a dip following the news.

Major markets for Nortel like the United States and Europe, have considerably slowed down due to intensifying competition, the reports also said.

Gavin Graham, director of investments at BMO Asset Management in Toronto, was quoted in a Reuters report, saying "it's obviously a remarkable transformation from where it was as the largest company in Canada worth about 35 percent of the (Toronto Stock Exchange) in 2000".

"But this is a reflection of the way that the telecommunications industry has changed." the analyst added, according to Reuters.

Nortel is facing tough competition from virtually all regions.

In Asia, the company faces formidable rivals in Huawei and ZTE, while Alcatel-Lucent, Ericsson, and other network equipment makers continue to compete with the firm.

Philippines operations
In the Philippines, meanwhile, the company remains mum on the issue.

In its latest press conference in December, Nortel Philippines reported that the company continues to be "liquid".

It also disclosed plans to offer new network technologies for large Philippine enterprises, a major contributor of its business.
Mandy Pascual, Nortel Philippines country manager for enterprise, said Nortel will focus on large enterprises, particularly on applications based on telepresence.

Pascual said although there are no telepresence installations in the Philippines yet, the large companies including telecommunications providers, business process outsourcers (BPOs), multinationals and others, are seriously looking at the technology.

The ongoing global economic crunch, Pascual said, could be one of the catalysts for telepresence to take off in the Philippines, as companies look for long-term products for cost-savings in their respective operations.
The executive said actual telepresence installations in the Philippines maybe seen in "one to two years" from now.

Joel D Pinaroc is a freelance IT writer based in the Philippines.
READ MORE - Nortel plans a restructuring in Asia

Gartner: 2009 a 'deciding year' for Sony Ericsson

By David Meyer

Sony Ericsson's latest quarterly results, which show a significant drop in revenue, have prompted analysts to suggest this year will be make-or-break for the mobile-phone manufacturer.


The company's results for the fourth quarter of 2008 were published on Friday. Sony Ericsson lost US$187 million in that quarter--it lost just US$36.9 million in the previous quarter. In the fourth quarter of 2007, Sony Ericsson made US$552 million.

"In economic terms, 2008 has been a tumultuous year with world markets experiencing a serious downturn," said Sony Ericsson president Dick Komiyama in a statement. "The mobile-phone market has been greatly affected by this and as expected, the fourth quarter continued to be very challenging for Sony Ericsson. Our business alignment is progressing as planned, with the full effect of annual savings of around US$444 million expected by the second half of 2009. We foresee a continued deterioration in the market place in 2009, particularly in the first half."

Gartner research director Carolina Milanesi said in a statement last week that Sony Ericsson's sales for the fourth quarter of 2008 came in "at the low end" of the analyst house's expectations. "The market in the last quarter of 2008 continued to be very challenging especially for Sony Ericsson which remains particularly exposed to the weakness of the Western European market," Milanesi wrote.

"We continue to believe that maintaining the third position in the worldwide ranking achieved in the third quarter of 2008 will be very difficult for Sony Ericsson," Milanesi said. "With sales in 2009 forecasted to slow down and the weakness in the European and Japanese market expected to continue, Sony Ericsson needs to build presence in markets such as North America where market share has historically been limited."
Palm stole the show at CES, revealing the Pre smartphone and WebOS operating system

Milanesi added that Sony Ericsson's decision to join the Google-led Open Handset Alliance and the Symbian Foundation had been "the right steps", partly because Sony Ericsson has not had a significant presence in the smartphone market so far.

"We believe that 2009 will be a deciding year for Sony Ericsson as it battles between profitability and market share growth," Milanesi wrote.

The average selling price (ASP) of a Sony Ericsson handset in the fourth quarter of 2008 was US$140, up from US$145 in the preceding quarter but down from US$164 in the fourth quarter of 2007. The company attributed the quarter-on-quarter ASP increase to "a positive impact of foreign exchange rate fluctuations and to the sale of a higher proportion of high-end models".

Currency fluctuations were also credited for a four percent quarter-on-quarter rise in sales (US$3,886 million, up from US$3,746 million), but blamed for having a "large negative impact" on costs. Sales in the fourth quarter of 2007 totaled US$5,030 million, so the fourth quarter of 2008 showed a 23 percent year-on-year drop in sales. According to Sony Ericsson, this was "driven by lower volumes, due to the global economic slowdown that resulted in contracting consumer demand and decreased availability of credit".

In its statement, Sony Ericsson estimated that its market share in the fourth quarter of 2008 was around eight percent. The company also forecast that "the global handset market will contract in 2009 and that the industry ASP will continue to decline".
READ MORE - Gartner: 2009 a 'deciding year' for Sony Ericsson

Jockeying starts for Philippine automated polls

By Melvin G. Calimag

MANILA--With just about one-and-a-half-years remaining before the Philippines holds its first nationwide computerized elections, the government's main poll body and machine suppliers are rushing against time to beat the deadline.


The Philippines, a vast archipelago comprising of thousands of islands, is set to conduct its synchronized general elections on May 2010. In August last year, it piloted a poll automation exercise in the Autonomous Region of Muslim Mindanao (ARMM) to conform to a 2007 law mandating the implementation of computerized elections.

The Commission on Elections, more known to the public as Comelec, has submitted an 11.9 billion pesos (US$252 million) supplemental budget request for the poll automation, but it's still uncertain as to when Congress will give its nod.

The budget for the counting machines was not included in the original appropriation for the Comelec because the agency was still evaluating which poll technology to use when the 2009 budget was passed in December 2008.

Congressional leaders, including Sen. Richard Gordon, principal author of the election automation law, have expressed interest in granting the additional funding with haste upon the resumption of Congress on Jan. 19.
However, Comelec spokesperson James Jimenez, said in a phone interview that the poll agency is not expecting to have the budget approved until mid-February.

"That means we can only award the contract and start the procurement of the machines by March or early April," the Comelec official said.

The timetable cited by Jimenez seems to jibe well with the ideal target being looked at by a machine vendor that is hoping to implement a mix of technologies for next year's polls.

Robert Cook, president for worldwide sales at Smartmatic, said in a press briefing Thursday, that March would be a good time to award the contract as this will give the winning bidder sufficient time to roll out the machines, as well as to train poll personnel.
"If they'll announce it on April or May, it may be a little bit more difficult to do it considering the geography of the country," said Cook, whose company participated in the ARMM elections by deploying 2,500 DRE (direct recording electronic) machines.
Although it specializes in DRE technology, Smartmatic has teamed up with two other foreign-based companies to offer an integrated package that includes Precinct Count Optical Scan (PCOS) and Central Count Optical Scan (CCOS).
Smartmatic initiated the tie-up with its technology partners after the Comelec Advisory Council (CAC) came out with a resolution Nov. 29, 2008 recommending that DRE or PCOS technology be used for all poll areas in 2010, and CCOS technology for all other areas not covered by DRE or PCOS technology.
The CAC, and inter-agency advisory body created by the poll automation law and headed by the chairman of the Commission on Information and Communications Technology, is mandated to recommend the "most appropriate, secure, applicable and cost-effective technology to be applied in the automated election system".
Melvin G. Calimag is a freelance IT writer based in the Philippines.
READ MORE - Jockeying starts for Philippine automated polls

Fake reviews prompt Belkin apology

By David Meyer

Fake positive reviews of Belkin products were actively solicited by one of its employees, the company admitted last week.


Belkin, a networking and peripheral manufacturer, apologized for the worker's actions, which sought to artificially boost Belkin's status on Amazon while denigrating existing bad reviews.

Last week, The Daily Background Web site revealed how someone, apparently Belkin business-development representative Mark Bayard, had used the Mechanical Turk service to ask users to write positive reviews of a Belkin product at a rate of 65 US cents per review. The requests made it clear that writers need have no experience of, nor even own, the product in question. Mechanical Turk is an online clearing-house for small jobs that cannot be done by machine, such as writing product descriptions. It is, coincidentally, run by Amazon.

In a letter posted on the company's Web site last week, Belkin president Mark Reynoso said the solicitations had been "an isolated incident".

"It was with great surprise and dismay when we discovered that one of our employees may have posted a number of queries on the Amazon Mechanical Turk Website inviting users to post positive reviews of Belkin products in exchange for payment," Reynoso wrote.

"Belkin does not participate in, nor does it endorse, unethical practices like this. We know that people look to online user reviews for unbiased opinions from fellow users and instances like this challenge the implicit trust that is placed in this interaction. We regard our responsibility to our user community as sacred, and we are extremely sorry that this happened."
Reynoso said Belkin had "acted swiftly" to remove all the review requests from the Mechanical Turk system, and was "working closely with our online channel partners to ensure that any reviews that may have been placed due to these postings have been removed".
"It's also important to recognize that our retail partners had no knowledge of, or participation in, these postings," Reynoso wrote. "Once again, we apologize for this occurrence, and we will work earnestly to regain the trust we have lost."
According to The Daily Background, the product for which the positive reviews were requested was Belkin's wireless F5U301 USB2.0 hub and dongle, listed on Amazon.com. On Monday, the listing for that product on Amazon.com showed a rating of one-and-a-half stars out of five.
READ MORE - Fake reviews prompt Belkin apology

Dell launches 256GB solid-state drives for XPS

Dell has increased the capacity of the solid-state drives on its range of XPS laptops, but not the price--at least as far as the United States is concerned.


Late last week a Samsung-manufactured 256GB SSD appeared on Dell's U.S. Web site as an option for the XPS M1330 and M1730 laptops. The SSD is available as an upgrade, at a US$400 premium, to the base specifications of those laptops.

Before the option of a 256GB SSD was made available, the largest SSD a customer could order as an upgrade for the XPS machines was 128GB in capacity. Upgrading to the 128GB SSD also cost US$400.
ZDNet Asia sister's site ZDNet UK asked Dell if and when the 256GB SSD option will be made available to U.K. customers, but had not received a reply at the time of writing.
READ MORE - Dell launches 256GB solid-state drives for XPS

M'sia telecom spending to stay resilient in 2009

By Lee Min Keong

KUALA LUMPUR--Telecommunications spending in Malaysia is expected to remain resilient this year and is projected to achieve a moderate growth to RM24 billion (US$6.7 billion) despite the slowing economy, IT research firm IDC said.
Esther Gan, market analyst for IDC Market Research, said: "Malaysia's telecommunications spending encompassing mobile services, Internet access, corporate data services and fixed telephony, are expected to grow at 4 percent in 2009."
However, the dismal economy will reduce the subscription of non-essential telecom services such as redundant fixed line services, said Gan at a media briefing on "Malaysia Telco Predictions" here Thursday. She added that the global economic meltdown will affect the telecom sector in the Asean markets, but to a lesser degree than the IT markets.
"The way telecom operators will rethink their strategies in response to the economic crisis, however, differ from one country to another. Malaysian telecom operators will look into managed services as new revenue streams that could help increase ARPU (average revenue per user) from basic voice and data services," said Gan.
This will likely result in service providers forging stronger alliances with vendors to deliver comprehensive managed offerings, she added.
IDC said the incumbents in Asean will have renewed interest in the small and midsize business (SMB) segment this year.
"The economic meltdown will hinder prospects of growth for large corporations. Any initial signs of recovery will be first witnessed in the SMB market. 2009 would be a good time for incumbents to invest in marketing [aimed at] SMBs," said Gan, adding that service providers can leverage on improved telecom infrastructure and technology advancements to develop a wide range of services specific to requirements of SMBs.
Affordable alternatives
With the tightening on spending, IDC anticipates a higher preference among businesses for alternative products this year. Gan explained: "SMB/SOHO (small office home office) establishments and larger corporations will look into more affordable networking products. Interest for new brands in the market may be seen in 2009."
In the mobile phone devices category, she anticipates users selecting mid-end units over high-end familiar brands while purchases for devices with steep price tags will be pushed towards the end of 2009. Broadband and mobile subscribers will seek affordable ways of staying accessible either when online or while mobile, she added.
Gan said strong demand and continuous marketing efforts by service providers will sustain strong growth in wireless broadband both within the key cities and surrounding locations in Asean.
"On-going competition among technologies such as 3G, WiMax and iBurst will drive online access growth in Malaysia. Wired broadband service providers will not be slow in providing counter-offers. Such offerings may include wired broadband service bundles that include public hotspot access time in order to bring some 'mobility' element to their offerings," she said.
Although wireless broadband will gain a wide customer base, wired broadband will still be the dominant technology, added Gan.
IDC said despite the gloomy economic outlook, mobile and wireless broadband providers will continue to invest in 3G and WiMax infrastructure to expand their network and attract new subscribers.
Chua Fong Yang, associate market analyst, IDC Market Research, said: "Although 3G has been commercially available in Malaysia since 2005, the service coverage remains limited and concentrated in key urban areas. Competition in the 3G market space will intensify with DiGi.Com expressing their interest in rolling out 3G commercially in 2009."
WiMax services operator P1 will continue its aggressive expansion in 2009 while Malaysia's three other WiMax players will likely take the "wait-and-see approach", Chua said.
"Subscriber acquisition will become the most important objective of wireless service providers in year 2009," he added.
IDC also predicted that 2009 would be a year where the Internet data center model gets a second chance to shine. "Enterprises in Malaysia will fundamentally rethink their data center strategies in 2009 and will reconsider the managed route in order to drive cost down," said Chua.
"Various data center issues have driven up the capital expenditure and operating expenditure (Capex and Opex) over the years. Given the unfavorable economic outlook for 2009, many companies will consider outsourcing and migrating their data centers," he added.
The research firm also confirmed that the mini-notebook phenomenon had taken hold in Malaysia and would be a bright spot for the PC market this year. "Mini-notebooks were the fastest growing products in Malaysia in 2008 and the growth is not expected to slow down in 2009," said Chua.
IDC forecasts mini-notebooks will contribute more than 5 percent of the portable PC shipments in Malaysia in 2009. The emergence of mini-notebooks will also enable mobile operators to penetrate a new market by addressing the needs of consumers who require wireless broadband "on the go" minus the weight of the notebook and the high price, said Chua.
"Due to the devices' limited processing power and storage, mini-notebook users will be heavily dependent on being connected to the Internet. Hence, usage of these devices will spur demand for Internet access," he added.
Lee Min Keong is a freelance IT writer based in Malaysia.
READ MORE - M'sia telecom spending to stay resilient in 2009

Ballmer and Bostock break bread

Well, it's official, Microsoft and Yahoo have come to an agreement.


On lunch.

As first noted by Valleywag, Microsoft CEO Steve Ballmer and Yahoo Chairman Roy Bostock had lunch together this week in New York. The New York Times has a lengthy piece up now as well, confirming the meeting, but offering no clue as to what happened beyond, presumably, caloric consumption.

In any case, it's the second high-level contact this week, given that incoming Yahoo CEO Carol Bartz told employees that she, too, had talked with Ballmer.

Microsoft has made it clear in every way imaginable that it would like to do a search deal, but it remains less clear whether Yahoo is willing to take the relationship beyond the "lunch date" stage.
READ MORE - Ballmer and Bostock break bread

Five reasons to kill IT projects

By Michael Krigsman

A recent study highlighted the top five reasons IT experts who killed a project gave for terminating projects prior to completion. Here's some insight on the five.

A survey of IT experts revealed 43 percent of their organizations had recently killed an IT project.
The study, conducted by ISACA, an independent IT governance group, highlighted the top five reasons these organizations named for terminating projects prior to completion.

Here's the list, with my commentary on each issue:

1. Business needs changed: 30 percent
There are many conditions and situations where a business legitimately changes its requirements after starting a project. If the project no longer provides meaningful value, then it's best to stop throwing good money after bad.
On the other hand, some organizations deliberately obscure a flawed project requirements process by claiming business needs evolved. Obviously, that's unhealthy and a true sign of failure.

2. Did not deliver as promised: 23 percent
This is a typical expectation-setting problem: promise anything to get funding and worry about the consequences later. Shortsighted managers don't realize that funding is less important than delivering substantive value. Failure is inevitable when managers don't clearly identify and deliver business value.
In some cases, the project really did provide value, which the organization did not recognize due to communication problems. I have blogged about one CIO seeking a publicist, presumably to address this issue:
Many organizations take a CIO for granted when his IT department consistently delivers the goods without fanfare and attention; sadly, this human failing is all too common. In that case, PR might be a great idea, especially if the CIO isn't a great communicator. Of course, the CIO should improve his communication skills, but that's another story.

3. Project was no longer a priority: 14 percent
If the organization shifted direction without good reason, thus making the project superfluous, then flawed strategic planning was the culprit. However, if business requirements changed for a good reason, as suggested in point one, there's not necessarily a problem.
In general, and this is an obvious point, canceling projects without a darn good reason is a definite sign of failure.

4. Project exceeded the budget: 13 percent
On the surface, over-budget projects are the basic metric for failure. I'm actually surprised this number isn't higher, because unanticipated cost is always such a clear red flag.
At the same time, some projects run over-budget due to intelligent scope increases that provide additional value. For example, while automating two departments, the project team realizes it can add a third department for only marginal increases in cost. In such cases, going forward is probably the right decision despite the higher spend.
Although tempting to use budget performance as simple metric of success or failure, that approach can be overly simplistic and ignore important nuances related to business value. Nonetheless, anytime a project goes over-budget the team must offer a detailed explanation.

5. Project did not support the business strategy: 7 percent
This classic indicator of failure often suggests a project rooted in poor requirements analysis. However, as with previous points, it's also possible changing business needs made the original project goals obsolete.
Note: The survey is most interesting to highlight significant issues related to project failure. However, some of the questions are too ambiguous to provide straightforward conclusions. In general, understanding whether a project is successful requires examining the business environment and context.

Michael Krigsman is CEO of Asuret, a software and consulting company dedicated to reducing software implementation failures. He also serves as CEO of Cambridge Publications, which specializes in developing tools and processes for software implementations and related business practice automation projects. Michael contributes to the IT Project Failures blog at ZDNet Asia's sister site, ZDNet.
READ MORE - Five reasons to kill IT projects

Why are IT systems so unreliable?

By Tony Lock

perspective A recent report by Freeform Dynamics shows that IT systems fail. What's more, a quick glance at the chart below shows they fail far more frequently than one might expect in today's 'high availability' environments.

Such lack of resilience might be surprising if one just went by the content of press releases from many of the leading IT vendors, especially those in the virtualization markets.

So what's going wrong and who needs to step up and take responsibility? Are IT pros and business analysts getting something wrong or are the IT vendors selling lots of pups?


As we can see from the chart above, all components essential for application delivery are prone to failure. Most frequently cited are software failures followed in second place by network failures or performance degradation, with the failure of physical components trailing back in third place.

Despite much public beating of chests, power outages and brownouts have yet to cause application failure as often as any of the other three areas addressed. So the inference is that software, hardware and network failures account for the vast majority of systems interruptions.

One thing that is clear is that service disruptions also occur as a consequence of human interventions triggering an interruption to application availability. The figure below highlights that while hardware, software and networking considerations are important in ensuring service availability, it is essential that operational management processes and practices are also suitable to the quality of service desired for each application.
It must be said frankly that if the people and process side of system availability is not addressed, the chances are that systems will fall over, probably time after time.

This all brings up the question: in which area should an organization seek to add additional resilience to its application delivery?

Is it a question of spicing up the software side of things, acquiring better hardware or getting more resilient networking in place?

Well naturally enough, in the ideal world, all three factors would receive more than adequate attention long before an application is due to enter live service thereby allowing its requirements for performance, availability and recovery/protection to be more than adequately delivered.

Alas, as we know all too well, this is the real world where, as the figure below shows, such attention to service availability is not considered early enough in projects.

If we consider the area of employing software-based solutions to enhance application availability, a lot of attention, especially from the vendor and channel community, is focused on applying a number of virtualization technologies to the problem. It is fair to say the many flavors of virtualization currently on offer, especially in the x86 server space, are being promoted as an answer to delivering greater service availability.

A closer look at such offerings, essentially any of the operating system/hypervisor virtualization technologies or one of the different approaches of application virtualization solutions available, highlights that the simple solutions mostly deliver faster recovery after failure rather than preventing failure in the first place.

A good first step perhaps but it is only now that the effective management of virtualized solutions is beginning to offer the high levels of pro-active availability that such established platforms as the mainframe manage with ease. It is also worthwhile noting that these virtualization solutions should not overshadow the need to ensure that the application itself is written in a way that enhances availability and can interoperate with new virtualization solutions to raise service availability.

As has been mentioned, over the course of the last few years software systems, especially virtualization offerings, have rather stolen the limelight when it comes to adding resilience to applications. However it is still a fact that utilizing hardware platforms that are designed with application availability in mind can also deliver great value.

These solutions are frequently overlooked as today there is a tremendous volume of marketing claiming that industry standard systems are good enough. For many applications this is true but if you need to go that extra step along the road of application resilience, server and storage hardware, fault-tolerant or resistant systems deserve consideration, especially if keeping applications running rather than simply being able to restart them quickly in the event of failure is an issue.

A serious question needs to be asked of the vendors: do they really understand what is important for organizations when it comes to application availability? Or do many of them actually believe that 'virtualization' is the marketing storm that solves all problems at once?

At the moment it appears the latter is so. And beware--it appears cloud computing may soon be proffered as the next answer to application availability, life the universe and everything.

Tony Lock is program director at Freeform Dynamics. He contributed this article to ZDNet Asia's sister site, Silicon.com.
READ MORE - Why are IT systems so unreliable?

Study: 68 percent of IT projects fail

By Michael Krigsman

A new report, notes that success in 68 percent of technology projects is "improbable". Are well-defined requirements the key to successful projects?

According to new research, success in 68 percent of technology projects is "improbable". Poor requirements analysis causes many of these failures, meaning projects are doomed right from the start.

These are staggering numbers, hitting the high end of the Standish Chaos Report and presenting a far worse picture than Sauer, Gemino, and Reich.

Key findings from the report, The Impact of Business Requirements on the Success of Technology Projects from IAG Consulting, include (emphasis added):

  1. Companies with poor business analysis capability will have three times as many project failures as successes.
  2. Sixty-eight percent of companies are more likely to have a marginal project or outright failure than a success due to the way they approach business analysis. In fact, 50 percent of this group's projects were "runaways" which had any 2 of: taking over 180 percent of target time to deliver; consuming in excess of 160 percent of estimated budget; or delivering under 70 percent of the target required functionality.
  3. Companies pay a premium of as much as 60 percent on time and budget when they use poor requirements practices on their projects.
  4. Over 41 percent of the IT development budget for software, staff and external professional services will be consumed by poor requirements at the average company using average analysts versus the optimal organization.
  5. The vast majority of projects surveyed did not utilize sufficient business analysis skill to consistently bring projects in on time and budget. The level of competency required is higher than that employed within projects for 70 percent of the companies surveyed.
This chart illustrates the requirements skills gap most companies face:

The impact of this skills gap is substantial, directly increasing project time, cost, and risk of failure. The "skills gap premium" is reflected in this graph:

My take. This research seems credible and insightful, intuitively corresponding to observations one sees in the field. I should mention the study talks about "companies", rather than projects, and it's unclear whether that distinction has numerical significance. Either way, the number is both high and disturbing.
It's important to quantify issues such as requirements failure, because many organizations over-estimate their capabilities in this area. As the study makes clear, few organizations perform these activities well. Let me be clearer: your organization probably does a lousy job setting up projects, which is why they fail.>
The solution lies in recognizing that requirements definition is critical. Learn to make assumptions explicit; for example, if the business requests a specific requirement, do the following:
  1. Write it down
  2. Expand the requirement into a set of features
  3. Share the planned features with the business to get their feedback
  4. Rinse, lather, repeat until the technical team and the business are on the same page.
I asked Helge Scheil, CA's senior vice president and general manager of the company's governance group, for comment:
Solid requirements planning establishes a clear connection between the business case, project goals, and the project outcome.
Yes, it may seem obvious, but still many projects fail. Follow this perhaps-not-so-obvious advice and more of your projects will succeed than fail.

Michael Krigsman is CEO of Asuret, a software and consulting company dedicated to reducing software implementation failures. He is also CEO of Cambridge Publications, which specializes in developing tools and processes for software implementations and related business practice automation propjects. This article was first published as a blog post on ZDNet.com.
READ MORE - Study: 68 percent of IT projects fail