Verizon: Who needs the iPhone?

Can Verizon Wireless keep its spot as the leading wireless company in the U.S. if it doesn't have the industry's hottest phone?
Lowell McAdam, the company's chief executive, is trying to make the case that it can. Two years ago, Verizon Wireless passed on the chance to become the exclusive U.S. distributor of the Apple (AAPL) iPhone and pushed Apple into the arms of rival AT&T. Since then the iPhone has become a megahit, helping AT&T close the gap with Verizon. In the most recent quarter, AT&T added 2 million wireless subscribers, bringing its total to 81.6 million, while Verizon Wireless added 1.2 million, for a total of 89 million.
Now, McAdam is launching a slew of products designed to keep Verizon ahead. In the fourth quarter the company is rolling out its largest new-product lineup ever: 14 devices, vs. half that number a year ago. Among those will be two netbooks and five smartphones, including the Droid phone from Motorola (MOT), a sleek device with a touchscreen and keyboard that runs on Google's (GOOG) Android operating system. The new products are backed by an unusually aggressive marketing campaign. In one TV spot, Verizon takes direct aim at Apple with a series of "iDon't" quips that explain all the things an iPhone can't do. "The Droid can compete head to head" with the iPhone, says John Stratton, chief marketing officer of Verizon Wireless.
Too Many New Offerings?
Verizon's strategy is bold but risky. With the Droid and another phone from HTC, Verizon is placing a big bet on the unproven Android. The software is popular with techies and has attracted enough support from developers that 10,000 apps are available for download to Android phones, but it hasn't yet caught on with consumers. In addition, Verizon risks confusing customers with the sheer number of devices it's introducing. "Greater choice is not a guarantee of a greater quarter," says Richard Doherty, research director at Envisioneering Group.
McAdam and Stratton are firing up Verizon's marketing machinery to win over consumers. The Droid will be backed by Verizon's biggest marketing campaign ever for a single device, and total marketing spending will increase five percent to 10 percent in the current quarter from last year. Stratton says the anti-Apple vibe will be toned down in favor of ads that underscore the features of its phones and the reliability of its network compared with AT&T's. Verizon has said it is still interested in selling the iPhone if Apple is amenable.
Verizon is experimenting with new marketing approaches. For several weeks in November, the company has rented time on two huge electronic billboards in New York's Times Square that will show real-time results of searches people make on their Android phones. Verizon is also planning to transform Droid, the moniker for the Motorola phone, into a brand name for a whole lineup of Android devices.
Google Could Be the Key
Verizon's success may ultimately depend on how the partnership with Google works out. As cell phones become more sophisticated computing devices, wireless companies need Silicon Valley firepower to compete. Google has helped boost the number of wireless applications available on Android phones, but analysts say it has to step up its marketing of Android to gain ground on Apple. "When the iPhone was associated with Apple iTunes, that really meant a lot to consumers," says Ken Dulaney, an analyst at researcher Gartner (IT). "You want to know that the [Android] app store is being run by Google."
READ MORE - Verizon: Who needs the iPhone?

Intel seeks new 'microserver' standard

In September, Intel introduced its back-to-the-future idea of tiny "microservers." Now the company wants to make the design into a standard others can use, too.
The chipmaker will offer its design specification to the Server System Infrastructure Forum by the end of the year, said Jason Waxman, general manager of Intel's high-density computing group. If the group's board votes its approval for the specification, group members may use the designs royalty-free, he said in a meeting with reporters here.
"Before the end of the year, it will happen," Waxman said.
The computer industry is in constant tension between proprietary designs and standards that anyone may use. The former can mean tidy profits for companies, as long as the technology is widely adopted, but the latter can spur broader adoption. Intel's primary business, selling processors, benefits more from the latter when it comes to cultivating a new server market segment.
Who's it for?
Waxman believes the servers will appeal to Web site hosting companies that need a lot of servers for relatively low-traffic Web sites.
"At most Web sites hosting providers, do you know what the server does? Nothing. It just sits there," Waxman said, so a low power draw when idle is an important characteristic. But when that request to view the Web page does arrive, it must respond quickly.
This sales pitch recapitulates one for first-generation blade servers from early this decade. So what's different now from the first time, when those commercially unsuccessful blade servers were replaced with much more powerful, sophisticated, and expensive models? This time, though the Intel microservers are simple, they have reasonably good performance, Waxman said.
"For the low-end, scaled-out Web hosting space, we think we can put enough power in a low enough power envelope," Waxman said.
What's inside?
The diminutive server consists of a single quad-core processor and four memory banks. Intel showed 16 microservers housed in an 8.75-inch-tall chassis that supplies them all with power, cooling, and a network connection to the outside world. Along the bottom of the chassis is a bay with 16 "sleds" that each has a trio of 2.5-inch hard drives that directly connect to each microserver.
The present microserver uses a 1.86GHz quad-core processor, the "Lynnfield" model of Intel's new "Nehalem" generation. Its top power consumption is 45 watts, but early in 2010, Intel will release a dual-core "Clarkdale" model that consumes only 30 watts when running flat-out.
That's at the top end, though. Intel's goal is for the entire microserver--which also includes memory and supporting chi--to idle at just 25 watts of power.
READ MORE - Intel seeks new 'microserver' standard

How to develop a better estimation matrix

An expert discusses effort estimation and shares his efforts to develop a better estimation matrix based on actual project data.

During lessons learned sessions, a common practice is to highlight what went well and identify areas for improvement on a project.
Agile enthusiasts acknowledge project teams should conduct retrospectives (a.k.a. lessons learned sessions) after every release and iteration. In waterfall projects, this activity usually occurs at the end of the project.
Regardless of your preferred methodology, I recommend comparing actual duration and actual effort to the baselined effort as part of a lessons learned session.
Effort estimation
Project teams often spend time at the end of a project documenting lessons learned, although few measure their actual performance and record it for future estimation.
During effort estimation, project teams conduct either a bottom-up or a top-down estimate. Bottom-up estimates require a significant investment in time to define scope and build an accurate estimate.
Top-down approaches use analogous estimates and rely on expert opinion to estimate duration at a high level. If you start collecting actual performance data and compare actual results against baseline estimates, you can achieve an analogous estimation tool that is based on bottom-up data from past projects.
Estimation matrix
In my system implementations, a common practice involved developing an estimation matrix that categorized reports, interfaces, conversion, enhancements, and forms (or screens) and assigning a complexity level of low, medium, or high.
Low estimates were assigned 8 hours of effort; medium estimates were assigned 16 to 24 hours of effort; and high estimates were assigned 32 to 80 hours of effort. The range was tweaked based on the information available. The key benefit was it provided a starting point for estimation based on basic information.
Over the past few years, I started collecting metrics against the Microsoft Project schedule so I could develop a better estimate matrix based on actual project data.
Figure A depicts a custom table that I used to track and categorize the actual project data for future comparison. In this table, I added custom fields to categorize the deliverables, assign a complexity rating, and include a flag to include in my estimation analysis.
Figure A

MyEstimation table. (Click the image to enlarge.)
Figure B includes a close-up view of the interface analysis.
Figure B

Interface actual and estimate comparison. (Click the image to enlarge.)
In this section, I can quickly see where I underestimated the code phases of specific interfaces. Interface 1 had a baseline duration of 2 days, and it took 6 days to complete the work. Since the data is recorded in days or hours, I can refine my estimates using both units.
By examining the actual task data, I can start building better estimation metrics. It helps to understand the root cause of why a medium interface moved from 2 days to 6 days. Based on the root cause, I may adjust the matrix to accommodate better estimation.
After I export the data to Excel, I can update my matrix in Figure C.
Figure C

Estimation matrix
Summary
For a lessons learned session, I am not suggesting you track every task, although I do recommend tracking against the major activities required to produce specific IT deliverables.
For your next project, you'll need to conduct similar effort estimation activities. By comparing estimates against actuals for specific deliverables, you continue to build a better estimation matrix.
In my next column, I'll show how to track and export this data using a variety of views and maps in Microsoft Project.
READ MORE - How to develop a better estimation matrix

Philippine government proposes new IPR laws

With counterfeit goods growing across all industries, especially in the consumer electronics sector, the government has proposed a new set of rules for the speedy and efficient resolution of intellectual property rights (IPR) cases.
Director-general Adrian Cristobal Jr. of IP Office of the Philippines (IP Philippines), the local government agency overseeing the country's IPR, presented the draft rules to Supreme Court chief justice Reynato Puno at the IPR Week 2009 conference here Monday.
In his speech, Puno said the high tribunal would "review and act" on the proposal in the soonest possible time, adding that the rules may be "piloted" in one or a few courts to monitor and evaluate their effectiveness.
The chief magistrate said he had asked IP Philippines last year to study the existing rules and assess how these can be modified and improved to hasten the process of IPR-related litigation. "I understand that many contributed to this [proposal]: lawyers engaged in IPR practice, customs and enforcement officers, prosecutors, IP owners, and of course, judges. That you were all able to agree on something is a remarkable achievement in itself," Puno said.
As of July 2009, the number of IP cases in the country clocks at over 500, most of which are in Metro Manila, he said.
The resolution of court cases in the Philippines is notoriously slow, particularly in the field of IP, where most lawyers and judges are still grappling with rapid changes in legal and technology trends. This has led to piles of unresolved lawsuits and upsurge in IP violations, notably in the IT sector, where the piracy rate in the Philippines has stayed at 69 percent in the last two years, according to figures from IDC and Business Software Alliance.
David Blakemore, executive director of non-profit group IPR Business Partnership Asia-Pacific, said the country's piracy problem has taken a new face lately with low-end brands being blatantly copied as well.
"It used to be high-end labels were the only ones being imitated. Now, piracy is becoming prevalent across different industries, from zippers to motorcycles to medicines," Blakemore said in an interview, on the sidelines of the conference.
Cristobal said the proposed rules were intended to improve litigation proceedings, but without compromising the country's obligations under the Trade-Related Aspects of IPR (TRIPS) multilateral agreement, outlined by the World Trade Organization.
"[The proposed laws] are not complicated or costly, and do not entail unreasonable time limits," he explained. "In practical terms, these would translate into efficient litigation and consistent decision-making by the courts."
According to Cristobal, the special rules carry five major features:
  • Designation of selected commercial courts in Metro Manila as special IP courts to try violations of IPR nationwide;
  • Streamlining the litigation process for efficiency and expediency, which include prohibiting a number of pleadings, submission of affidavits in question and answer form, and shortening waiting periods;
  • Destruction of counterfeit goods and making judgments immediately executory;
  • Technical advice and assistance for IP courts; and
  • As an alternative to adjudication, parties are allowed to go into mediation while authorizing judges conduct judicial dispute resolution.
READ MORE - Philippine government proposes new IPR laws

Can Windows 7 'reset' Microsoft shares higher?

A well-received debut for its newest operating system and fresh signs that businesses again want to spend on computers gave Microsoft investors reason to hope a more than yearlong sales decline is coming to an end.
But it may take big spending by consumers to pop Microsoft stock out of its tight trading range.
Shares of Microsoft gained 1.43, or 5.4 percent, on Oct. 23 to close at 28.02. Before the market opened, the company reported fiscal first-quarter sales and profit that bested analysts' forecasts. Revenue fell 14 percent and profit dropped 18 percent, but Wall Street analysts had expected worse.
"The fourth quarter of fiscal 2009 may well have been the bottom on the economic reset," Microsoft Chief Financial Officer Chris Liddell said during a conference call with analysts. He was harking back to remarks by other Microsoft officials, who have said that old demand patterns have been "reset" lower amid recession and a credit crunch.
As heartening as recent gains may be, Microsoft shares remain in the mid-20s range, where they've been mired for a couple of years. An investor who bought Microsoft shares in late October 2000 and held them would have realized a net gain of about 1 percent. The shares reached about 37 in November 2007, but then declined to about 27 by February 2008. They've barely budged since. Despite Microsoft's size and influence--the company reported US$58.4 billion in sales last year--its stock has lagged the tech industry's highfliers.
Apple shares closed at an all-time high of 205.20 on Oct. 22 after the company reported a blowout 47 percent increase in quarterly profits on Oct. 19. Amazon.com shares reached a record high on Oct. 23, closing at 118.49, after it reported a 68 percent jump in third-quarter earnings a day earlier. And Google is trading at a lofty 554, although the stock is well off its high of 715 in late 2007.
PC sales stalled in first quarter
Getting consumers to embrace Windows 7 and snap up more capable computers than the low-end mini-laptops that have accounted for the PC industry's meager growth will be key to Microsoft's earnings and stock performance, analysts said.
The difficulty of forecasting consumers' fickle buying behavior is one reason Microsoft isn't releasing a specific sales or earnings target for the current quarter. "The financial impact of [Windows 7] will be driven by the consumer market, and that's very hard to predict," said Yun Kim, an analyst at Broadpoint AmTech who has a buy rating on Microsoft shares.
During the fiscal first quarter, sales of consumer PCs rose at a single-digit pace while those of PCs for businesses dropped at a double-digit rate, Liddell said.
Business demand will need to reverse course and consumer buying will have to accelerate. A "wildly successful" year of Windows 7 sales could propel the stock to 33 and put the company's price-to-earnings ratio on par with the rest of the tech industry, Kim said. That will likely only happen if Microsoft can increase sales and earnings by more than 10% a quarter. "If they can grow by double digits, there's a lot of room for this stock to go higher," Kim said. "There aren't many companies doing that anymore."
For the moment, analysts seem to think Microsoft can pull it off. After several quarters of decline, Microsoft is poised to start growing again. Before the first-quarter report, analysts expected Microsoft to generate US$17.12 billion in sales for the second quarter--an increase of 14 percent--and earn 52 cents a share. The surprisingly strong first quarter will likely prompt Wall Street analysts to raise estimates even higher.
Business users ignored Vista, kept XP
One potential roadblock for Microsoft is Apple, which has been grabbing market share in the U.S., and now accounts for 8.6 percent of PC sales there, according to market researcher IDC. Apple's computer shipments grew by 12 percent in the third quarter while the overall PC market grew at an anemic 2.5 percent. Worldwide PC sales increased 2.3 percent, IDC said.
What's more, Microsoft will need to overcome weakness in sales to businesses, which largely bypassed the Windows Vista operating system released in 2007, perceiving it as slow and lacking in compelling features. Instead, companies stood pat with older machines. Consumers, meanwhile, have been gravitating to low-priced netbooks that run a discounted version of an earlier operating system, Windows XP.
Windows 7 will likely help Microsoft on both fronts. The software has received positive reviews. Windows 7 helps PC users more easily manage files and it starts and shuts computers faster. It eliminates many of the annoyances that bothered Vista users.
Windows 7 also gives Microsoft a better-performing--and higher-priced--operating system to sell on netbooks. "The end result may be that Microsoft sells more versions of Windows 7 at lower price points, but if Microsoft did not meet this emerging demand," rivals would, said Allan Krans, an analyst at consultant Technology Business Research, in an Oct. 23 report.
Microsoft has more than Windows 7 to rely on. Cost-cutting could boost shares by increasing operating margins. Microsoft's profit margins have been declining over the past seven years as the company entered expensive new businesses such as Internet search and video games. The operating margin in fiscal 2009 was 34.8 percent, down from 46.3 percent in fiscal 2001.
"One of the biggest hangups investors have had about Microsoft the past few years is their spending on the online side of the business," said Jeff Gaggin, a vice-president at Avian Securities who rates Microsoft a buy.
An eight cents surprise, aided by Xbox
To rein in expenses, Microsoft is cutting its staff and reducing marketing and other costs. Operating expenses fell 7 percent during the quarter ended Sept. 30.
Sales declined to US$12.92 billion, and profit fell to US$3.57 billion, or 40 cents a share, Microsoft said. Wall Street analysts expected Microsoft to earn 32 cents per share on revenue of US$12.32 billion. A year ago, Microsoft earned US$4.37 billion, or 48 cents per share, on sales of US$15.06 billion. Surprisingly strong sales of Microsoft's Xbox video game console and royalties from games sold for the machine helped boost the results.
First-quarter sales were lower than they would have been had Microsoft not offered coupons for a free upgrade to Windows 7 to consumers who bought Vista in the months leading up to the Oct. 22 launch of the new software. The program resulted in US$1.47 billion of revenues, which Microsoft deferred. It will recognize most of those sales in the current quarter.
Liddell didn't give investors guidance for Microsoft's second quarter ending in December, but said businesses were ready to buy new PCs loaded with copies of Windows and Office. Microsoft will continue to hold the line on expenses for at least the next year, he added. "The fiscal discipline that you have seen will continue," he said.
READ MORE - Can Windows 7 'reset' Microsoft shares higher?

Vendors split over reuse of mobile gear

Mobile phone components can be reused in consumer and medical devices, Sony Ericsson says, but it remains to be seen if other mobile makers are keen to step in this direction.
Sony Ericsson's head of corporate sustainability Mats Pellback-Scharp, said earlier this month in a interview with BusinessGreen.com, that a growing number of companies are realizing it is more cost-effective to buy color displays, cameras and touchscreen technologies from old handsets, rather than build these components from scratch.
Mobile recycling volumes are "now at a scale where it is perfectly feasible for companies to take the old components and reuse them", Pellback-Scharp said, adding that there are already devices in the market today that contain Sony Ericsson's old mobile components.
In an e-mail interview with ZDNet Asia, Hirokazu Ishizuka, Sony Ericsson's corporate vice president and Asia-Pacific head, said the company aims to collect 1 million used mobile phones annually under its global "takeback" program from 2011. As part of its environmental sustainability efforts, Sony Ericsson is targeting to reduce its carbon emissions by 20 percent by 2015.
Specifically in the reuse of mobile components, the phone maker has been involved in developing guidance documents that cover various aspects such as design, collection, refurbishment, recycling and trans-border movement, he added.
"No" for Nokia
At least one handset manufacturer, however, has expressed concerns over the resale of components to a third party. Finnish mobile giant Nokia told ZDNet Asia it does not sell mobile components to third parties for use in other devices "because of safety and ethical issues".
But, the company is focused on component and material recycling, said Francis Cheong, the company's environmental affairs manager for Southeast Asia and the Pacific. For instance, plastic can be recycled and reprocessed into items such as safety cones and plastic pallets, he said in an e-mail.
Metals derived from used mobile devices can also serve a variety of purposes, he noted. Stainless steel, which is used for phone covers, external detailing and internal components, can be adapted for kitchen ware and bicycle frames. Copper, used in circuitry, can be tapped for piping in homes or musical instruments such as saxophones. Gold and platinum extracts can be turned into dental fillings or jewelry.
According to Cheong, a global study by Nokia last year found only 3 percent of mobile users recycle their old or unused phones. He said the mobile vendor observed that people are actually keen to recycle their mobile phones "if they are aware of easy and convenient ways to do so".
A 2008 estimate from the U.S. Environmental Protection Agency stated that 150 million handsets were retired on an annual basis but less than 20 percent were recycled.
READ MORE - Vendors split over reuse of mobile gear

Enterprise IT's trust level of Google will increase

ZDNet Asia's sister site TechRepublic�s CIO Jury finds that IT leaders trust Microsoft more than Google as a technology partner, but tune in next year.
So why do I expect Google�s standing to look better next year? Last week, I was at the Gartner IT Symposium and you couldn�t escape chatter about Google and what it may be able to do for the enterprise. And here�s what caught my attention: The audience--not the analysts--were yapping about Google in the enterprise,
Rest assured, that skepticism abounds about Google. That�s natural--and actually quite healthy. But this is the second consecutive year that Google has been on the enterprise IT radar. In 2008, tech execs were more curious about Google than anything. Anyone arguing that Google was about to take over the enterprise in 2008 was smoking hype. This year there was more curiosity, but also a lot of interest in returns and logistics. Simply put, enterprises are more than curious--they�re ready to move.
Now Google isn�t going to run any large enterprise whole hog, but I�m confident that the search giant will get its share--especially as e-mail moves to the cloud. It�s increasingly hard to justify running your own e-mail and at US$50 a user a year, Google�s price is a bit hard to beat.
Of course, IBM sees the same thing and has its own enterprise e-mail in the cloud service for less than Google will charge (assuming you don�t need a lot of storage).
Read more about "Enterprise IT's trust level of Google will increase" at ZDNet.


READ MORE - Enterprise IT's trust level of Google will increase

Ubuntu's new Linux tries getting cloud-friendly

With all the hubbub about Snow Leopard and Windows 7, there's another operating system out there you may not have noticed that's getting a significant update: Ubuntu Linux.
Ubuntu backer Canonical plans to release its "Karmic Koala" version on Thursday, and both the desktop and server versions of the open-source operating system take significant steps toward cloud computing. The concept of moving work away from the computer in front of you and into the network does have some merit, but cloud computing is today's fashionable buzzword, and Canonical Chief Executive Mark Shuttleworth is sensitive to its overuse.
"What frustrates me is the term 'cloud' has come to mean anything with an Internet connection, including some stuff that really looks familiar like internal IT," said Shuttleworth in an interview. It's fair to say that in Ubuntu's case, though, it's not a stretch.
Built into the server version of Ubuntu 9.10 is Ubuntu Enterprise Cloud, technology built atop the Eucalyptus software package. Amazon Web Services (AWS), a collection of computing infrastructure accessible over the Net on a pay-as-you-go basis, is among today's most significant cloud-computing efforts, and Eucalyptus implements many of its functions so companies can build their own "private clouds" using the same services.
And in the desktop version of Ubuntu, the cloud connection is a service called Ubuntu One, which lets Ubuntu users synchronize files stored on different machines and back them up on the central service. Storage space of 2GB is free, and 50GB costs US$10 per month.
The Ubuntu software itself is free; Canonical sells Ubuntu support services.
Ubuntu remains popular among the technically savvy Linux crowd, but it faces challenges. On the server, where Linux is common and there's money to be made, Red Hat is still dominant. On desktops and laptops, Linux has yet to take on Windows or Mac OS X among mainstream computer users.
And to this day, Canonical, the company Shuttleworth founded to back Ubuntu, remains unprofitable. Shuttleworth demurs when asked when the company he's funded will go into the black.
As ever, he's optimistic that the business will bear fruit. Revenue is growing, he said.
"It takes a long time to build traction in the enterprise market, but I now see that traction," Shuttleworth said. "Our growth is something to be proud of."
Linux on the desktop?
Linux has existed for more than a decade as an alternative to Windows for people's PCs, but so far it hasn't spread far beyond programmers and other technically advanced users. It's been held back in part by the difficulties of learning and installing new operating systems and by the lack of software such as Office, Quicken, Photoshop, and games.
Overall, Shuttleworth remains optimistic. One key to the growth among consumers will be adoption through computer manufacturers, an area where he believes the company has made progress. Dell offers Ubuntu machines. And last week, IBM announced a software package called IBM Client for Smart Work that combines Ubuntu with IBM's Lotus software suite.
So how might Linux take off as an operating system for PCs? Although Ubuntu continues to work on basics such as faster booting and better audio in version 9.10, Shuttleworth believes it'll be hard to succeed if the strategy is just to out-Microsoft Microsoft.
"I don't think it'll happen if we continue to define the desktop the way Microsoft defines the desktop," Shuttleworth said.
A better idea is to try to capture growth in new markets. "It's initiatives like Chrome OS and Moblin that hold the key, whether through the pendulum shifting irreversibly to the Web, or with new users and devices," he said. Chrome OS is Google's browser-based operating system that uses Linux under the hood; Moblin is Intel's Linux product for mobile devices built with cooperation from Canonical. "I don't think we will dislodge Microsoft from the traditional desktop."
With its Chrome browser and Chrome OS, Google seeks to push operating systems into the background; applications are Web-based rather than running on the operating system embedded beneath to handle things like communicating with a keyboard, trackpad, or screen. Shuttleworth sees Chrome OS as helpful, though, since other Linux projects could benefit from the support it will bring for technology such as wireless networking hardware.
And Shuttleworth clearly is a fanboy for some of Google's latest initiatives. "I'm fascinated by Chrome, Chrome OS, and (Google) Wave. I think they're awesome," he said.
Ubuntu One, the online storage service, could provide a bit more revenue for the company from the consumer operating-system business.
"Think of it as a drive in the sky that can replicate content across multiple machines," Shuttleworth said. "Most our users are sophisticated users. They have more than one PC and generally battle with the tension of having some content they don't want to manage on the Web but do want to have on multiple machines."
And data sync is a service that could be larger than Linux. "We expect it will span all Ubuntu devices and ultimately perhaps grow to other platforms as well," he added.
Your own private AWS
The most proven Linux market is on the server, though, where Linux is in some ways just another branch of the Unix family tree. Unix and Linux are fixtures of the server market.
Here, Canonical hopes to get ahead through the Ubuntu Enterprise Cloud technology.
The AWS Elastic Compute Cloud (EC2) provides access to raw computing power on which customers can fire up their own software from the operating system up. Those servers can store data on AWS' Simple Storage Service (S3) and tap into other AWS services. It's all paid for on the basis of how much processing power is consumed, how much storage space is needed, how much network capacity is used, and similar consumption-based pricing.
Ubuntu Enterprise Cloud is for those who want similar services on their own servers. The software interface mirrors that of AWS' EC2 and S3, so at least in principle a service that exceeds an organization's internal computing capacity could spill over to Amazon's infrastructure.
"In principle the goal is to provide API (application programming interface) compatibility," he said. Eventually, when such services reach mass-market appeal, standards will follow for controlling them, he believes: "We think eventually there will be a common IETF (Internet Engineering Task Force) management protocol."
One tricky piece of engineering, though, comes through virtualization, software that lets multiple operating systems run in compartments called virtual machines on one physical computer. Amazon's EC2 uses open-source virtualization called Xen, but Ubuntu's preferred foundation is another, KVM. Ubuntu 9.10, though, will be available in a Xen-based version that works on EC2.
"It's possible to build one machine image which works in both places," Shuttleworth said. "We went to a lot of trouble to make a version for 9.10 that works on EC2."
READ MORE - Ubuntu's new Linux tries getting cloud-friendly

Asean CIOs list biz intelligence top priority

CIOs view leveraging analytics to gain a competitive advantage and improve business decision making as a top priority, according to an IBM study, which polled IT heads in Asia and across the globe.
Released Thursday, the study revealed that 87 percent of CIOs in Southeast Asian nations identified the ability to see patterns in vast amounts of data and extract insights, or business intelligence and analytics, as a crucial way to enhance their organizations' competitiveness. The survey polled 2,500 CIOs worldwide, 86 of whom were from Asean.
IBM CIO Pat Toole said in the report: "In this challenging economy, CIOs understand that analytics can be key to new growth markets, whether it's new ways to manage a utility grid or smarter healthcare systems.
"Managing and leveraging new intelligence through analytics is something that today's CIO is pursuing to gain competitive advantage in these new markets," he added.
Data reliability and security were also identified as increasingly urgent concerns, according to the report, with 76 percent of Asean CIOs planning to make additional investments in risk management and compliance.
Other priorities noted in the survey included virtualization projects, mobility solutions, and enhancing activities in customer and partner collaboration.
Asean CIOs better accepted
Compared to their global counterparts, Asean CIOs ranked more highly holding roles such as "visionary" and "value creator" and were perceived less as IT managers than their global counterparts.
In fact, the survey found that CIOs in this region were more often included as members of their company's senior management team, than their global counterparts. As a result, they were more actively involved, especially with their fellow leaders, in all aspects of setting their company's business strategy.
According to the study, CIOs in Asean who were employed in higher-growth organizations typically spent more time as "business vision enabler", while their peers in lower-growth organizations focus more on work as core technical services provider.
READ MORE - Asean CIOs list biz intelligence top priority

Analyst: Wait before rushing into LTE

Operators should wait before embarking on a full-scale LTE (long term evolution) rollout, but still plan now in order to be a part of the eventual 4G game, analyst house Ovum has advised.
According to Jeremy Green, practice leader of Ovum's mobile division, there are "few first-mover advantages" for LTE network upgrades given the lack of a clear revenue stream to justify the high capital investment. As such, there was "no need to rush headlong into implementation. he pointed out.
This, in spite of the impending commercial launches of LTE to start next year from the likes of NTT DoCoMo and Verizon.
The disappointment of 3G's beginnings some ten years ago, which failed to unlock new revenue streams as promised, have cast a shadow upon the business case for quick 4G rollout now, Green said Friday in a statement.
Adding to that was the burden of a global economic downturn, which blunted investors' and lenders' enthusiasm for rapid rollout projects, he added.
Green noted: "The suggestion that higher data speeds will enable new sources of revenue for mobile operators are no more likely to be true now than they proved to be then."
On the other hand, selling a new service requires operators to upgrade networks to cater to new subscriptions, which may compel operators to embark on quick LTE rollouts, Green said. But with "most mobile networks almost certainly implementing LTE" eventually, most "can afford to wait until implementation and operational issues are resolved and the business case improves," he pointed out.
In the meantime, operators should plan their LTE strategy to avoid the eventual rush, he cautioned. For example, CDMA operators may have to decide between shutting down their networks in favor of HSPA as an interim step to LTE, or operate both LTE and CDMA networks together, he said.
Verizon decided back in late 2007 to switch over to LTE, abandoning the UMB (ultramobile broadband) standard backed by its partners. It currently has plans to fast-track LTE while maintaining its CDMA network.
"All [operators'] intermediate steps should factor in the eventual migration to LTE, so that investments and incremental improvements to the network are LTE-ready," said Green.
READ MORE - Analyst: Wait before rushing into LTE

Offline SharePoint tool promised for Mac users

Microsoft's Mac Business Unit is working on a tool that will allow Mac users to take SharePoint files offline for editing.
It will not be a Mac equivalent of Groove, the Office collaboration desktop client that will become SharePoint Workspace in the upcoming Office 2010, the SharePoint team told ZDNet Asia's sister site ZDNet UK. The promised tool also will not offer Groove's existing peer-to-peer features on the Mac.
But it will allow Mac users to take SharePoint documents offline, work on them and then synchronise them back to the server in a similar way to the new SharePoint Workspace, according to Paul Cannon, the product manager for SharePoint Workspace.
"It's not the same product, but you will be able to find an offline client for the Mac as well," Cannon said in an interview at the SharePoint 2009 conference last week.
SharePoint Workspace relies on both the Office Document Cache and a protocol in Windows Communication Foundation called File Synchronization via SOAP over HTTP. The protocol, which replaces the WebDAV protocol used in previous versions of SharePoint, enables the co-authoring promised in Word 2010 and PowerPoint 2010, as well as in the new service-oriented architecture in SharePoint 2010.
Read more about "Offline SharePoint tool promised for Mac users" at ZDNet UK.
READ MORE - Offline SharePoint tool promised for Mac users

White House moves to open-source programming for its Web site, takes code from public

White House opens Web site programming to public

WASHINGTON — A programming overhaul of the White House's Web site has set the tech world abuzz. For low-techies, it's a snooze — you won't notice a thing.

The online-savvy administration on Saturday switched to open-source code for www.whitehouse.gov — meaning the programming language is written in public view, available for public use and able for people to edit.

"We now have a technology platform to get more and more voices on the site," White House new media director Macon Phillips told The Associated Press hours before the new site went live on Saturday. "This is state-of-the-art technology and the government is a participant in it."

White House officials described the change as similar to rebuilding the foundation of a building without changing the street-level appearance of the facade. It was expected to make the White House site more secure — and the same could be true for other administration sites in the future.

"Security is fundamentally built into the development process because the community is made up of people from all across the world, and they look at the source code from the very start of the process until it's deployed and after," said Terri Molini of Open Source for America, an interest group that has pushed for more such programs.

Having the public write code may seem like a security risk, but it's just the opposite, experts inside and outside the government argued. Because programmers collaborate to find errors or opportunities to exploit Web code, the final product is therefore more secure.

For instance, instead of a dozen administration programmers trying to find errors, thousands of programmers online constantly are refining the programs and finding potential pitfalls.

It will be a much faster way to change the programming behind the Web site. When the model was owned solely by the government, federal contractors would have to work through the reams of code to troubleshoot it or upgrade it. Now, it can be done in the matter of days and free to taxpayers.

Obama's team, which harnessed the Web to win an electoral landslide in 2008 and raise millions, has been working toward the shift since it took office Jan. 20 with a White House site based on technology purchased at the end of President George W. Bush's administration.

It didn't let the tech-savvy Obama team build the new online platform it wanted. For instance, 60,000 watched Obama speech to a joint session of Congress on health care. One-third of those stayed online to talk with administration officials about the speech. But there are limits; the programming used to power that was built for Facebook, the popular social networking Web site.

"We want to improve the tools used by thousands of people who come to WhiteHouse.gov to engage with White House officials, and each other, in meaningful ways," Phillips said.

It's also a nod to Obama's pledge to make government more open and transparent. Aides joked that it doesn't get more transparent than showing the world a code that their Web site is based on.

Under the open-source model, thousands of people pick it apart simultaneously and increase security. It comes more cheaply than computer coding designed for a single client, such as the Executive Office of the President. It gives programmers around the world a chance to offer upgrades, additions or tweaks to existing programs that the White House could — or could not — include in daily updates.

Yet the system — known as Drupal — alone won't make it more secure on its own, cautioned Ari Schwartz of the Center for Democracy and Technology.

"The platform that they're moving to is just something to hang other things on," he said. "They need to keep up-to-date with the latest security patches."
READ MORE - White House moves to open-source programming for its Web site, takes code from public

Twitter CEO: Why he turned down Facebook



At the Web 2.0 Summit in San Francisco, Twitter CEO Evan Williams explains to Federated Media CEO John Battelle his rationale for turning down Facebook in October of 2008. Williams says: "[We] didn't see a reason to sell... The point is really what we can build."
READ MORE - Twitter CEO: Why he turned down Facebook

Asia sees jump in hiring expectations

Hiring expectations for the fourth quarter of 2009 are expected to increase across three Asian economies, where companies in the IT and telecommunication sector say they are willing to hire candidates who have been unemployed for a period of time, according to a new report released Thursday.
According to the quarterly employment report from executive recruiter Hudson, Hong Kong was most optimistic about their hiring plans, where 35 percent of 500 companies surveyed said they planned to increase new hires in fourth quarter 2009, compared to 22 percent in the previous quarter.
This was followed by Singapore companies, where 34 percent of respondents reported plans to increase their new hires in the fourth quarter, up from 26 percent in the previous quarter. Over 600 respondents were based in the island-state.
Conducted in August, the survey polled nearly 2,000 key employment decision makers in China--Beijing and Shanghai--Hong Kong, and Singapore, according to Hudson.
The recruit firm said respondents in the IT and telecoms (IT&T) sector also indicated expectations of increasing their hiring activities, as many global IT companies are looking to expand their Asian operations to support existing clients and to capture new business. Hudson also cited strong demand from the public sector as a driver for headcount growth in the IT&T industry segment.
"Employers report a significant jump in hiring expectations this quarter, and most sectors are increasingly optimistic about job prospects," Gina McLellan, Hudson's country manager for Singapore, said in the report.
Focus on talent development, retention
The survey also noted that talent development and staff retention were seen as key HR priorities, especially in the IT&T sector.
Hudson said companies in this industry segment are making "strenuous efforts" to retain key staff as companies in this sector face stiff competition for talent when the economic environment is favorable.
Respondents in the IT&T sector were the most open to hiring candidates who have a prolonged period of unemployment. Among the respondents in Singapore, 60 percent said they would employ such candidates, while only 5 percent said they would not.
Some 35 percent of Singaporean respondents see specialist skills as a valid reason for hiring those with prolonged unemployment.
According to Hudson, niche skills are vital in the IT&T sector, where candidates with such skills can be difficult to find. In addition, 22 percent of respondents highlighted higher qualifications as a reason for hiring a candidate.
In Hong Kong, respondents in IT&T were more confident about finding local talent for senior positions.
READ MORE - Asia sees jump in hiring expectations

Sergey Brin: Yahoo shouldn't abandon search

He wasn't on the program, but nobody was disappointed that Google co-founder Sergey Brin showed up at the Web 2.0 Summit on Thursday afternoon and agreed to sit down for an onstage chat with conference organizer John Battelle.
Battelle said Brin had been extended an invitation to speak but turned it down, to which Brin joked, "I didn't say no, I just never responded."
But it was an appropriate time to hear from one of the minds behind Google because one of the most evident trends at the conference is that the search market is heating back up. On Wednesday alone, Microsoft announced a partnership with Twitter and Facebook for real-time search results, Google announced a similar deal with Twitter, and Google executive Marissa Mayer previewed a new "social search" feature in Google Labs.
Brin talked about the new competition with a "bring it on" attitude. "I think what Bing has reminded us is that search is a very competitive market," he said. "There are many interesting companies out there." He said he's disappointed that Yahoo is retreating from the fight and planning to strike a deal with Microsoft instead.
"I think Yahoo had a number of innovations there, and I wish they would continue to innovate in search," Brin said. He didn't go into specifics.
Yahoo CEO Carol Bartz had been slated to speak at the conference on Wednesday but canceled at the last minute, citing a bad case of the flu.
READ MORE - Sergey Brin: Yahoo shouldn't abandon search

Nokia sues Apple for patent infringement

Nokia is suing Apple over 10 patents the Finnish phone maker says it owns related to wireless handsets.
The largest handset maker in the world is suing the maker of one of the most popular, the iPhone, because, according to a statement released by Nokia on Thursday, Apple has refused to license any of the patents in question.
All iPhone models dating back to the original introduced in 2007 are infringing, according to Nokia. Nokia is asking the U.S. District Court in Delaware for an injunction on sales of iPhones and for unspecified damages.
"The basic principle in the mobile industry is that those companies who contribute in technology development to establish standards create intellectual property, which others then need to compensate for," said Ilkka Rahnasto, vice president, legal and intellectual property at Nokia. "Apple is also expected to follow this principle. By refusing to agree to appropriate terms for Nokia's intellectual property, Apple is attempting to get a free ride on the back of Nokia's innovation."
Nokia has already reached licensing agreement on the patents in question with 40 other companies, including "most of the major device makers", according to Nokia spokesman Mark Durrant. Apple has thus far refused to cooperate, and filing the lawsuit was a "last resort". The two companies have been in negotiations for "some time," he added.
Nokia says it has spent more than US$60 billion on R&D related to wireless technology. The 10 patents it accuses Apple of violating are related to making phones able to run on GSM, 3G, and Wi-Fi networks. They include patents on wireless data, speech coding, security, and encryption, according to Nokia.
Apple did not respond immediately to a request for comment.
For every kind of technology you can think of (USB, wall plugs, video game controllers) there's an agreed upon standard. It's arrived at by companies making products that use the technology in question in the context of a standards-setting organization. They'll gather, debate over whose patented technology is best, and also agree in advance that every other company in the standard group will be able to license their patent at a reasonable rate.
Apple is one of a few companies--Nokia wouldn't expand on who the others might be--that is not licensing Nokia's 10 patents. Nokia says that for any phone to run on a GSM, 3G, or Wi-Fi network, it would have to license one of its patents.
Though it is asking the court to halt sales of the iPhone, the general consensus by legal observers and those who follow Nokia, is that it's not actually trying to pull the iPhone off the market permanently--injunctions are always used as leverage in these cases--but rather that it wants Apple to pay its fair share.
"There are companies that are patent trolls, that don't participate in the creation of technology, or they secretly acquire them. Nokia's not one of these companies. They're pretty up front about the patents they own," noted Jason Schultz, director of the Samuelson Law, Technology & Public Policy Clinic at the UC Berkeley School of Law. "They're probably not trying to put Apple out of business...but force Apple to play the same game that every other phone company has to play."
Apple analyst Gene Munster thinks Nokia is looking to extract a royalty payment of 1 percent to 2 percent of every iPhone sold from Apple, which would be about US$6 to US$12 per phone. With 34 million iPhones sold to date, that would be US$204 million to US$408 million in back payments Apple would have to pay if Nokia were successful in court. There's also the added risk of something called "willful infringement." Basically, if Apple were to be found in violation it'd have to pay three times the amount of whatever the judgment won by Nokia.
Apple could settle out of court, or it could try to show that Nokia either doesn't own the patents or that they're not valid in this case, both of which would be difficult, said Schultz.
"Invalidating 10 patents is a lot, that's like running the Boston Marathon. It's really hard to do. You might get one, two or even five," he said. "But 10 is a lot."
If it does go to court, strap in for a long ride. This kind of case could take up to two or three years of litigation.
READ MORE - Nokia sues Apple for patent infringement

Pacnet increases network capacity by 50 percent

Telecommunications service provider Pacnet announced plans to add 3.6 terabit per second (Tbps) of capacity to its EAC-C2C subsea cable network.
According to a statement issued today, the upgrade to the EAC-C2C comes three months in advance of its upgrade schedule, and will be Pacnet's largest ever bandwidth upgrade, bringing in an additional 50 percent capacity.
Pacnet said in the statement that the global economic crisis has not affected the demand for intra-Asian submarine cable capacity. Citing TeleGeography Research figures, demand for intra-Asia capacity is expected to increase at a compound annual growth rate of 48 percent between 2009 and 2015, it said.
Bill Barney, Pacnet chief executive officer said: "We are seeing bandwidth demand in the region fueled not only by the expanding broadband population in Asia, but also from the growing amount of digital content that is being generated from Asia."
Popularity of cloud computing and the demands from stock exchanges and financial institutions for redundancies and ultra-low latency connectivity computing will also drive the increase in lit capacity requirements, said Wilfred Kwan, chief technology officer of Pacnet.
"[Pacnet's] latest network upgrades, which are targeted for completion by early 2011, will focus on increasing capacity across key network routes which connect Singapore, Hong Kong and Japan, countries that are host to some of Asia's largest enterprises and busiest stock exchanges," added Kwan.
Apart from upgrades to its submarine cable network, Pacnet also plans to upgrade its terrestrial backhaul links between cable landing stations and its points of presence, delivering city-to-city high-bandwidth connectivity around the region.
According to the statement, Pacnet completed a phase of upgrades with 3.2 Tbps of capacity added to EAC-C2C earlier in April this year.
READ MORE - Pacnet increases network capacity by 50 percent

Leaking crypto keys from mobile devices

Security researchers have discovered a way to steal cryptographic keys that are used to encrypt communications and authenticate users on mobile devices by measuring the amount of electricity consumed or the radio frequency emissions.
The attack, known as differential power analysis (DPA), can be used to target an unsuspecting victim either by using special equipment that measures electromagnetic signals emitted by chips inside the device or by attaching a sensor to the device's power supply, Benjamin Jun, vice president of technology at Cryptography Research, said on Tuesday. Cryptography Research licenses technology that helps companies prevent fraud, piracy, and counterfeiting.
An oscilloscope can then be used to capture the electrical signals or radio frequency emissions and the data can be analyzed so that the spikes and bumps correlate to specific activity around the cryptography, he said.
"While the chip performs cryptography it is massaging the secret key around in various ways. This processing causes information about the key to leak through the power consumption itself," said Jun.
For instance, someone with the proper equipment could steal the cryptographic key from a device three feet away in a cafe in as short a time as a few minutes, he said. An attacker could replicate the key with the information and use it to read a victim's e-mail or pretend to be the user in sensitive online transactions.
Smartphones and PDAs have been found to leak data unless they have countermeasures in place to protect against it, which Cryptography Research offers, according to Jun.
He would not say exactly which devices could be snooped on in this manner and said he did not know of any attacks in the wild using this method.
"I think we're about to start seeing it on smartphones," he said. "These attacks are not theoretical."
This type of attack first surfaced about 10 years ago on cash register terminals and postage meters. Similar data leakage was found with smartIDs, secure USB tokens, smart cards, and cable boxes, he said.
Countermeasures can involve randomizing to throw noise into the measurements or changing the way the computation is done, Jun said.
Asked to comment on how threatening this type of attack could be, cryptography expert Bruce Schneier said the basic question is who stands to lose?
"Honestly, I don't care if someone hacks a cable box--it's not my money. Similarly, I don't care how often a bank gets robbed as long as the bank doesn't deduct the losses out of my personal account," he said in an e-mail. "But if someone hacks my phone and either steals service that I am charged for, or causes me enough hassle to change my phone number, that's bad."
READ MORE - Leaking crypto keys from mobile devices

'Green' mobile power offers new revenue

Alternative forms of power supply for mobile phones and other mobile devices can prove a lucrative revenue stream for mobile carriers worldwide, according to the GSM Association (GSMA).
In a statement Tuesday, the industry group said it ran a study that determined off-grid charging alternatives and services, which include solar phones and external solar chargers, can provide mobile operators additional revenue to the tune of US$2.3 billion.
These products can also benefit nearly 500 million mobile users, particularly mobile communities in emerging markets, the GSMA said. According to its study, some 485 million mobile users across the globe have no access to the electricity grid, a factor which severely limits usage opportunities.
The GSMA identified a range of available charging choices that, if implemented effectively, can extend a carrier's service availability and boost average revenue per user by 10 percent to14 percent.
"We are extremely excited that operators are able to provide people in off-grid areas with solutions to power mobile phones, as this will not only improve quality of life and access to information but can also act as a unique and significant opportunity to fuel economic growth," David Taverner, program manager at the GSMA, said in the report.
Revenue figures used to calculate the market size of off-grid charging solutions were "on the conservative side", Taverner said, so the potential financial gains for mobile operators could be greater than the estimated US$2.3 billion.
He added that this preliminary market overview marks "the start of what the GSMA believes will be an important area of industry growth in the coming years".
According to the mobile trade group, there is significant interest in off grid solutions, where about 60 percent of mobile operators interviewed for the study already have or are exploring off-grid charging initiatives. However, it noted that there is currently only limited understanding about the full scope of options and the associated social and business benefits.
The study was conducted over three months, the GSMA said, and included desk research to identify emerging vendors and their products, as well as interviews and surveys of mobile operators and vendors covering 50 countries across Africa, Asia and Latin America.
In a separate report, research firm Juniper Research said mobile networks are increasingly being deployed in rural areas of emerging markets, where consumer access to the electricity grid is at best limited and unreliable, and in many cases non-existent.
"Usage will in large be dependent on consumers being able to charge the handset through alternative methods, and solar-powered chargers in particular could become a key means of facilitating reliable access to mobile services in these markets," said Windsor Holden, principal analyst at Juniper Research.
READ MORE - 'Green' mobile power offers new revenue

Speed Showdown: Windows 7 vs. Windows Vista

With Windows 7 to be released tomorrow, this seemed like a perfect time to take a final look at how its performance compares with that of its much-maligned predecessor, Windows Vista. What we found probably won't surprise you very much, if you've been following the progress of Windows 7 since folks first started getting their hands on it around a year ago: Windows 7 beats Vista—just not always by a huge margin.
Given that—cosmetics aside—Windows 7 isn't really that different from Windows Vista, this was probably to be expected. Still, it was interesting to see the areas in which 7 really walloped Vista, and those in which there was little (if any) change. I'll turn the e-reins over to Michael Muchmore, who did the testing and wrote about his findings over on PCMag.com:

...The new OS starts up significantly faster than Vista on the same machine. And it's not just faster in boot time, but on a number of other benchmarks we ran, including video encoding, the SunSpider JavaScript benchmark, Geekbench, and PCMark Vantage. The only area in which the new OS didn't show at least a little improvement was in shutdown speed. We tested on clean installations of 64-bit Windows 7 and Vista on the same machine: a Dell Studio 14z running a 2.4-GHz Intel Core 2 Duo processor with 3GB of DDR3 RAM and Nvidia GeForce 9400M graphics...
In a new OS with a lot of new features, it's impressive that Microsoft has trimmed down and sped up the code.... Most of the tests showed about a 14 percent improvement—a pretty nice boost. Of course, your mileage will most definitely vary. I performed several of my tests on other laptops as well, including 32-bit systems, and got roughly similar results. Where there were differences, they were generally in Windows 7's favor. Overall, I'm confident that most users will experience noticeable performance improvement if they upgrade from Vista.
Be sure to read Michael's whole piece, in which he goes into greater detail about his tests and lists all of results—which pretty conclusively point to Windows 7 as the winner of this speed match-up.
I definitely agree with Michael that not everyone can expect comparable results. On my self-built home PC, which has a Core i7-920 CPU and 6GB DDR3 RAM plugged into an Asus P6T motherboard, I haven't seen an enormous performance difference after switching between a relatively new installation of Vista and a completely fresh installation of Windows 7. The new OS gets to the login screen maybe two seconds faster than Vista did, and to a usable desktop another three seconds sooner, but Vista was never distractingly slow in these areas for me.
Sure, even that little bit of extra time is nice, but the rest of Windows and most of my programs run just as well under 7 as they did under Vista—certainly not worse, but also not appreciably better. The biggest speed gains I've experienced have been incidental ones: Installing Windows 7 in the first place took about half the time Vista did, for instance, and thanks to Jump Lists, the new taskbar has made me a lot more organized and productive.
READ MORE - Speed Showdown: Windows 7 vs. Windows Vista

10 ways to effectively estimate and control project costs

Estimating what a project will cost is only half the battle; controlling those costs during the project and after delivery is equally critical.

Building a better bottom line is just as important for an IT department as it is for the whole organization at the enterprise level.
Implementing sound financial management within an IT framework is broader than simply being more efficient. Many factors are involved: an understanding of the main drivers of IT costs, aligning IT spending plans with overall business strategy, using financial resources efficiently, viewing IT expenditures as investments and having procedures to track their performance, and implementing sound processes for making IT investment decisions.
Estimating what a project will cost is only half the battle; controlling those costs during the project and after delivery is equally critical. In this article, we examine some methods to predict and manage costs, part of a sound basis for overall IT financial management.
1: Control baseline costs
Non-discretionary money spent maintaining established IT systems is referred to as baseline costs. These are the "grin and bear it" costs, those required just to keep things going.
Baseline costs constitute around 70 percent of all IT spending for the average organization, so this is a good place to start. These costs tend to creep over time due to the addition of new systems, meaning there's less money available for discretionary project work. Worse yet, this creep gives the appearance that IT costs are rising while the value derived from IT investments stays the same or actually goes down.
Fortunately, baseline costs can be easily controlled. Renegotiate vendor contracts, reexamine service levels, manage assets effectively, consolidate servers, sunset older applications, maintain a solid enterprise architecture, and practice good project and resource management. By so doing you can lower the percentage of the IT budget allocated to baseline costs and keep them in line, avoiding problems with opportunity costs.
Think of IT projects as an investment portfolio; the idea is to maximize value and appreciation. Baseline costs are food, clothing, and shelter; we have to spend the money but it doesn't have to overwhelm the budget.
2: Acknowledge hidden IT spending impacts
Gartner estimates more than 10 percent of corporate technology spending occurs in business units, beyond the control of IT. Several factors contribute to increasing hidden IT spending:
  • Flat organizational models more difficult to rein in and control
  • Virtual enterprise structures ostensibly set up as nimble, agile organizational constructs but without regard for policy and procedure
  • Changing organizational authority where business unit managers are given (or take) responsibility for decentralized technology spending
  • Selective IT outsourcing, in which a business unit will independently decide it doesn’t need to participate in overall enterprise architecture to fulfill its departmental mission
The impact of all this hidden technology spending can be profound and prevents IT from being able to control project costs. Architectural pollution from rogue projects can delay change, resulting in cost overruns and lost opportunities. Business unit-sponsored systems eventually become the responsibility of IT, increasing the cost of support and maintenance (there are those baseline costs again).
Cultural biases in business units may conflict with overall strategic goals, increasing costs and resulting in the destabilization of information and knowledge. This is just as important for small companies as well as large; fundamental business decision-making is driven by solid information, and if we don't have it we can't do it.
3: Understand long-term application costs
As a general rule, ongoing application costs are about 40 percent to 60 percent of the original development cost for each year in an application's life cycle. Sound like a lot? These are the costs associated with application support, maintenance, operations, software licenses, infrastructure, and allocated help desk and operational staff.
Controlling these ongoing costs is critical; as a component of baseline costs, they're necessary evils. Collect and maintain information about all new development work underway throughout the entire enterprise and actively participate in all projects as a value-added business partner. Communicate effectively and relentlessly; report to senior management anticipated costs both at the start of projects and at appropriate intervals thereafter. Don't forget to maintain a historical record of all costs.
4: Understand IT cost estimation truths
How good an estimator of project costs are you? I'm sorry to disappoint you, but no matter how good you think you are, you're not that good.
None of us is; your crystal ball is just as cloudy as anyone else's. This is the single biggest reason IT projects have such a high failure rate. Remember: The cost of IT initiatives will typically exceed original estimates by an average of 100 percent.
Institutional knowledge is lacking as to the result of major initiatives, the advice and counsel of IT is routinely omitted or ignored, and business process change relies too heavily on IT ownership of those business processes. How often have you been called upon to estimate, if not virtually guarantee, a project cost before the scope has been fully defined?
As an IT professional, whatever your role on a project, you must provide business managers with parameters for setting funding expectations and force those business managers to explain why their assumptions are valid.
If you're an IT manager, track all major development efforts throughout the enterprise and regardless of your role, participate in the creation of a knowledge base of maintenance and support costs to drive future verifiable and credible estimation. Don't underestimate the future costs of maintenance and support and whatever you do, don't make the classic cardinal error: Do not, under any circumstances, pad budgets in anticipation of an underestimation. Keep track of project costs as the project unfolds and communicate, immediately and vociferously, the instant you detect even the potential for an overrun.
5: Leverage current system investments
Applications, purchased software, networks, infrastructure, and any IT investment should all be regularly reviewed, at least on an annual basis, to ensure maximum value is being extracted and that original ROI goals are being met.
Start with the original requirements and review them to ensure return on investment goals were delivered. Examine changes in the business and review new requests to determine whether they fit with the existing systems. Consider business reengineering.
Review embedded processes to determine whether they're consistent with new organizational models and make changes where necessary. Review vendor and product features, making sure they still fit within the organization. Enterprise architecture is organic; it's not once and done. It changes over time. Keeping up with those changes allows for adjustments either at the periphery or by making modifications to existing components. This is an effective way to control overall costs.
6: Implement short-term cost cutting measures
Often we can control costs by putting in place tactical solutions. Short-term thinking can also be an effective tool in project cost estimation, in that it focuses us on the details. Getting from New York to Tokyo involves a fairly long flight, but we can't forget that we still have to figure out how we're going to get to the airport to begin with.
Try to postpone capital purchases as long as possible. This may not only provide time to negotiate better costs, but an idea for a less expensive solution may present itself after the project has begun. Always control project scope.
Come to agreement as quickly as possible with business unit customers and sponsors as to the overall project scope and put that in writing. Have an effective change management process for the inevitable "just one more thing" discussions, which will limit or postpone until after project delivery the single biggest reason for cost overruns.
Try to control human resource spending. There are only two reasons to use external consultants–to fill a knowledge gap (we don't know how to do something) and to fill a resource gap (we have too few to complete the project on time). Negotiate the best possible rates and where possible, use fixed-price agreements rather than T&M (time and materials).
7: Implement long-term cost cutting measures
Be tactical, but don’t forget to be strategic at the same time. Make sure there's an enterprise architecture; it's hard to put the puzzle together when you have no picture on the front of the box to go by. Eliminate duplicate processes and systems, eliminating unnecessary costs in the process.
Reprioritize and rejustify all IT projects on a regular basis. Just because something made sense in January doesn't mean it still does in August, so why waste the budget? And outsource selectively. These are the costs that typically are the most controllable yet too often lead to the highest cost overruns.
8: Implement pricing and chargeback mechanisms
I once worked for a CIO at a Fortune 500 company who decided an internal chargeback process was needed to make business units more accountable for technology costs. He successfully implemented the new approach and was credited with saving the corporation many millions of dollars. He was also fired, because this approach is the one most fraught with political peril.
Absent a chargeback mechanism, business units tend to look upon IT as a giant free toystore. Put one in place and those same business units feel free to go to the outside to get more competitive technology pricing, and IT loses control and becomes marginalized.
If your company is going to consider this, there are ways to achieve both goals: making the business units accountable and maintaining central technology architectural control. Internal IT must be competitive with external service providers. Periodic benchmarking exercises are key.
Don't underestimate the substantial resources needed to effectively administer chargeback mechanisms to ensure that business units have all the information they need and no one feels at a disadvantage. IT must have a clear understanding of all costs and manage the demand appropriately. Use client satisfaction surveys and service level agreements (a good idea no matter what the circumstances) and always show a balance between costs and benefits.
9: Use governance to drive IT investment decisions
Too many organizations fly blind, with little synergy between IT and the business. In most organizations, IT is a discretionary expense center; there's a fundamental framework (baseline costs again) but most, if not all, of what's required beyond that isn't necessarily mission critical.
Enlightened organizations understand that IT is a value-added strategic business partner, and a successful collaboration between IT and the business drives significantly increased stakeholder value. Establish, or if one exists become a participant of, a strategy council to examine enterprise-level issues of strategy, politics, priorities, and funding.
Set up a business council to define priorities, oversee projects, and measure (and communicate) project success across business units. This group must, of course, have the courage to cancel projects when that becomes necessary; not everything that starts must finish. Put together a technical council to develop guidelines and principles for technology standards and practices.
These are three very different organizational constructs, and while there may be some overlap in terms of participation, the mission of each is mutually exclusive.
10: Quantify the value/benefit proposition for IT investments
Why do we do what we do? That's not an existential or rhetorical question. IT exists to provide value, to participate in the achievement of organizational strategic goals. How can we prove we’ve done so? Just because we've built a thing, that doesn't mean much. Does the thing work? Does the thing provide value? Is that value measurable and consistent with the corporate mission?
Some quantifiable benefits of IT work can be improved operating efficiencies, enhanced personal productivity, enhanced decision quality, and/or enabling or supporting organizational strategic initiatives. What's most critical is to ensure the credibility of any measurements used to justify IT investments and provide after-the-fact valuations.
You may be working on a project that will reduce a process from five person-days' worth of work to two. Does that mean three people are going to be fired, with the resulting compensation cost saving attributable to your project? Probably not. Those folks will most likely be reassigned, so don't take credit for expense reductions that aren't going to happen.
READ MORE - 10 ways to effectively estimate and control project costs

Twitter hits 5 billion tweets

Former Current Media executive Robin Sloan appears to have posted Twitter's 5 billionth tweet, in the form of a reply to another user that otherwise read only "Oh lord."
A third-party app called Gigatweet has been measuring the service's total tweet count for some time now, and last week some onlookers picked up on the fact that it was getting awfully close to five billion. That said, Twitter's engineers have bumped up this number at least once or twice, and who knows how many test tweets were sent out in the company's early days.
But Sloan's tweet, which he has nicknamed "The Pentagigatweet," does get at least some landmark status because it actually has the number 5,000,000,000 in the URL. That's because the number at the end of a tweet's URL is apparently the running count of tweets that have been posted until that point. We've e-mailed Twitter co-founder Biz Stone for more information and will update if and when we hear back.
It's sort of fitting that Twitter's 5 billionth tweet came not from one of the celebrities or marketers who have flooded the service in recent months, but from one of the quirky Bay Area dot-com nerds who formed its first loyal pack of users.
Sloan, who lives in San Francisco, recently departed his gig at Current--which is headquartered only a few blocks away from Twitter's own home base in the South of Market neighborhood--to write a novel, "Mr. Penumbra's Twenty-Four-Hour Book Store," which he is funding through creative-microfinance site Kickstarter.
He may have just gotten a convenient leg up in publicity.
Meanwhile, some third-party observers have been remarking that Twitter's rapid growth may be slowing down. The company recently raised another round of funding at a valuation somewhere in the neighborhood of US$1 billion.
READ MORE - Twitter hits 5 billion tweets

Windows 7 business standard by next year

In just 12 months, Windows 7 will become the standard operating system for business PCs, analysts believe.
Currently, eight out of 10 new PCs run Windows XP, nearly eight years after the operating system was first released. By next year, that figure could drop by more than half, according to new research from industry watchers Forrester.
"Today, Windows XP is installed on four out of five new PCs. When we asked IT professionals to forecast their anticipated new PC deployments within 12 months from now, we discovered that Windows 7 will already be the primary OS deployed, with Windows Vista shrinking from 15 per cent to 10 per cent and Windows XP shrinking from 81 per cent to 34 per cent," Forrester's report said.
The research, which surveyed North American and European SMEs and enterprises, found that the majority of businesses using Windows XP plan to migrate straight to Windows 7 and without migrating to Vista first.
Of the companies polled, 61 per cent said they'll jump from XP to Windows 7, while just seven per cent said they plan to make a stop at Vista on the way.
Read more about Windows 7 will be business standard by next year at Silicon.com
READ MORE - Windows 7 business standard by next year

Five steps firms should make before Win 7 move

It is "nearly inevitable" that businesses will migrate to Microsoft's upcoming operating system Windows 7 and they must examine some key issues before making the move, according to Gartner.


In an advisory released Friday, the research firm outlined five key areas companies should evaluate to prepare their migration to the new OS when it is officially launched Oct. 22.

1. Plan to be off Windows XP before end-2012.

Enterprises that had skipped Vista should plan to be leave Win XP by the end of 2012. According to Gartner, while Microsoft will support Win XP with security fixes until April 2014, past experience has shown that independent software vendors (ISVs) will stop testing their apps much earlier. In fact, ISVs will start limiting their support for Win XP after 2011.

"New releases of critical business software will require Windows 7 long before Microsoft support for Win XP ends," Steve Kleynhans, research vice president at Gartner, said in the report. "Organizations that get all of their users off Win XP by the end of 2012 will avoid significant potential problems."
2. Start work on migration plans now.
A typical organization requires 12 to 18 months of waiting, testing and planning before it can start deploying a new client OS, Gartner explained.
There is also a lot of work to be done in preparation, and delays in getting started will only result in added costs later.
3. Don't wait for Windows 7 SP 1.
Gartner cautioned against waiting Windows 7 Service Pack 1 to begin testing and deploying the new platform, noting that many companies are likely planning to wait until SP1 ships before starting their tests. Instead, organizations should start work now, especially if companies have skipped Windows Vista.
4. Don't skip Windows 7.
In May 2009, Gartner advised companies to skip Vista and wait for Windows 7, because the latter has important features that Vista did not have. These new tweaks will help improve organizations' abilities to deploy the new platform.
The research house pointed out that Windows 7 is not a major architectural release, focusing mainly on "polishing" or fine-tuning the OS, and builds on the deeper "plumbing" changes Microsoft made in Windows Vista. However, Windows 7 carries improvements in memory management that will allow users to have a better experience than that on Vista, explained Michael Silver, vice president and analyst at Gartner.
5. Budget with care.
Companies should plan their budgets carefully as migration costs vary significantly, depending on the OS the organization has. Gartner estimated that it will cost US$339 to US$510 per user to move from Windows Vista to Windows 7, and US$1,035 to US$1,930 per user to move from Windows XP to Windows 7.
READ MORE - Five steps firms should make before Win 7 move

Seven fundamentals of IT project success

These points cover areas such as conflicting agendas, multiple perspectives as well as a range of business-oriented conditions.

Many folks think large projects usually fail for technical reasons--software that doesn't work as advertised, bugs, and so on. In reality, that's not the case.
In my experience, the most serious project issues come down to misplaced expectations among participants. Fundamentally, problems in human communication lie at the root of most failures.
These expectation and communication mismatches are difficult to detect systematically, because they aren't quantitative or technical in nature. Failures persist despite fancy project management methodologies, precisely because traditional approaches do not isolate and address hidden problems.
These seven points of project success touch on conflicting agendas, multiple perspectives, and a broad range of business-oriented conditions that drive projects to succeed or fail:
  1. Business case
  2. Stakeholder and user engagement
  3. Executive sponsorship
  4. Third-party relationships
  5. Project management
  6. Change management
  7. Resource availability
The Project Failures analysis
It's tempting to dismiss these points as obvious or to believe your projects have few problems in these areas. However, successful project managers dig deeper than that. For example, how do you really know that sufficient executive sponsorship is present? If you only asked one or two stakeholders, then your opinion may well be incorrect.
To gauge sponsorship accurately, you must gather perceptions across the project. After all, someone reporting directly to the CIO may have quite a different view than one working 1000 miles away who has never even met the sponsor.
Please do not ignore these seven fundamentals, thinking they are too "simple"; or do not apply to your work. They really are that important.
READ MORE - Seven fundamentals of IT project success

Investors see IBM outside tech's recovery

Try as it might, IBM couldn't make a convincing case to investors that it's reaping the benefits of the economic recovery that is lifting other tech companies.
Even as it raised full-year profit forecasts and beat expectations for third-quarter earnings, IBM also reported a decline in contract signings, a yardstick of future business. Investors and analysts took the decline as a signal IBM isn't seeing demand rebound as quickly as some other tech providers. "The signings number was pretty low," says Brian Marshall, an analyst with Broadpoint AmTech Research in San Francisco. "Today's announcement will be viewed as a disappointment."
Armonk (N.Y.)-based IBM said new signings dropped 7 percent to US$11.8 billion. Marshall expected signings of US$13 billion.
The shortfall overshadowed other more upbeat numbers, including a 14 percent rise in profit to US$3.2 billion. IBM also boosted its forecast for 2009 profit to US$9.85 a share. That was the third increase in the forecast, which in January was at US$9.20.
The higher forecast is in line with a seasonal bump that IBM and other large IT companies see in the second half, Marshall says. "IBM usually grows sequentially 16 percent in the December quarter," as corporations and governments use up unspent money in their IT budgets, Marshall says. "The Street wanted more. We need to see the whites of the eyes of the recovery." Marshall gives IBM a neutral rating with a 126 price target. Shares fell more than 3 percent in extended trading to 123.46. During the regular trading day on Oct. 15, IBM stock slipped 0.3 percent to 127.98.
Outsourcing deals
In the fourth quarter, IBM expects to reverse the trend of declining revenue that's been in place for a year, Chief Financial Officer Mark Loughridge told analysts on a conference call, though he didn't specify by how much. Analysts expect sales of US$26.8 billion for the fourth quarter and US$95 billion for the full year.
Loughridge also said he expects double-digit growth in IBM's outsourcing services business in the fourth quarter. He said three deals worth a combined US$1 billion were signed during the first two days of October just after the close of the quarter. "These three deals would have pushed our outsourcing growth rate to 16 percent," consistent with the 18 percent growth seen during the first half of the year. The company signed US$6.7 billion in outsourcing deals in the third quarter, an improvement of 1 percent.
Sales fell in IBM's hardware and services businesses and were flat in its software division.
The company achieved margin growth through cost reductions and sales of higher-margin products such as software. Gross margin, which measures profitability, widened by 1.8 percentage points to 45.1 percent. Loughridge said the shift to software accounted for almost one-third of the margin improvement. IBM has grown its margin in 20 of the last 21 quarters, Loughridge said. The company's results were also helped by a weak dollar, which contributed nearly half of an 11 percent drop in expenses.
More cost-cutting
During the conference call with analysts, Loughridge was asked by Bernstein Research analyst Toni Sacconaghi whether he believes corporate IT spending has recovered in light of what he called "tepid guidance." Loughridge said IBM is seeing an overall stabilization in the economy. He reiterated that the revised earnings-per-share (EPS) guidance reflects "the confidence we have going into the fourth quarter."
Loughridge also hinted at further savings and acquisitions. He said IBM expects to finish the year having reduced operations expenses by US$3.5 billion. The company cut 9,000 jobs in the first half of the year. Loughridge suggested more cost cuts could be in the offing. "We've got a lot of opportunity to continue to reduce structure and make our business more efficient," he said.
He suggested IBM will consider returning some cash to shareholders in the form of share buybacks and perhaps another boost in its dividend, currently at US$2.20 a year.
IBM has in recent years shifted the majority of its business away from computing hardware and toward services such as advising large companies and governments on their technology operations, and incurring more stable recurring revenue. "Hardware sales follow industry trends, but that's now less than a quarter of IBM's business," says analyst Bob Djurdjevic of Annex Research. "With services during flush times IBM helps companies grow, and during tough times it helps companies save."
READ MORE - Investors see IBM outside tech's recovery