A server virtualization project success story

Westminster College's VMware ESX-based server consolidation project is in place and running very smoothly. Its CIO details their solution and discusses why it's a success.

Westminster College's recently completed virtualization project is the second part of what began quite a while back as an ad hoc way to retire some critically aging servers.
The servers were still hosting Web applications that we were in the process of phasing out; as such, we didn't want to buy new servers and completely redeploy those services, so we put into place a couple of VMware ESX (3 & 3.5) servers and used PlateSpin's physical-to-virtual (P2V) software to remove the potential of hardware failure from the equation.
The beginning
In early 2007, shortly after my arrival at Westminster College, it became apparent that my plan to phase out an existing portal application was going to take a whole lot longer than I had hoped.
The supported services were intertwined in many different processes; in fact, three years later, we're still running one of the applications in production, but it's the last one.
Supporting this portal application were a couple of really, really old servers that were well past their warranty expiration date. On top of that, completely redeploying the portal application was one of the last things I wanted to do since it was only tenuously held together, and the people that had implemented the solution were long gone and had left behind only basic documentation.
I also wanted to reduce the number of servers we were running in our small data center; even the older servers were running at only a fraction of their capacity but still needed to be replaced on some kind of cycle and be plugged into electrical outlets consuming power.
The desire to move to newer hardware without breaking the bank, reduce electrical consumption, and not have to redeploy all of our existing services led to the phase one virtualization rollout. Once we had that solution in place, we ran in that configuration for a while. Over time, we virtualized a number of newer servers as well, also using the P2V method. As new services were brought on line, we generally deployed them on one of the virtual hosts.
The hosts were simple containers to house virtual machines and were not connected to a SAN; all of the storage was local. That said, these were Westminster's first steps into VMware and, they accomplished the necessary goals at the time.
Next steps
Over the years, I've become a big believer in the "virtualize everything whenever possible" motto. The great success of the first phase led me to decide to expand virtualization to encompass everything that we could, but I wanted to do so in a much more robust way.
Our initial foray did not implement any availability methods, which was fine for the purpose, but as we moved into our "virtualize everything" mode, we needed SAN-backed ESX servers and a bit more robustness. To achieve our availability goals, we wanted to make sure that we didn't have any single points of failure.
To that end, everything is redundant, and we've deployed more servers than are necessary to support our current virtual workloads. We have room for growth, which we will need.
Again, we're a small environment, so the architecture is pretty simple, but here's what we have:
  • An EMC AX4 SAN--iSCSI, dual controllers, 12 x 300 GB SAS + 12 x 750GB SATA. Fully and 100 percent redundant.
  • 3 x Dell M600 blade servers, 2 x 2.66 GHz Quad Core Intel Xeon processors, 32 GB RAM each, 6 NICs each (chassis houses 6 x Dell M6220 switches--1 for each NIC in each server)
    • 2 x NICs for front-end connectivity
    • 2 x NICs for connectivity to AX4 (iSCSI)
      • Each of these is connected to a separate Ethernet switch.
      • Each NIC connects to a different storage processor on the AX4.
      • Each storage connection resides on a different physical network card.
    • 1 x NIC for vMotion
    • 1 x NIC for Fault Tolerance
We're running 28 virtual machines across these three hosts. Of the processing resources we have in this three host cluster, we're using, on average, about 10 percent of the computing power available to us (Figure A), so there is plenty of room for growth, and we have no worries about performance if one of the physical hosts fails.
On the RAM side, we're using just over 30 percent of the total RAM available in the cluster, but I think we can bring that down by paying more attention to how individual virtual machines are provisioned (Figure B).
Figure A

We're using about 10 percent of our computing resources. (Click the image to enlarge.)
Figure B

We're using a bit over 30 percent of the RAM resources of the cluster. (Click the image to enlarge.)
In figures A and B, note that there are two periods during which we experienced a problem with vCenter that affected statistics gathering. Also, while each machine has 32 GB of RAM, one of our hosts has Dell's RAM RAID capability turned on, which helps protect the host in the event of a RAM problem. As a result, that server reports only 24 GB of available RAM.
Due to having host-level redundancy, we'll be disabling this feature during a maintenance window in order to have the benefit of the full 32 GB of RAM.
In Figure C, you'll see a look at the full infrastructure. The 50 and 51 are simply internal identifiers.
Figure C

The whole ESX environment. (Click the image to enlarge.)
This summer, we'll make some changes to our environment to increase overall availability, including:
  • A migration from our single-server (physical) Exchange 2007 system to a multi-server (virtual) Exchange 2010 environment. The only service that will remain physical is unified messaging.
  • We're using SharePoint for a lot of stuff, including our public-facing Web site. Our existing SharePoint environment consists of two servers: a dedicated database server and the MOSS server running the other components. As we explore SharePoint 2010, we'll more likely than not migrate away from the physical SharePoint infrastructure as well.
Even if I have to add additional ESX hosts to support newer initiatives (though I don't think I will), the availability advantages are too great to ignore.
The virtualization project at Westminster exceeded all of my original goals. We've been able to very easily extend the life of aging applications, reduce power consumption, increase availability, and make a huge dent in the budget for equipment replacement in the data center.
READ MORE - A server virtualization project success story

Sunderland is Facebook capital of Britain

LONDON - Sunderland has emerged as the Facebook capital of Britain.
Dr Aleks Krotoski, presenter of BBC2 series The Virtual Revolution, discovered that the northern city's online population is 24 per cent more likely to visit the social networking site to chat with pals or play online games than the UK's national average, reports the Mirror.

There are more than 23 million users Facebook users in Britain.
Sunderland band The Futureheads has attracted 8,100 fans to their Facebook page, while 5,350 users have signed onto the city's World Cup 2018 site.
Durham, Oldham, Manchester and Llandudno are other cities in the top 10 Facebook users list.
The bottom 10 were in Scotland, with Kilmarnock and Motherwell having the fewest users.
READ MORE - Sunderland is Facebook capital of Britain

Reunion dinners are like project work

Merry metal Tiger year! If your family observes the lunar new year, you would have had the traditional reunion dinner over this past weekend.

As we trickled back to work after the long weekend, my friends and colleagues exchanged updates about our family reunions--always an enthralling experience for most, to say the least. Don't even get me started on mine.

It's a pity that as we grow older, family gatherings seem to get fewer and reduced to just the obligatory once-a-year lunar reunion. And even that annual get-together may seem like a chore for some.

I thought about why that happens and saw some parallels with the corporate world.

When we were eager young kids and our parents themselves young, family gatherings were fun, enjoyable and an opportunity for the children to talk about a new toy and the adults to exchange parenting tips.

As time passes and the children turn into adults, and the adults turn into jaded retirees, our families become more nuclear and we no longer see value in family reunions. Host families grumble about having to spend money and time prepping reunion dinners that everyone else gets to enjoy at their expense.

Even potluck dinners become a chore, where some insist on showing up empty-handed and the others complain about why they were allowed to show up empty-handed.

And yet, in every family, there's usually a "campaigner", the one who's most enthusiastic about getting everyone together and who evangelizes the importance of blood ties. And let's not forget the "black sheep", the loafer who celebrates sloth and gobbles down family feasts without chipping in a single cent, or sweat.

As I thought about it, I realized I'd just described the corporate environment in some places, particularly in relation to project work that involves several team members across the various departments in the company.

At the start of a project, everyone's fresh and new, and super enthusiastic about getting the ball rolling...keeping the ball rolling, however, is another matter altogether.

There's also usually a project lead, the one who spearheads the collaborative effort and urges everyone to do their part and meet deadlines. And there're the black sheep, the ones who sit back, relax, and let others do most of the work, and still expect to be credited when the project is completed. And there are then the disgruntled lot who do their share of the work and get upset when their teammates don't.

Whether it's about reunion dinners or IT projects, it's very easy to get complacent, get caught up in the daily grind and lose sight of the key objective.

In any project work that involves multiple parties, and personalities, there will always be differences--be it in aptitude or attitude. It's not practical to expect everyone to have the same level of competency and to get along all the time, every time.

Over the years, I've on occasion handled tasks that should have been under another's purview. I did so for various reasons, whether it was because they couldn't cope with the load or weren't able to manage the work.

When I get asked why I'm helping to cover "someone else's ass", I usually reply that my main goal is always to get the job done. It's about identifying the bottlenecks and helping to fix the clogs so that at the end of the day, the job gets done and the key objective is achieved. So I don't really care if someone's behind is exposed...maybe he needed some fresh air...

Reunion dinners are like that, aren't they? Ultimately, the goal is to renew family ties and help ensure our kids will still be able to recognize their relatives when they meet them on the streets.

Sure, not everyone is going to contribute the same portion of money or sweat or time. But those who do give should do so without expecting any fair returns, and those who don't should be accepted as they are, imperfections included.

After all, isn't it better to live life without grudge?
READ MORE - Reunion dinners are like project work

Judge in Google Books case says no ruling Thursday

U.S. Federal Judge Denny Chin kicked off the much-anticipated Google Books hearing Thursday morning by making one thing clear: there will be no quick ruling in the case.
"I'm going to say right off, I'm not going to rule today," Chin said, highlighting the droves of written submissions that have come in from passionate parties on all sides of the case. "I'm going to listen to opinions carefully and I'm going to ask a few questions."
And that he's already done in morning testimony, which has been going on for more than three hours before a crowded courtroom and overflow room. For example, he interrupted an attorney testifying for Microsoft by pressing him on why Sony, a competitor, is all right with the proposed settlement, but Microsoft isn't.
Some of the other groups and businesses that have testified so far include the Electronic Frontier Foundation, Amazon, the National Federation of the Blind, and the Center for Democracy and Technology.
The proposed settlement being debated in the case--reached between Google and the Authors Guild and Association of American Publishers--would allow Google to partially display in-copyright but out-of-print books alongside books authorized by publishers and public domain works in Google Books.
The deal would make Google the only organization in the U.S. explicitly authorized to make digital copies of out-of-print yet copyright-protected books, much to the dismay of many authors and privacy advocates.
Potential competitors, however, object to the unique rights that Google has been granted based on the class action status of the lawsuit, with authors upset over Google's decision to scan their works without asking permission. Privacy advocates fear corporate oversight of what books people are reading. And the U.S. Department of Justice has twice expressed "antitrust" concerns over the proposed settlement.
Stay tuned for more coverage from the hearing.
READ MORE - Judge in Google Books case says no ruling Thursday

Dell earnings: Enterprise spending rebounds

Dell reported a better-than-expected fourth quarter as enterprise sales rebounded. The company said that it was "cautiously optimistic” that commercial IT spending will improve throughout the year ahead amid “ongoing signs of stabilization".
Dell reported fiscal fourth quarter net income of US$334 million, or 17 cents a share, on revenue of US$14.9 billion, up 11 percent from a year ago. That sales tally was US$1 billion more than Wall Street expected. Non-GAAP earnings were US$544 million, or 28 cents a share, a penny ahead of Wall Street estimates.
For fiscal 2010, Dell reported net income of US$1.43 billion, or 73 cents a share, on revenue of US$52.9 billion, down 13 percent from a year ago (statement).
Read more of "Dell earnings: Enterprise spending rebounds" at ZDNet.

READ MORE - Dell earnings: Enterprise spending rebounds

Yahoo and Microsoft get search deal OK

Both the US Department of Justice and the European Commission have said there are no issues to prevent Microsoft and Yahoo going ahead with their search deal.

The 10-year deal makes Yahoo the salesforce and Microsoft the search platform for their combined search. The deal was announced in July 2009 and is expected to start being implemented in the next few days. All global customers and partners are expected to have been moved to the combined system by early 2012.

The deal is aimed at taking market share and revenues from Google, which currently has around 75 percent of the market worldwide.

Read more of "Yahoo and Microsoft get search deal OK" at ZDNet.
READ MORE - Yahoo and Microsoft get search deal OK

Chip-and-PIN flaw to be investigated by industry body

The body that oversees the technology behind chip-based payment cards is to investigate chip-and-PIN security, following claims that the protocol has been broken.
The specification body, EMVCo, said it will analyze a paper by researchers from Cambridge University, who demonstrated an attack with a valid payment card that did not require a valid PIN to be entered to complete a transaction.
EMVCo, owned by American Express, JCB, MasterCard and Visa, said those debit- and credit-card payment companies will also scrutinize the paper.
READ MORE - Chip-and-PIN flaw to be investigated by industry body

Tablet commandments for PC makers

The Apple iPad is expected to help rejuvenate the tablet industry, with other PC makers also looking to cash in with their own slate products. But, what will it take to emerge leaders in a market that is gradually getting crowded with players eager for a slice of the pie?
According to research firm In-Stat, the global table market is projected to move some 50 million units in 2014, with Apple's upcoming slate expected to inject an additional US$4.1 billion into the semiconductor industry.
PC manufacturers such as AsusTek, Hewlett-Packard, Dell Computer and Micro-Star International (MSI) also announced plans or revealed prototypes at last month's Consumer Electronics Show (CES) in a bid to assume first-mover position in the nascent market segment.
However, what will it take for these players to succeed in this space? ZDNet Asia spoke to industry watchers and players who highlight key factors that should go into designing and producing tablets to stand out against the competition.
Identify target audience
According to Lilian Tay, principal analyst of client computing markets in Gartner's technology and service provider research group, the key consideration is the target market.
"As the slate or tablet [market] is still evolving, there are many sub-market usages that manufacturers can consider," Tay said in an e-mail interview. She added that while design specifications such as battery life or user interface are key factors, manufacturers can better cater their designs if they know which user segments the device will have more demand in.
In the education market, for example, manufacturers would have to offer devices that are more rugged to cope with the faster wear and tear due to increased usage, she explained.
"[Manufacturers] need to demonstrate how the device can fit into the lifestyles of people who already own a smartphone and a notebook," Tay said.
Determine form factor
Phil McKinney, HP's vice president and CTO of personal systems group, noted in a recent video interview his company looks to address issues on form and function. "[What's] the right size, what's the right form factor, and what are the capabilities [this tablet should have]," he explained.
McKinney added that HP had been exploring the possibility of producing a tablet PC for the past five years and actually "built physical hardware"--about 60 units--which were distributed to consumers to gather feedback.
Based on results from this exercise, he characterized this evolving product segment, in terms of size, as: "North of what a smartphone is, and smaller than a netbook and notebook."
Content is king
If there is one thing Apple's App Store and Amazon's Kindle e-reader demonstrated, Gartner's Tay noted, it is that manufacturers with access to the most content will generate demand for their devices.
AsusTek Computer is one such manufacturer that is exploring ways to "link or enrich content" with any potential tablet device it plans to introduce.
"How to develop or cooperate with content providers will be important when considering the software and hardware that will be embedded in a tablet device," said Jessie Lee, global public relations lead from AsusTek's marketing planning division.
McKinney concurred, noting that consumers are not simply looking for a dedicated device such as the Kindle or Sony's Reader, but one that capable of providing "immersive kind of experiences".
"[Users tell us] 'I want to browse. I want to be able to watch my movies. I want to be able listen to my music. I want to be able to read magazines and do books.' So these devices [should not only have] reading capabilities, but also give you that rich media," he said.
User interface gaining importance
In an earlier interview with ZDNet Asia, Dell's Asia-Pacific general manager for consumer business Ian Chapman-Banks, said the "physical keyboard experience", which is absent in touchscreen mobile computing devices such as the iPad, remains a key element for users.
This, and other user interface (UI) elements such as a capacitive touchscreen that iPad and Fusion Garage's JooJoo device feature, are key considerations prospective tablet makers will have to consider, noted Tay.
READ MORE - Tablet commandments for PC makers

ARM, Globalfoundries outline 28nm SoC platform

ARM and Globalfoundries have released details on their upcoming system-on-a-chip (SoC) platform, which they say will combine "PC-class performance" with the portability and longevity of smartphones.
The companies revealed the details on Monday at Mobile World Congress in Barcelona. Their SoC platform is based on ARM's Cortex-A9 processor and Globalfoundries' 28nm manufacturing process, and will be used in smartphones, tablets and smartbooks.
The 28nm manufacturing process will allow devices built on the platform to have 40 percent more computing performance, 30 percent less power consumption and 100 percent greater battery life than devices built using existing 45nm processes, the companies said.
Read more of "ARM, Globalfoundries outline 28nm SoC platform " at ZDNet UK.
READ MORE - ARM, Globalfoundries outline 28nm SoC platform

Body scanners may be illegal, says rights watchdog

The use of full-body scanners at airports could break UK laws on discrimination, race relations and privacy, the government equality watchdog has warned.
The Equality and Human Rights Commission (EHRC) said in a statement on Tuesday that the government needs to take immediate action over the scanners, which allow airport security personnel to view travellers as though they were naked.
The statement follows a request from EHRC to home secretary Alan Johnson in January, seeking justification for the government's profiling and body-scanning plans.
Read more of "Body scanners may be illegal, says rights watchdog" at ZDNet UK.
READ MORE - Body scanners may be illegal, says rights watchdog

RIM to offer free BlackBerry Enterprise Server

Research In Motion will soon begin giving away a free version of BlackBerry Enterprise Server.

BlackBerry Enterprise Server (BES) Express will be made available as a free download in March, RIM chief executive Mike Lazaridis announced on Tuesday in a keynote speech at Mobile World Congress.

BES Express, which comes with free client licences, is server software that makes it possible to synchronise BlackBerry smartphones with Microsoft Exchange or Windows Small Business Server systems.

Read more of "RIM to offer free BlackBerry Enterprise Server" at ZDNet UK.
READ MORE - RIM to offer free BlackBerry Enterprise Server

Microsoft Issues Fixes for Outlook 2010 Beta Bug

Microsoft on Thursday announced fixes for its Outlook 2010 beta to address an e-mail message bloat problem.
The bloat stems from the use of numbered and bulleted lists in e-mail messages. Outgoing messages containing such lists will bulk up with "redundant CSS definitions." Consequently, mail services that limit the size of incoming messages may not display the messages correctly, according to Microsoft's Outlook 2010 blog.
Patches for both the 32-bit and 64-bit versions of the Outlook 2010 beta can be accessed here.
The fixes actually apply to Microsoft Word 2010 beta and are also available via the Microsoft Download Center as Knowledge Base article KB980028. Microsoft's Jeanne Sheldon explained in the Outlook 2010 blog that "though the problem is most readily manifested in Outlook, the root cause is in Word 2010."
Applying the patch will eliminate the bloat caused by future incoming messages, but it does not fix e-mails already received. Microsoft recommends running Outlook 2010's "conversation clean up" feature to reduce the mailbox size. The clean up feature removes older message threads while retaining the most current message in a thread, as described in this blog.
Microsoft fixed this bug in the release candidate version of Office 2010, according to a blog. However, the general public doesn't have easy access to the release candidate version as it was privately released to testers earlier this month.
The Office 2010 beta can run on the same hardware that was used to run Office 2007, according to Microsoft. Older versions of Office do not have to be removed before installing the beta, but Microsoft adds a couple of caveats. First, the Outlook 2010 beta does not play well with other versions of Outlook installed on the same machine. Second, SharePoint Workspace (formerly known as "Microsoft Office Groove 2007") does not coexist with earlier versions.
Microsoft recommends selecting the "custom installation" option of the Office 2010 beta installation program if users want to exclude Outlook 2010 and SharePoint Workspace to avoid conflicts with older versions of those programs. Otherwise, the "express installation" option will add those applications.
Important details to know before installing the Office 2010 beta are described in this blog.
Office 2010 is expected to become generally available in June. Betas for a number of Microsoft's 2010-branded products were released in November.
READ MORE - Microsoft Issues Fixes for Outlook 2010 Beta Bug

Microsoft Affirms BSOD, Halts Windows Patch

Redmond is once again looking into chatter about Microsoft security patches causing "screens of death."
This time the patch in question (MS10-015) was for a long-unaddressed Windows kernel bug that could enable elevation-of-privilege control by an attacker. The patch, which was contained in Tuesday's mammoth security update, was based on a security advisory that Microsoft released in late January.
According to this discussion thread on a Windows forum page, when Windows XP users applied the kernel patch, all they got was blue screens after they restarted their operating systems. Some users had to reopen Windows in "safe mode," while others simply got blue screens followed by error messages, according to comments on the thread.
The screens-of-death complaints in the forum thread reflect the experiences of XP users. However, Microsoft described its patch as important for Windows 2000, Windows XP, Windows Server 2003, Windows Vista, Windows Server 2008 and Windows 7 for 32-bit systems. The Windows kernel exploit has been present in all 32-bit Windows versions since Windows NT, which means the bug has been accessible for about 17 years.
Microsoft admitted in a security blog that restart issues are associated with its MS10-015 patch, and that malware on a system can cause the problem. To that end, many in the security community believe that a rootkit may be blocking the patch installation and triggering the instances of "blue screen of death" (BSOD) shutdowns.
"The possibility that the reported BSOD problems, associated with the recent Microsoft patches, are related to a malware rootkit makes a lot of sense," said Andrew Storms, director of security operations at nCircle. "As a result of their extensive quality control and testing processes, Microsoft has a terrific track record of releasing solid patches. No one expects Microsoft to test installing patches on a system that already contains malware though."
Because of the snafu and pending investigation, Microsoft has temporarily pulled security bulletin MS10-015 from automatic release through Windows Update. However, the patch still remains on Microsoft update sites for administrators to download and test.
"This issue with the patch is a prime example of why administrators should test each and every patch they deploy them to their systems," said Jason Miller, data and security team leader for Shavlik Technologies. "Microsoft tries to ensure the functionality of each patch, but it cannot be guaranteed with so many different systems and scenarios that are affected by the patch."
For those with the BSOD problem, the Windows forum moderator for Microsoft, Kevin Hau, suggested that users "boot from your Windows XP CD or DVD and start the recovery console." Hau then referred Windows users to this Knowledge Base article for more details on how to reboot safely.
READ MORE - Microsoft Affirms BSOD, Halts Windows Patch

Microsoft Releases Anti-Piracy Update for Windows 7

Microsoft plans to release an update for Windows 7 today to counter software piracy.
The new Windows Activation Technologies Update continues anti-piracy technologies initiated with Microsoft's Windows Genuine Advantage program in 2006. In a Windows blog post, Microsoft's Joe Williams described the update, which validates Windows 7 copies, as a means of keeping customers' PCs secure.
"The update will determine whether Windows 7 installed on a PC is genuine and will better protect customers' PCs by making sure that the integrity of key licensing components remains intact," wrote Williams, who is general manager of Genuine Windows.
Windows Activation Technologies will detect more than 70 known and potentially dangerous activation exploits.
Initially, the update will run validations every 90 days, at which time Windows will download the latest "signatures" -- similar to an anti-virus service, according to Williams. On computers running authentic software, the update will run in the background and will not be noticeable to users.
However, if the core licensing files have been tampered with, or are disabled or missing, the update will run a check and repairs weekly. In addition, "periodic" dialog boxes will pop up that offer two options: get more information or acquire a legitimate copy of Windows. The update will add reminders for users of nongenuine Windows 7, including changing the desktop to a plain background with a watermark. Desktop icons will be left intact after that change is made, according to Microsoft.
The update will be distributed first to the Home Premium, Professional, Ultimate and Enterprise edition users. It will be available online at www.microsoft.com/genuine starting Feb. 16 and on the Microsoft Download Center beginning Feb. 17. It also will be offered as an "important" update on Windows Update later this month.
Enterprise customers will be able to import the update into Windows Server Update Services through the Microsoft Update catalog.
Williams stressed that the update will not reduce Windows functionality. Customers who choose to not install the update will continue to have access to benefits afforded to genuine Windows users if the copy they are running is genuine, according to a Microsoft spokesperson by e-mail.
Compared with Microsoft's Genuine Advantage effort, this update is notable in that it tries to get customers to contact Microsoft for validation, said Michael Cherry, an analyst with Directions on Microsoft, in a telephone interview.
"When they first started this, there were some definite issues in the way in which it worked," Cherry said. "If it felt you did not have a valid copy, it might reduce the functionality until you showed you had a legitimate copy."
This update also varies from previous efforts because it includes both activation and validation components.
In a ZDnet blog posting, Ed Bott wrote that he gave Microsoft an "F" for its Windows Genuine Advantage efforts in 2006 and 2007, followed by a C+ in 2008. More recently, he wrote, activation issues have become a nonissue, and false-positive reports are practically nonexistent.
The update aims to shut down counterfeiters who sell fraudulent Windows 7 packages to consumers. According to Cherry, few businesses intentionally purchase counterfeit software or computers with pirated software. The update will alert them if they do.
Privacy advocates have criticized Microsoft for its past anti-piracy efforts, particularly requiring the Windows Genuine Advantage update for Windows XP users in 2006. Now the update is voluntary and users can decide not to install it. It also can be uninstalled at any time.
Williams stressed that the update will not jeopardize privacy. "The information we receive from PCs during these checks does not include any personally identifiable information or any other information that Microsoft can use to identify or contact you," he stated.
Cherry said that Microsoft is trying to balance making sure that people have legitimate copies against being obtrusive.
"Microsoft is working very, very hard to not collect any personally identifiable information about you in this process," Cherry said. "Piracy is a big enough problem that for a relatively modest investment, this is going to give them returns."
READ MORE - Microsoft Releases Anti-Piracy Update for Windows 7

Verizon CTO says 4G service is on track

Verizon Wireless is on track to offer its 4G wireless service later this year, the company's chief technology officer said here Monday.
Dick Lynch, an executive vice president and CTO for Verizon Communications, said during a press conference here that Verizon Wireless is on track to launch its commercial LTE (long-term evolution) service this year. The gathering was hosted by the GSM Association, which puts on the Mobile World Congress.
Lunch said Verizon Wireless is in the final testing phase, or "Phase 4," of its LTE technology. Within 60 days he said he expects testing to be completed in Boston and Seattle. After those trials are complete, Verizon will be ready to announce commercial deployments.
Verizon announced its plans to launch its LTE network in 2010 at the Mobile World Congress last year. The company has said previously that it will launch the service in 25 to 30 markets throughout the United States by the end of 2010. The company is using 700 MHz that it acquired in a Federal Communications Commission auction.
Initially, Verizon Wireless will offer USB air cards that access LTE for its laptop customers. Cell phones and other mobile devices with embedded LTE will be introduced later. That said, Lynch and other executives from the European carrier Orange as well as from equipment maker Ericsson, said that LTE handsets will be introduced sooner than anyone had anticipated.
Lynch said getting voice to work over LTE has been particularly challenging. But that challenge is getting resolved as Verizon and other members of the GSMA announced Monday they are supporting a standard that uses IMS technology to deliver voice services over LTE. Still, more work needs to be done.
Until a solution is complete, Verizon will use its CDMA network to provide voice services. And the LTE network will be used for data. Eventually, when voice over LTE becomes a reality, Verizon will use that technology.
Verizon will also have to integrate EV-DO into its LTE offering to ensure that customers can switch to the 3G EV-DO network when the 4G LTE network is not available. Even though Verizon is being aggressive in building its network, it won't happen overnight.
The next bit of news consumers can look for from Verizon is pricing. Verizon hasn't yet detailed plans for how much it will charge to use the new 4G LTE service.
READ MORE - Verizon CTO says 4G service is on track

Ballmer banks on Windows Phone 7 for the future

Steve Ballmer hopes "7" will be Microsoft's lucky number as the company restarts its mobile business with the release of Windows Phone 7.
On Monday, the CEO of Microsoft and his team of Microsoft executives took the wraps off the latest version of the Windows Mobile operating system at a press conference here at the Mobile World Congress. The new Windows Phone 7 is a fresh start for the company in mobile.
"There's no question that a year and a half ago we had to rethink everything," Ballmer said.
Instead of revamping Windows Mobile software, which first came out in 2002, Microsoft decided to start from scratch. The result is a completely new look and feel to previous generations of Microsoft Windows Mobile software.
The new Windows Phone 7 is primarily designed for touch-screen smartphones. It offers graphic "tiles" that let users get multiple views of their information. The goal was to create software, and a user interface, that was much more useful--and intuitive. For example, the new software integrates data such as pictures, e-mail, music, video, and contact numbers from the phone and other places (social-networking sites or multiple music services or e-mails) into easy-to-access "tiles" or virtual buttons on the phone.
Even though Microsoft is still a leading provider of smartphone software, its market share has slipped over the past several quarters. Competitors such as Apple with the iPhone, Research In Motion with its BlackBerry devices, and now Google with Android phones, have taken share away from Microsoft.
Apple and RIM have taken a different approach to the market than Microsoft has. Those companies build both the software and hardware for their phones, which has provided them more control and some edge in terms of getting new features out across an entire product line. It's also made it somewhat easier for developers to come up with new applications for these devices.
Andy Lees, senior vice president of mobile communications for Microsoft, admitted during the press conference Monday that Microsoft had questioned its strategy of not building its own hardware and instead selling software to phone manufacturers.
"We considered a lot of different things over the past year and a half to two years," he said. "We even considered building our own phone."
Instead, Lees said the company decided that working with partners offered far more value.
That said, Microsoft recognizes the need for more hardware consistency, and it plans to work closely with hardware manufacturers such as LG, Samsung, HTC, Sony Ericsson, and others to make sure there is commonality in devices. For example, Microsoft is setting standards within its partner group for screen size. It will also require that devices use the same kind of sensing technology.
In addition, it's working with carrier partners to ensure more consistency in service offerings.
From a developer's perspective, this sounds great. But it also sounds like it limits manufacturers and carriers when it comes to how they can differentiate their products. And it's unclear how handset manufacturers--which are already struggling to differentiate their products from one another--will handle the requirements.
On Sunday, at a press event here, Sony Ericsson's CEO said the company plans to eventually become operating-system-agnostic, providing consumers with a user interface that has a look and feel unique to Sony Ericsson.
Ballmer argued that the new version of its OS will offer stricter sets of criteria for devices and services using the software, and that would ultimately lead to more innovation from its partners.
"We needed a model to raise the bar and give our partners a chance to show their unique capabilities," he said. "I think it will create a bigger pool of opportunity for everyone. And when we look back, there will be greater diversity and innovation when you work from a higher foundation instead of everyone replumbing things from the lower levels of user interface."
The new Windows Phone 7 phones are expected to hit the market in time for the 2010 holiday shopping season, Ballmer said. He also said AT&T and Orange have been selected as special carrier partners. The company plans to deliver Windows Phone 7 devices on all four major U.S. carrier networks.
READ MORE - Ballmer banks on Windows Phone 7 for the future

Adobe bringing AIR to smartphones--Android first

dobe Systems, hard at work bringing its Flash technology to mobile phones, announced Monday that it's also working on making the same move for a related programming foundation called AIR.
AIR, short for Adobe Integrated Runtime, is a foundation for standalone applications that use Flash or Web technology. Examples of AIR applications include the New York Times Reader and the TweetDeck for advanced Twitter usage.
Adobe plans to release AIR for Google's Android operating system for mobile devices in 2010, the company said at the Mobile World Congress show in Barcelona. Also at the show, Adobe announced that it's joined the LiMo Foundation to bring Flash to Linux-based mobile phones.
Adobe plans to release Flash Player 10.1 for smartphones in the first half of 2010.
The San Jose, Calif.-based company demonstrated AIR on a Motorola Droid phone, including the Tweetcards Twitter application, a "South Park"-style avatar creator, and Adobe's Connect Pro software for screen-sharing and videoconferencing.
AIR for mobile will use Flash Player 10.1, a beta version of which Adobe said was just released to partners and programmers.
"AIR leverages mobile-specific features from Flash Player 10.1, is optimized for high performance on mobile screens and designed to take advantage of native device capabilities for a richer and more immersive user experience," Adobe said in a statement. Specifically, AIR for mobile devices will support multitouch interfaces, gesture inputs, accelerometers for motion and device orientation, and geolocation for detecting position.
Flash is ubiquitous on computers but comparatively rare on mobile devices; AIR hasn't achieved Flash's penetration even on desktops. But if Adobe can persuade mobile-phone makers to support it, or persuade phone owners to install it on their own, it could open up cross-platform advantages for programmers who want the same or similar versions of a program to run on different types of equipment.
However, Adobe has its share of challenges spreading Flash and AIR to mobile devices. Although Flash Player 10.1 will run on many smartphones, it won't run on arguably the highest-profile model out there, Apple's iPhone. The absence of Flash on the iPhone and iPad has put Adobe on the defensive, and the company has begun sharing more details on its plans to improve Flash.
Motorola, whose newer Droid models of Android phones compete with the iPhone, endorsed Adobe's moves.
"We look forward to seeing AIR come to the Android platform and developers creating applications that will delight our end-users," said Christy Wyatt, Motorola's vice president of software applications and ecosystem, in a statement.
Adobe isn't giving up on the iPhone. In a blog post by Adobe's Michael Chou, the company also touted several iPhone games written with its upcoming Flash Professional CS5 Packager for iPhone software, which lets programmers write Flash applications that run on iPhones without Flash installed.
READ MORE - Adobe bringing AIR to smartphones--Android first

Apple bans hackers from App Store

After a long battle with hackers who have been successful at jailbreaking the iPhone from one version of the OS to another, Apple is now taking a more personal approach to locking down the device. It's been reported that known iPhone jailbreaking/unlocking hackers have had their Apple IDs banned from Apple's App Store.
One of those hackers is Sherif Hashim, who recently found an exploit in the latest iPhone OS version 3.1.3 that could unlock the baseband version 05.12.01. Sherif now gets a message saying that his Apple ID is banned for "security reasons" each time he tries to access the Apple's App Store.
Baseband is the component that controls the connection between the phone and the mobile network; when unlocked, it allows the phone to work with any GSM carrier. Apple tends to release updated firmware for this chip specifically in order to relock iPhones that have previously been unlocked.
The exploit hasn't been released yet, though it has been confirmed that it works by DevTeam, a group of hackers that develop methods to jailbreak and unlock Apple's handheld devices.
Other hackers have also reported that their Apple IDs have been banned. It seems that this action of Apple's is merely a warning, as these hackers can always just create another Apple ID and access the App Store that way. It would be a lot harsher if Apple decided to ban their devices.
The latest firmware of the iPhone OS, version 3.1.3, apart from relocking any jailbroken phones, doesn't seem to offer much improvement. Instead, it has a few battery and sync issues.
READ MORE - Apple bans hackers from App Store

Google CEO comes to Barcelona in peace

Google CEO Eric Schmidt extended an olive branch to wireless-network operators as he took the stage Tuesday afternoon at the GSM Association's Mobile World Congress.
Schmidt delivered his speech hours after the CEO of the world's largest mobile operator, Vodafone, suggested in his own keynote address that Google was getting too powerful in the mobile value chain. Earlier in the day, Vittorio Colao warned the telecommunications industry that companies controlling 70 percent to 80 percent of a market, such as Google in mobile search, should raise the attention of regulators.
Schmidt, whose company has had a contentious relationship with some mobile operators, did not respond to fears of monopolistic behavior. Instead, he focused on how he saw Google and the wireless industry working together to deliver services to consumers. Google wants to partner with wireless operators and application developers to make sure that consumers get a good mobile Web experience, and all the partners involved make money, he said.
"Ultimately, these businesses will succeed to the degree that they stay end-user-focused," Schmidt said. "And the best partnerships start from that, and not from dividing the industry or restricting what people do." He added that the best partnerships are also the ones in which all parties involved make lots of money serving consumers.
Google, which has been adding more sophisticated and data-intensive applications to its cache of products, has often garnered suspicion among mobile operators, as it moves further into the industry, not only with search applications but also with its focus on the Android mobile operating system and on hardware such as the Nexus One smartphone.
Google also raised eyebrows a couple of years ago, when it bid on wireless spectrum in the United States. Ultimately, Google did not win wireless-spectrum licenses and admitted, once the auction was over, that it had only bid so that the price of the wireless licenses reached a point where a special open-access provision was triggered in the rules.
Eyebrows were raised once again last week, when Google announced that it was going to launch an experimental fiber broadband network capable of delivering 1 gigabit of data per second.
But during the question-and-answer period of the presentation, Schmidt assured worried Mobile World Congress attendees that Google comes in peace. He said he disagreed with one audience member's assertion that Google is trying to make wireless operators "dumb pipe providers."
"We feel very strongly that we depend on the success of the carrier business," Schmidt said. "We need a sophisticated network for security and load balancing."
Schmidt explained that carriers' sophisticated billing relationship with customers is key. He also emphasized that carriers would offer support and education, serving as a basic platform for mobile services. Google will also serve customers, he said, but it will rely on customers sharing their information with Google to get better search results, more accurate location data, and more relevant applications.
Another point the Google CEO tried to get across: Google is not looking to compete with wireless operators.
"We are not going to be investing in broad-scale (communications) infrastructure," he said, adding that Google's fiber network trial and the company's investment in WiMax 4G wireless provider Clearwire are designed to help advance high-speed networks.
He also addressed concerns that Google is trying to limit how operators can manage their networks through its efforts to lobby for Net neutrality regulation in the United States.
"We understand at a fundamental level [that] wireless networks have constraints," he said. He went onto explain that wireless operators should not be choosing winners and losers when services are offered, and he conceded that current bandwidth constraints may require operators to move to a tiered pricing model.
"As people consume massive amounts of data, operators will be forced to tiered pricing to deal with the top 1 [percent] to 5 percent of users consuming 70 percent of the bandwidth," he said.
Even though Schmidt acknowledged that operators need to figure out ways to better manage their networks, he made it clear that they should not deny network access to bandwidth-intensive applications, point blank. Instead, he said operators need to find ways to accommodate user demand on their networks.
"We should embrace [changes in end-user behavior]," he said. "And we should figure out a way to make money from it together, instead of blocking it."
READ MORE - Google CEO comes to Barcelona in peace

Digital ID beneficial for biz credibility

Consumers would be more willing to visit a corporate Web site and transact online if they trust the company, and this comfort level can be achieved by enterprises adopting digital identification (ID).
This is the sentiment of industry watchers such as Willy Lim, the co-founder of NetProfitQuest, a Singapore-based service training provider that does certification for social media marketing. Lim pointed out that digital IDs are "important for both individuals and companies to prove that they are who they say they are online".
While he acknowledged that such identification methods are still "in the early stages", this verification standard will gain more importance as more people transact online, "especially with the rapid adoption of smartphones like the iPhone".
This perspective is echoed by Thomas Crampton, leader of Ogilvy's Digital Influence team in Asia Pacific, who said: "Knowledge about who somebody is engenders a higher level of trust when taking part in a transaction or exchange."
He backed the assertion by pointing out that beyond Facebook and its Facebook Connect service which allows its users to login to partner sites using a standardized password, others such as Twitter are following suit with features like verified accounts. And this is becoming a "trend".
Crampton responded in an e-mail that "an increasing number" of people are looking to manage their online profile, and this is indicative of the importance of having digital IDs.
However, not everyone agrees with this stance. IBM Singapore's country manager of software, Tan Jee Toon, said: "Our view is that sharing digital IDs makes sense within a community of interest, but is rarely relevant outside of that community. As such, we do not believe there will be a universal ID that can be used for all transactions on the Web."
He cited in his e-mail response the analogy of having a physical ID card that is issued by an international body for everyone, which should be trusted worldwide. Unfortunately, Tan said that this has not happened because each country has its own community of interest, which is why we have only nationally-recognized ID cards.
Digital IDs not safe enough?
Tan also noted that "user-centric digital IDs" are currently used only for accessing consumer online sites. However, enterprises are concerned about protecting confidentiality, intellectual property and business transactions. With so much of their business value at risk, organizations will demand a higher level of identity assurance, including "identity proofing and stronger authentication", which digital IDs may not provide.
He added that the legal framework does not "easily support arbitration" in cases where an enterprise is compromised due to a breach in a digital ID assigned by a third-party service provider. Tan also questioned whether it should be the company or the provider that should be liable.
"These technologies represent opportunities for the ID provider and its partners that share a community of interest. It allows them and their users to sign in once and go everywhere within the community, but that does not mean it is suitable for all enterprises," he said.
However, this was disputed by JanRain, a turnkey provider of digital IDs which stated that though the adoption of digital IDs within the enterprise space "has not picked up as quickly as on the open Web", companies such as SAP are adopting the use of OpenID. This is the decentralized authentication protocol that JanRain is using to allow companies to access Web accounts with.
The spokesperson for the company said: "By storing password information at a centralized source such as Google, Facebook or Yahoo, user information is protected by constant security audits and enhancement. The alternative is sharing a password with disparate sites that most likely will not maintain the same high level of security."
IDs won't replace company Web sites, blogs
When asked if digital IDs will take over corporate Web sites or blogs as the main tool for engaging their online audience, Ogilvy's Crampton thinks not.
"Blogs and company Web sites will serve as the first filter for finding businesses, but verifiable digital identity will help people make the final preference between two similar offers," he said. "Digital IDs will grow alongside [these platforms]."
NetProfitQuest's Lim agreed, saying that digital IDs do not give further insights into what the company is doing, its corporate culture and other such objectives and overview.
"Such understanding can only come from the company’s blog posting, social media conversations and its engagement with its online communities," Lim said.
READ MORE - Digital ID beneficial for biz credibility

PayPal explains Indian service suspension

PayPal has confirmed that its recent service suspension involving local bank transfers and personal payments to and from India, was the result of the country's revised licensing rules.
In a blog post Tuesday, Anuj Nayar, director of global communications PayPal explained that the latest incident in India was a response to enquiries local regulators posed, specifically, on whether personal payments constitute as remittance into India.
According to Nayar, who also posted a comment on ZDNet Asia's report on the issue, personal payments to and from India will be suspended for at least a few months until the company resolves questions from the regulators. Meanwhile, personal payment senders will need to find another payment method, he added.
Local bank withdrawals should be available to customers within the next few days, he said. PayPal will restore money into the accounts of Indian customers who have recently initiated withdrawals and these users will receive reimbursement for any withdrawal fee charges.
Customers with a negative balance due to PayPal's reversal of payments should contact the sender, and arrange for the payment to be resent if the amount was transacted for goods or services, Nayar explained. This can be done by selecting the "Send Money" tab and selecting "Purchase", he said.
Nayar added that only personal payments should have been reversed. Customers who believe their payments were mistakenly reversed should request for the payment to be sent again. Customers who still face problems can contact PayPal customer support.
The service suspension had evoked a myriad of comments from ZDNet Asia readers, several of whom expressed anger over how the service was terminated without notification.
READ MORE - PayPal explains Indian service suspension

Wireless@SG goes places with new engine

-The country's government-funded free Wi-Fi service, Wireless@SG, has opened a location-based engine to allow developers to build location-aware apps and services over the network.
At a press briefing here Thursday, the Infocomm Development Authority of Singapore (IDA) demonstrated the new Centralised Location-based Engine (CLBE), aimed at enabling a slew of new services such as advertising, friend-finding and searching for nearby retail outlets.
This is made available today through a new process it calls Seamless and Secure Access (SSA), where users need to download an authentication application from the IDA to get connected to hotspots marked "Wireless@SGx". This differentiates the hotspots from the traditional Wireless@SG hotspots, which require users to go to a Webpage in order to sign in.
Wireless@SGx hotspots will allow automatic logins through mobile devices and PCs, and also link user profiles to the CLBE. This allows location-aware apps and services that can call up the CLBE database to push alerts and advertisements to users.
Users will not be able to opt out from receiving these ads but can choose to continue using the traditional Wireless@SG hotspots for connectivity, without the additional apps and services, according to operators present at the briefing.
iCell Network CEO Ken Chua said the company has been able to upkeep a profitable service through advertising, and is looking to the new platform to increase advertising revenues as well as additional revenues from services provided atop Wireless@SG.
For example, Chua said, Wireless@SG may take a cut from a cashless transaction or taxi booking through the SSA.
Developers wanted for location-based apps
Alex Tan, product development head at MobileOne (M1) Connect, said Wireless@SG operators are looking to court developers to build third-party apps for their individual dashboards.
The network's three operators--iCell Network, M1 and SingTel--offer their own versions of the SSA installer, which will come with app dashboards carrying app selections unique to each operator. Users are allowed to use any of the three dashboards--or all of them, if they so choose.
As a result, Tan said the operators are looking to attract developers in order to differentiate their service offerings.
M1 has also been tasked to host and manage the CLBE, which it will make available to developers via APIs (application programming interfaces). According to M1's Web site, interested parties should apply to M1 and will be charged S$200 (US$141) per month for access to the CLBE.
Tan said the mobile operator is anticipating heavy usage to the tune of "hundreds of thousands" of API calls to the CLBE database, and has made the necessary capacity provisions on the backend.
He added that developer-submitted apps will first be screened by operators, before they go through a final approval process with the IDA, for inclusion in the app dashboards.
iCell's Chua said the operator receives 10,000 new sign-ups every month, with approximately 50 percent of users accessing its network through mobile devices.
Wireless@SG's speeds were upgraded from 512Kpbs to 1Mbps in September last year. In addition, the IDA last June committed S$9 million (US$6.3 million) in the wireless network over four years, pledging to provide the wireless broadband service for free until March 2013.
Launched in 2006, Wireless@SG encompasses 7,500 hotspots located across the island-state and supports some 1.5 million users, as of December 2009.
READ MORE - Wireless@SG goes places with new engine

Five tips for tackling a one-time project

Don't let a one-time project derail your career. An IT consultant shares tips on how to successfully manage a "once-in-a-career" event.

The CIO informs you that you've been selected to lead a special project that will impact your IT organization. When the CIO reveals the focus of the special project, you realize that you have no knowledge or experience related to the project. Everyone is watching, and many of your colleagues are not on board with the new direction that your project will take your IT organization.
To avoid derailing your career, here are five ways to tackle the project.
1. Make the project a priority from the start.
Even if you do not want to run the project, go ahead and accept the job; the faster you embrace the challenge, the faster you can make a realistic assessment of what you will need to do to be successful or to at least avoid failure.
Also, the time you spend resisting the assignment is time you'll need to get over the learning curve on your new challenge. You have been given the job, and taking on the tough projects is one of the key hurdles to achieving the CIO role and beyond.
Look at your planned professional activities in the immediate future and consider which activities you can delete from your calendar. You will need to delegate some noncritical activities to your managers and staff. Your staff may not like it, but everyone has to pitch in to navigate this specific career challenge.
Maximize the time you can commit to your project, because you'll need every minute.
2. Commit talented people to the project.
Managers and staff available in IT organizations to commit to special projects may not be high performers. They are available for a reason, and other managers are often willing to assign their least desirable staff to your special project. Mp>You do not want a poorly staffed team for a career-critical project. You will need to make some hard decisions early on to free up some of your top talent to commit to the project. If you cannot control the project staffing, carefully interview each person assigned to the project to understand what he or she can contribute.
When a new project requires knowledge you do not currently possess, you will need to make managers and staff who work best in unstructured environments available for the project. While you will eventually need to carefully structure, plan, and execute your project, your initial project time will involve some level of unstructured learning. You need your team to be able to adjust quickly as you form a vision for the project and take this vision into tactical execution.
3. Get over the learning curve as fast as possible.
We all see our special projects as unique, but your project has likely already been executed at numerous companies, government agencies, nonprofits or universities. While the Internet provides access to a vast variety of information, it is also a data landfill where you can lose critical time sifting through mountains of raw information to get to the critical knowledge you need to succeed.
You should use your professional and personal contacts, as well as publicly available information, to cull lessons learned on how other organizations tackled similar projects. Social networking tools, such as LinkedIn, provide a forum where you can ask questions in specific communities to quickly gain knowledge.
For example, if your challenge is an IT service management (ITSM) implementation, there are a number of ITSM communities on LinkedIn where you can ask intelligent questions to narrow your search for relevant information. There is also excellent project information on the Web sites of a number of IT consultancies and industry trade groups. Tap professionals who have faced the same challenges you face and ask for input.
4. Plan the project work but adjust quickly.
You are committed to the project, you have gathered a good team, and you have a handle on some good lessons learned from similar projects executed at other organizations; the next step is to treat this project like any other project you have executed.
Develop a charter to define the project, support the charter with a detailed project plan with committed organizational resources, develop key milestones, and have a clear vision for what success would look like at project completion.
As you begin to execute the project, unexpected challenges will occur despite the initial project planning; be willing to quickly adjust to these challenges as you go. You'll gain knowledge as you progress in the project, and you'll probably realize that you made some faulty assumptions at project inception that need correcting. Adjust your project plan and keep moving.
5. Keep communicating.
I've seen many IT managers derail their once-in-a-career projects by failing to communicate with any of the various groups who are impacted by, or directly involved in, the project.
The CIO and other senior leaders do not want to be surprised by a project that has gone off track from the original project plan and is failing. Senior leadership should not learn of any challenges or potential failure from anyone other than you. Be proactive in communicating successes and challenges to the leaders of your IT organization.
You also need to keep IT managers and staff who are potentially impacted by your project informed. If this group will ultimately be responsible for executing the outcomes of your project, the earlier you engage them in the process, the better. Many project managers fail to keep up timely, focused communication while executing a project, and the project flounders in implementation.
Once-in-a-career events represent opportunity, as well as risk, to your career. Embrace the challenge, and you will give yourself the best opportunity for success!
READ MORE - Five tips for tackling a one-time project

Google's social side hopes to catch some Buzz

Google is determined not to be left behind by the social-media revolution.
The company wants to take what it does best--organizing Web content by relevancy--and apply it to social media, perhaps the most disorganized segment of the Web. Google Buzz is its most ambitious attempt to do just that, marrying the Gmail Web interface with status updates and media-sharing technology in an attempt to convince the social media addicts of the world to spend more time on Google's sites than on competitors like Facebook or Twitter; generating valuable data in the process.
"It has become a core belief of ours that organizing the social information on the Web is a Google-scale problem," said Todd Jackson, Gmail product manager, demonstrating Google Buzz at the company's headquarters a day before Tuesday's event. An astounding amount of social-media content is produced every day, across Facebook, Twitter, Flickr, YouTube, and personal blogs, and Google's faith that it could one day index and organize the entire Internet has been shaken by this explosion in Web content.
Somebody has to try, according to Google engineers. "A lot of the world's information is what's happening with my friends," said Bradley Horowitz, vice president of product management at the company. "We can't achieve (Google's) mission unless we solve these parts of that problem."
However, they see not only an opportunity to unify the social Web and make things easier for users, but a chance to erode Facebook's advantage in the reams of user data it has amassed behind closed walls that Google--and Google's advertisers--can't see. The only way they'll be able to do that is by creating a system that is as compelling and easy to use as Facebook.
Google is attempting to do this by taking Gmail, one of its more popular products, and integrating Buzz directly into the Gmail interface. Users can link their Twitter, Flickr, Picasa, and Google Reader accounts to their Buzz streams to see information produced by friends on those networks, as well as updates posted directly to the Buzz stream.
Google thinks it can build a competitive advantage in social media by focusing on relevancy and ranking within a social network. For example, Buzz users will be able to see all the content produced by those who they are following, but they'll also be able to see content produced by people they aren't following if their friends "liked" or commented on that content.
They'll also be able to train that algorithm by clicking "Not interested" on these "recommended" status updates if they don't wish to see that particular type of update again. Google thinks users might see an advantage if they can lower the ranking of oft-repeated types of content--such as the what-I-had-for-breakfast update--without having to banish that friend's content from their feed.
The idea is to take the thinking behind core Google concepts such as PageRank and quality score and apply it to social media, and Buzz is an early example of that process at Google, Jackson said. Expect to see further updates, as Buzz fits right into Google's classic strategy of launching a product as soon as possible and making constant updates.
Getting Buzz when you're on the move
And on the mobile side of the world, where social media can be combined with location, Google wants to allow phone users to see a wealth of data about what's happening around them and get in on thelocation-aware services bandwagon.
Google Buzz for Mobile will essentially be a competitor to services like Foursquare and Gowalla, allowing users to "check in" by updating their Buzz status with a Google Maps link to their location. You'll be able to do this right from Google's mobile home page, and Google is also releasing a Web application for Google Buzz that will work on iPhones and Android phones.
And within Google Maps for Mobile, the company's improved mapping application, users will be able to see public Buzz content posted from mobile phones around their location, said Vic Gundotra, vice president of engineering at Google. That includes quick reviews of restaurants in the area, updates on traffic snarls further along the route, or anything else imaginable.
For those who haven't drunk from the Foursquare pitcher just yet, bear in mind that the location part of a Buzz status update is opt-in: you'll have to manually declare your location, and can post Buzz updates without having to share your exact whereabouts.
This brings up a key factor in how Google is pitching Buzz. In order to attract users, it has to offer enough privacy safeguards to allow them to live their online lives in a semi-private fashion. But it also wants to open up that data to the wider Web, where it can be analyzed and dissected to glean information about trends that advertisers demand.
One of the issues with a service like Facebook is that so much of its content is walled-off from search engines and the general public. That's nice for users, but bad for search engines and marketers, and so Facebook has gently tried to encourage its users to open up their profiles.
Buzz users can choose to make a new post public or private before publishing. Public messages are distributed to one's followers, but they are also posted to one's Google Profile, where they can be searched, indexed, and viewed by anyone. Private Buzz messages can be sent to an unlimited number of subgroups within one's follower list, separating work contacts from drinking buddies, family and groups of friends that don't travel in the same circles. That would appear to give enough cover to those who want to make their online lives semi-public, but also placate Google and its advertisers' hunger for data on how people are spending their time both online and offline.
Buzz will take some time to gather the momentum that other social media sites have enjoyed. For example, one key omission is the inability to update those external services from within the Buzz stream: you can't update your Twitter feed with your status with a Buzz post, even though you can see what your Twitter contacts are doing in Buzz.
Google said it was working on that feature, but declined to say why it decided to leave that out at launch. Presumably, the company would prefer to build a network within Buzz that keeps those updates in house, at least at first. But those who have already established themselves as frequent Twitter users might not see a lot of value in a service that doesn't allow them to post to Twitter.
2010 is an important year for Google's social media strategy. The company has hired several veterans of the social Web to build out a new team, and executives promised a lot more to come with services like Buzz over the course of the year to erase the memories of Google as a social-media also-ran.
The problem, however, will be the increasing backlash Google is seeing from the general public over how much data the company already controls on their online habits. Will they want to take it a step further? If not, Google's social skills will have taken another hit.
READ MORE - Google's social side hopes to catch some Buzz

Indian IT market to regain 'normal' growth

The IT market in India is expected to grow at about 15.5 percent in 2010, returning to a "new normal rate" after last year's turbulent economic environment, according to a new report.
In a report titled "India IT Market Predictions 2010" released Monday, Manish Bahl, research manager at Springboard Research, said the growth would be spurred by organizations initiating new large-scale projects over the course of the year. These initiatives will require significant IT infrastructure-related investments, he noted.
Springboard Research estimates show that India's domestic IT market recorded an 11-percent year-on-year growth in 2009, down from the 14.1 percent it projected in February 2009.
Going forward, there is great interest in cost-cutting measures as CIOs in India focus on allocating around 75 percent of their budgets to "keep the lights on", added Bahl. The report pointed to a noticeable shift in enterprise IT spending from focusing on new investments to streamlining costs and improving internal efficiencies.
Reaching out beyond large enterprises to small and midsize businesses (SMBs) will be "key to success" for vendors of remote infrastructure management, desktop management and other managed services, Bahl noted. However, he added that vendors can expect to cater to "very cautious and increasingly skeptical set of IT buyers and prospects".
The report indicated that data center transformations, IT manageability advancements, software-as-a-service (SaaS) and virtualization investments are also helping companies drive "economies of scale" by reducing operational expenditures, both from the business and IT perspective, while increasing productivity.
The report also listed the top 10 trends that will share the IT market in India this year:
1. Business enhancement with existing vendors and geographic expansion will drive strong focus from India's IT channel community.
2. Analytics and the advent of "intelligent solutions" will drive new business value.
3. Rural India IT solutions will make steady headway via public- and private-sector investments.
4. Business leaders will drive the proliferation of SaaS applications.
5. A wave of innovative new payment technologies emerge.
6. Desktop virtualization to gain more acceptance in the enterprise.
7. Government projects will fuel smart card technologies.
8. Mobile social networking goes mainstream.
9. Convergence of computing platforms accelerates.
10. Online developer platforms and communities are the new ecosystem battleground and epicenter of application innovation.
READ MORE - Indian IT market to regain 'normal' growth

SAP leadership change just 'business reality'

The departure of SAP CEO Leo Apotheker is simply a reflection of "business reality and part of business life" and will not impact customer relationships, says the company's Asia-Pacific top exec.
Stephen Watts, SAP's new president for Asia-Pacific and Japan, told ZDNet Asia in an interview here Tuesday that Apotheker's resignation this week is not expected to affect customer confidence and he remains especially bullish about the region. Effective immediately, the CEO left his post and seat on the company's board after his contract was not renewed.
While Apotheker's tenure was stung by lower sales and earnings over the last year of the economic downturn, Watts said SAP has forecast a 4 percent to 8 percent recovery in revenues this year, with the Asia-Pacific region expecting to fare closer to the 8 percent-mark.
"Southeast Asia as a whole had the strongest quarter last year in the company's history," he said, pointing to China and India as "growth engines" that saw an average of 46 percent increase in the same quarter.
Watts said SAP's market share in the region is currently the "largest of the application software vendors", where it takes 40 to 50 percent of the market in Southeast Asia. It has the largest share in Japan, he added, and is the largest multinational vendor in China--behind local provider, Ufida.
Business intelligence pulling customers
According to Gartner figures released mid-2009, SAP's share of the North American ERP (enterprise resource planning) software market stood at 26 percent in 2008. For the BI (business intelligence) market segment, IDC ranked SAP's BusinessObjects global market share top, at 20.4 percent.
Of SAP's revenues in the Asia-Pacific region, ERP software accounts for half and BI contributes to the remaining half, Watts said. "BI will be the fastest mover for us [in the region."
He explained that the high expectation in BI stems from several areas: small companies that are beginning to streamline processes, as well as keen interest from the public sector, thanks to stimulus package spending by governments last year.
Utilities was the strongest growing vertical for SAP in the region last year, with governments looking to modernize operations with investments in billing, smart metering and process integration, said Watts. "The [utilities industry] provided a very material investment across the region," he added.
In an interview last year with ZDNet Asia, SAP said BI was providing an inroad to organizations for the vendor's main ERP product. Organizations were particularly interested in BI's quick returns and comparably smaller investment commitment, compared to embarking on a larger ERP implementation, SAP said.
Watts echoed these sentiments, nothing that BI's "short and sharp" implementations help demonstrate a quicker and predictable investment rollout, of within a six month-span, for enterprise customers.
Beyond BI, companies want smaller, phrased investments across their enterprise implementations, he added. "Boards want to see results faster. The expectations of technology are higher, and this creates a [higher] level of immediacy," he said.
Watts took over the role of SAP's Asia-Pacific and Japan president in January from his predecessor, Geraldine McBride. Prior to that, he served as the region's chief operating officer.
READ MORE - SAP leadership change just 'business reality'

The enduring cipher: Unbreakable for nearly 100 years

One cryptographic cipher has been mathematically proven to be unbreakable when it is used correctly, but it is only very rarely used. Chad Perrin breaks down the one-time pad cipher.

It seems to be a truism of cryptography that any cipher, no matter how strong it is considered to be in its heyday, eventually becomes a weak cipher.
What people fear when they use the current favorite strong cipher is that someone will crack it--will find a shortcut that greatly reduces the time required to use brute force calculation to decrypt something without the key.
Even if a cipher is never broken, and we forever-more need the same average number of CPU clock cycles to decrypt something encrypted by a given cipher without using the key; it still takes less time every few months to achieve the same goal than it did a few months ago. This is because computers keep getting faster, allowing us to squeeze the same number of CPU clock cycles into a shorter period of time.
Ever-more complex and clever algorithms are designed to provide ever greater resistance to brute force cryptanalysis, and to replace older algorithms that have been broken or otherwise become obsolete. It is always an arms race--privacy against the attempt to penetrate that privacy.
Amongst all the wreckage of the broken and rusty ciphers that have fallen by the wayside through history, one cipher has endured for the last 93 years. It is called the one-time pad.
In 1917, Gilbert Vernam developed a cipher known as the Vernam Cipher, which used teletype technology with a paper tape key to encrypt and decrypt data. The result was a symmetric cipher that was quite strong for its time.
U.S. Army Captain Joseph Mauborgne realized that by using truly random keys, where no part of the key was repeated (except perhaps at random), the Vernam cipher could be made much stronger.
From the idea of using paper tape keys, a pad of paper with rows of random letters or numbers on each page as the means of recording keys was developed. Two copies of the same pad could be given to two people, and by using each character on each page only once (and destroying each page as its last character is used to encrypt or decrypt a message), they could pass encrypted messages between them without fear of an intercepted message ever being decrypted without the help of the key.
Because of the technique of distributing key stream data on pads of paper, this cipher became known as the one-time pad.
Claude Shannon, known as the father of information theory, mathematically proved the unbreakability of the one-time pad cipher when it is used properly--including destroying any pages containing used key data so that it will not be used, and so that unauthorized copies of any messages cannot later be decrypted if someone gets his hands on your used pages from the pad.
The same concept for key management can be employed digitally, of course, with the proviso that one must be very careful to avoid letting the inherent weaknesses of computers introduce flaws into the one-time pad system. For instance, expensive data recovery operations might be able to reconstruct "deleted" files, including used one-time pad data. There are things you can do to help obscure such data when simply deleting files is not enough, but one must be careful.
The one-time pad cipher can be extremely inconvenient at times, which is why it is not used more often. We do actually need theoretically weaker (if cleverer) ciphers, such as AES/Rijndael and Twofish because of that inconvenience:
  • Because the one-time pad cipher is a symmetric cipher, both parties to an encrypted communication must have the exact same key data. For certain use cases for encryption, this makes a one-time pad completely useless, because to securely exchange the key data so both parties have it, one must have a secure means of sharing data that would work just as well for sharing the eventual messages themselves. Only in cases in which you do not know what messages you will need to send, and where you will not be able to use whatever secure means was used to exchange the key data (such as physically handing it to the person) at the later time, is the one-time pad cipher useful.
  • A one-time pad encryption key must be as long as the message it is used to encrypt and decrypt. Thus, if you want to encrypt or decrypt a three gigabyte file, you need three gigabytes of one-time pad key data.
  • The same one-time pad data can not be shared securely among more than two people; for example, in cases where different messages will be sent between some recipients of the key data, which should not be readable to other recipients, using the same one-time pad amongst all of them subverts the security of the cipher. By contrast, with an asymmetric cipher you can provide the same public key to dozens of people, and they will all be able to use that same public key to encrypt messages for you without any fear the other people who have the public key will be able to read it--as long as the cipher is not broken and the state of the art of computing technology does not advance enough to reasonably brute force decrypt the messages. This is because when something is encrypted with the public key, only the associated private key can be used to decrypt it.
  • Reusing a key potentially breaks the security of the one-time pad cipher because it suffers a known-plaintext vulnerability. Kerckhoffs' principle states that a cryptosystem should be secure so long as the the key remains secret, but where the encrypted and unencrypted (plaintext) versions of a given message are both known, the key can be derived in the case of the one-time pad cipher. This is not a problem if each string of key data is used only once, because if the plaintext is captured by the "enemy" you have already lost the game anyway. If a key is reused, however, one message's plaintext can be used as part of the set of tools used to determine the key for decrypting another message. The moral of the story is: Don't use a given one-time pad key more than once. There is a reason it is called a "one-time" pad.
  • When the last of your one-time pad is used up, you cannot securely send messages back and forth--encrypted using that same cipher--any longer unless you securely exchange new random key data. This kind of thing can really cramp your style when trying to communicate with someone on the other side of the world.
Other factors come into play in making use of the one-time pad cipher impractical in some circumstances, too, but these should give you a good start on seeing why other, theoretically weaker ciphers are still important.
The way the one-time pad works is deceptively simple. It involves merely comparing each of two datums, such as two letters or numbers, and using that comparison to produce a new datum. This is done for every such datum in the message you want to encrypt. The process of performing this comparison is simple and easy, one datum at a time, and (relatively) computationally cheap. A simple operator known as the XOR operator can be used to perform such a comparison. In its simplest form, the XOR operator as applied to binary numbers works something like this:
  1. First, you need a message. Let's use the word "short" as our message. Yes, that is a short message.
  2. Next, you translate that message into a binary representation. Using ASCII encoding to translate the word "short", you end up with the following string of ones and zeros:
    0111001101101000011011110111001001110100
  3. The last thing you need is a key that is exactly as long as the message. In this case, let's use this string of ones and zeros as our key:
    0110010101101010001110010010011101100100
  4. Finally, with the XOR operator, you basically perform a simple subtraction. Where the first character in each case is a zero, you see that 0 - 0 = 0. Similarly, the second character in each is a one, and 1 - 1 = 0. Where one character is a one and the other a zero, though, you get either a 1 or a -1 result. If you take the absolute value of the result, that means that both will give you a 1 result. Thus, subtracting and taking the absolute value provides the following:
    0111001101101000011011110111001001110100
    0110010101101010001110010010011101100100
    ----------------------------------------
    0001011000000010010101100101010100010000
Of course, there are no ASCII translations for some of those groups of eight binary characters in the resulting string of ones and zeros, so it is difficult to represent the data in a concise form. Using ASCII encoding is well-suited to computer use, but the traditional number-and-letter approach to implementing the one-time pad cipher is much better suited to analog, by-hand translations.
The core algorithm for the one-time pad cipher is obviously incredibly simple. It is only in designing the rest of the software that surrounds this algorithm, and finding the right use case for the cipher, that the real problems of secure cryptographic software development arise. On the other hand, if it absolutely, positively has to be securely encrypted, the one-time pad is the only cipher that is provably unbreakable when used properly--given our current understanding of mathematics.
READ MORE - The enduring cipher: Unbreakable for nearly 100 years