Gasping for air, Nortel to sell off wireless tech

Nokia Siemens Networks will buy Nortel Networks' wireless technology business for US$650 million.
Struggling Nortel, a one-time giant in telecommunications equipment, had filed for Chapter 11 bankruptcy in January in hopes of reorganizing. But that is unlikely now.
Nokia Siemens said Friday that it will use Nortel's CDMA and LTE technology to expand its presence in North America. CDMA, or code division multiple access, is one of the two major networks operating in the United States and is used by Verizon Wireless and Sprint. LTE, or Long Term Evolution, is 4G wireless technology that will potentially replace today's mobile networks.
"This agreement provides an important strategic opportunity for Nokia Siemens Networks to strengthen its position in two key areas, North America and LTE, at a price that makes good economic sense," Nokia Siemens CEO Simon Beresford-Wylie said in the statement.
As part of the deal, about 2,500 Nortel employees in the United States, Canada, Mexico and China can keep their jobs. Nortel said this represents a "significant portion" of the workers associated with that part of its business.
"Maximizing the value of our businesses in the face of a consolidating global market has been our most critical priority. We have determined the best way to do this is to find buyers for our businesses who can carry Nortel innovation forward, while preserving employment to the greatest extent possible," Nortel CEO Mike Zafirovski said in a statement.
Toronto-based Nortel also said that it is working toward selling off the other parts of its business and that it is applying to be delisted from the Toronto Stock Exchange.
The deal with Nokia Siemens, which is expected to close in the third quarter, must be approved by both U.S. Bankruptcy Court and the Ontario Superior Court of Justice.
Nortel was founded in 1895 as Northern Electric and Manufacturing and supplied telecommunications gear for Canada's young telephone system. At the height of its glory days about 10 years ago, Nortel was worth US$250 billion and had more than 90,000 employees.
READ MORE - Gasping for air, Nortel to sell off wireless tech

Telcos missed IPTV 'window of opportunity'

Telecommunications operators have missed their "window of opportunity" to dominate the IPTV (Internet Protocol television) space, according to an Irdeto executive.
Telcos failed to exploit their chance two to three years ago to deploy IPTV most effectively, and have now been squeezed out of the content-delivery game by broadcasters, said Thierry Raymaekers, Asia-Pacific managing director of business development at Irdeto, a digital content management provider.
With plenty of competing content on cable channels and free-to-air satellite TV, telcos are finding it difficult to make IPTV a compelling offering for customers, Raymaekers explained in an interview Thursday with ZDNet Asia at imbX 2009.
Furthermore, with broadcasters now eyeing IPTV opportunities, this places more pressure on telcos because broadcasters are in a better position to combine IPTV's functions with the content they already offer, he noted.
Raymaekers said: "Telcos have not mastered [the practice of] thinking like a broadcaster. Telcos are still thinking about flow, bandwidth, and bits and bytes, and not thinking about how to appeal to customers."
Content delivery is a "different world" to telcos, which tend to see a movie as "1.5 hours of data", he said. Broadcasters, on the other hand, understand that a blockbuster carries higher value than a regular movie.
Raymaekers pointed to the success Swedish telco, TeliaSonera, had with its IPTV rollout. The operator entered the market at a time when Sweden had only two free-to-air broadcast channels. TeliaSonera's IPTV offering introduced some 30 channels to customers, immediately filling a void in the market, he said.
According to reports, TeliaSonera added 27,000 subscribers in the fourth quarter of 2008, bringing its subscriber count to 477,000.
Raymaekers noted, however, that good paid-TV content is already available in the Asian market, making it hard for telcos to find an entry point.
In Singapore, for example, SingTel's mio IPTV offering has little more to offer than StarHub's more mature cable TV service. The latter also has more exclusive content, making it an easier choice for consumers, he said.
Telcos typically have a tougher time negotiating for exclusive content rights, because content owners more likely prefer to sign with the broadcaster that has the biggest market share. This is typically the incumbent, not the telco, Raymaekers noted.
Unique services driving success
The good news is that the game is not over yet for telcos.
According to Raymaekers, telcos hold the advantage with their ability to offer bundled triple-play services. For example, a telco can offer discounted packages for subscribers that bundle mobile phone, home Internet and IPTV services.
To further build on their triple-play model, telcos can gain leverage over traditional broadcasters by securing exclusive rights with content owners to broadcast content across different media. This would allow telcos, for example, to offer IPTV subscribers the option of watching the same content for free on their mobiles, said Raymaekers.
The ability to offer customers what broadcasters are unable to provide, helped bring success for Chunghwa Telecom's IPTV business, said Taiwan's largest mobile phone operator.
In his presentation Friday at imbX, Huang Tzu-Han, Chunghwa Telecom's president of Northern business group, said the telco competes with cable players in both voice and data. Chunghwa's "Karaoke on demand" as an example, Huang said offering services that are unique has proven successful with its targeted audience.
Its broadcast of the Beijing Olympic Games last year also pushed its subscriber count. Huang said: "Quality content is a growth accelerator."
Chunghwa saw its IPTV subscriber base climb 71.5 percent to 676,000 in December 2008, from 394,000 in the same month last year.
READ MORE - Telcos missed IPTV 'window of opportunity'

Asia's 'bright' HPC future will take time

Despite a declining share in the Top500 supercomputer list, Asia's high-performance computing (HPC) future remains "bright", according to industry watchers in the region. To catch up with its Western counterparts however, the region will require "some time".
Dennis Ang, Hewlett-Packard's Asia-Pacific and Japan general manager for HPC, noted that supercomputing or HPC in the region "has made continuous progress". Adoption across the region, he said in an e-mail interview, is rather broad-based.
"While we have seen higher adoption across high-end research institutes and universities in the past, there has been a change in the last couple of years with more uptake and high investments in HPC infrastructure from vertical industries," Ang explained, adding that the automotive and aerospace sectors were among the drivers of the new phenomenon.
"Over the next 12 to 18 months, we expect to see growth coming from companies that engage in Web hosting or building up cloud computing capabilities--both at a national level as well as for private and hybrid clouds," added Ang. "A successful example of commercializing HPC using cloud technologies is the Singapore-based Alatum [grid computing] service provided by SingTel using HP BladeSystem…the Alatum infrastructure [is] expected to list on the [upcoming] June Top500 [list]."
Concurring, Adesh Gupta, server platform marketing manager at Intel Asia-Pacific, said Asia "continues to grow in prominence" in terms of deployment of large high-performance clusters. In the recent past, there have been deployments of over 1,000 nodes in India, Australia and China.
"We continue to see future projects in the pipeline that would offer hundreds of teraflops of computing capability in a cluster," he added in an e-mail. "As [of] November 2008, there were 50 Asia-based supercomputers in the Top500 list [with] over 70 percent of the supercomputers [powered by] Intel architecture.
What was interesting, Gupta pointed out, was that 39 of the 50 supercomputer deployments in Asia were in 2008. "We are seeing [an] increase in HPC resources--capital, human talent--being deployed in mature countries and emerging economic superpowers in Asia."
Andréas Rydén, HPC sales leader at IBM's Systems & Technology Group, told ZDNet Asia there has been progress in the Asia-Pacific HPC landscape and "a steady growth in capacity of systems within the Asian agencies", with many investing in expertise and infrastructure for deeper research.
According to him, investments of Asian countries in the past have been mainly on smaller HPC systems with focus on developing expertise in application, software and ecosystem. "Big" HPC investments, he admitted, are still mainly government-related.
"Many countries in this region have national programs that need large capacity HPC infrastructure," said Rydén. "The future looks bright for supercomputing systems in the region.
HPC a victim of economic downturn
Meanwhile, the financial crisis has dampened the rollout of HPC in the region. Intel's Gupta reported that the economic downturn has delayed some projects in the enterprise sector, but sectors such as life sciences, universities and government continue to see deployment of clusters ranging from eight to 64 nodes, to larger scale rollouts of at least 1,000 nodes.
HP's Ang also acknowledged there has been "a bit of a slowdown" in implementations across the region due to the current economic situation. "This is particularly not surprising for large manufacturing-based countries like Japan and Korea which have been more affected by the global economic downturn than some others."
Japan's next-generation supercomputer, for instance, encountered a hiccup last month, when NEC announced it would withdraw its participation in the manufacturing phase of Japan's next-generation supercomputer project.
A check with Riken, the agency tasked to spearhead the development of Japan's next-generation supercomputer, revealed the agency is still gearing to roll out the initiative by 2012. "We are reviewing the plan so that we can achieve the initial target without NEC," Mitsuo Yokokawa, system development team leader of the Riken Next-Generation Supercomputer R&D Center, said in an e-mail.
An NEC spokesperson, noted in an e-mail that the company was still committed to supercomputing projects within Japan. Its recently upgraded Japanese Earth Simulator recorded "the world highest efficiency of 93.38 percent on the Linpack with a 122.4 teraflops Linpack score", bettering the best performer on the November 2008 Top500 list.
Asia's declining share in Top500 cause for concern?
In contrast to the picture of positive development, the number of Asia-based supercomputers in the Top500 list--a recognized ranking for global supercomputers--have declined in the last two years.
In November 2006, there were 79 systems from Asia on the list. That dropped to 72 in June 2007, and further down to 58 in November 2007. Last November, the number of machines from the region stood at 47, down marginally from 48 in June 2008.
While the Top500 list indeed is a good indicator of where the best systems in the world are based, it is "not 100 percent accurate" as there is an option for companies not to be listed, pointed out HP's Ang.
Intel's Gupta added that instead of focusing on the number of systems in the Top500, the industry should look at improvement in computing capability demonstrated by these Asia-based clusters. "If we look at the aggregate computing power for the deployments in 2006 [based on the] November 2006 listing, it is over 350 teraflops.
"The aggregate computing power for deployments in 2008 [based on the] November 2008 listing, is over 2,000 teraflops," he noted. "Clearly, 2008 saw an increase of [more than] six times over 2006 deployments."
Gupta continued: "Asian countries are putting in more resources to develop powerful supercomputers to solve complex problems. Mature economies such as Australia, Korea and Japan and emerging superpowers India and China would continue to invest in very large node clusters.
"All these nations can potentially compete with the top supercomputing countries in the world today."
Asia plays catch-up to Western regions
Still, it will be a while before Asia can dethrone its more illustrious supercomputing counterparts in the United States and Europe.
"HPC develops in parallel with the maturity of the market, especially in terms of the understanding of HPC and how to build [such clusters], skill sets, existing infrastructures, and even space constraints," HP's Ang said. "Asia still has a ways to go."
According to the NEC spokesperson, it will still take the region "some time to catch up" with North America and Europe as the landscape is fast-moving.
Intel's Gupta added more HPC resources are being assigned in universities, government institutions and public laboratories. "India and China have created national agendas for better research capabilities to solve complex problems associated with weather modeling, oil exploration, chemical and molecular research, biosciences, etc.
"It will take a few years for the supercomputers in India and China to catch up with their U.S. counterparts, but we feel that in the not too distant future, clusters delivering pentaflops of computing capability would emerge in India and China."
IBM's Rydén noted that investments in areas including basic applied research and nanotechnology, will help bridge the gap with the global counterparts.
"China has a vision of becoming an innovation-based economy with more impetus on services," he said. "There are ambitious plans in the making of petascale computing systems to enhance research in several research areas such as nanotechnology, life sciences and weather."
Rydén added: "With the larger national agenda and focus on research in many Asian countries, we will see a growth in large systems in this part of the region. Growth would be across the region with particular impetus from China, Taiwan, Australia and India."
Will the next Top500 list, expected to be released next week, reveal winners or losers in Asian supercomputing? Only time will tell.
READ MORE - Asia's 'bright' HPC future will take time

Build governance into e-govt, experts urge

Governance and collaboration are key elements of successful e-government frameworks, various speakers stressed at the inaugural iGov Global Exchange 2009 on Monday.
Delivering the conference keynote, Singapore Senior Minister Goh Chok Tong said the country's experience has shown that good governance--more than technology and the application of IT--serves as the foundation for successful e-government.
"It is tempting to think that to successfully implement e-government, the 'e' in e-government comes first. Too often, the marketing allure of new technologies blindside us into thinking that all we need is to roll out new hardware and software," he noted. "Putting a PC on the desk does not in itself raise efficiency; rolling out broadband does not, by itself, lead to higher productivity."
Governance in e-government, according to the minister, required three elements:
  • transparency, accountability and incorruptibility--government leaders have to set an example, and "every credible allegation of corruption" needs to be looked into. In public sector procurement, for instance, open competitive tenders are the norm, so much so that "many suppliers and vendors regard winning a project in an open competitive bid in Singapore as a valuable endorsement of their product", said Goh.
  • continuous regulatory review--administrations need to constantly relook rules and regulations to do away with obsolete ones, as well as re-engineer delivery processes. Without continuous regulatory reviews, the government would simply be importing inefficiencies into an electronic system.
  • working as one--breaking down silos in the public service is a necessary ingredient, as a silo mentality can create problems for investors and businessmen, noted Goh. Likewise, civic-minded individuals may also be discouraged from providing feedback if they are given the run-around.
  • Singapore: From integrated to collaborative
    Retracing the island-state's steps toward e-government, Lim Hup Seng, deputy secretary (performance) at Singapore's Ministry of Finance, pointed out that Singapore had evolved from being a computerized government to an online government to its current phase of becoming an integrated government. To date, Singapore has some 1,600 e-services, or 98 percent of all services deemed to be feasible electronically. Some 300 of such e-services are also optimized for mobile devices, and there are about 3,000 transactions per month using this platform. The next step, he said, was to become a "collaborative government", where the administration taps onto the network of public, private and people sectors to identify gaps in services and bridge those gaps. That, essentially, constitutes a shift from "government to you" to "government with you". During a panel discussion, Lim observed that this could mean that the "form of government may well have to change"--emphasis could be on more strategic issues such as diplomacy and defense instead of minding the communities.
    Good governance, the politician added, also enables the Singapore administration "to improve the effectiveness of government". For example, the Singapore government uses e-government systems to identify lower-income residents and provide them bigger cash payouts, which are automatically channeled as the government databases are linked to banking systems. Going forward, Goh said the role of the government "must evolve from being the sole provider of public services, to providing an open IT platform to nurture an ecosystem of IT services". In the new ecosystem, participants will be able to freely innovate and create value-added services on top of or even superseding existing public services. E-government challenges Qian Haiyan, division director for public administration and development management at the United Nations Department of Economic and Social Affairs, said the government of the future is one that understands and correctly exploits e-government strategies and tools. From a UN perspective, e-government stands at a crossroads in many parts of the world, she pointed out. Many administrations acquire technologies that end up being redundant, because they did not fully understand or implement them. Echoing Goh's observation, Qian said countries sometimes equated e-government with having the best infrastructure and the most advanced technology. When implementing e-government, many also experience an "integration loophole", she added. Successful implementation requires the connecting of online platforms with offline processes, as well as government's partnership with the private sector to pursue targeted marketing in educating citizens. At the heart of e-government progress is a need for cultural shifts and to manage such shifts carefully. Ken Cochrane, CIO of the Government of Canada between 2006 and 2008, said from his country's experience of moving the public sector into a shared services model that changing user mindsets was often the most challenging part of the process. "Technology was the least of our worries; the biggest challenge was trying to convince departments to use our shared services," said Cochrane, currently managing partner of SSG Southside Solutions Group. Canada's e-government experience began in the 1990s, he shared, when it was merely about having online presence. Two other phases--"i-Gov" in 2003 and "Gov 2.0" in 2007--followed, where the emphasis was on making internal processes and systems more efficient, and modernizing the delivery of services and widening the services scope, respectively. Even with "solid agreement" among CIOs at various government levels, there continued to be cultural issues which mostly arose from middle management ranks, he noted. "A lot of the cultural issues need to be dealt with…by starting small…showing what can be done." Experts at the conference, agreed that e-government was a continuous journey. E-government is a journey that has "no clear beginning" and "no clear end", summed up Lim Hup Seng, deputy secretary (performance) at Singapore's Ministry of Finance.
READ MORE - Build governance into e-govt, experts urge

Five factors stopping the cloud taking off

Whatever the marketing people may say, there is little evidence to suggest organizations are migrating many, if any, of their existing business-critical systems to the cloud.
Beyond the deployment of a few commodity offerings, such as email and collaboration and the development of often disposable in-house applications, most enterprise adopters are using the pay-as-you-go delivery model as a means of introducing new services without dipping into tightening capital expenditure budgets.
However, there is a great deal of interest from start-ups with little money to spend upfront, as well as from the independent software vendor community, which is looking for a platform on which to build new software-as-a-service-based applications (Saas) to sell to customers.
Neil Ward-Dutton, a partner at analyst company Macehiter Ward-Dutton, said "The majority of the market in the short term remains software vendors, which isn't surprising as there are a lot of inhibitors for enterprises."
Key challenges
The factors inhibiting wider take-up include technology-based concerns relating to security, reliability, network latency, integration and management, as well as worries about data and vendor lock-in.
However, another key challenge is cultural. "Many IT people will resist this, because if a big chunk of the data center migrates to the internet, it's their jobs you're talking about," said Ward-Dutton.
Moreover, while the decision to undertake more traditional outsourcing is typically made by the business or chief information officer in conjunction with the business, the mechanics of hosting a data center is perceived to be predominantly an IT issue, which makes it easier for managers to muddy the waters and introduce delaying tactics.
Nonetheless, in the medium to long term, it is likely to be a career-limiting move to turn one's back on the potential advantages that cloud computing has to offer in terms of cost-cutting and boosting staff productivity.
Because such services can act as a useful complement to on-premise systems, it makes sense for IT directors to take the time to understand and investigate them to see which might make the business more competitive.
So, while the "death of the IT department is greatly exaggerated", according to Tom Austin, head of software research at Gartner, he also believes, "you can't ignore cloud". The key instead is about finding a balance, rather than rushing into it headlong, because, he said "like everything, cloud has its place".
What this all means is that it is crucial for IT departments to take a measured approach to the subject. And an important place to start relates to migration issues.
READ MORE - Five factors stopping the cloud taking off

'Spam king' could face criminal charges in Facebook case

In a move that could land Sanford Wallace in jail if convicted, a federal judge on Friday referred a lawsuit Facebook filed against the "spam king" to the U.S. Attorney's office for possible criminal proceedings.
A written ruling from Judge Jeremy Fogel in U.S. District in San Jose, Calif., is expected early next week, a court clerk said. The action came at a hearing on a Facebook motion that Wallace be found in criminal contempt for allegedly continuing to send spam on Facebook.
Facebook sued Sanford and two others in February alleging they used phishing sites or other means to fraudulently gain access to Facebook accounts and used them to distribute phishing spam throughout the network.
The judge had earlier entered a preliminary injunction against Wallace for failing to appear in court for the original proceedings, said Sam O'Rourke, Facebook's lead counsel for litigation and intellectual property. Wallace appeared in court on Friday in what is believed to be his first court appearance in any of the cases filed against him, according to O'Rourke.
Facebook also had asked for a default judgment in the case, but the judge was prevented from taking action on that since Wallace filed for Chapter 11 bankruptcy protection on Thursday and civil actions seeking monetary sanctions are automatically stayed when a defendant files for bankruptcy, O'Rourke said. Facebook believes Wallace filed for bankruptcy to avoid a default judgment and criminal contempt order, he said.
Facebook plans to ask the bankruptcy court to lift the stay so a ruling can be made on the default judgment to become a creditor, O'Rourke said.
"We're very pleased Judge Jeremy Fogel agreed that there were grounds for criminal contempt and that the U.S. Attorney's office should investigate Wallace," Facebook said in an e-mail statement. "Wallace filed for bankruptcy, which is not unexpected and only delays our judgment temporarily. We will continue to pursue the judgment and will be reviewing his filing very closely."
The order should serve as a strong deterrent against spammers, Facebook said. "Fogel's ruling demonstrates that judges will enforce restraining orders and spammers who violate them face criminal prosecution" the statement said.
A year ago, Wallace and another defendant were ordered to pay MySpace.com $234 million following a trial at which Wallace repeatedly failed to turn over documents or even show up in court.
In the largest judgment in history for a case brought under the Can-Spam Act, the federal court in San Jose awarded Facebook $873 million in damages late last year against a Canadian man accused of spamming users of the site.
READ MORE - 'Spam king' could face criminal charges in Facebook case

Adobe makes Acrobat.com a business with paid accounts

Adobe is taking Acrobat.com out of beta on Monday, and turning it into a business with paid user accounts.
The service, which has more than 5 million registered users will retain its free version, however there are now usage limitations on certain features which can be unlocked by upgrading to one of the two new premium plans. These can be purchased on a monthly or yearly basis and cost US$14.99 or US$39 a month, or US$149 or US$390 a year respectively.
The "premium basic" plan allows for 10 PDF conversions per month, as well as up to five meeting participants though Adobe's ConnectNow tool. The "premium plus" plan dials that up to unlimited PDF conversions, and meetings with up to 20 users. Both premium plans also gain phone and Web support. In comparison, free users will only be able to convert five PDFs, and connect with two people at once in ConnectNow, which is just one less connection than users were able to have during Acrobat's beta period.
Along with the move to paid accounts, Acrobat.com is getting a new collaborative app called Tables that handles basic spreadsheets. Just like Buzzword, Adobe's online word processor, this lets multiple users work on a spreadsheet at once, as well as track revisions and roll back to earlier versions.
In a call with ZDNet Asia's sister site CNET News last week, Eric Larson, Adobe's director of product management and marketing for Acrobat.com, told me that Tables is not quite ready to replace Microsoft's Excel, which is why it's being rolled out in Adobe's Acrobat Labs section first. Larson did stress, however, that it will allow users to do things Excel can't, like see where other people are on the document, and provide a subtle warning when users are making a visual change that will affect others.
Little things that users are used to doing in normal software, like changing column width or sorting order, yields a small warning message that tells them to think twice if there are other people working on it at the same time. It also provides the option to switch to "private view," which lets users make edits without the changes going live to the main document. Adobe is hoping this type of work flow will cut down on the e-mail overload, and versioning problems that typical office software creates.
Unlike Google Docs, which opens up to a sea of white cells, Tables opens up to just three columns and five rows which can be expanded one at a time. It's also incredibly responsive, letting you re-organize, and snap around columns and individual cells as if you were using desktop software.
Included are a wide range of formulas, however there is not yet a way to reference individual cells, which may be a show-stopper for some financial applications. You can, however reference entire columns.
Will this be enough to woo users to pay up who were previously getting some of these things for free? Adobe seems to think so, and is still allowing users free and unlimited access to Buzzword, Presentations, and now Tables. The big thing that changes with today's news is an expansion of the the services that let you share what you've created with these free tools, either by converting files, or talking about them in the live meeting solutions.
Adobe is eventually hoping to get Acrobat.com beyond the browser and get its AIR application up to parity, so that users will be able to use these same apps, and access their work off the browser. The company is also trying to get people access to these files and applications on mobile devices, where they'll be able to make edits and even create new documents, although this isn't coming until later this year.
Following that, Adobe is working on an upgrade to its PDF technology that will show others when a change has been made by anyone else who is collaborating on it. When it finds that out, it gives them the option to update to the newer version, similar to what happens when a developer makes a change to a software application.
With Acrobat.com, Adobe is coming a little late to a game that Google, Zoho and ThinkFree have been running for years, and that Microsoft is set to join very soon. What may make the difference is that Adobe can work these products very deeply into other pieces of its software. Whether that ends up being a liability compared to competing solutions that remain Web-only is unclear.
READ MORE - Adobe makes Acrobat.com a business with paid accounts

SAP in SaaS strategy shift

SAP has announced a major shift in its approach to software-as-a-service (SaaS), in a move intended to help it catch up to SaaS leaders such as Salesforce.com.
Under the new strategy, announced Wednesday at the OnDemand Europe conference in Amsterdam, SAP will adapt its Large Enterprise on Demand product to allow end-user organizations to mix traditional software with the online services.
Online services will be much more tightly integrated with SAP's core Business Suite product, a spokesman for the company told ZDNet Asia's sister site ZDNet UK.
The move is intended to allow customers to benefit from both the strengths of traditional software and the flexibility of online services, SAP said.
The new approach also aims to help businesses to avoid the integration problems that can arise when they use both in-house software and SaaS, according to SAP. These integration glitches can come when a company shares data between different platforms or can arise from the differences between the business processes used in various types of software.
Large Enterprise On Demand, which is SAP's SaaS product for enterprises, offers CRM (customer relationship management) and e-sourcing. Expense management capability is set for release in 2010.
Paul Daugherty, the chief technology architect at Accenture, said the news reflects the growing importance of SaaS.
"This news from SAP will add to the growing legitimacy of the SaaS model, which encourages the development of company-wide processes, making it easier for a company to focus on what differentiates it from the competition," Daugherty said in a statement.
SAP has faced problems getting into the SaaS market, notably with delays to its Business ByDesign product, which is targeted at small and medium-sized businesses.
In November, Salesforce.com's chief executive, Marc Benioff, accused SAP of failing to understand SaaS. He said the German company was doing a "terrible job" with Business ByDesign, and described the project as "a huge implosion", adding: "The way they're developing it is insane."
In May, SAP said it had signed up only about 80 customers to ByDesign.
READ MORE - SAP in SaaS strategy shift

New Linux kernel adds filesystem support

Developers have released the Linux kernel version 2.6.30, adding support for new file systems as well as performance improvements and new hardware drivers.
The Linux kernel is the core used by GNU/Linux operating system distributions from Red Hat, Novell Suse and others. The new release was finalized last week, and was publicised in a newslist post from Linux developer Linus Torvalds on the next day.
The most prominent new features include support for two new filesystems, according to release notes published by Kernelnewbies, a group of Linux developers.
Support was added or updated for the NILFS2 filesystem, still under development, which is designed to be more resistant to crashes; and for POHMELFS (Parallel Optimized Host Message Exchange Layered File System), a high-performance and network distributed file system.
The kernel also comes with updated support for other filesystems, including EXOFS, a file system for object-based storage devices, and the FS-Cache filesystem. Tweaks have been made to generally improve filesystem performance, Kernelnewbies said.
Storage improvements include the addition of support for DST, a technology designed to simplify the creation of high-performance storage networks.
The kernel adds a feature contributed by Intel for speeding up the kernel's boot time by carrying out several steps of the boot process at once. "This feature speeds up the total kernel boot time significantly," Kernelnewbies wrote in their notes on the release.
Other changes include allowing the use of LZMA and Bzip2 compression of kernel images, so that they take up less space; and new or updated drivers that add support for additional hardware and hardware features.
A new architecture for putting hardware into suspend mode has been put into place, according to Torvalds. "We're hopefully now done with the suspend/resume irq re-architecting, and have switched to a new world order," he wrote in the newslist post.
READ MORE - New Linux kernel adds filesystem support

HP RDX Removable Disk Backup System

HP RDX Removable Disk Backup System

READ MORE - HP RDX Removable Disk Backup System

Smartphone roulette for app makers

When Pyxis Mobile began selling business software for mobile phones in 2004, there were only a couple of major alternatives to choose from: Research In Motion's BlackBerry and devices running Microsoft's Windows Mobile software.


Today the market for sophisticated smartphones that can run advanced software is exploding. Besides RIM and Microsoft, the choices include Apple's iPhone, Nokia phones that run the Symbian operating system, Motorola phones that use Google's Android software, and Palm's new Pre, which hit the market June 6.
For hundreds of software developers that create applications for the corporate market, this abundance of choice is a doubled-edged sword. While more people are using smartphones, developers can't afford to create different versions of their applications for every kind of phone.
If they choose to focus on the wrong device, they waste money and time. "It's one of the reasons my hairline is receding," laments Pyxis President Todd Christy.
At the same time, developers will have a major say in which companies dominate the mobile computing world: Corporate customers will gravitate to devices with the best applications as well as the best security.
Although these are still early days, RIM, Microsoft and Apple look like the top picks. Apple, traditionally strong with consumers but weak with business customers, added to the iPhone's corporate appeal when it announced new software Jun. 8.
The device is getting features specifically for business, including more secure communications and the ability to disable lost or stolen phones remotely.
Mobile masses
It's emblematic of a broader shift in the market. Most of the early mobile applications have been for consumers and are free or cost less than a buck. Now business-software developers are moving into the market in growing numbers, and they stand to make real money�up to thousands of dollars for programs that help run a company's operations.
They see opportunity as the number of mobile workers is expected to rise to nearly 1 billion in the next few years.
So far, RIM leads the corporate side of the smartphone market.
Sean Ryan of researcher IDC figures it sold 17.5 million BlackBerrys for business use last year, compared with 11.6 million Windows Mobile devices, 9.8 million phones running Symbian, and 3.9 million iPhones.
However, the iPhone has shaken up the market by creating demand for phones that are easy to use and pack a variety of applications.
MeLLmo, one business-software developer, is targeting the iPhone for its first product, RoamBi. The soon-to-be-launched software makes it easy to present data graphically on a compact smartphone screen. The company is evaluating the alternatives to the iPhone.
"The industry will have room for three or four at the most. I'm sure Apple and RIM will be there for the long term. The others, I'm not sure," said MeLLmo Chairman Santiago Becerra Sr.
NetSuite votes for iPhone
The iPhone is showing surprising strength among software companies that might be expected to target RIM first. NetSuite, a developer of financial and customer management software delivered via the Internet, chose the iPhone as the initial target for mobile software it is just now releasing.
"We see iPhone as groundbreaking technology," says NetSuite Chief Technology Officer Evan Goldberg.
To win over developers, the smartphone rivals are offering software tool kits that make it easier to build applications, helping with marketing, and creating online marketplaces modeled on Apple's App Store, where phone users can easily download applications.
RIM's BlackBerry App World, launched in April, offers only 1,000 applications. But RIM gives developers 80 percent of the revenue, instead of the 70 percent cut at Apple.
"These are serious applications for business," said RIM co-CEO Mike Lazaridis, referring to the fact that many applications on Apple's App Store are games or lightweight fare for consumers.
RIM designed the BlackBerry from the ground up as a business computing device. Corporate IT departments are focused on keeping their information secure and on controlling the use of computing devices, including smartphones.
They tend to standardize on BlackBerry, since RIM supplies technologies that address their security and control requirements.
Today, RIM. Tomorrow, Apple
RIM can't afford to be complacent, though. Coca-Cola Enterprises, which chose the BlackBerry for a recent mobile workforce project, is keeping its options open. "Today we partnered with RIM, but tomorrow it could be Apple or the Google Android program," said Esat Sezer, CCE's chief information officer.
For now, Pyxis Mobile's Christy is channeling most of his resources to BlackBerry apps, but he is hedging his bets. The company offers versions of its products for Windows Mobile and has iPhone software on the way.
"We don't see much weakening of the hold that BlackBerry has in the corporation," said Christy, "but it's a crazy market to predict."
READ MORE - Smartphone roulette for app makers

Airlines won't mandate online check-in

Most airlines have extended their services online to provide passengers added ease and accessibility. However, these carriers have no plans to remove traditional service channels, preferring to offer Web-based services as an additional channel for customers.

ZDNet Asia spoke with three airlines--AirAsia, British Airways and Singapore Airlines--all of which run portals that offer their passengers a range of online services, including flight reservation and Internet check-in. Passengers can also choose their seats and pre-order meals for their flights via the portal.

However, these carriers said they have no plans to make online check-in a compulsory procedure for travel, a move that was recently announced by a European budget airline.

RyanAir last month said it will remove all of its check-in counters from 146 airports and make Internet check-in compulsory for its passengers. RyanAir customers will also need to print their own boarding pass online, or face a US$64.50 fee if they fail to do so. In addition, passengers will have to fork out about US$8 for the online check-in service.

The three airlines said they will continue to operate physical check-in desks at airports, where online check-in will remain as an additional option for passengers. This, the carriers said, offers their passengers added "flexibility" and convenience.

According to a British Airways spokesperson, online services give passengers complete control of their flight details and provide "speedy" progress through airport procedures.

Nicholas Ionides, vice president of public affairs at Singapore Airlines, said online services facilitate shorter queues and faster counter check-ins at the airport.

Ionides explained in an e-mail interview that passengers with self-printed boarding passes will have to get these documents verified, and drop off their luggage, at designated counters in the airport. This reduces the processing time, compared to physical check-in counters, he added.

A spokesperson from AirAsia's communications department, said in an e-mail: "We have always used technology to grow our business across all fronts. It has enabled us to offer cutting-edge products, and to continue to offer quality service to our constantly expanding passenger base."

The spokesperson added that over 80 percent of their flights are booked via the Internet. The mobile platform has also allowed AirAsia to disseminate information to its customers, allowing passengers to receive these updates via their handheld devices, he said.

RyanAir did not respond to ZDNet Asia's queries at press time.
READ MORE - Airlines won't mandate online check-in

Software liability law could divide open source

The world of open source development could be divided if the European Commission (EC) succeeds in passing a law extending consumer protection rules to software, according to experts.
The EC proposes software companies be held liable in the European Union (EU) for the security and efficacy of their products.
David Mitchell, senior vice president of IT Research at Ovum, thinks this may lead to a situation boosting current open source vendors' business models, but making it more difficult for independent developers to thrive.
The EC proposal is likely to make vendors force customers into support and maintenance agreements upon each purchase, in order to help the former fulfill warranty obligations, said Mitchell.
This is already in line with the business models of current open source vendors such as Red Hat and Canonical, which sell support services. On the other hand, the "garage open source model" of independent developers who do not have the scale to guarantee their products at that level, will likely suffer, Mitchell said in an interview with ZDNet Asia.
Bryan Tan, director at Keystone Law Corporation, had predicted in an earlier blog post the "caving in" of open source software due to similar worries over liability on the parts of independent developers.
"Gone are the days where software could be written in a garage by two guys," Tan wrote.
Tan also told ZDNet Asia the proposed law would likely inflate prices for consumers outside the EU, as well--including the Asia-Pacific region, as a result of the vendors' need to provide insurance. Furthermore, the "death" of some smaller vendors would lead to increased prices as well from lack of competition, he added.
While the EC has said the proposal is in the interest of consumers, Ovum's Mitchell thinks there will be a "huge amount of market uncertainty" created, on the other hand.
"Customers will find that their existing support and maintenance agreements are now ambiguous, or in contradiction with any new legislation," he said. Businesses would also have to undertake longer testing cycles, resulting in project delays, Mitchell added.
Realistically, liability will be hard to pinpoint, because of the inter-dependency between hardware and software, Mitchell noted. The failure of a piece of software could be blamed on another installed software or hardware portion.
"[The legislation] promises to be a lawyer's dream [come true] but not to deliver any tangible benefit for the customers," he said.
However, Stanley Lai, partner at Allen & Gledhill, thinks consumers will benefit. While he agrees that software prices will likely go up, "it remains to be seen whether consumers will consider that the price to be paid in return for quality assurance is an adverse effect".
Lai also said it is "premature and over-simplistic" to predict the demise of open source software. He said with code open and more easily-corrected--the oft-quoted "many eyeballs" effect--users and consumers of open source software may be more likely to get errors fixed through the community and less likely to pursue direct recourse to liability under the proposed legislation.
READ MORE - Software liability law could divide open source

Cisco enters rack-mount server market

Cisco is to move into the rack-mount server business as part of its drive towards what it calls 'unified computing'.
Unified computing, says the company, involves combining network, compute and virtualization resources into a single system. Cisco's Unified Computing System (UCS) C-series rack-mount servers, announced on Wednesday, add a new element to a portfolio that already includes UCS blade servers.
Like the UCS blade servers, which were announced in March, the C-series rack-mount servers are based on Intel's Xeon 5500 Nehalem chips.
Cisco's UCS range is the company's first foray into the server hardware business. The company is better known for its networking products.
Also last week, Cisco announced two new IT individual certifications for customers and channel partners: Cisco Data Center Unified Computing Design Specialist and the Cisco Data Center Unified Computing Support Specialist.
The certifications cover skills such as storage networking, data center networking infrastructure, data center application services and virtualization.
Cisco's UCS C-series rack-mount servers will be available in the fourth quarter of this year.
READ MORE - Cisco enters rack-mount server market

Bing off to solid start, but not that good

Microsoft's new Bing search engine has received positive reviews in its first week on the planet, but did that early buzz translate into traffic?
A report from Statcounter picked up by TechCrunch suggested that Bing's debut was successful enough to eclipse Yahoo Search during its first week, but subsequent analysis from Search Engine Land as well as data from CNET's network of sites suggests otherwise.
Statcounter, a Web traffic tracking company, reported that as of Thursday, Jun. 4, Bing accounted for 16.28 percent of the U.S. search market, surpassing Yahoo's 10.22 percent just days after going live on Monday. Worldwide, Bing's advantage was said to be slimmer (5.62 percent to Yahoo's 5.13 percent), but that was enough for Statcounter to proclaim "Bing overtakes Yahoo!"
However, it's not quite that simple. Statcounter's data is "based on aggregate data collected by Statcounter on a sample exceeding 4 billion page views per month collected from across the Statcounter network of more than 3 million Web sites. Stats are updated and made available every 4 hours, however are subject to quality assurance testing and revision for 7 days from publication", according to the company.
Therefore, it will be interesting to see if those numbers change next week. Search Engine Land checked in with Comscore, Nielsen, and Hitwise, and found that over the past week in the United States, Yahoo Search did about three times more traffic than Bing, roughly the same level where it was the week before when Microsoft-branded search consisted of Live Search and MSN Search.
Nielsen figures show that there was indeed a surge in interest among U.S. Web surfers related to Bing on Jun. 1, the first day it went live. But that's not all that surprising given natural curiosity surrounding something new and shiny, and Bing's surge appeared to neatly replace the corresponding drop-off in traffic to Live Search and MSN Search.
CNET data suggests a similar story. For the first four days that Bing was live, the new search engine accounted for 2.2 percent of all session starts across the various CNET sites, including News, Reviews, Download, CNET TV, and CNET Shopper. Yahoo searches accounted for a little more than twice as many session starts, or 4.5 percent. Google, of course, was responsible for the rest. Bing did better than Yahoo on some sites, but worse on others.
Measuring Internet market share is notoriously tricky, and five different companies could very well reach five different conclusions. But even Microsoft has said that its basic goal for Bing over the next year is to pick up 2 percentage points of share, which unless Yahoo goes completely dark will still leave it solidly in third place behind Google and Yahoo.
READ MORE - Bing off to solid start, but not that good

For maps, detail is king

Streetdirectory.com's edge lies in its detail, said Firdhaus Akber, Streetdirectory.com managing director.
Akber told ZDNet Asia in an interview, the Singaporean mapping site still holds its edge over competitors like Google Maps because of the level of information captured on its maps like bus stops and carpark locations.
The company has a stable of some 40 surveyors collecting street data, and spends about half its operating costs on data collection, as well as training these surveyors to provide usable data to the site.
Akber said this is necessary to keep the maps accurate, as well as frequently updated because "Singapore roads change so fast".
Unlike Google, the company appears to be launching cautiously into the territory of user-generated data.
Still trumping the accuracy card, Akber said Streetdirectory.com accepts user feedback, but sends surveyors out to verify information submitted before the site's editorial team includes it in the maps.
Google Maps, on the other hand, relies on user-submitted tags to demarcate locations. Akber said this has resulted in numerous inaccurately-tagged landmarks, on the search giant's maps service, which "can be confusing" to users. Landmarks are also often tagged multiple times but at slightly different spots on maps, leading to users having to guess which is the correct tag, he added.
This accuracy, said Akber, has allowed it to retain its audience of some 150,000 unique visitors a day, in spite of a six-month hiatus last year.
"No one has overtaken our traffic in the six months we were down," he said.
The site was rebuilt from scratch and relaunched last August, to avoid legal issues after its then-owner, Virtual Maps, lost a copyright infringement case to the Singapore Land Authority.
But the site is not cutting out user-generated content altogether. Its restaurant listings section, built around a site redesign launched Friday, allows users to submit reviews of restaurants which are "less policed"--scanned for offensive content but uncensored for negative reviews.
"Google Maps is about breadth, if you want to jump from [maps of] country to country. We're focusing on Asia," said Akber.
Streetdirectory.com plans to collect data on maps for Jakarta, the Philippines and the Malaysian states of Kuala Lumpur and Johor next.
READ MORE - For maps, detail is king

Data center survey: Storage appears to be CIOs' focus

Cisco's Unified Computing System is garnering interest, but storage appears to be the focus of CIOs as they ponder the next-gen data center, study reveals.

Cisco's Unified Computing System is garnering interest, but storage appears to be the focus of CIOs as they ponder the next generation data center and that's good news for EMC and NetApp, according to a Goldman Sachs survey.
Goldman Sachs surveyed 100 IT executives at Fortune 1000 companies to get a read on their data center plans two to three years from now.
Among the takeaways:
Cisco's Unified Computing System (UCS) has found "a surprisingly receptive ear", according to Goldman Sachs. Indeed, 18 percent are planning to evaluate Cisco's UCS in the next 12 months, an impressive figure for a product that was announced only in March. Another two-thirds of IT execs say that they expect Cisco have a larger server presence over the next 2 to 3 years.
Among those surveyed, 18 percent said they will evaluate UCS in the next 12 months, 44 percent said no and 38 percent were unsure.
Cisco, HP and Dell were vendors expected to increase data center share, according to respondents. Sun and IBM are seen decreasing.
These charts tell the tale:



Click images to enlarge.

The next-gen data center push is benefiting pure storage players. EMC and NetApp are seen gaining share in the next-gen data center. A key point: As tech giants try to further integrate hardware and software independent storage vendors NetApp and EMC are benefiting. Why? These vendors work with any architecture and they're ahead on storage virtualization.

Click image to enlarge.
VMware is seen as the most strategic software vendor, but Microsoft has a better-than-expected finish. Meanwhile, Oracle got a mention as being strategic on the virtualization front.
The standings:

Click image to enlarge.
Cisco and Juniper defend switching turf. Goldman Sachs notes:
Despite the heightened activity in data center networking, including the launch of Juniper's new high-end switching platform as well as HP's ProCurve partner ecosystem, Cisco is expected to further extend its already sizable lead in the long-term. This is consistent with our IT Survey's results pointing to share gains in the near term. Juniper also appears to be gaining traction in switching as our survey points to the company increasing its presence in the data center, with nearly 70 percent of the respondents citing share gains over next two to three years.
READ MORE - Data center survey: Storage appears to be CIOs' focus

Reviewing pandemic plans has just become job #1

With the Influenza A hitting over 60 countries, IT managers need to step up contingency planning. One CIO shares his organization's experience and plans.

As of June 1, the Influenza A, also known as H1N1, has infected over 17,000 worldwide and caused 115 deaths.
This most recent flu outbreak is of the same variety as that responsible for an estimated 40 to 50 million deaths worldwide in 1918. Although health officials have yet to determine a number of facts, including the origin of the disease and its virulence, the outbreak should spur renewed conversations regarding contingency plans in the event that the current outbreak turns into something more serious.
Many organizations have developed contingency plans based around concerns regarding the bird flu, so there is probably already at least a framework in place, if not a full plan.
A contingency plan put into motion for a pandemic is likely to be different from many other business continuity-type plans. For example, if headquarters is wiped out by a tornado, setting up operations at an alternate site makes a lot of sense. However, when it comes to something that, quite frankly, scares people away from the office, such as a virulent disease, the path isn't always as clear.
In cases like this, in order to maintain operations, the organization would need to maintain at least a skeleton staff and significantly enhance remote worker capabilities for those that need to work but that, for whatever reason, can't or won't make it to the office.
At Westminster College, we do have campus plans for what to do in the event of a pandemic or significant public health emergency. However, for the kind of campus we are--very traditional with no online classes and all courses taught on site--we don't maintain regular significant remote access capabilities so our normal operations don't include what we'd need in the event of, well, an event.
With the news continuing to come out regarding the spread of confirmed cases of H1N1, my staff and I are taking a few relatively minor steps in preparation for a possible problem:
  • First, we're verifying our VPN services to make sure that we have enough licenses and capacity for increased volume. We don't currently have many VPN users. Again, we're a very traditional, very residential campus, with VPN used primarily by those that travel on college business.
  • We're also going to prep a couple of additional servers as terminal servers. Through this and VPN, users will be able to continue to easily run their normal applications from anywhere.
  • A part of a campus-wide plan calls for staff that will stay on site for long periods of time in the event of a pandemic. Given that my staff has endured a lot of turnover since the campus pandemic plan was developed, we'll have conversations regarding this point.
  • We will verify with our service providers, including Internet and electrical service providers, our points of contact in the event of a pandemic. Although we have this information in our campus pandemic plan, periodic review is essential to keep the information current.
At this point, we won't go overboard in preparing for what could turn out to be a whole lot of nothing. Even if the whole thing fizzles out right now--and I hope it does--it's a valuable reminder that we need to stay vigilant with regard to our disaster and pandemic planning and make sure that we're ready for whatever comes our way.
Scott Lowe has spent 15 years in the IT world and is currently the chief information officer for Westminster College in Fulton, Missouri. Scott is also a regular contributor to TechRepublic and has authored one book, Home Networking: The Missing Manual (O’Reilly) and coauthored the Microsoft Exchange Server 2007 Administrator's Companion (MS Press).
READ MORE - Reviewing pandemic plans has just become job #1